When working with Captain Data’s API, a common question arises: Should you process inputs one by one or in batches? The answer depends on your specific needs, such as speed and responsiveness, which can be critical for your product or service.

TL;DR

Single Input ProcessingBatch Input Processing
Real-time results for fast actions🏋️‍♂️ Efficient for large datasets
⏱️ Process one input at a time🚀 Send multiple inputs in one go
🚫 Cold start may occur✅ Reduces network overhead
💡 Best for time-sensitive tasks⚡ Great for bulk updates
❌ More API calls for large data🛠️ Fewer API calls, better for scaling
🔄 Needs faster feedback📈 Best for large-scale enrichment

🚀 Single Input Processing

What is it?

Single input processing involves sending one input at a time to the API endpoint using POST Launch a Workflow. Each input is handled individually, ensuring that each one is processed separately.

When to Use It?

Use single input processing when you need to act quickly on individual pieces of data, especially when timing is crucial.

Examples:

  1. Enriching a new user:
    • A user just signed up on your platform, and you need their data enriched immediately for a personalized experience.
    • Sending one input at a time ensures real-time updates, keeping the user experience smooth.
  2. Updating specific database entries:
    • When updating 100 profiles in your HubSpot database, sending inputs one by one ensures you only process what’s necessary, avoiding the overhead of handling entire batches.

Why?

  • Single input processing is ideal for small to medium-sized datasets.
  • It delivers faster, more targeted results for time-sensitive needs.

📦 Batch Input Processing

What is it?

Batch input processing groups multiple inputs into a single API request. Instead of sending inputs one by one, you send a large set of inputs in one go.

When to Use It?

Batch processing is more efficient for large datasets or when performing bulk actions.

Examples:

  1. Updating your HubSpot database:
    • If you’re updating 10,000+ profiles, batch processing is much more efficient than handling each profile individually.
    • This method reduces unnecessary API calls and minimizes network overhead.
  2. Bulk data enrichment:
    • When enriching a list of leads, processing them in bulk reduces both processing time and API costs.

Why?

  • Batch input processing excels at handling high volumes of data efficiently.
  • It minimizes API calls, which is crucial for large-scale updates.

⚖️ Single vs. Batch: How to Decide

A few examples:

Use CaseRecommendationReason
Enriching a user at signupSingle InputFaster response for real-time needs.
Updating large databasesBatch ProcessingMore efficient for high volumes of data.

Rate Limiting on Parallel Inputs

Captain Data applies rate limits depending on how many inputs you launch in parallel.

If you exceed your rate limit, you will receive a 429 error. Runs won’t be queued automatically, so you’ll need to handle this error on your side.

Examples Based on a Rate Limit of 100 Inputs in Parallel

  • Launching 100 workflows with 1 input each is acceptable. The 101st workflow will be rate-limited and queued.
  • If you launch a single workflow with 5000 inputs, no rate limit applies due to batch processing.
  • Launching 20 workflows, each with 5 inputs, will hit the parallel input limit.

Rate Limiting by Plan

  • AppSumo, Pro, Lunar, Mars plans: Limited to 100 inputs launched in parallel.
  • Scale & Growth plan: Limited to 1000 inputs launched in parallel.
  • Enterprise plan: No rate limits apply.