ESC

AI-powered search across all blog posts and tools

Integration ยท February 25, 2026

Salesforce API Limits - What Every Developer Should Know

A complete guide to every Salesforce API type, its limits, and the strategies that keep you from hitting them in production

☕ 10 min read 📅 February 25, 2026
  • API call limits are per 24-hour rolling window and scale with your user licenses โ€” not just your edition
  • Bulk API 2.0 is the right choice for any operation over 2,000 records โ€” it consumes API calls differently and is far more efficient
  • Monitor /limits endpoint proactively so you catch consumption spikes before they cause failures

Iโ€™ve been involved in Salesforce integrations that failed spectacularly in production because nobody thought seriously about API limits during design. A middleware that worked perfectly during UAT would start returning 503 errors on the third day of a marketing campaign because request volume spiked. Understanding limits is not an afterthought โ€” itโ€™s architecture.

Let me walk you through every API type Salesforce offers, what limits apply to each, and how to design systems that stay inside those limits.

The Limits Landscape at a Glance

API Types and Daily Limits (Enterprise Edition Baseline)
API Types and Typical Daily Limits (Enterprise Edition)Daily Limit (requests)1,000,000 / dayREST API1,000,000 / daySOAP API150M records / dayBulk API 2.0200,000 events / dayStreaming API50,000 events / dayPub/Sub API* Limits scale with user licenses. Enterprise Edition baseline shown. Check /limits endpoint for your actual org.

Every number above is a baseline. Your actual limit scales based on the number of full (non-portal) user licenses your org has. The formula Salesforce uses is: Base limit + (licenses x per-license allocation). For REST/SOAP combined, thatโ€™s roughly 1,000,000 + (users x 1,000) for Enterprise Edition.

REST API

The REST API is the default choice for most modern integrations. It returns JSON, supports standard HTTP verbs, and is straightforward to implement from any language or platform.

Key limits:

  • Combined daily API calls (shared with SOAP): 1,000,000 + user license allocation
  • Concurrent long-running requests (>20 seconds): 25 per org
  • Maximum response size: 10 MB per request
โš ๏ธ Warning

The concurrent limit is the one that surprises teams. If you have a middleware that fires 30 simultaneous queries that each take 30 seconds, you will hit it. The mitigation is designing for async patterns and connection pooling.

For reading records efficiently via REST, always use field selection in your SOQL queries rather than requesting all fields:

Inefficient

GET /services/data/v60.0/query?q=SELECT+FIELDS(ALL)+FROM+Account+LIMIT+200

Efficient

GET /services/data/v60.0/query?q=SELECT+Id,Name,Type,BillingCity+FROM+Account+WHERE+IsDeleted=false+LIMIT+200
Polling for Changes Every 5 Minutes

A middleware hits Salesforce every 5 minutes to check for new or updated Contacts โ€” 288 calls per day per integration instance. With 10 integration instances running, thatโ€™s 2,880 calls/day just for idle polling. During a marketing campaign with real changes, each call also fetches 1,000-record pages.

Webhook-First with Delta Queries

Platform Events notify the middleware only when a Contact changes. The middleware calls Salesforce only to act on a confirmed change, and filters by LastModifiedDate > [last sync timestamp]. API consumption drops by 90%+ and the integration is faster and more accurate.

SOAP API

SOAP shares the same daily API call pool as REST. The primary reason to use SOAP today is compatibility with legacy enterprise middleware (MuleSoft implementations from 2015, SAP integrations, etc.) that were built against the WSDL before REST was available.

Unique SOAP capabilities
  • retrieve() call: fetch specific fields from specific record IDs in bulk
  • describeSObjects(): get metadata about objects
  • merge(): merge duplicate records programmatically (not available in REST)

If you are building something new, use REST. SOAP exists for maintenance and legacy compatibility.

Bulk API 2.0

This is the API that fundamentally changes how you think about data volume. Bulk API 2.0 was designed for operations involving thousands to millions of records.

โ„น๏ธ Info

The critical difference: Bulk API calls do not count against your standard daily API call limit in the same way. Instead, Salesforce limits bulk operations to 150 million records per day for most editions, and the number of batches (jobs) you can run concurrently.

When to use Bulk API:

  • Any data load over 2,000 records
  • Nightly data syncs from external systems
  • Mass updates (update all records where field X = Y)
  • Data migrations
# Bulk API 2.0 flow
# 1. Create a job
POST /services/data/v60.0/jobs/ingest
{
  "object": "Contact",
  "operation": "upsert",
  "externalIdFieldName": "External_Id__c",
  "contentType": "CSV",
  "lineEnding": "LF"
}
# Response includes job ID

# 2. Upload data (CSV)
PUT /services/data/v60.0/jobs/ingest/{jobId}/batches
[CSV data as body]

# 3. Close the job to trigger processing
PATCH /services/data/v60.0/jobs/ingest/{jobId}
{"state": "UploadComplete"}

# 4. Poll for results
GET /services/data/v60.0/jobs/ingest/{jobId}

# 5. Get success/failure records
GET /services/data/v60.0/jobs/ingest/{jobId}/successfulResults
GET /services/data/v60.0/jobs/ingest/{jobId}/failedResults
๐Ÿ’ก Bulk API 2.0 vs 1.0 โ€” Use 2.0

Bulk API 1.0 required splitting your data into batches of up to 10,000 records manually and managing each batch separately. Bulk API 2.0 handles batching internally โ€” you upload the full CSV and Salesforce splits it. Unless you have a legacy integration locked to 1.0, always use 2.0.

Streaming API and Platform Events

Streaming API (PushTopic, Generic Streaming) and Platform Events let you subscribe to real-time data changes rather than polling.

Streaming limits (Enterprise Edition):

  • 200,000 daily deliveries per org
  • 1,000 concurrent clients per org
  • 24-hour subscription duration (clients must re-subscribe daily)

Platform Events are the modern replacement for PushTopic streaming. Theyโ€™re more flexible, support replay IDs for at-least-once delivery, and integrate natively with Apex triggers and Flow.

// Publish a platform event from Apex
Order_Update__e event = new Order_Update__e(
    Order_Id__c = orderId,
    Status__c = 'Shipped',
    Tracking_Number__c = trackingNum
);
Database.SaveResult sr = EventBus.publish(event);
โš ๏ธ Warning

The 200,000 daily delivery limit sounds large until you have a high-volume order management system with 10 subscribers per event. 200,000 deliveries / 10 subscribers = 20,000 events before you hit the cap. Design your event volume accordingly.

Pub/Sub API (gRPC)

Pub/Sub API is the newest and most efficient option for high-throughput event streaming. It uses gRPC (binary protocol) instead of HTTP, which dramatically reduces overhead.

Best for: CDC (Change Data Capture) subscriptions, high-frequency event consumers, building real-time data pipelines into external systems like Kafka or Snowflake.

Pub/Sub API limit details

The limit is lower than standard Streaming API by default but Salesforce positions this as the enterprise-grade option where limits can be raised with contract negotiation.

Strategies to Optimize API Consumption

Composite API requests: Bundle up to 25 REST API calls into a single HTTP request. This is the single most impactful optimization for chatty integrations.

POST /services/data/v60.0/composite
{
  "allOrNone": false,
  "compositeRequest": [
    {
      "method": "GET",
      "url": "/services/data/v60.0/sobjects/Account/{!accountId}",
      "referenceId": "Account"
    },
    {
      "method": "GET",
      "url": "/services/data/v60.0/query?q=SELECT+Id+FROM+Contact+WHERE+AccountId={!Account.id}",
      "referenceId": "Contacts"
    }
  ]
}

Caching: If your integration queries the same reference data repeatedly (price books, product catalogs, record type IDs), cache those results in your middleware layer. Reference data doesnโ€™t change often โ€” querying it 10,000 times a day is pure waste.

Delta queries with LastModifiedDate: Instead of pulling all records on every sync, filter by LastModifiedDate > [timestamp of last sync]. This can reduce your API consumption by 90%+ for mature orgs where most records donโ€™t change daily.

Webhook-first design: Instead of polling Salesforce for changes, have Salesforce notify your system when something changes using Outbound Messages or Platform Events. This inverts the cost โ€” you only make API calls when you need to act on a change, not just to check.

Monitoring Your API Usage

The /limits endpoint is your best friend. It returns the current state of every governor limit in your org, including how many API calls remain in the current 24-hour window.

GET /services/data/v60.0/limits
Authorization: Bearer {token}

The response includes:

{
  "DailyApiRequests": {
    "Max": 1150000,
    "Remaining": 847232
  },
  "DailyBulkV2QueryFileStorageMB": {
    "Max": 10240,
    "Remaining": 9980
  }
}
๐Ÿ’ก Set Your Alert Threshold at 20%, Not 10%

Teams commonly set their API limit alerts at 10% remaining, which gives almost no reaction time. If you discover youโ€™re at 10% at 2 PM with 10 hours left in the window, youโ€™re already in crisis mode. Alert at 20% remaining โ€” that gives you time to throttle integrations, defer batch jobs, or contact Salesforce support for a temporary limit increase before users are affected.

The Limit You Never Think About: Data Storage

API call volume is the most commonly discussed limit, but data storage catches teams off guard just as often. Every record you create via API consumes storage. Enterprise Edition includes 10 GB of file storage and 1 GB of data storage per org, with data storage growing by 20 MB per user license.

At 200 bytes average per record (a realistic Contact with custom fields), 1 GB of storage is about 5 million records. A daily sync creating 50,000 records would fill that in 100 days. Plan your archival and deletion strategy before you need it.

Summary Checklist

Before any integration goes to production I run through this list:

  • Daily API call estimate calculated against org limit
  • Operations over 2,000 records use Bulk API 2.0
  • Composite API used for multi-step operations
  • Delta queries implemented โ€” no full-table syncs
  • Monitoring alert on /limits endpoint at 80% consumption
  • Error handling includes 429 (rate limit) with exponential backoff
  • Data storage growth projection reviewed

What API limits have caught your production integrations off guard? Are you proactively monitoring or mostly discovering limits after theyโ€™ve been hit? Iโ€™d love to hear what monitoring setup your team has in place.


Test Your Knowledge

What is the most impactful optimization for reducing API call consumption in a chatty integration?
When should you use Bulk API 2.0 instead of standard REST API?

How did this article make you feel?

Comments

Salesforce Tip

🎉

You finished this article!

What to read next

Contents