If you’ve been building on Salesforce long enough, you’ve hit this wall: you need to process millions of records, so you write Batch Apex. Then you deal with the rigid start() -> execute() -> finish() lifecycle, the lack of fine-grained control over chunk sizes, the inability to easily chain jobs, and the painful debugging cycle when something breaks at batch #847 of 2,000.
Apex Cursors change this entirely. Introduced as Beta in Summer ‘24 and reaching GA maturity in Spring ‘26, cursors give you a pointer to a SOQL result set — not the result itself. You fetch only the slice you need, when you need it.
The Pain Every Salesforce Developer Knows
The Problem
Scenario: A financial services org processes 12 million Account records nightly to recalculate risk scores. Their Batch Apex job takes 4+ hours, and when it fails at record 8 million due to a transient lock error, the entire job restarts from scratch.
The Cursor Solution
Use an Apex Cursor inside a Queueable job. Process 2,000 records per execution, track position automatically, and chain the next Queueable. If one chunk fails, retry that chunk only — no restart required.
How Apex Cursors Actually Work
An Apex Cursor is a server-side pointer to a SOQL query result set. When you call Database.getCursor(query), Salesforce executes the query and caches the result on the server — but returns only a lightweight locator object. No records are loaded into heap memory until you explicitly call cursor.fetch(position, count).
Core Syntax
// 1. Create a cursor — no records loaded yet
Database.Cursor cursor = Database.getCursor(
'SELECT Id, Name, Amount FROM Opportunity WHERE StageName = \\'Closed Won\\''
);
// 2. Check total records
Integer totalRecords = cursor.getNumRecords();
// 3. Fetch in controlled chunks
Integer position = 0, chunkSize = 2000;
while (position < totalRecords) {
List<SObject> records = cursor.fetch(position, chunkSize);
processRecords(records);
position += records.size();
}Scenario I: Mass Email Campaign
public class ReEngagementEmailJob implements Queueable {
private Database.Cursor cursorLocator;
private Integer position;
private Integer batchSize = 2000;
private Integer retryCount = 0;
public ReEngagementEmailJob() {
this.cursorLocator = Database.getCursor(
'SELECT Id, Name, Email FROM Contact ' +
'WHERE LastActivityDate < LAST_N_DAYS:400 ' +
'AND Email != null AND HasOptedOutOfEmail = false'
);
this.position = 0;
}
public void execute(QueueableContext ctx) {
Integer totalRecords = cursorLocator.getNumRecords();
if (position >= totalRecords) return;
try {
List<Contact> contacts = cursorLocator.fetch(position, batchSize);
sendReEngagementEmails(contacts);
position += contacts.size();
retryCount = 0;
} catch (TransientCursorException e) {
retryCount++;
if (retryCount > 3) { position += batchSize; }
}
if (position < totalRecords) {
System.enqueueJob(
new ReEngagementEmailJob(cursorLocator, position)
);
}
}
}Scenario II: 15M Record Data Migration
The Problem
An enterprise org is migrating 15 million Lead records with external API enrichment (address verification). Batch Apex’s fixed 200-record chunks paired with the 100 callout limit per transaction make orchestration a nightmare.
The Cursor Solution
Use a cursor with a chunk size of 50 records — matching your callout budget. The cursor tracks your exact position, so if the API is throttled, you pause and resume precisely where you left off.
public class LeadMigrationJob implements Queueable, Database.AllowsCallouts {
private Database.Cursor cursor;
private Integer position;
private static final Integer CHUNK = 50;
public void execute(QueueableContext ctx) {
Integer total = cursor.getNumRecords();
if (position >= total) return;
List<Lead> leads = cursor.fetch(position, CHUNK);
for (Lead ld : leads) {
AddressVerification result =
AddressAPI.verify(ld.Street, ld.City, ld.State);
ld.Street = result.standardizedStreet;
ld.PostalCode = result.zip;
}
Database.update(leads, false);
position += leads.size();
if (position < total) {
System.enqueueJob(
new LeadMigrationJob(cursor, position)
);
}
}
}Scenario III: LWC Pagination — 500K Records
public with sharing class CasePaginationController {
@AuraEnabled
public static PaginationResult getPage(String serializedCursor,
Integer position,
Integer pageSize) {
Database.Cursor cursorLocator;
// First call — initialize a new cursor
if (serializedCursor == null) {
cursorLocator = Database.getCursor(
'SELECT Id, CaseNumber, Subject, Status, Priority ' +
'FROM Case ORDER BY CreatedDate DESC'
);
} else {
// Restore cursor from serialized state
cursorLocator = (Database.Cursor)
JSON.deserializeStrict(serializedCursor, Database.Cursor.class);
}
PaginationResult result = new PaginationResult();
result.records = cursorLocator.fetch(position, pageSize);
result.totalRecords = cursorLocator.getNumRecords();
result.currentPosition = position;
result.serializedCursor = JSON.serialize(cursorLocator);
return result;
}
public class PaginationResult {
@AuraEnabled public List<Case> records;
@AuraEnabled public Integer totalRecords;
@AuraEnabled public Integer currentPosition;
@AuraEnabled public String serializedCursor;
}
}Feature Comparison
| Feature | Batch Apex | Apex Cursors |
|---|---|---|
| Max Records | 50 million | 50 million per cursor |
| Chunk Size | Fixed (up to 2,000) | You choose any size |
| Error Recovery | Restarts entire job | Retry specific chunk |
| Chaining | Limited chain support | Queueable chains |
| Concurrent Limit | 5 active (only 1 executes at a time) | No concurrent limit |
| Bidirectional | Forward only | Forward and backward |
| Callout Support | Requires AllowsCallouts | Requires AllowsCallouts (same as Batch) |
| UI Pagination | Not designed for UI | PaginationCursor class |
| Heap Usage | Loads entire batch | Only fetched chunk |
| Code Complexity | 3-method interface | 2-3 lines of code |
Governor Limits to Know
Apex Cursors come with their own governor limits. Know these before deploying to production — build monitoring into your Queueable chains.
Use Limits.getApexCursorRows() and Limits.getFetchCallsOnApexCursors() to monitor your cursor consumption at runtime. Pair these with OrgLimits.getMap().get('DailyApexCursors') for daily tracking.
Architect’s Best Practices
1. Right-Size Your Chunk Size
Your chunk size should be driven by what you’re doing in each transaction — not by a default. For DML-heavy operations (inserts, updates, deletes), keep chunks around 200 records to stay safely within DML limits. For read-only processing like aggregations or reporting, you can push up to 2,000 records per fetch. If you’re making external callouts, keep chunks under 100 — each transaction allows only 100 callouts, and you’ll want headroom for retries.
2. Always Handle TransientCursorException
The platform throws TransientCursorException when the server-side cursor cache is temporarily unavailable. This is not a fatal error — it’s a signal to retry. Build a retry counter into your Queueable, and set a maximum (typically 3 retries). If all retries fail, log the failure with the exact position and skip forward. This granular recovery is the single biggest advantage over Batch Apex.
3. Monitor with the Limits Class
Salesforce exposes detailed cursor metrics through both the Limits and OrgLimits classes. Check Limits.getApexCursorRows() within your transaction, and use OrgLimits for daily consumption tracking. Build guardrails: if you’re within 20% of the daily cursor limit, log a warning and optionally pause your chain until the next 24-hour window.
// Transaction-level checks
System.debug('Cursor rows used: ' + Limits.getApexCursorRows());
System.debug('Fetch calls used: ' + Limits.getFetchCallsOnApexCursors());
System.debug('Max fetch calls: ' + Limits.getLimitFetchCallsOnApexCursors());
// Daily org-level check
Map<String, System.OrgLimit> limitsMap = OrgLimits.getMap();
System.OrgLimit cursorLimit = limitsMap.get('DailyApexCursors');
Integer remaining = cursorLimit.getLimit() - cursorLimit.getValue();
System.debug('Daily cursors remaining: ' + remaining);
// Safety guardrail
if (remaining < 500) {
System.debug('WARNING: Approaching daily cursor limit. Pausing chain.');
return; // Don't enqueue next job
}Batch Apex is still the right choice for simple, scheduled, set-it-and-forget-it bulk processing where you don’t need fine-grained control. A nightly job that deactivates expired users? Batch is fine. Use cursors when you need custom chunk sizes, error recovery, Queueable chaining, or UI-driven pagination. Think of cursors as a scalpel and Batch Apex as a sledgehammer — both have their place in the toolkit.
5. Serialize Cursor State for Cross-Transaction Use
Cursors can be serialized to JSON and restored in a subsequent transaction. This is what makes the Queueable chaining pattern work — each job receives the serialized cursor string, restores it, fetches the next chunk, and passes it forward.
// Serialize cursor for passing between transactions
String serializedCursor = JSON.serialize(cursorLocator);
// Restore in next transaction
Database.Cursor restored = (Database.Cursor)
JSON.deserializeStrict(
serializedCursor,
Database.Cursor.class
);Key Takeaways
Apex Cursors fundamentally change how we approach large data processing in Salesforce. They are not a minor convenience — they solve real architectural pain points that have plagued the platform for over a decade.
Use Cursors when: you need custom chunk sizes, granular error recovery, Queueable chaining, UI pagination beyond 2,000 records, or when your Batch Apex jobs are hitting the 5-concurrent-job ceiling and blocking other processes.
Keep Batch Apex when: you have simple, scheduled, set-it-and-forget-it jobs that process records in a straightforward manner without needing fine-grained control.
The bottom line? If you’ve ever watched a Batch Apex job fail at 3 AM and restart from scratch — Apex Cursors are your answer. This feature is GA, production-ready, and available now.
Set up a scratch org, write a Queueable job with a cursor, and process a million records. The Salesforce developer documentation has complete API references for Database.Cursor, PaginationCursor, and the related Limits methods.
What’s your experience with large data processing in Salesforce? Have you tried cursors yet, or are you still wrestling with Batch Apex? I’d love to hear how you’re handling high-volume scenarios in your org.
How did this article make you feel?
Comments
Salesforce Tip