I want to start with a number: 100. That is the maximum number of SOQL queries you can execute in a single Apex transaction. In a sandbox with a few hundred records, you can easily write code that executes 3 queries per trigger, never notice, and then watch that same code fail at 34 simultaneous record updates in production because a workflow also fires 3 queries, putting you at 102.
SOQL optimization is not optional for production Salesforce development. Here are the ten mistakes I see most often and exactly how to fix each one.
The Query Execution Plan: Your Debugging Tool
Before covering the mistakes, you need to know how to investigate them. The Query Plan tool in the Developer Console shows you exactly how Salesforceโs query optimizer will execute your SOQL.
Open Developer Console โ Help โ Query Plan. Paste your SOQL and look at the โCostโ column. Any value above 1.0 means Salesforceโs optimizer prefers a full table scan over using an index. In small orgs, this is tolerable. In orgs with millions of records, this is a time bomb.
The Problem
Scenario: A trigger on Opportunity runs in a sandbox with 20 records and passes all tests. It contains a SOQL query inside a loop to fetch the parent Account name for each Opportunity. In production, a data import updates 150 Opportunities at once โ the trigger fires 150 SOQL queries, hits the 100-query governor limit, and the entire import fails with a cryptic Too many SOQL queries: 101 error.
The Solution
Move all SOQL outside the loop. Collect all AccountIds from Trigger.new into a Set, run a single query to build a Map<Id, Account>, then iterate over the records using in-memory map lookups. One query handles 1 or 10,000 records identically.
Mistake 1: SOQL Inside Loops
This is the most famous mistake and still the most common. Every iteration of a loop that executes a SOQL query is a separate query counted against the 100-query limit.
Bad: Query per Record
for (Opportunity opp : opportunities) {
Account acc = [SELECT Name FROM Account WHERE Id = :opp.AccountId];
// do something with acc
}Good: Map Pattern
Set<Id> accountIds = new Set<Id>();
for (Opportunity opp : opportunities) {
accountIds.add(opp.AccountId);
}
Map<Id, Account> accountMap = new Map<Id, Account>(
[SELECT Id, Name FROM Account WHERE Id IN :accountIds]
);
for (Opportunity opp : opportunities) {
Account acc = accountMap.get(opp.AccountId);
// do something with acc
}The Map pattern is the standard fix. It uses exactly one SOQL query regardless of how many records you are processing.
In the Developer Console Query Plan tool, a Cost value below 1.0 means Salesforce will use an index. A Cost above 1.0 means a full table scan is preferred โ any query with Cost > 1.0 on an object with more than 100,000 records is a future production incident waiting to happen.
Mistake 2: Querying Without Selective Filters
A selective query is one where the WHERE clause filters out enough records that Salesforce can use an index efficiently. The thresholds Salesforce uses are:
- Standard indexes: selective if they return fewer than 30% of total records, capped at 1 million records
- Custom indexes: selective if they return fewer than 10% of total records, capped at 333,000 records
Bad: Non-Selective
List<Contact> contacts = [SELECT Id, Name FROM Contact WHERE Status__c = 'Active'];Better: Combined Filters
List<Contact> contacts = [
SELECT Id, Name FROM Contact
WHERE Status__c = 'Active'
AND CreatedDate = LAST_N_DAYS:30
];Adding a date filter dramatically reduces the result set and makes the query selective.
Mistake 3: Missing Index on Filter Fields
Salesforce automatically indexes: Id, Name, OwnerId, CreatedDate, SystemModstamp, RecordTypeId, and lookup/master-detail relationship fields. Everything else requires an explicit custom index request through Salesforce Support, or you need to redesign your filter strategy.
When to request a custom index from Salesforce Support
Before filing a support case for a custom index, verify the field has high cardinality โ meaning many distinct values. Indexing a field with only 5 possible values does not help because Salesforceโs threshold math means the index is never selective enough to be useful.
Fields worth requesting custom indexes for: external IDs used in integrations, frequently queried unique text fields, date fields used in range queries.
Mistake 4: Over-Fetching Fields
Bad: Over-Fetching
List<Opportunity> opps = [SELECT Id, Name, Amount, CloseDate, StageName,
Account.Name, Account.BillingCity, Account.Phone, Account.Website,
Account.AnnualRevenue, Account.Industry ... FROM Opportunity];Good: Minimal Fields
List<Opportunity> opps = [SELECT Id, Name, Amount, StageName FROM Opportunity
WHERE CloseDate = THIS_QUARTER];Over-fetching increases heap size (which has a 6MB limit per transaction) and increases query execution time. Query only the fields your code references.
Mistake 5: Not Using Query Locators for Batch Apex
In Batch Apex, the start() method must return a Database.QueryLocator, not a List. This is not just style โ it is the difference between processing 50,000 records and 50 million.
Bad: List Return
global List<Account> start(Database.BatchableContext bc) {
return [SELECT Id FROM Account WHERE IsActive__c = true];
}Good: QueryLocator
global Database.QueryLocator start(Database.BatchableContext bc) {
return Database.getQueryLocator(
'SELECT Id FROM Account WHERE IsActive__c = true'
);
}Mistake 6: Relationship Queries When a Single Query Works
Parent-to-child relationship queries (with inner SELECT) are convenient but they have a separate row limit: each inner query result set is capped at 200 rows per parent record.
Risky: Inner Query
List<Account> accs = [
SELECT Id, Name,
(SELECT Id, Subject FROM Cases WHERE Status = 'Open')
FROM Account
WHERE Type = 'Customer'
];Safer: Two Queries + Map
List<Account> accs = [SELECT Id, Name FROM Account WHERE Type = 'Customer'];
Set<Id> accIds = new Map<Id, Account>(accs).keySet();
Map<Id, List<Case>> casesByAccount = new Map<Id, List<Case>>();
for (Case c : [SELECT Id, AccountId, Subject FROM Case
WHERE AccountId IN :accIds AND Status = 'Open']) {
if (!casesByAccount.containsKey(c.AccountId)) {
casesByAccount.put(c.AccountId, new List<Case>());
}
casesByAccount.get(c.AccountId).add(c);
}Mistake 7: Using LIKE for Exact Matches
Bad: LIKE for Exact Match
List<Account> accs = [SELECT Id FROM Account WHERE Email__c LIKE 'user@example.com'];Good: Equality Operator
List<Account> accs = [SELECT Id FROM Account WHERE Email__c = 'user@example.com'];Use LIKE only when you genuinely need wildcard matching (LIKE '%partial%'). Leading wildcards (LIKE '%value') always result in a full table scan.
Mistake 8: Ignoring Null Handling in Filters
Queries using = null do not use standard indexes. The correct SOQL syntax is = null but it behaves like a full table scan on most indexed fields. If you frequently query for null values, consider a checkbox field that is explicitly set instead.
Problematic: Null Check
List<Contact> contacts = [SELECT Id FROM Contact WHERE AccountId = null];Better: Explicit Flag
List<Contact> contacts = [SELECT Id FROM Contact WHERE Has_Account__c = false];Mistake 9: Querying in Triggers Without Limiting Scope
Bad: Unscoped Query
trigger OpportunityTrigger on Opportunity (after update) {
List<Opportunity> allOpps = [SELECT Id FROM Opportunity
WHERE AccountId IN :accountIds]; // accountIds built from Trigger.new
}Good: Change-Filtered Query
trigger OpportunityTrigger on Opportunity (after update) {
Set<Id> changedAccountIds = new Set<Id>();
for (Opportunity opp : Trigger.new) {
Opportunity oldOpp = Trigger.oldMap.get(opp.Id);
if (opp.StageName != oldOpp.StageName) { // Only process changed records
changedAccountIds.add(opp.AccountId);
}
}
if (changedAccountIds.isEmpty()) return; // Early exit if nothing changed
List<Opportunity> relevantOpps = [SELECT Id FROM Opportunity
WHERE AccountId IN :changedAccountIds AND StageName = 'Closed Won'];
}Always filter trigger context queries by what actually changed, not by the full Trigger.new set.
Mistake 10: Aggregate Queries Without GROUP BY Selectivity
Bad: Non-Selective Aggregate
AggregateResult[] results = [
SELECT COUNT(Id), StageName FROM Opportunity
WHERE IsDeleted = false
GROUP BY StageName
];Better: Date-Bounded
AggregateResult[] results = [
SELECT COUNT(Id), StageName FROM Opportunity
WHERE CloseDate = THIS_FISCAL_YEAR
GROUP BY StageName
];Aggregate queries (COUNT, SUM, GROUP BY) still obey selectivity rules. A COUNT(Id) over 5 million records with a non-selective filter will time out in production.
Benchmark Reality Check
I want to give you concrete numbers from Salesforceโs own documentation and common experience:
| Scenario | Rows in Org | Query Time (approx) | Risk |
|---|---|---|---|
| Id filter on 10M records | 10M | < 50ms | None |
| Indexed field, selective | 10M | 50-200ms | Low |
| Non-selective on Status field | 10M | 2-10 seconds | High |
| Full table scan, 10M records | 10M | Timeout / Error | Critical |
The difference between a query that takes 50ms and one that times out is almost always selectivity โ whether the WHERE clause enables Salesforce to use an index to jump directly to the relevant rows.
The Quick Audit Process
Full SOQL audit process (step by step)
When I audit SOQL performance in an org, I follow this process:
- Export all SOQL from triggers, classes, and flows using VS Code SOQL extension
- Run each query through the Query Plan tool with large record counts in mind
- Flag any query with Cost > 1.0 for rewriting
- Check all trigger handlers for SOQL inside loops
- Verify Batch Apex
start()methods useQueryLocator - Review aggregate queries for date or selective field boundaries
This takes about an hour for a medium-sized org and has saved me from multiple production incidents.
What SOQL pattern have you found causes the most problems in your experience? I am particularly curious whether people encounter the aggregate query issues or if loop-based queries remain the dominant culprit.
How did this article make you feel?
Comments
Salesforce Tip