Salesforce provides tools for automating business processes, either declaratively (without code) or programmatically (using code). These tools allow you to meet various business requirements, from simple field updates to advanced workflows and asynchronous operations.
Declarative tools allow you to automate processes without writing code. They are user-friendly and ideal for many business scenarios.
Programmatic automation is used when declarative tools can’t meet the requirements. It provides advanced flexibility through coding.
Triggers are pieces of code that execute automatically in response to database events, such as inserting, updating, or deleting records.
The following trigger updates the Description field of a Contact record before it is saved:
trigger ContactTrigger on Contact (before insert) {
for (Contact con : Trigger.new) {
con.Description = 'Welcome, ' + con.FirstName;
}
}
Explanation:
Trigger.new: A list of records being processed.Contact and updates its Description field.Best Practices:
Asynchronous tasks handle time-consuming processes, such as processing large data sets or calling external APIs, without blocking the user.
What are Future Methods?
Example:
@future
public static void sendEmail(String email) {
// Logic to send email
System.debug('Sending email to ' + email);
}
Best Practices:
What is Batch Apex?
Key Components:
start: Retrieves the data to be processed.execute: Processes each batch of data.finish: Executes post-processing logic.Example:
global class BatchExample implements Database.Batchable<sObject> {
global Database.QueryLocator start(Database.BatchableContext context) {
return Database.getQueryLocator('SELECT Id FROM Account');
}
global void execute(Database.BatchableContext context, List<Account> scope) {
for (Account acc : scope) {
acc.Description = 'Processed in batch';
}
update scope;
}
global void finish(Database.BatchableContext context) {
System.debug('Batch processing completed.');
}
}
What is Queueable Apex?
Features:
Example:
public class QueueableExample implements Queueable {
public void execute(QueueableContext context) {
System.debug('Executing asynchronous job');
}
}
What are Scheduled Jobs?
Schedulable interface.Use Cases:
Example:
global class ScheduledJob implements Schedulable {
global void execute(SchedulableContext sc) {
System.debug('Scheduled job executed');
}
}
To schedule:
ScheduledJob job = new ScheduledJob();
String cron = '0 0 12 * * ?'; // Noon daily
System.schedule('Daily Job', cron, job);
Declarative automation tools are great for straightforward processes, while programmatic automation provides the power to handle complex or large-scale tasks. Combining these approaches ensures you can automate virtually any business scenario in Salesforce effectively.
Can do:
Cannot do:
| Feature | Workflow Rules | Process Builder |
|---|---|---|
| Multiple Actions per Rule | No | Yes |
| Create New Records | No | Yes |
| Call Apex Code | No | Yes |
| Scheduled Actions | Yes | Yes |
| Future Support | Being phased out | Being phased out (Use Flow instead) |
Salesforce is deprecating Process Builder in favor of Flow because Flow is more powerful and scalable.
If the process requires multiple steps or decisions → Use Flow
If the process needs to interact with multiple objects → Use Flow
If user input is required (e.g., a guided screen flow) → Use Flow
If automation needs to be migrated from Workflow or Process Builder → Use Flow
If the process requires complex calculations or loops → Use Apex
If the process requires external API calls (HTTP Callouts) → Use Apex
If the automation needs to run asynchronously (background jobs) → Use Apex
| Scenario | Use Flow? | Use Apex? |
|---|---|---|
| Simple field update | Yes | No |
| Create or update related records | Yes | No |
| Work across multiple objects | Yes | No |
| Perform external API calls | No | Yes |
| High-performance batch processing | No | Yes |
A Trigger Framework is a best practice for managing triggers in large applications.
Trigger Code (Minimal Logic)
trigger AccountTrigger on Account (before insert, after update) {
AccountTriggerHandler.handleTrigger(Trigger.new, Trigger.oldMap);
}
Handler Class (Encapsulated Logic)
public class AccountTriggerHandler {
public static void handleTrigger(List<Account> newRecords, Map<Id, Account> oldRecords) {
for (Account acc : newRecords) {
acc.Description = 'Handled by Trigger Framework';
}
}
}
Use static variables to track execution state.
Example:
public class PreventRecursion {
private static Boolean isTriggerRunning = false;
public static void executeTriggerLogic() {
if (isTriggerRunning) return;
isTriggerRunning = true;
// Trigger logic here...
isTriggerRunning = false;
}
}
Flow is a declarative tool, while Apex provides advanced control over logic. Choosing between them depends on complexity and performance needs.
| Scenario | Use Flow? | Use Apex? |
|---|---|---|
| Simple record updates | Yes | No |
| Automate approvals | Yes | No |
| Perform API callouts | No | Yes |
| Perform bulk processing (batch jobs) | No | Yes |
| Work with complex data structures | No | Yes |
Salesforce provides four main asynchronous execution methods: @future, Queueable, Batch Apex, and Scheduled Apex.
| Method | Use Case | Supports Chaining? | Supports Complex Objects? |
|---|---|---|---|
@future |
Lightweight, short-lived async jobs (e.g., API callouts) | No | No |
Queueable |
Similar to @future, but supports chaining and complex objects |
Yes | Yes |
Batch Apex |
Process large datasets (e.g., bulk updates) | Yes | No |
Scheduled |
Run jobs at specific times (e.g., nightly cleanups) | Yes | No |
Using @future for API Callouts
public class AsyncExample {
@future(callout=true)
public static void callExternalAPI() {
Http http = new Http();
HttpRequest request = new HttpRequest();
request.setEndpoint('https://example.com/api');
request.setMethod('GET');
HttpResponse response = http.send(request);
}
}
Using Queueable for Complex Async Processing
public class QueueableExample implements Queueable {
public void execute(QueueableContext context) {
System.debug('Executing Queueable Job...');
}
}
System.enqueueJob(new QueueableExample());
Using Batch Apex for Large Data Processing
global class BatchExample implements Database.Batchable<sObject> {
global Database.QueryLocator start(Database.BatchableContext bc) {
return Database.getQueryLocator('SELECT Id FROM Account');
}
global void execute(Database.BatchableContext bc, List<Account> scope) {
for (Account acc : scope) {
acc.Name += ' - Updated';
}
update scope;
}
}
Database.executeBatch(new BatchExample());
@future (but limited functionality).Queueable Apex.Batch Apex.Scheduled Apex.When should a developer choose Apex triggers instead of record-triggered flows?
Apex triggers should be used when automation requires complex logic, cross-object operations, or large-scale processing beyond Flow capabilities.
Record-triggered flows can handle many automation scenarios such as field updates, sending emails, and simple logic. However, Apex triggers are more suitable when developers need complex algorithms, advanced error handling, integration with external services, or highly optimized bulk processing.
For example, if a requirement involves calculating values across hundreds of related records or integrating with a third-party API, Apex provides greater control and scalability. Salesforce recommends using declarative tools like Flow first, but when the logic becomes too complex or performance-critical, Apex triggers become the better solution.
Understanding when to choose Flow versus Apex is a frequent scenario-based question in the PDI exam.
Demand Score: 90
Exam Relevance Score: 94
What is the main purpose of bulkifying an Apex trigger?
Bulkification ensures the trigger can process multiple records efficiently in a single transaction without exceeding governor limits.
Salesforce processes records in batches, often up to 200 records at a time when operations occur through the API or data imports. If a trigger is written to handle only one record at a time, it may fail when processing larger batches.
Bulkified triggers operate on collections instead of individual records. Developers use lists, sets, and maps to process multiple records simultaneously and reduce the number of SOQL queries and DML operations.
For example, instead of querying related records inside a loop, developers collect IDs into a set and perform one query outside the loop. This approach avoids hitting limits such as the maximum number of queries per transaction.
Bulkification is considered one of the most critical Apex development practices tested in the PDI certification.
Demand Score: 92
Exam Relevance Score: 96
Why can a trigger run multiple times during a single transaction?
Because automation tools like workflow rules or flows can update the same record again, causing triggers to fire repeatedly.
In Salesforce’s order of execution, certain automation tools may modify a record after triggers have already executed. For example, a workflow field update or Flow action may change a field value, which causes Salesforce to re-evaluate triggers for that record within the same transaction.
If developers do not design triggers carefully, this can lead to recursion, where the same logic runs repeatedly and may update records indefinitely. This behavior can cause governor limit exceptions or unintended data changes.
To prevent this, developers often use recursion control patterns such as static Boolean variables or sets that track processed record IDs. Understanding trigger recursion is important for debugging automation conflicts in real projects and appears frequently in certification scenarios.
Demand Score: 84
Exam Relevance Score: 90
What is the difference between a before trigger and an after trigger in Apex?
Before triggers are used to modify record values before they are saved, while after triggers run after the record is committed to the database.
Before triggers allow developers to change field values directly on the records being saved without performing additional DML operations. This makes them efficient for tasks like populating calculated fields or enforcing business logic before data is stored.
After triggers execute once the record has already been saved. They are commonly used when developers need record IDs or must create or update related records in other objects.
Choosing the correct trigger type is important because some operations are only possible in specific contexts. For example, modifying values on Trigger.new is allowed in before triggers but not in after triggers.
Understanding the appropriate trigger context helps developers design efficient automation and avoid unnecessary database operations.
Demand Score: 80
Exam Relevance Score: 88
Why should developers avoid having multiple triggers on the same object?
Multiple triggers on the same object can create unpredictable execution order and make maintenance more difficult.
Salesforce does not guarantee the order in which multiple triggers on the same object execute. If several triggers perform related logic, they may interfere with each other or produce inconsistent results depending on which runs first.
To avoid this problem, developers typically implement a single trigger per object pattern. This trigger acts as an entry point that calls handler classes responsible for different pieces of logic.
Using a structured trigger framework improves readability, maintainability, and testability of the code. It also ensures that developers can control the order of execution within the handler class rather than relying on unpredictable system behavior.
This design pattern is widely recommended in Salesforce development best practices.
Demand Score: 77
Exam Relevance Score: 85
Why might a record-triggered Flow update fail when processing large data loads?
Because flows can still encounter governor limits when handling large batches of records.
Although Flows are declarative tools, they still operate within Salesforce’s governor limits. If a record-triggered flow performs many queries, loops, or record updates during bulk operations, it can exceed limits such as CPU time or database operations.
For example, a flow that queries related records inside a loop may work fine for one record but fail when processing hundreds during a data import. Developers must design flows with bulk processing in mind, similar to Apex triggers.
Best practices include reducing loops, retrieving data once, and using collection operations whenever possible. Understanding how declarative automation interacts with platform limits is a common troubleshooting topic for Salesforce developers and often appears in certification questions.
Demand Score: 81
Exam Relevance Score: 87