Shopping cart

Subtotal:

$0.00

ADM-201 Managing Data

Managing Data

Detailed list of ADM-201 knowledge points

Managing Data Detailed Explanation

Effective data management is essential in Salesforce to ensure the accuracy, consistency, and security of your organization’s data. This includes importing new data, cleaning up existing data, and exporting data for backups or analysis.

6.1 Data Import

Data import allows you to add new records to Salesforce or update existing records efficiently. Depending on the volume of data and complexity, you can choose between two primary tools: Data Import Wizard and Data Loader.

Data Import Wizard

What It Is

  • A Salesforce-provided tool for importing small to medium volumes of data.
  • Best suited for users who need a simple, guided import process.

Features

  • Duplicate Matching and Merging:
    • Detects duplicates during the import process and merges them based on matching criteria.
  • Objects Supported:
    • Standard objects like Accounts, Contacts, Leads, and custom objects.
  • File Formats:
    • Accepts CSV files.

How to Use

  1. Navigate to Setup and search for Data Import Wizard.
  2. Select the object you want to import data into (e.g., Leads or Accounts).
  3. Upload your CSV file.
  4. Map the CSV fields to Salesforce fields.
    • Example: Map "Account Name" in your file to the "Account Name" field in Salesforce.
  5. Review and confirm the import.

When to Use

  • Small datasets (less than 50,000 records).
  • When you need to import and merge records in a user-friendly interface.

Practical Example

You have a CSV file with 200 new Leads. Use the Data Import Wizard to upload the file, map the fields, and import the records without creating duplicates.

Data Loader

What It Is

  • A powerful client application for importing, exporting, and updating large volumes of data.
  • Suitable for more advanced users or admins handling large datasets.

Features

  • Handles up to 5 million records in one operation.
  • Allows complex operations like upsert (update existing records and insert new ones) and delete.
  • Requires manual configuration for duplicate detection.

How to Use

  1. Download and install the Data Loader application from Salesforce.
  2. Log in to Data Loader with your Salesforce credentials.
  3. Select the operation you want to perform (e.g., Insert, Update, Upsert).
  4. Upload your CSV file.
  5. Map CSV columns to Salesforce fields.
  6. Execute the operation and review the results.

When to Use

  • Large datasets (over 50,000 records).
  • Advanced data management tasks like mass deletion or bulk updates.

Practical Example

Your organization needs to update 200,000 Account records with new industry data. Use Data Loader to perform a bulk update efficiently.

6.2 Data Cleansing

Data cleansing ensures that your Salesforce data remains accurate, consistent, and free of duplicates.

Duplicate Rules

What They Are

  • Prevent duplicate records from being created by setting up matching rules.
  • Ensure data quality by alerting users when they try to create a duplicate record.

How to Configure

  1. Go to Setup and search for Duplicate Rules.
  2. Choose the object (e.g., Contacts, Leads).
  3. Define the matching criteria (e.g., "Email" must be unique).
  4. Set the action:
    • Alert: Notify the user about a potential duplicate.
    • Block: Prevent the duplicate record from being saved.

Practical Example

  • A Duplicate Rule for Contacts alerts users when they attempt to create a record with the same email address as an existing Contact.

Merge Tools

What They Are

  • Tools that combine duplicate records into one, ensuring all associated data remains intact.

How to Merge Records

  1. Navigate to the list view for the object (e.g., Accounts, Contacts).
  2. Select the records you want to merge.
  3. Use the Merge button (available in Lightning Experience).
  4. Choose the master record and specify which field values to retain from each record.
  5. Confirm the merge.

Practical Example

  • Two duplicate Accounts, "ABC Inc." and "ABC Incorporated," exist. Use the Merge tool to combine them into a single "ABC Inc." record while retaining associated Opportunities and Contacts.

6.3 Data Export

Regular data exports are essential for creating backups, migrating data, or performing external analysis.

Why Export Data?

  • Protect against accidental data loss.
  • Analyze data outside of Salesforce using tools like Excel or Tableau.
  • Migrate data to other systems.

How to Export Data

Manual Data Export

  1. Navigate to Setup and search for Data Export.
  2. Select the objects you want to export (e.g., Accounts, Opportunities).
  3. Specify the file format (usually CSV).
  4. Download the export file once it is generated.

Scheduled Data Export

  • Set up automatic, regular exports to ensure you always have an up-to-date backup.
  • Configure the frequency (e.g., weekly, monthly) and select the objects to export.

Practical Example

  • Your organization exports Opportunity data weekly to track sales performance in an external reporting system.

Step-by-Step Summary

1. Data Import

  • Use Data Import Wizard for small, simple imports with duplicate handling.
  • Use Data Loader for large or complex data operations.

2. Data Cleansing

  • Configure Duplicate Rules to prevent duplicate data entry.
  • Use Merge Tools to clean up existing duplicate records.

3. Data Export

  • Regularly export data for backups and external analysis.
  • Schedule exports to ensure consistent data snapshots.

Key Takeaways for Beginners

  • Choose the right tool for data imports based on volume and complexity.
  • Implement duplicate prevention strategies to maintain clean, accurate data.
  • Regularly back up your data to protect against accidental loss.

Managing Data (Additional Content)

1. Upsert (Updating or Inserting Records Efficiently)

Why is it important?

  • Upsert allows users to update existing records and insert new records in a single operation, avoiding duplicate data imports.
  • Unlike Insert, which only creates new records, Upsert checks for an existing record first and updates it if found.
  • Data Import Wizard does NOT support Upsert, but Data Loader supports it.

How to Perform an Upsert Operation

  1. Prepare a CSV file with data, ensuring it contains a unique identifier (e.g., Record ID or External ID).
  2. Open Data Loader and select Upsert.
  3. Upload the CSV file.
  4. Map fields, ensuring that the External ID or Record ID field is correctly mapped.
  5. Click Next and run the Upsert operation.

Example Scenario

  • A company imports customer data from an external system.
    • If the Email Address already exists, the system updates the existing record.
    • If the Email Address does not exist, a new record is created.

2. External ID (Ensuring Seamless Data Integration Across Systems)

Why is it important?

  • External IDs allow Salesforce to match records from external systems (such as ERP, CRM, or accounting software) without using Salesforce Record IDs.
  • This helps in data synchronization and avoids duplicate records.

How to Create an External ID

  1. Navigate to Setup → Go to the object → Click Fields & Relationships.
  2. Click New Field → Select Text, Number, or Email as the field type.
  3. In the Field Properties, check the External ID box.
  4. Save the field.
  5. Now, this External ID can be used in Upsert operations.

Example Scenario

  • A company synchronizes customer data between Salesforce and an ERP system.
    • The ERP system uses Customer ID (e.g., "ERP-001234") as the unique identifier.
    • The Customer ID is stored in Salesforce as an External ID, allowing Upsert to match records using this ID instead of Salesforce Record ID.

3. Mass Delete (Cleaning Up Unnecessary Records in Bulk)

Why is it important?

  • Over time, Salesforce accumulates redundant data (e.g., old Leads, inactive Accounts, duplicate Cases).
  • Mass Delete allows administrators to remove thousands of records at once.

How to Perform a Mass Delete

  1. Navigate to Setup → Search for Mass Delete Records.
  2. Select the object to delete records from (e.g., Leads, Accounts, Cases).
  3. Define filter criteria (e.g., delete Leads created more than a year ago).
  4. Click Delete.

Example Scenario

  • A company cleans up old Leads that:
    • Have not been contacted in over 2 years.
    • Do not have any related Opportunities.
    • Are marked as "Junk Leads".

4. Data Archiving (Reducing Storage Usage Without Losing Data)

Why is it important?

  • Salesforce has data storage limits, and storing old records may slow down performance.
  • Archiving data ensures historical records are preserved without consuming active Salesforce storage.

Best Practices for Data Archiving

  1. Identify data to archive (e.g., Opportunities closed 5+ years ago).
  2. Export data using Data Loader and store it in external storage (e.g., AWS, Google Cloud, or a database).
  3. Use Salesforce Big Objects to store large volumes of historical data without consuming standard storage.
  4. Set up automated archiving for records older than X years.

Example Scenario

  • A company archives old Case records that are more than 5 years old.
  • Instead of keeping them in Salesforce, the data is exported to a secure cloud database and removed from active Salesforce storage.

5. Salesforce Data Masking (Protecting Sensitive Information in Sandbox Environments)

Why is it important?

  • Data Masking prevents sensitive information from being exposed in sandbox environments where developers and testers have access.
  • It helps protect personally identifiable information (PII), such as:
    • Customer names
    • Phone numbers
    • Credit card details

How to Implement Data Masking

  1. Use Field-Level Security (FLS) to hide sensitive fields.
  2. Enable Salesforce Shield (Platform Encryption) for encrypting critical data.
  3. Use data anonymization tools to replace real customer data with fake but realistic data.

Example Scenario

  • A testing sandbox is created for developers.
    • Instead of showing real customer names, the system replaces them with "John Doe, Jane Smith, etc.".
    • Credit card numbers are scrambled to "XXXX-XXXX-XXXX-1234".

6. Data Governance (Ensuring Data Quality and Consistency)

Why is it important?

  • Poor data quality leads to duplicate records, incorrect reports, and operational inefficiencies.
  • A strong data governance strategy ensures that Salesforce data remains clean, consistent, and reliable.

Best Practices for Data Governance

Strategy Benefit
Standardize Data Entry Use Picklists instead of free text to maintain consistency.
Validation Rules Prevent users from entering incorrect or incomplete data.
Regular Data Audits Review records to remove duplicates and outdated information.
User Training Ensure end users understand data entry best practices.

Example Scenario

  • A company ensures that:
    • All Accounts have at least one Contact.
    • Emails follow a valid format (e.g., "[email protected]").
    • Phone numbers are stored in a standardized format.

Final Summary

Feature Description Best Use Cases
Upsert Updates existing records and inserts new ones in one operation Avoids duplicate data imports
External ID Identifies records based on external system IDs Data synchronization with ERP or other systems
Mass Delete Deletes large volumes of outdated records Removing old Leads, Accounts, and Cases
Data Archiving Moves old records to external storage to free up space Archiving closed Opportunities, Cases, and historical records
Salesforce Data Masking Protects sensitive data in sandbox environments Hiding customer PII from developers
Data Governance Implements data quality best practices Ensuring accuracy and consistency of Salesforce data

Frequently Asked Questions

What is the difference between Data Import Wizard and Data Loader?

Answer:

Data Import Wizard is simpler and web-based for smaller jobs, while Data Loader is more powerful and better for large or complex data operations.

Explanation:

The Data Import Wizard is designed for administrators who need an easier interface and are working with supported standard objects and moderate data volumes. It includes basic matching options and is a good choice for common imports such as Leads, Contacts, or Accounts. Data Loader is a client application that supports a wider range of objects and actions, including insert, update, delete, export, and upsert. It is better for large-scale operations and advanced administrative work. On the exam, the decision usually depends on scale and flexibility. If the scenario involves many records, unsupported objects, scheduled processing, or advanced operations, Data Loader is typically the correct answer. If it is a smaller, simpler import, Data Import Wizard is often preferred.

Demand Score: 90

Exam Relevance Score: 95

What does upsert mean in Salesforce?

Answer:

Upsert updates an existing record if a match is found, or inserts a new record if no match exists.

Explanation:

Upsert combines update and insert into a single operation, making it very useful during migrations and integrations. Salesforce determines whether a record already exists by using a record ID or an external ID field. If a matching record is found, the existing record is updated; if no match is found, Salesforce creates a new one. This helps administrators avoid duplicate creation while still loading new data efficiently. On the exam, upsert is often the right choice when the system must process mixed datasets containing both new and existing records. The key concept is matching logic: without a reliable identifier such as an external ID, upsert cannot work correctly. That makes field design and data quality important parts of the solution.

Demand Score: 84

Exam Relevance Score: 91

What is the purpose of duplicate management in Salesforce?

Answer:

Duplicate management helps detect and control duplicate records so the database stays accurate and trustworthy.

Explanation:

Salesforce duplicate management uses matching rules and duplicate rules. Matching rules define how records are compared, such as by email, name, or phone number. Duplicate rules define what happens when a possible duplicate is found, such as blocking the save or warning the user. This is especially important for Leads, Contacts, and Accounts, where duplicates can damage sales visibility, reporting accuracy, and customer experience. On the exam, administrators should know that duplicate management is preventative and configurable. It does not automatically clean all existing data, but it does help stop new duplication and manage user behavior. Good duplicate control reduces confusion, improves analytics, and supports better automation outcomes.

Demand Score: 81

Exam Relevance Score: 89

When should an administrator use an external ID field?

Answer:

Use an external ID when records need to be matched with data from another system during imports, updates, or integrations.

Explanation:

An external ID is a custom field marked as a unique or indexed identifier for data coming from outside Salesforce. It is commonly used when another system, such as an ERP or billing platform, has its own customer or transaction ID. During imports or integrations, Salesforce can use that field to match records without relying on internal Salesforce record IDs. This becomes especially useful for upsert operations and relationship loading. On the exam, the right answer often appears when the business needs to sync or reconcile records between systems. External IDs improve efficiency, reduce duplicate risk, and allow cleaner automation between platforms. They are one of the most important data management concepts for practical administration.

Demand Score: 76

Exam Relevance Score: 88

Why is data quality important in Salesforce administration?

Answer:

Because poor data quality leads to unreliable reports, broken automation, user distrust, and weak business decisions.

Explanation:

Salesforce depends on structured, accurate, and consistent data. If records are incomplete, duplicated, or inconsistent, dashboards become misleading, flows may behave incorrectly, and users may stop trusting the system. Administrators support data quality through validation rules, picklists, duplicate management, controlled imports, and clear ownership processes. Exam questions often present symptoms such as inconsistent reports or automation errors and expect the candidate to recognize that the real issue is poor data governance. Good data quality is not just a cleanup task; it directly affects user adoption, forecasting, service performance, and executive decision-making. For an administrator, protecting data quality is one of the most business-critical responsibilities.

Demand Score: 73

Exam Relevance Score: 84

ADM-201 Training Course