Skip to content

How to Build a Salesforce API Integration: Code Samples & Architecture

A comprehensive, technical blueprint for building a Salesforce API integration. Learn how to handle OAuth lifecycles, REST vs Bulk API 2.0, custom objects, and strict rate limits.

Roopendra Talekar Roopendra Talekar · · 14 min read
How to Build a Salesforce API Integration: Code Samples & Architecture

If you are a product manager or engineering lead looking to publish a hands-on Salesforce implementation guide with code samples for your internal team, or simply trying to ship a native integration without derailing your upcoming product roadmap, you already know the stakes. Salesforce is not just another integration. It is a mandatory requirement for selling software to mid-market and enterprise buyers.

Building a native Salesforce integration is notoriously difficult. The documentation is fragmented across decades of legacy features, the API limits are aggressively enforced, and every customer's instance is heavily customized. You cannot just map your data to a standard contact object and call it a day.

If you need to build a CRM integration connecting your B2B SaaS product with Salesforce, here is the short version: set up a Connected App for OAuth 2.0, use the REST API for real-time single-record operations, switch to Bulk API 2.0 for anything over 2,000 records, and plan your architecture around the 100,000-request daily limit before it surprises you in production.

This guide provides a highly technical, architectural blueprint for building a Salesforce API integration. We will walk through each step with code samples, covering authentication lifecycles, the architectural trade-offs between REST and Bulk APIs, dynamic custom object handling, and rate limit management.

The Business Case for a Native Salesforce Integration

Salesforce dominates the CRM market. According to IDC, in 2024, Salesforce led all CRM vendors with a 20.7% share and generated the highest revenue among all CRM vendors. This recognition marks the 12th consecutive year Salesforce has earned the top spot. In fiscal year 2025, Salesforce reported $37.9 billion in total revenue, up about 9% year-over-year.

What does that mean for your product? If you sell to enterprise accounts, more than 90% of Fortune 500 companies use Salesforce. When your sales team asks for a native Salesforce integration, they are responding to direct buyer demands. Enterprise buyers expect your product to read and write records in their system of record without requiring manual CSV uploads or brittle middleware scripts. Your prospects are not asking if you integrate with their CRM - they are asking how deeply. A missing Salesforce integration is a disqualifier during procurement. Not a nice-to-have.

However, the cost of building this infrastructure in-house is steep. Industry data shows that simple REST-only integrations with a few standard objects can run $10,000-$20,000 in engineering time. But the moment you need custom objects, bidirectional sync, Bulk API support, and multi-tenant OAuth management, budgets quickly climb past $100,000 to $150,000 - and that is before ongoing maintenance. Every Salesforce API version upgrade, every connected app policy change, every customer with a unique Deal_Registration__c object adds to the long-term burden.

You have to decide whether to build this infrastructure from scratch or leverage an abstraction layer. Before making that decision, you need to understand exactly what building it entails.

Authentication: Setting Up Salesforce OAuth 2.0

The first technical hurdle is authentication. Salesforce uses OAuth 2.0, but configuring it correctly for a multi-tenant SaaS application requires careful orchestration.

Salesforce supports three primary OAuth 2.0 flows for API integrations:

  • Authorization Code with PKCE - for apps acting on behalf of end users
  • Client Credentials - for server-to-server communication with no user context
  • JWT Bearer Token - for certificate-based machine-to-machine auth

For most B2B SaaS integrations where your customers connect their own Salesforce org, Authorization Code with PKCE is the right choice.

Creating the Connected App

Before you can use any OAuth 2.0 flow, you need to register your app in Salesforce as an External Client App. This app provides the client_id (Consumer Key) and client_secret (Consumer Secret) needed for the OAuth flow.

Key Configuration Steps:

  1. In Salesforce Setup, navigate to App Manager and create a New Connected App.
  2. Enable OAuth Settings.
  3. Set your Callback URL (e.g., https://api.yoursaas.com/oauth/callback/salesforce).
  4. Select the necessary OAuth Scopes. At minimum, you need api (or full), refresh_token, and offline_access (Perform requests on your behalf at any time).
Warning

Security Warning: Never request more scopes than your application strictly requires. Over-permissioned OAuth apps are frequently flagged during enterprise security reviews, blocking deals.

Here is the authorization code exchange in practice:

// Step 1: Redirect the user to Salesforce's authorize endpoint
const authUrl = `https://login.salesforce.com/services/oauth2/authorize
  ?response_type=code
  &client_id=${CONSUMER_KEY}
  &redirect_uri=${encodeURIComponent(CALLBACK_URL)}
  &scope=api%20refresh_token%20offline_access`;
 
// Step 2: Exchange the auth code for tokens
const tokenResponse = await fetch(
  'https://login.salesforce.com/services/oauth2/token',
  {
    method: 'POST',
    headers: { 'Content-Type': 'application/x-www-form-urlencoded' },
    body: new URLSearchParams({
      grant_type: 'authorization_code',
      code: authorizationCode,
      client_id: CONSUMER_KEY,
      client_secret: CONSUMER_SECRET,
      redirect_uri: CALLBACK_URL,
    }),
  }
);
 
const { access_token, refresh_token, instance_url } = await tokenResponse.json();

The September 2025 Connected App Restrictions

This is a gotcha that will bite you if you are not paying attention. Starting in early September 2025, Salesforce made a change that shifts how connected apps work: uninstalled connected apps will no longer be accessible to most users. If an app isn't installed in your org, it's blocked. Salesforce also eliminated OAuth Device Flow completely.

The practical impact: your customers' Salesforce admins must explicitly install your Connected App in their org before end users can authorize it. If you have been relying on "self-service" OAuth where any user can authorize, you now need an admin installation step in your onboarding flow.

The OAuth Token Lifecycle and Concurrency

Once a user authenticates, Salesforce issues an access token (typically valid for 1-2 hours) and a refresh token. Your integration must handle refresh tokens proactively. The common failure pattern here is concurrency.

If your application runs multiple background sync jobs or parallel API requests, they might all detect an expired access token simultaneously. Two concurrent API calls both race to refresh it, one succeeds, the other fails, and the losing caller retries with a now-invalid token. If three concurrent workers attempt to use the same refresh token at the exact same time, Salesforce may invalidate the token entirely, forcing the user to re-authenticate manually.

The fix is a locking mechanism that ensures only one refresh operation executes per account at a time - concurrent callers wait for the in-progress refresh instead of triggering their own.

sequenceDiagram
    participant Worker A
    participant Worker B
    participant Token Lock
    participant Salesforce API

    Worker A->>Token Lock: Request Lock for Account 123
    Token Lock-->>Worker A: Lock Granted
    Worker A->>Salesforce API: POST /services/oauth2/token (Refresh)
    Worker B->>Token Lock: Request Lock for Account 123
    Token Lock-->>Worker B: Lock Denied (Wait)
    Salesforce API-->>Worker A: New Access Token
    Worker A->>Token Lock: Release Lock & Update DB
    Token Lock-->>Worker B: Lock Granted (Use New Token)

For a deeper dive into setting up this connection, see our Integrate Salesforce: Step-by-step guide by Truto.

Salesforce REST API vs. Bulk API 2.0: Which Should You Use?

Salesforce enforces strict API limits. Hitting these limits causes integration failures across your entire customer base. To avoid this, you must choose the right API architecture for the right task.

The golden rule: Use the REST API for real-time, low-volume operations (under 2,000 records). Use Bulk API 2.0 for anything larger.

This is not a style preference - it is a constraint imposed by Salesforce's limit architecture.

The Standard REST API

The standard REST API is synchronous. You send a request, you get a response. It is ideal for real-time operations, single-record updates, fetching a specific account record immediately after a user clicks a button, or running SOQL queries that return small result sets.

Example: Querying Contacts via REST API

curl -X GET "https://your-instance.my.salesforce.com/services/data/v60.0/query/?q=SELECT+Id,FirstName,LastName,Email+FROM+Contact+WHERE+LastModifiedDate>=YESTERDAY" \
  -H "Authorization: Bearer YOUR_ACCESS_TOKEN"

Example: Creating a Contact via REST API

curl -X POST \
  "https://your-instance.my.salesforce.com/services/data/v60.0/sobjects/Contact" \
  -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "FirstName": "Jane",
    "LastName": "Rodriguez",
    "Email": "jane@example.com",
    "AccountId": "001xx000003DGXYZ"
  }'

The problem with the REST API is pagination and volume. If you need to sync 50,000 contacts, making 50,000 individual GET or POST requests will instantly consume the customer's daily API quota.

Bulk API 2.0

For high-volume data syncs, you must use the Bulk API 2.0. Bulk API 2.0 is a RESTful API that allows you to perform large-scale data operations on Salesforce objects, such as insert, update, delete, and query. Salesforce automatically chunks the data for your job into multiple internal batches to improve performance while processing the job.

The workflow is asynchronous: you create a job, upload a batch of CSV data, close the job to start processing, and poll for the results.

Example: Creating an Ingest Job

# Step 1: Create an ingest job
curl -X POST \
  "https://your-instance.my.salesforce.com/services/data/v60.0/jobs/ingest" \
  -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "object": "Contact",
    "operation": "upsert",
    "externalIdFieldName": "External_Id__c",
    "contentType": "CSV",
    "lineEnding": "LF"
  }'

Example: Uploading and Closing the Job

# Step 2: Upload CSV data to the job
curl -X PUT \
  "https://your-instance.my.salesforce.com/services/data/v60.0/jobs/ingest/${JOB_ID}/batches" \
  -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
  -H "Content-Type: text/csv" \
  --data-binary @contacts.csv
 
# Step 3: Close the job to start processing
curl -X PATCH \
  "https://your-instance.my.salesforce.com/services/data/v60.0/jobs/ingest/${JOB_ID}" \
  -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{"state": "UploadComplete"}'

Once closed, you poll the job status endpoint until the state changes to JobComplete, at which point you download the results or error logs.

The Decision Matrix

Factor REST API Bulk API 2.0
Record volume < 2,000 records > 2,000 records
Latency Synchronous (real-time) Asynchronous (minutes to hours)
Daily API call cost 1 call per request 1 call per ~10,000 records
SOQL support Full SOQL Limited SOQL (no compound fields)
Error handling Immediate per-record errors Batch error reports after completion
Use case UI-driven reads/writes Data migrations, nightly syncs

Users can expect a 2x to 6x improvement in export times when using the Bulk API 2.0 compared to the REST API.

Building infrastructure to handle both synchronous REST calls and asynchronous Bulk jobs requires significant engineering effort. You have to manage state, polling intervals, and CSV parsing logic on your backend.

Managing Salesforce API Rate Limits and Retries

Salesforce API limits are unforgiving. Salesforce enforces a 100,000 daily API request limit for Enterprise Edition orgs, plus 1,000 additional requests per user license. Limits are calculated on a 24-hour rolling basis rather than a fixed calendar day.

That sounds generous until you realize your integration is not the only thing hitting the API. When marketing automation, support tools, and business intelligence platforms all access Salesforce concurrently, combined usage often exceeds both daily quotas and concurrent request limits.

Beyond daily limits, Salesforce caps how many long-running (20+ second) API calls can run simultaneously - in production, this cap is typically 25 concurrent requests.

Handling Limit Errors in Your Integration

Here is what happens when you hit the wall: exceeding these limits results in blocked requests with HTTP 403 status and REQUEST_LIMIT_EXCEEDED error until usage drops. Alternatively, you may receive an HTTP 429 Too Many Requests error. If your integration does not handle this gracefully, data syncs will fail silently, leading to data corruption and angry support tickets.

Your integration needs to respect the Sforce-Limit-Info header that Salesforce returns with every response. It looks like this:

Sforce-Limit-Info: api-usage=45032/100000

Parse it. Track it. Set alerts at 80% capacity. When you hit limits, you must implement exponential backoff - waiting progressively longer intervals between retry attempts based on limit reset windows.

Standardizing Rate Limit Headers

Handling rate limits across different APIs (Salesforce, HubSpot, NetSuite) is a nightmare because every provider formats their rate limit headers differently. If you use an abstraction layer like Truto, this complexity is normalized. Truto translates upstream rate limit information from Salesforce into standard IETF headers (ratelimit-limit, ratelimit-remaining, ratelimit-reset).

It is important to note that Truto does not automatically retry, throttle, or apply backoff on rate limit errors. When the Salesforce API returns an HTTP 429 or 403, Truto passes that exact error to the caller. This is an intentional architectural decision: the caller (your application) maintains total control over retry and backoff logic, ensuring you can prioritize critical operations over background syncs.

Learn more in our guide on Best Practices for Handling API Rate Limits and Retries Across Multiple Third-Party APIs.

Handling Salesforce Custom Objects and Fields

Standard objects like Account, Contact, and Opportunity only represent a fraction of how enterprise companies use Salesforce. You can read the docs, write your SOQL, and move on. Custom objects are where integration projects go to die.

Whenever a user creates a custom field in Salesforce, the system appends __c at the end of the field name. Custom objects get the same treatment. The API mechanics are identical for both standard and custom objects - that is the good news. The bad news is that you can't know the schema ahead of time.

The Schema Discovery Problem

Your test org has Account.Industry. Your first enterprise customer has Account.Industry_Vertical__c. Your second customer has a completely custom object called Deal_Registration__c with 47 fields that do not map to anything in your data model.

You can discover the schema dynamically using the Describe endpoint:

# Get all fields on a custom object
curl -X GET \
  "https://your-instance.my.salesforce.com/services/data/v60.0/sobjects/Deal_Registration__c/describe" \
  -H "Authorization: Bearer YOUR_ACCESS_TOKEN"

This returns every field, its type, whether it is nullable, picklist values, and relationship references. Use this during your post-connection setup to build dynamic field mapping UIs.

Querying Custom Objects with SOQL

SOQL (Salesforce Object Query Language) is how you retrieve data from custom objects. The syntax resembles SQL but with important differences - no wildcard selects, specific join syntax, and governor limits on query complexity.

-- Query a custom object with relationship traversal
SELECT Id, Name, Deal_Value__c, Account__r.Name, Account__r.Industry
FROM Deal_Registration__c
WHERE Stage__c = 'Approved'
  AND CreatedDate >= 2025-01-01T00:00:00Z
ORDER BY Deal_Value__c DESC
LIMIT 200

For lookup fields, replace __c with __r to access related object fields. You can go up to five levels deep in relationships.

The Hardcoded Schema Trap

The most common mistake engineering teams make is hardcoding the Salesforce schema into their database or integration code. They create a table with columns for first_name, last_name, and email. Then, an enterprise customer renames Lead_Source__c to Marketing_Channel__c, and your sync breaks silently.

If your schema is hardcoded, fulfilling this request requires a database migration, backend code changes, and a deployment. This approach does not scale across hundreds of customers.

To handle custom fields properly, your integration must query the object metadata dynamically or allow customers to map fields in your UI. When constructing queries, you must build SOQL statements programmatically.

// Example: Dynamically building a SOQL query with custom fields
function buildSalesforceQuery(standardFields, customFields) {
  const allFields = [...standardFields, ...customFields].join(', ');
  return `SELECT ${allFields} FROM Contact WHERE LastModifiedDate > 2024-01-01T00:00:00Z`;
}
 
const query = buildSalesforceQuery(
  ['Id', 'FirstName', 'LastName', 'Email'],
  ['LTV_Score__c', 'Internal_ID__c']
);

For a detailed breakdown of how to manage these mappings at scale, read How to Handle Custom Salesforce Fields Across Enterprise Customers.

The Data-Driven Alternative: Declarative Mapping at Scale

Building a reliable Salesforce integration requires managing OAuth concurrency, implementing Bulk API polling, writing dynamic SOQL query builders, tracking the Sforce-Limit-Info header, and normalizing error handling. Doing this in-house will consume 3-6 months of engineering time. The long-tail complexity is what kills you - month 6 when you have 50 customers with 50 different Salesforce configurations and your integration engineer just quit.

There is a vastly superior architectural approach: declarative integration mapping via a unified API.

At Truto, we handle complex integrations like Salesforce without writing a single line of integration-specific code in our backend. Instead, we use a generic execution pipeline and store the integration logic as data using JSONata expressions. No if (provider === 'salesforce') branches. No per-integration code paths.

sequenceDiagram
    participant App as Your Application
    participant API as Unified API
    participant Config as Mapping Config<br>(JSONata)
    participant SF as Salesforce API

    App->>API: GET /unified/crm/contacts
    API->>Config: Resolve Salesforce mappings
    Config-->>API: SOQL query + field mappings
    API->>SF: GET /services/data/v60.0/query<br>?q=SELECT Id,FirstName...
    SF-->>API: { Id, FirstName, LastName, ... }
    API->>Config: Transform response
    Config-->>API: Normalized contact object
    API-->>App: { id, first_name, last_name, ... }

How Truto Maps Salesforce Data Declaratively

When a developer requests a unified list of CRM contacts through Truto, the platform executes a JSONata configuration specific to Salesforce. This configuration automatically translates standard query parameters into SOQL WHERE clauses and maps the PascalCase Salesforce response into a clean, unified JSON object.

Here is an example of the actual JSONata configuration used to map a Salesforce Contact response:

response_mapping: >-
  response.{
    "id": Id,
    "first_name": FirstName,
    "last_name": LastName,
    "name": $join($removeEmptyItems([FirstName, LastName]), " "),
    "title": Title,
    "account": { "id": AccountId },
    "email_addresses": [{ "email": Email }],
    "phone_numbers": $filter([
      { "number": Phone, "type": "phone" },
      { "number": MobilePhone, "type": "mobile" }
    ], function($v) { $v.number }),
    "created_at": CreatedDate,
    "updated_at": LastModifiedDate,
    "custom_fields": $sift($, function($v, $k) { $k ~> /__c$/i and $boolean($v) })
  }

Notice the custom_fields line. Truto automatically detects any field ending in __c and maps it into a custom_fields object. Your application receives the custom data without requiring any schema changes or database migrations.

Similarly, building the SOQL query is handled entirely via configuration:

query_mapping: >-
  (
    $whereClause := query
      ? $convertQueryToSql(
        query.{
          "created_at": created_at,
          "email_addresses": email_addresses ? $firstNonEmpty(email_addresses.email, email_addresses),
        },
        ["created_at", "email_addresses"],
        {
          "created_at": "CreatedDate",
          "email_addresses": "Email"
        }
      );
    {
      "q": query.search_term
        ? "FIND {" & query.search_term & "} RETURNING Contact(Id, FirstName, LastName, Email)",
      "where": $whereClause ? "WHERE " & $whereClause,
    }
  )

This architecture guarantees that you never have to write integration-specific logic in your codebase. You interact with a clean, RESTful unified API, and the platform handles the Salesforce-specific eccentricities. When Salesforce changes an API behavior, it is a config update, not a code deploy.

For a deep technical breakdown of this architecture, read Zero Integration-Specific Code: How to Ship API Connectors as Data-Only Operations.

Strategic Wrap-Up: Building for Scale

Whether you build in-house or use a platform, your Salesforce integration architecture needs to handle five things well:

  1. OAuth lifecycle management - proactive token refresh, concurrency control to prevent race conditions, and admin-gated Connected App installation.
  2. API selection logic - REST for real-time operations, Bulk API 2.0 for high-volume batches.
  3. Rate limit awareness - track the Sforce-Limit-Info header, implement exponential backoff, and set alerts before hitting the ceiling.
  4. Dynamic schema handling - use the Describe API for field discovery and build configuration-driven mappings that adapt per customer.
  5. Custom object support - full __c field extraction, relationship traversal via __r, and per-tenant mapping overrides.

Shipping a Salesforce integration is not a one-time project. It is an ongoing commitment to maintaining OAuth lifecycles, monitoring rate limits, supporting new custom objects, and responding to API deprecations.

If you build this entirely in-house, your engineering team will inevitably become an integrations maintenance team. By leveraging a declarative unified API, you offload the infrastructure burden. You get the depth of a native integration, the ability to read and write custom fields dynamically, and the reliability of distributed token management - all while keeping your engineering team focused on your core product. Furthermore, for anything the unified model does not cover, you can leverage a proxy API to interact directly with the Salesforce instance.

The worst outcome is shipping a half-built integration that only works with standard objects. Your enterprise customers will test it against their real orgs - with their custom fields, their automation rules, their unique object hierarchies - and if it breaks, you lose the deal.

FAQ

What is the difference between Salesforce REST API and Bulk API 2.0?
The REST API is synchronous and designed for real-time, single-record operations (under 2,000 records). Bulk API 2.0 is asynchronous and optimized for processing large datasets. It automatically batches records and helps you avoid strict daily API limits.
How do you handle Salesforce custom fields via API?
Custom fields and objects in Salesforce are appended with a __c suffix. To handle them dynamically, your integration must query the object metadata using the Describe endpoint and map these fields programmatically without hardcoding the schema.
What changed with Salesforce Connected Apps in September 2025?
Salesforce now blocks uninstalled Connected Apps for most users. Your customers' Salesforce admins must explicitly install your Connected App before end users can authorize OAuth connections. The OAuth Device Flow was also completely removed.
What happens when you hit Salesforce API rate limits?
Salesforce Enterprise Edition enforces a 100,000 daily API request limit. Exceeding it returns an HTTP 403 REQUEST_LIMIT_EXCEEDED or HTTP 429 Too Many Requests error. You must track the Sforce-Limit-Info header and implement exponential backoff.
How do you prevent OAuth token refresh race conditions?
If multiple background sync jobs detect an expired access token simultaneously, they can race to refresh it, causing Salesforce to invalidate the token entirely. You must implement a concurrency lock per integrated account to serialize token refreshes.

More from our Blog