Inserting data to Salesforce using Data Loader from command line

Posted on 21-01-2013 09:12 by graham
This tutorial shows how to configure Salesforce Data Loader to be used from command line.

Downloading and installing Data Loader


First, you need to download Data Loader and install it on your system. Once this is done, found the location where it resides - on Windows it will usually be c:\Program Files\salesforce.com\Apex Data Loader 23.0\.

Creating the configuration


Data Loader uses two configuration files: process-conf.xml and config.properties. The names of the files are fixed and cannot be changed.

Creating the process-conf.xml file


We'll start with the main configuration file process-conf.xml. First, create some directory where you will keep all your configuration - let's call it account-load. In this directory create a file called process-conf.xml with the following content:
process-conf.xml
<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd">
<beans>
<bean id="insertAccounts"
class="com.salesforce.dataloader.process.ProcessRunner"
singleton="false">
<description>Inserting test log entry</description>
<property name="name" value="insertAccounts"/>
<property name="configOverrideMap">
<map>
<entry key="sfdc.debugMessages" value="false"/>
<entry key="sfdc.debugMessagesFile" value="test.log"/>
<entry key="sfdc.endpoint" value="https://test.salesforce.com"/>
<entry key="sfdc.username" value="username@domain.com"/>
<entry key="sfdc.password" value="af6ea5a96e45e10ff8042247cf4eec85"/>
<entry key="sfdc.timeoutSecs" value="540"/>
<entry key="sfdc.loadBatchSize" value="200"/>
<entry key="sfdc.entity" value="Account"/>
<entry key="process.operation" value="insert"/>
<entry key="process.mappingFile" value="c:\path\to\your\directory\mapping.sdl"/>
<entry key="process.outputError" value="c:\path\to\your\directory\errors.csv"/>
<entry key="process.outputSuccess" value="c:\path\to\your\directory\success.csv"/>
<entry key="dataAccess.name" value="c:\path\to\your\directory\new-accounts.csv" />
<entry key="dataAccess.type" value="csvRead" />
<entry key="process.initialLastRunDate" value="2007-06-06T00:00:00.000-0800"/>
</map>
</property>
</bean>
</beans>

The names of the properties are quite clear, but let's explain it:
- sfdcusername - your Salesforce username
- sfdc.password - your encoded Salesforce password - how to encode a password will be explained further in this tutorial
- process.operation - type of operation - can be insert, upsert, update or delete
- process.mappingFile - a file that describes a mapping between the columns in the source CSV file and the Salesforce object fields - this is also explained below
- dataAccess.name - location of a file that contains the data you want to insert in CSV format
- process.outputError - location of a file into which all rows that could not be processed will be written
- process.outputSuccess - location of a file into which all rows that have been successfully processed will be written

Creating the config.properties file


This is a simple file that contains only the endpoint URL to which you will be loggin in. Create it in your working directory with the following content:
config.properties
#Loader Config
#Thu Sep 10 09:37:47 PDT 2009
sfdc.endpoint=https\://login.salesforce.com

or, if you are working on a sandbox:
config.properties
#Loader Config
#Thu Sep 10 09:37:47 PDT 2009
sfdc.endpoint=https\://test.salesforce.com

Encoding the password


The password that is put into the configuration file needs to be encrypted with an encrypt tool that comes with Data Loader. Find it at <data-loader-dir>/bin/encrypt. In this example, let's assume your password is "xyz". Execute:
encrypt -e xyz

# Output
af6ea5a96e45e10ff80a8187cf4eec85

Copy the displayed value and paste it into the process-conf.xml file to the property sfdc.password

Prepare a mapping file


In your working directory, create a mapping file called mapping.sdl. Its purpose is to define a correspondence between columns in your CSV data file and the actual properties of Salesforce objects.
mapping.sdl
Name=Name
First_Name__c=First_Name__c
Last_Name__c=Last_Name__c

Prepare a CSV data file


new-accounts.csv
"Name","First_Name__c","Last_Name__c"
"User1","Mark","Davies"
"User2","Jenny","Smith"

Inserting the data


Now that we are done configuring the job, run it:
process.bat "c:\path\to\your\directory\" insertAccounts

The meaning of the parameters is as follows
process.bat <directory-where-configuration-resides> [optional-operation-name]

If you don't specify the operation name, all operations defined within the configuration file will be executed.

Checking if insert succeeded


To check if the insert operation succeeded, open the file success.csv. It will contain all the records that were successfully inserted in CSV format.
The errors.csv on the other hand will contain those records for which the insert failed.
Comments
Hi,

Thank you very much for the info. I have a query-
In the line below
<entry key="dataAccess.name" value="c:\path\to\your\directory\new-accounts.csv" /> , Instead of
new-accounts.csv , I would like to give something like *.csv, where I am not sure about the file name, Could you please help?

Regards,
PN
Added on 24-04-2014 19:44 by anonymous

 

Add comment

Has this tutorial been helpful to you? Or do you see anything wrong? We appreciate your opinion!
Your comment:
Show formatting hints
HTML is disallowed, but in your text you can use the following markup
  • [code][/code] for a block of code
  • [tt][/tt] for inline code
  • [link]link href|link anchor[/link] for links
  • [b][/b] for bold text
Email:
+ Ask a question
If you have a technical question related to programming and computers, ask it here. Other users will help you solve it!
Unanswered questions
Share your knowledge by helping others solve their problems