How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

Reading Time: 10 minutes
How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

In our previous blog, How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow (Part 1)? We have discussed Amazon AppFlow, its features, benefits, use cases, etc. In this blog, we will see how we can make use of the Amazon AppFlow service to create a data flow to share data between AWS (Amazon S3 bucket) and Salesforce. 

Hands-On

In this hands-on, we will see how we can make use of the Amazon AppFlow service to create a data flow to share data between AWS (Amazon S3 bucket) and Salesforce. Amazon AppFlow basically provides two-way communication between AWS and any other SAAS-based system. We will see how we can create a Salesforce Developer account, an S3 bucket to store the file containing the data using the Amazon S3 service, and a data flow using the Amazon AppFlow service that will help us share data between AWS and Salesforce. We will also see how we can add various formulas to concatenate the shared columns, various validations, and filtrations based on which the data will be shared and stored as a CSV formatted file in the created S3 bucket.

To implement this, we will do the following:

  • If you do not have a Salesforce account, navigate to the dashboard and create one.
  • Verify your email address and navigate to the Salesforce dashboard.
  • Login to your AWS account and traverse to the home page.
  • On the dashboard, search for AWS S3 service and navigate to the S3 dashboard.
  • Create a new bucket with a unique name that will be used to store the file containing the shared data.
  • Navigate to the Amazon AppFlow service to configure the flow for the data-sharing process.
  • Explore the dashboard and create a new flow for the data to be shared.
  • Connect to your already existing or newly created Salesforce account to authorize the data-sharing process.
  • Map the fields, and add validations and filters based on your requirements.
  • Follow the 6-step process to create the flow and review the configurations.
  • Execute the flow to begin the data-sharing process and test the same.
  • Review the success message to check for the count of records shared based on the filters and validations.
  • Navigate to the S3 console and download the file created.
  • Download the file, change the extension and review the data by opening the file.

Create a Salesforce account (developer edition) using the following URL:

https://developer.salesforce.com/signup

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

Fill in the details in the form as shown below and click on register.

You will be navigated to the page as shown below. You are then required to verify the email address to proceed.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

The verification email received will be as shown below. Click on Verify Account to verify your account and proceed with the execution.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

Once you click on verify, you will be navigated to the page as shown below wherein you are required to enter a password.

Enter a new password and confirm the password as required. Select the security question and answer and click on Change Password once done.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

Once done, you will be navigated to the dashboard. The dashboard might take a few minutes to be configured.

On success, you will see the dashboard as shown in the image below.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

Now, login to your AWS account and navigate back to the AWS console.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

Search for S3 and click on the service to open the S3 dashboard.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

You will be navigated to the S3 dashboard as shown below. Click on Create Bucket to create a new bucket where the data file will be stored.

Fill in the details and configure your bucket.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

Enter a name for your bucket, select the AWS region and select the Block all public access checkbox.

Scroll down, enable versioning, enable encryption for your bucket and add tags (if needed) for your bucket. Once done, click on Create bucket.

On success, you will see the message as shown in the image below and you will find the bucket in the list of the buckets.

Now, search for Amazon AppFlow on the AWS console and open the AppFlow dashboard.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

You will be navigated to the Amazon AppFlow dashboard. You can scroll down to look at its various features, integrations and benefits.

Amazon AppFlow provides integration with the following external service to create a data sharing flow.

Here are some of the benefits and features offered by Amazon AppFlow.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

Scroll up to the top of the page and click on View Flows. On the Flows dashboard, click on Create flow.

You have to configure the flow following the 5 steps process. Configure the settings as shown in the upcoming images.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

Enter a name for your flow and a description for the flow (if needed).

Scroll down and check the data encryption checkbox (if needed). Add a tag for your flow (if needed).

In step 2, we need to select our data source which is salesforce.

Click on the dropdown and search and select Salesforce.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

Under Choose salesforce connection, select Create new connection.

A new modal will appear to create a new connection. For the Salesforce environment, select production. AWS KMS key is autoselected.

Enter a name for your new connection as shown in the image below. Once done, click on continue.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

A new window will open up as shown in the image below. You need to click on Allow to allow access to Salesforce to connect to AWS.

On success, you will see the connection added in the text box. Now, below the connection, a new field will appear. Select Salesforce objects and proceed.

Click on the choose Salesforce object dropdown and from the drop-down, select the schema you need to save the data for. In our case, we selected the account schema.

Select the choose destination object dropdown and select Amazon S3 as the destination for your data file storage.

One done, click on the choose an S3 bucket drop-down and search and select the bucket you created above. Add a prefix (if needed).

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

Expand the additional settings section to configure other settings. Choose the options for each configuration as shown in the image below. You may change it based on your requirements if needed.

Once done, scroll down and for the flow trigger, you can either choose it to be Run on demand or Run follow on schedule. For this blog, we will choose Run follow on demand. Click on next.

For the map data fields configuration, select manually map fields. Under source to destination field mapping, click on the drop-down.

From the drop-down, select map all fields directly to map all the fields of the schema. You can select the columns based on your preferences and requirements.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

Once done, you will see all the fields mapped in the area below.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

To add a new formula for the selected fields, click on the drop-down again and select more than 2 fields on which you wish to apply that formula. Once done, click on map fields with formula.

A modal will appear as shown in the image below. Click on the drop-down for formula and select the formula.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

As you can see, concatenate is one of the default formulas added. Choose the default formulas. Enter a new name for the field that are going to be concatenated. Verify the details and clicks on map fields with formulas

.

You can also add a formula by selecting the fields from the mapped fields as shown in the image below. Once selected, click on add formula.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

Follow the same process as we did above and clicks on map fields with formula.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

Now, expand the additional settings option. Select the options if required for your configuration.

Let’s add a validation for the flow if you wish to apply some conditions to the data. Expand validations tab to add a new validation.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

For this blog, we have added a validation as shown in the image below on the account id field. Now the data that will be shared, would not have any record with the Account Id field empty. Once done, click on next.

In the next step, you can add filters for the data if required.

For this blog, we have added a filter as shown in the image below. We added a filter on account name such that only the records whose account name contains the letter A will be shared. After adding the filter, click on Next.

In the last step of the flow, review and verify all the settings you configured in the above steps.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

Once done, scroll down to the bottom of the page and click on Create flow.

On success, if all the configurations have been done perfectly, you will see a success message as shown in the image below. Now, to share the data, click on Run flow.

The flow execution might take a few minutes to run and store the data in your S3 bucket.

If your flow is configured properly on your database, you will see a success message as shown in the image below. The message also gives you other details like the transfer size, the records count that got processed based on the filters and validations you applied for your flow and your S3 bucket link where the data is stored. Click on the S3 bucket to navigate and download the file consisting of the data that got transferred from your Salesforce database.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

You will see that a folder will be created in your S3 bucket. Open that folder to fetch the file.

You will see the file containing the data inside the folder. Select the filter and download it. On your local machine, change the extension of the file to be .csv.

On opening the file, you will see the data based on the criterias you applied on the dataset in your appflow.

How to create a data flow to share data between AWS and Salesforce using Amazon AppFlow?

Conclusion

In this blog, we have created a Salesforce Developer account, an S3 bucket using the Amazon S3 service, a data flow using the Amazon AppFlow service, and various formulas, validations, and integrations. We have explored how we can integrate the Amazon AppFlow service with a Salesforce account following a 6-step process to configure the data sharing process via a data flow. Amazon AppFlow provides two-way communication between AWS and any other SAAS-based system. AWS makes use of PrivateLink to make the data flow between AWS and Salesforce and thus, helps in fetching the data as per our requirements. We will discuss more use cases of Amazon AppFlow in our upcoming blogs. Stay tuned to keep getting all updates about our upcoming new blogs on AWS and relevant technologies.

Meanwhile …

Keep Exploring -> Keep Learning -> Keep Mastering

This blog is part of our effort towards building a knowledgeable and kick-ass tech community. At Workfall, we strive to provide the best tech and pay opportunities to AWS-certified talents. If you’re looking to work with global clients, build kick-ass products while making big bucks doing so, give it a shot at workfall.com/partner today.

Back To Top