Overview

Dynamic Customer Profiles (DCP) are a collection of your customers' attributes, including demographic data, behavioral characteristics, or any other information particular to your industry and customers. DCP provides a consolidated, dynamic view of your customers, enabling you to refine this view as you obtain more information, and to take action based on this view.

A single customer profile contains attributes collected by you, or by services that you use, and provided to Optimizely to create a single view of the customer. These attributes are organized and stored in Tables(also known as datasources) and linked across Tables using identity aliases.

Customer profiles can be used to create audiences for targeting, and exported for use in other integrations, or analysis.

Use the v1 REST API to configure DCP Services, Tables, and attributes. You can also read and write customer profiles, or get and create aliases using the v1 REST API.

Currently, all DCP REST APIs make use of only classic tokens for authorization. They do not accept personal-tokens to authorize api requests.

To enable DCP for your account, please contact techpartners@optimizely.com

Remember, your terms of service prohibit you from collecting or sending any personally identifiable information (such as names, social security numbers, email addresses, or any similar data) to Optimizely's services or systems through Dynamic Customer Profiles or any other feature.

Please read the article on PII to learn more about sending data to Optimizely and handling personally identifiable information.

Customer Profiles

Customer Profiles are a collection of your customers' attributes across several Tables. The following APIs allow you to create, update, and read customer attributes for a single Table.

To use these APIs, we recommend that you first read the sections on DCP Services, Tables, and attributes

Using the consolidated customer profile API call, you can retrieve the complete Customer Profile across all Tables.

See the API Reference for endpoints related to customer profiles

Bulk Upload

You can upload a CSV (or TSV) file to the optimizely-import S3 bucket using the provided Table S3 path. We will parse the given CSV, validate each row of data against the registered attributes, and store the successfully processed rows. Each row is treated as an update request.

Using the provided AWS credentials, it's possible to upload CSV files in a variety of ways. The simplest approach is to use an S3 client application, such as Cyberduck.

Using Cyberduck

Once you have downloaded and installed Cyberduck, follow these steps to upload a file:

  1. Retrieve the datasource's AWS Access Key, AWS Secret Key and S3 Import Path from the Optimizely interface (Audiences > Attributes > More Actions > Data Upload).
  2. Launch Cyberduck and create a new Bookmark (Bookmark menu > New Bookmark).
  3. Enter a descriptive name for the datasource and your AWS Access Key.
  4. Click "More Options" to enter the S3 Import Path.

    Note: Cyberduck requires the full S3 import path, including both the optimizely-import bucket and the given s3_path, for example, when s3_path=dcp/567/678, use /optimizely-import/dcp/567/678

  5. View your bookmarks and double-click to connect to the S3 bucket. Cyberduck will prompt you to enter the AWS Secret Key from Step 1.

  6. You are now connected to your Datasource's S3 bucket! You can drag and drop files to be uploaded via the Cyberduck interface.

Please note that this process may change in future versions of Cyberduck. See the official Cyberduck website for current documentation.

Programmatically

It's also possible to upload files programmatically, using the AWS CLI, an available SDK, or library.

You can retrieve the AWS credentials and S3 path from the Table endpoint. This information is also displayed in the Optimizely web application in the Data Upload menu within each Table.

After uploading a file to S3, you can check Upload History to confirm that your file has been processed. If you need to upload an updated version of your file, make sure that any files with the same name have already been processed. Uploading multiple files with the same name to the same S3 path can cause the files to be processed out of order and prevent your updates from being detected.

CSV Formatting Requirements
  • You can download a template CSV file for each of your Tables under the Data Upload menu. This file will include a correctly formatted header row with each of the attributes registered for the Table:

  • Each column in the header row must be a registered attribute name. A CSV may contain a subset of the registered attributes
  • The header row must include a column named customerId (case-insensitive). All rows must also contain a valid value for this column. Note that the values of this column should correspond with expected values for the Table's Customer ID field (defined during Table configuration). For example, if your Table's Customer ID Field is a cookie named "logged_in_id", then the value of the customerId column for each row should be the user's ID as it would appear in their "logged_in_id" cookie.
  • If a column header does not correspond to a registered attribute name, the upload will fail
  • If an attribute value does not respect the attribute's datatype/format, the upload will fail
  • If you add an attribute of type String or Text, make sure that any double quotation marks that are part of the attribute are escaped with another quotation mark prefixed before it. Except for double quotes, none of the other symbols or characters require to be escaped.

    For example: A formatted data row in your input file can look like this:

      "CustomerId1", "John Doe", "Height: 6'1"""
    

    As shown for the "Height" attribute in above sample row, the double quotes that are part of the data need to be escaped with another double quote even though the field value is enclosed in separate double quotes, while the single quote, in attribute value, can be used as is.

  • Here's an example of what your CSV file would look like for a Table containing two attributes, most_viewed_category and LTV:

Alias

You can target a customer with any attribute from your datasources (also called "tables") by creating an alias.

Aliases are links between your customer IDs and the Optimizely user IDs. With multiple datasources linked, Optimizely is able to generate consolidated customer profiles, which enable advanced, specific customer targeting.

Optimizely automatically generates each Optimizely user ID and stores it in the optimizelyEndUserId cookie.

ANON_ID_1 and OEU_2 are in separate columns of the same row in the 'Alias Table'. Two arrows show that ANON_ID_1 and OEU_2 are originally from separate tables.

In the above figure, the highlighted alias indicates that ANON_ID_1 in "My Datasource" is the same customer as OEU_2 in "Optimizely Datasource".

See the API Reference for endpoints related to aliases.

Profile Integrations

Customer Profile Integrations allow you to upload attributes to enrich a customer profile.

This section walks through how to build an integration that uploads customer attributes to Optimizely.

Pre-requisites

  • Your application must store customer IDs (e.g. hashed email addresses or other unique IDs)
  • Customer IDs must not be personally identifiable, as per Optimizely's Terms of Service
  • Customer IDs must be accessible via the customer's device (e.g. in a cookie, query parameter, or JavaScript variable)
  • You are comfortable using REST APIs

1. Get Access

If you are interested in building an integration with DCP and need DCP enabled for your account, please contact techpartners@optimizely.com

2. Register your application

We highly recommend that you use OAuth 2.0 to authenticate with the Optimizely REST API. This will allow you to provide a seamless experience to users in your application and periodically send data to Optimizely. Learn how to connect to Optimizely using OAuth 2.0.

3. Create a DCP Service

After connecting with Optimizely, you should create a DCP Service, and associate it with your project. This service will contain all your datasources. If you already have a DCP service, you may proceed to the next step.

4. Create a Table

Create a table within your DCP Service. This will be the location for all of your application's customer data. A table allows you to send customer data to Optimizely, organized under a common ID space, without worrying about the relationship of customers across Tables.

5. Register Attributes

Register attributes for the table with create attribute. Attributes must be registered prior to customer profile data being uploaded to that table.

6. Upload data

Write customer profile attribute values for the registered attributes. You can also bulk upload attribute data by dropping a CSV (comma-separated values) file into the table's S3 Import path: optimizely-import/<s3_path>; the s3_path is included in the table's metadata.

7. QA integration

To test the integration end-to-end, verify that:

  • A table has been created in the the Audiences > Attributes dashboard.
  • Table attributes appear in the audience builder under the list of External Attribute conditions and you can create an audience based on these attributes.
  • You can run an experiment targeted to this audience based on uploaded data.