BidTheatre DSP can be used as a generic store of user data in key-value form. The user data can be used for subsequent campaign targeting and affinity reporting.
At the heart of the data collection lies the cookie group. The cookie group contains any number of keys, and each key can hold any number of values. Each unique key and value should be given a compact non-readable name that can be used to communicate segments through browser scripts and pixels.
Each key-value pair can be given a readable description within the system, called a dictionary. Cookie Group data is populated in two ways:
Either by rendering tags in user's browser
Or through a server-to-server (s2s) integration
Disallowed characters in keys and values are:
Attempt to set or update values containing these characters will be discarded.
Tag Based Population of Cookie Group Data
Read more about how to set up cookie groups and inject data to them via browser tags here.
Server-To-Server Syncing of Cookie Group Data
With this technique, a text file with the cookie data is uploaded to an Amazon S3 bucket (preferred), or via SFTP and injected into the relevant cookie group.
The text file must contain one user id per row, followed by a tab, and a comma separated list of key=value pairs. If the =value part is omitted, a default value of 1 is set.
The default behaviour is for cookie group data files to contain the full set of user data. I.e. any existing data on the cookie group for the user in question will be purged. However, files can also be sent as containing only incremental changes. In this scenario, only let the file contain rows with changes to users data. To indicate deletion of an existing value, use the value %DEL%.
Server-To-Server Syncing of the Cookie Group Dictionary
Cookie group dictionaries can also be uploaded to an Amazon S3 bucket (preferred) or via SFTP and injected into a cookie group.
The text file must contain one value key,value and description triplet per row in the follow format:
If the =value is omitted, a default value of 1 is used.
If data import volumes are low, start with verifying that:
The feed contains the expected segments.
The feed is properly pushed to our S3 bucket
User matching is properly initiated and BidTheatre User ID's properly collected
Supported External DMP's
BidTheatre has integrations with the following DMP: