Uniqueness can be tricky, so let's discuss it in more depth here. It's a database concept that warrants some additional discussion.
When setting up a Smart Connector, the system needs to be able to interpret your data correctly. There is a configuration called uniqueness that helps TapClicks understand what field or combination of fields make a particular entry in your dataset "unique".
Where uniqueness is hard to interpret, the system may treat two entries in your database the same. This can lead to over-writing errors or inaccurate reporting.
The reason these errors can occur is because when you upload multiple files over time, the system has to correctly attribute one line of data with its historical counterpart. Without that, the system may interpret each new line of data as a new line or as duplicate lines, and not show the time series you want to display in reports.
Let's use an example...
Let's pretend I have a dataset that is used to track the performance of various ads for a car dealership client of mine. This dealer has 20 different storefronts and uses the same ads for many of them.
I want to upload this data using a Smart Connector. I have to ask, "What attributes of this data make a given line in my spreadsheet unique?" Said another way, what factors combine to make one line different than another?
In this case, I have a few attributes that, if interpreted individually, could lead to misleading results. They include the following:
- The ad
- The location
- The product advertised
- The campaign
If I uploaded my data into the Smart Connector without telling it "All these things must be considered when deciding whether to match a field to a previous version of the file", then it might treat all lines with the same ad or same location as ... the same.
However, if I tell the system "a single element in my dataset is a combination of these factors", then it will interpret things correctly.
In this case, I would mark all four items as being factors to determine uniqueness.