Intrusion and innovation in advertising

0

How do we reconcile the fact that people don’t like their data being shared, but publish it openly online? It’s clear that users are uncomfortable with advertisers having access to their data, with many unsure about how their data is used to target ads. Advertisers never see a user’s data unless it’s given to them or is from their own properties, but it can be available as a targeting option on a variety of platforms. It’s an important distinction, but not one many people are aware of.

For advertisers, the goal is to serve fewer ads, which effectively reach the target audience. If my ad is annoying, I’ll do anything I can to stop showing it to you. Digital therefore offers a unique opportunity for brands to target users who match certain criteria, but also to filter those who don’t, therefore saving money by avoiding those unlikely to become customers.

The industry refers to this as ‘the filter bubble’ – which claims there is so much content on the web, users can’t take it all in, so media owners simply don’t show it all. Critics claim that this only enforces our existing world-views, increasing the likelihood of ‘confirmation bias’; the theory is we only notice that which supports our beliefs and ignore that which doesn’t.

But how much of this ‘filter’ is algorithm versus personal choice? Facebook recently released a study that determined the level of ‘filter bubbliness’ was indistinguishable, whether they ran the algorithms or not.

So if advertisers are working hard to reduce unwanted ad exposure, why are users still reluctant to share data? The problem is a lack of understanding of how data is used and ultimately, trust.

Anybody with significant ad tech experience could easily exploit the least technically proficient players, to conduct personal tracking on an enormous scale. It’s more than likely that people are making large sums collecting and selling data right now, and consumers are thinking the same.

But advertisers typically don’t see that data. They see first-party data that a user has explicitly provided and almost never collect, store or process data themselves.

The problem from a user’s perspective is that opt-outs aren’t universal. The industry does provide some shared preference controls, but the technology is also a limiter. Anonymous technology used to track behavior, is the very same tech used to store opt-outs and preferences.

So, can anything be done? Legislation is coming that will improve the landscape, but it’s going to prove unenforceable without industry backing. Unless users can be informed, they can’t make decisions, and they can’t be informed while the world of tracking is such a mess. It’s up to website owners to take control, and it’s up to legislators to mandate it.

About Author

Alistair Dent

Alistair Dent, head of product at iProspect

Comments are closed.