To understand how Forrester’s predictions will unfold in the retail industry in 2020, my colleague Madeline Cyr and I interviewed experts within Forrester for our “Applying 2020 Predictions To Retail” series. To learn more about how privacy and data ethics regulations will affect retail marketers, we spoke with Andrew Hogan and Fatemeh Khatibloo, who are Forrester experts on user experience and data privacy, respectively.
Madeline: You predict that the UK’s ICO (Information Commissioner’s Office) will upend digital advertising. What threats does this pose to retailers’ digital marketing initiatives like personalization?
Fatemeh: Things like real-time bidding (RTB) will take a big hit. RTB lets advertisers instantaneously automate bidding for “impressions” on publisher and content sites. This means, as an advertiser, you might not know all the places your ad is being shown. This lack of insight is a problem for brand safety and, according to the ICO, currently violates three different provisions of the GDPR (General Data Protection Regulation):
- Advertisers share user data that is defined as personal information (PI) under the GDPR or CCPA with their advertising vendors. As this data — like device ID, ad ID, or cookie history — moves downstream to fourth-, fifth-, and sixth-party vendors, the data is no longer under the advertiser’s control. The ICO says that this constitutes data leakage.
- RTB as it is done today violates consent requirements. You cannot just pass the chain of consent to all these ad exchanges that allow the real-time bidding to happen.
- Real-time bidding violates transparency. Because you do not know where your ads are appearing, or why your ad was placed on a site, you can’t be transparent with consumers about where their data has been shared and for what purposes.
Of course, only a small percentage of global retailers are governed by the ICO. But regional regulators tend to look at what other jurisdictions have done to inform their own investigations and enforcement actions. In other words, the California Attorney General could look at the ICO’s report and decide that California consumers have the right to sue companies over real-time bidding.
Madeline: What should retail stores that use surveillance technology be aware of in 2020?
Fatemeh: A lot of companies are currently flouting Biometric Information Privacy Acts (BIPAs) like in Illinois and Michigan. They should pay attention to the $550 million fine Facebook will pay for using its facial recognition algorithm on users’ photos without consent. Under the Illinois Biometric Information Privacy Act, some of the tools that retailers would potentially be using around gate analysis or facial recognition need express consent. You need to make that very clear to the consumer. This can be extremely challenging, as you will need to show a consumer some sort of opt-out option and determine how you will suppress the biometric information of those who do opt out.
A lot of BIPA-like regulations are coming up in other states, and since BIPA has stood up in court and has not been successfully challenged, it is looking like a model law. I anticipate more states will adopt BIPA, and therefore we will see tactics like in-store facial recognition and gait analysis start to come under more scrutiny.
Madeline: You predict that dark patterns will earn brands fines and bad press. What are dark patterns, and how should retailers avoid them?
Andrew: Dark patterns are interface and design decisions that are deceptive to or bias the user into making a choice that is not always in that user’s best interest. Some examples we see on retail sites are:
- Activity notifications such as “X many of this item were ordered [or added to the cart] in the last hour.” (These are sometimes randomly generated numbers.)
- Confirm shaming, whereby the option to dismiss a pop-up is framed in a way to shame a user out of taking that action.
- Misdirection, whereby in one type of this tactic, the most expensive product is the default you need to opt out of and alternate choices are visually discouraged.
When consumers are aware of dark patterns, they are likely to lose trust when interacting with a brand using these tactics. Our data shows that as trust drops, loyalty drops also. Not only that, but there have been a few notable cases in other industries of fines stemming from dark patterns, and we may see more of that in coming years. For retailers and brands, that means there’s precedent to heed and learn from. To avoid dark patterns, retailers need to design experiences that not only help generate revenue but design for the customer’s ideal experience and foster future relationships rather than cheap tricks.
Madeline: You predict that top talent will avoid companies with bad data ethics. How will privacy and ethics affect the retail employee experience down to the store associate level?
Fatemeh: We have seen that top tech companies that have had problems with data ethics and privacy often have a hard time attracting top talent. For example, job offer acceptance rates for some positions at Facebook fell from around 90% to nearly 50% after the Cambridge Analytica scandal. In a similar vein, companies that are known for making good choices or have demonstrable social corporate values find it easier to attract and retain talent.