Skip to main content

iStockphoto/Getty Images/iStockphoto

Plenty of online ads are targeted to people based on attributes like their gender, geographic location, or interests they've expressed through their Web browsing behaviours. But when does that targeting constitute discrimination?

That's a question raised in a new study from Carnegie Mellon University and the International Computer Science Institute in Berkeley, Calif. The study's authors examined which ads appeared based on user profiles, and have suggested that greater caution may be needed to ensure targeting does not cross the line.

To conduct the study, which was published in the journal Proceedings on Privacy Enhancing Technologies, researchers built a tool that self-identified with different user profiles, including different genders, and surfed the Web to analyze the types of ads delivered to each user through Google's ad network.

The various simulated Internet users first browsed 100 popular employment-related websites to identify themselves as job seekers. They then all visited the same websites, belonging to The Times of India and The Guardian, to see the types of ads delivered to them. One of the major differences was an ad for a career coaching service for "$200k+" executive jobs, which appeared on The Times of India site – that ad was shown to the male profiles 1,852 times. It was shown to female profiles just 318 times.

Of course, that's just one finding on one website. (The study looked at ads on just the two websites because it used a tool to automatically analyze the ads, and that takes programming tailored to each website's design.) But the fact that a correlation did appear was troubling, the researchers said.

"It illustrates that discrimination can arise in these large, automated [ad serving] systems," said Michael Carl Tschantz, a researcher at the International Computer Science Institute and one of the study's authors. "Showing ads promoting [coaching for] high-paying jobs only to men could have negative societal effects, so seeing that it can happen is concerning."

It is not unusual for advertisers to target consumers for their ads based on gender or age. In the most obvious case, a female Web user would be more likely to see ads for women's clothing, while male Web users might be more likely to see ads for men's clothing. This happens in other media too: It's why you see more commercials for trucks on TV during sports broadcasts, for example, or cosmetics ads in magazines aimed at women.

It may well reinforce gender stereotypes, say, to consistently advertise makeup to women and trucks to men. The difficulty is identifying when ad targeting – which, by its nature, excludes some audiences – is discriminatory.

"While neither of our findings of opacity or discrimination are clear violations of Google's privacy policy and we do not claim these findings to generalize or imply widespread issues, we find them concerning and warranting further investigation by those with visibility into the ad ecosystem," the researchers wrote.

It is also hard to assign blame for the alleged discrimination. Google's policies allow advertisers to target ads by gender, and the targeting may have been the result of other factors, such as a pattern in which men click on the ad more often, causing it to be automatically targeted to more men.

Google Inc. declined requests for an interview, but provided the following statement: "Advertisers can choose to target the audience they want to reach, and we have policies that guide the type of interest-based ads that are allowed. We provide transparency to users with 'Why This Ad' notices and Ad Settings, as well as the ability to opt out of interest-based ads."

Advertising providers such as Google, Yahoo and Microsoft do provide web pages where people can change their profiles (including interests, age, and gender) that influence the ads targeted to them. However, the study also raises the concern that these tech companies may not provide transparency on the complete profile they they have on users, or offer as much control as they could.

That's because the experiments showed a change in advertising with no corresponding change in the profiles of each user. In one case, the experiment had different user profiles visit 100 popular websites associated with substance abuse. After that, simulated users visiting the website of The Guardian were shown 3,309 ads (or 16 per cent of the ads) that related to drug and alcohol rehab. Comparatively, another group that did not visit the substance abuse-related websites did not see those ads. But when this occurred, the experimenters noted no change in the user profile on Google's ad settings site. The experimenters wrote that this suggests the information given to people on how ads are targeted to them is not entirely transparent.

The use of sensitive information – visiting websites about substance abuse for example – has been an issue before. Last year, the Office of the Privacy Commissioner of Canada found that a man had been served ads for sleep apnea devices through Google Inc.'s AdSense service based on Web searches he had done on the subject. "Health or medical information" is among the sensitive data that Google's policies forbid advertisers from using to target ads. Since then, Google has been working to reinforce its policies on retargeting based on sensitive information, according to the OPC. But problems can still arise.

"Google's ad network is so large that it could be very difficult for Google to ensure that this policy is obeyed by every advertiser, since there are so many of them," said International Computer Science Institute researcher Mr. Tschantz. " … There is a need to develop tools to prevent things like this from happening."

Report an editorial error

Report a technical issue

Editorial code of conduct

Tickers mentioned in this story

Study and track financial data on any traded entity: click to open the full quote page. Data updated as of 14/05/24 4:00pm EDT.

SymbolName% changeLast
MSFT-Q
Microsoft Corp
+0.69%416.56

Interact with The Globe