Sun, October 23, 2022
The online professional networking platform LinkedIn conducted a five-year experiment on 20 million users, testing which types of contacts resulted in new job opportunities. But it did so without the express consent of users, something that privacy experts told CBC is concerning.
(iStock - image credit)
A five-year study by LinkedIn on nearly 20 million of its users raises ethical red flags since some unknowing participants in the social experiment likely had job opportunities curtailed, experts in data privacy and human resources suggest.
The online networking and social media platform randomly varied the number of strong and weak acquaintances present in users "People You May Know" suggestions to test a long-held theory: that people are more likely to get a new job through distant acquaintances than they are close contacts.
The resulting study, published in Science Magazine on Sept. 15, by LinkedIn, MIT, Stanford and Harvard researchers, confirmed the idea: users shown contacts with whom they had only 10 mutual friends doubled their chances of a new job, compared to those shown people with 20 mutual friends.
But that also means the LinkedIn users whose algorithms were inundated with "close contacts" — those with 20 or more mutual friends — connected with fewer opportunities through the networking site.
Given the possible consequences, it's unlikely many people would knowingly consent to have their network, and livelihoods, manipulated as they were for the study, said Jonathon Penney, a law professor whose research focuses on internet, society and data policy at York University's Osgoode Hall Law School.
'No way they would have consented'
It was "a huge number of people that could be negatively affected in terms of job prospects simply because of this study," Penney said of the 20 million subjects. More than five million participants were said to be from North America in the 2019 phase of the study.
"Most users, if you asked them, would say there's no way they would have consented to this kind of study … I have real concerns with the ethics."
Submitted by Jonathon Penney
While academics are held to a rigorous standard of ethics and disclosure, it's not unusual for marketing or media companies to use an algorithm to gauge the success of new products or services. It's a process known as A/B testing, in which users have access to different online tools or experiences to analyze how a person engages with it.
In an email to CBC News, a LinkedIn spokesperson said the company hoped to use the data to tailor its services.
"Through these observations we were able to determine that you're more likely to get a job from an acquaintance over your best friend," LinkedIn said in an email. "We can't wait to see how the study helps companies, recruiters and job seekers change the way we think about the labour market."
A blanket privacy policy
Though the company never notified its users of the experiment while it was underway, its privacy policy states that LinkedIn can use members' profiles to conduct research.
But online privacy experts who spoke to CBC News suggest that the standard privacy policies people click through when registering for an online platform give the companies too much latitude in how they use people's information.
LinkedIn
In fact, the purpose limitation principle in Canada's Personal Information Protection and Electronic Documents Act (PIPEDA) states that user data can only be used for the purpose declared at the moment of collection — but companies often push the envelope, said Ignacio Cofone, Canada Research Chair in artificial intelligence law and data governance at McGill University.
"The problem … is that corporations very rarely know the purpose for which they're going to use data later on," Cofone said in an interview.
As such, "the way the law has evolved in business has allowed very wide purposes [of user profile use]."
LinkedIn's study "is a perfect illustration of how empty the meaning of consent is in our online interactions for companies," Cofone continued. For example, it would take someone 250 hours to read the average number of privacy policies they agree to in a year, he said — and those policies often change unilaterally.
Penney said he recognizes the purpose of LinkedIn's study: a practical look at big data and human behaviour. And the study had been subject to an institutional review board for human subject research, unlike Facebook's hidden 2014 psychological experiment, which sparked investigation from British data protection authorities.
Nonetheless, Penney said accepting a lengthy and intentionally vague privacy policy upon registration is not the same thing as the "informed consent" required of typical human subject studies — especially ones that may carry real-life consequences.
There are often significant hoops university-level studies need to clear to conduct research on human subjects, Penney said. "You have to be very [precise] about the study and its purposes. If there's any kind of deception, there's often additional safeguards that have to be put in place."
He also shared concerns that LinkedIn might be using their study to test new avenues for profit.
"You can easily imagine that the kind of design affordance that LinkedIn is testing could be used for intention bias, where the best jobs [and] hiring opportunities are channelled to wealthier users," said Penney.
Favouring wealthier users
The platform has already made a notable shift to offer paying users benefits in the past five years, said Neil Wiseman, senior consultant for Pivotal recruitment and HR services in Mississauga who uses LinkedIn in his line of work.
LinkedIn's premium subscription, starting at $30 a month, allows users to directly contact anyone on the platform. Those with free accounts, meanwhile, can only contact people they've connected with.
"When people reach out [via LinkedIn Premium], I try to give them something of value. They're taking the time, and they're paying to touch base with me," said Wiseman. And he notes that those who directly reach a company or hiring manager usually see more success in the job market.
Relying on the algorithms
Refer HR, a recruitment firm that's served 42 corporate clients since opening in Vancouver in 2019, also scours LinkedIn for potential hiring candidates, said general manager Kobe Tang. Recommendations made by LinkedIn's algorithms play a significant role in his search and eventual hiring, he said.
The networking site was also an essential space for Canadian tech workers following prominent layoffs by Shopify, Wealthsimple, Hootsuite and Unbounce in 2022, said Refer HR marketing manager Rob Gido.
"Adding so-called weaker [connections] definitely improves your chances of finding new opportunities and new work," said Gido.
Ignacio Cofone
The Office of the Privacy Commissioner (OPC) said in an email that it had not received a complaint regarding the study, but if it does, it could prompt an investigation.
But Cofone and Penney said Canada's privacy legislation's leniency around consent is one sign of how the law is less rigorous than its counterparts around the world. The European Union's general data privacy policy was updated twice since Canada's legislation was enacted 22 years ago, while this country's personal protection law has seen no major change in that time.
Penney said he would like to see legislative changes that give the federal privacy commissioner more powers for investigation and enforcement — and that limit how company privacy policies can be used when it comes to personal data, said Penney.
The act should be updated to reflect fundamental user rights — and instead place liability on companies who tread on those, said Cofone. Were employment to be harmed by a company's use of a user's profile, for example, "we should not be exempting them from liability just because they have the illusion of consent," he said.
"If Canadians are unhappy with being guinea pigs in a platform study like this, they should vote with their feet for the party that is proposing more robust data protection and privacy laws," Penney said.
"Politicians should be paying attention to this issue … these kinds of platform practices may entrench social and economic inequality."
A five-year study by LinkedIn on nearly 20 million of its users raises ethical red flags since some unknowing participants in the social experiment likely had job opportunities curtailed, experts in data privacy and human resources suggest.
The online networking and social media platform randomly varied the number of strong and weak acquaintances present in users "People You May Know" suggestions to test a long-held theory: that people are more likely to get a new job through distant acquaintances than they are close contacts.
The resulting study, published in Science Magazine on Sept. 15, by LinkedIn, MIT, Stanford and Harvard researchers, confirmed the idea: users shown contacts with whom they had only 10 mutual friends doubled their chances of a new job, compared to those shown people with 20 mutual friends.
But that also means the LinkedIn users whose algorithms were inundated with "close contacts" — those with 20 or more mutual friends — connected with fewer opportunities through the networking site.
Given the possible consequences, it's unlikely many people would knowingly consent to have their network, and livelihoods, manipulated as they were for the study, said Jonathon Penney, a law professor whose research focuses on internet, society and data policy at York University's Osgoode Hall Law School.
'No way they would have consented'
It was "a huge number of people that could be negatively affected in terms of job prospects simply because of this study," Penney said of the 20 million subjects. More than five million participants were said to be from North America in the 2019 phase of the study.
"Most users, if you asked them, would say there's no way they would have consented to this kind of study … I have real concerns with the ethics."
Submitted by Jonathon Penney
While academics are held to a rigorous standard of ethics and disclosure, it's not unusual for marketing or media companies to use an algorithm to gauge the success of new products or services. It's a process known as A/B testing, in which users have access to different online tools or experiences to analyze how a person engages with it.
In an email to CBC News, a LinkedIn spokesperson said the company hoped to use the data to tailor its services.
"Through these observations we were able to determine that you're more likely to get a job from an acquaintance over your best friend," LinkedIn said in an email. "We can't wait to see how the study helps companies, recruiters and job seekers change the way we think about the labour market."
A blanket privacy policy
Though the company never notified its users of the experiment while it was underway, its privacy policy states that LinkedIn can use members' profiles to conduct research.
But online privacy experts who spoke to CBC News suggest that the standard privacy policies people click through when registering for an online platform give the companies too much latitude in how they use people's information.
In fact, the purpose limitation principle in Canada's Personal Information Protection and Electronic Documents Act (PIPEDA) states that user data can only be used for the purpose declared at the moment of collection — but companies often push the envelope, said Ignacio Cofone, Canada Research Chair in artificial intelligence law and data governance at McGill University.
"The problem … is that corporations very rarely know the purpose for which they're going to use data later on," Cofone said in an interview.
As such, "the way the law has evolved in business has allowed very wide purposes [of user profile use]."
LinkedIn's study "is a perfect illustration of how empty the meaning of consent is in our online interactions for companies," Cofone continued. For example, it would take someone 250 hours to read the average number of privacy policies they agree to in a year, he said — and those policies often change unilaterally.
Penney said he recognizes the purpose of LinkedIn's study: a practical look at big data and human behaviour. And the study had been subject to an institutional review board for human subject research, unlike Facebook's hidden 2014 psychological experiment, which sparked investigation from British data protection authorities.
Nonetheless, Penney said accepting a lengthy and intentionally vague privacy policy upon registration is not the same thing as the "informed consent" required of typical human subject studies — especially ones that may carry real-life consequences.
There are often significant hoops university-level studies need to clear to conduct research on human subjects, Penney said. "You have to be very [precise] about the study and its purposes. If there's any kind of deception, there's often additional safeguards that have to be put in place."
He also shared concerns that LinkedIn might be using their study to test new avenues for profit.
"You can easily imagine that the kind of design affordance that LinkedIn is testing could be used for intention bias, where the best jobs [and] hiring opportunities are channelled to wealthier users," said Penney.
Favouring wealthier users
The platform has already made a notable shift to offer paying users benefits in the past five years, said Neil Wiseman, senior consultant for Pivotal recruitment and HR services in Mississauga who uses LinkedIn in his line of work.
LinkedIn's premium subscription, starting at $30 a month, allows users to directly contact anyone on the platform. Those with free accounts, meanwhile, can only contact people they've connected with.
"When people reach out [via LinkedIn Premium], I try to give them something of value. They're taking the time, and they're paying to touch base with me," said Wiseman. And he notes that those who directly reach a company or hiring manager usually see more success in the job market.
Relying on the algorithms
Refer HR, a recruitment firm that's served 42 corporate clients since opening in Vancouver in 2019, also scours LinkedIn for potential hiring candidates, said general manager Kobe Tang. Recommendations made by LinkedIn's algorithms play a significant role in his search and eventual hiring, he said.
The networking site was also an essential space for Canadian tech workers following prominent layoffs by Shopify, Wealthsimple, Hootsuite and Unbounce in 2022, said Refer HR marketing manager Rob Gido.
"Adding so-called weaker [connections] definitely improves your chances of finding new opportunities and new work," said Gido.
Ignacio Cofone
The Office of the Privacy Commissioner (OPC) said in an email that it had not received a complaint regarding the study, but if it does, it could prompt an investigation.
But Cofone and Penney said Canada's privacy legislation's leniency around consent is one sign of how the law is less rigorous than its counterparts around the world. The European Union's general data privacy policy was updated twice since Canada's legislation was enacted 22 years ago, while this country's personal protection law has seen no major change in that time.
Penney said he would like to see legislative changes that give the federal privacy commissioner more powers for investigation and enforcement — and that limit how company privacy policies can be used when it comes to personal data, said Penney.
The act should be updated to reflect fundamental user rights — and instead place liability on companies who tread on those, said Cofone. Were employment to be harmed by a company's use of a user's profile, for example, "we should not be exempting them from liability just because they have the illusion of consent," he said.
"If Canadians are unhappy with being guinea pigs in a platform study like this, they should vote with their feet for the party that is proposing more robust data protection and privacy laws," Penney said.
"Politicians should be paying attention to this issue … these kinds of platform practices may entrench social and economic inequality."