Main content

Aleks Krotoski finds out what happens, for ill or good, when we are all turned into digital lab rats, often without our knowledge or explicit consent.

When you go online, there is a 100% chance that you will be part of an experiment. We are constantly observed, and tested upon, in the digital space, and more often than not it’s done without our knowledge or explicit consent.

Many experiments are simple and narrow, focused on how to keep our eyeballs on a particular page, to how to get us to click a particular button, or how to separate people into categories where we can be subjected to particular exploitation - did your hotel or plane tickets cost more or less than another person on the same site?

But should we be afraid of every test in the digital world?

Aleks finds out how a glitch in the World of Warcraft resulted in the first virtual plague, and it allowed epidemiologists to study human behaviour in a pandemic situation, without risk of anyone really being harmed but in ways that were startlingly analogous to real world behaviour.

And she delves into the now infamous Facebook Emotional Contagion study, and finds out that the public outrage may not only have been displaced, but could have done far more, and longer lasting, harm than could have been predicted.

Available now

29 minutes

ERIC LOFGREN

ERIC LOFGREN

Dr. Lofgren is an infectious disease epidemiologist whose research focuses on the use of mathematical and computational models of disease transmission, particularly the transmission of antimicrobial resistant infections within and outside healthcare settings, as well as emerging infectious diseases. His work often focuses on producing policy-relevant results, working hand-in-hand with clinicians and policy makers to produce reproducible, quantitative guidance for designing and evaluating public health interventions.

He tells us about his research into the Corrupted Blood Incident, the first virtual plague, how online data can yield useful research data about real life under the right conditions.

MICHELLE MEYER

MICHELLE MEYER

Michelle Meyer is an Assistant Professor at Geisinger, where she co-directs an interdisciplinary lab that investigates judgments and decision-making related to research, innovation, and healthcare. She is also Associate Director for Research Ethics at Geisinger, chairs Geisinger's IRB Leadership Committee, directs its Research Ethics Advice and Consulting Service, and serves on its AI Governance Committee..  Her writing has appeared in leading journals of bioethics (American Journal of Bioethics, HastingsCenter Report, Kennedy Institute of Ethics Journal), law (Harvard Law Review, Administrative Law Review), and science (Nature, PNAS), as well as in popular media outlets (New York Times, Slate, Wired, Los Angeles Times, and Forbes.com). She has served on numerous boards and commissions, including National Academies study committees and working groups, an American Psychological Association blue ribbon commission, the editorial board of Advances in Methods and Practices in Psychological Science, the Board of Directors of Open Humans Foundation (formerly PersonalGenomes.org), and the Ethics and Compliance Advisory Board of PatientsLikeMe. 

 

She talks to us about the 2014 Facebook Emotional Contagion Study, and why the public backlash against the experiment, and adversion to testing in general, could have lasting, harming on billions of people.

CHRISTO WILSON

CHRISTO WILSON

Christo Wilson is an Associate Professor in the Khoury College of Computer Sciences at Northeastern University. He is a founding member of the Cybersecurity and Privacy Institute at Northeastern, and serves as director of the BS in Cybersecurity program. Professor Wilson's research focuses on online security and privacy, with a specific interest in algorithmic auditing. Algorithmic auditing is an emerging, interdisciplinary area that uses experimental techniques to measure the black-box algorithmic systems that pervade daily life in order to increase transparency and accountability of these systems. His work is supported by the U.S. National Science Foundation, a Sloan Fellowship, the Mozilla Foundation, the Knight Foundation, the Russell Sage Foundation, the Democracy Fund, the Data Transparency Lab, the European Commission, Google, and Verisign Labs.

 

He tells us about his research into AB testing online, how some companies don’t even realise that they are undertaking human experimentation, and reveals some of the ways that testing is dividing people online, with potentially harmful results.

Broadcast

Podcast