Our 2nd annual Innovations in Online Research Conference took place on September 23, 2022

Keynote Speaker: Dr. Joseph Cimpian, New York University

Dr. Joseph Cimpian is Professor of Economics and Education Policy at New York University. He earned a Ph.D. in Economics of Education from Stanford University. His research focuses on the use and development of novel and rigorous methods to study equity and policy, particularly concerning language minorities, gender, and sexual minorities. His work has been funded by the Spencer Foundation, the AERA Grants Board, the National Science Foundation, and the Institute of Education Sciences. His research has been published in some of the top journals in education, psychology, health, and policy, and has been featured by the New York Times, the Washington Post, NPR, and Brookings, among other outlets.
Dr. Joseph Cimpian
How Invalid and Mischievous Survey Responses Bias Estimates Between Groups
In this talk, Dr. Cimpian discusses how mischievous responders—and invalid responses, more generally—can perpetuate narratives of heightened risk, rather than those of greater resilience in the face of obstacles, for LGBQ youth. The talk reviews several recent and ongoing studies using pre-registration and replication to test how invalid data affect LGBQ-heterosexual disparities on a wide range of outcomes.

Data Quality Across Platforms

Online data collection continues to offer great benefits to researchers, but there is a pressing need for validated methods to ensure high data quality across platforms. In this session, Chris Berry and Jeremy Kees provide an overview of data quality on MTurk and professional panel samples, Joseph Goodman discusses the way online data collection is perceived among marketing researchers, and Efrain Ribeiro offers a deep dive into the effects of sampling automation on respondent fraud.
Chris Berry
Chris Berry
Assistant Professor of Marketing, Colorado State University
Jeremy Kees
Jeremy Kees
Professor of Marketing, Villanova University
In this session, we provide an overview of data quality on MTurk and professional panel samples. We discuss common concerns about data quality, methods for ensuring high-quality responses, and how to compare different platforms for your research needs.
Joseph Goodman
Joseph Goodman
Professor of Marketing, Washington University
This talk examines how online data collection is perceived among marketing researchers and discusses best practices for conducting rigorous online research that meets academic standards.
Efrain Ribeiro
Efrain Ribeiro
Associate Professor of Political Science, University of Georgia
A deep dive into how automation in sampling affects the prevalence and detection of fraudulent responses in online surveys.

Connect Workshop

Learn about CloudResearch Connect, our platform for conducting online research studies. This workshop covers the key features and benefits of using Connect for your research projects.
Aaron Moss
Aaron Moss
Senior Research Scientist, CloudResearch
An in-depth workshop on using CloudResearch Connect for online research studies, covering setup, participant recruitment, and data collection best practices.

2022 CloudResearch Grant Recipients

Meet the 2022 CloudResearch Grant recipients and learn about their innovative research projects that are pushing the boundaries of online research methods.
Asli Cerencinar
Asli Cerencinar
PhD Candidate, New York University
Research project description and findings from the 2022 CloudResearch Grant recipient.
Farnoush Reshadi
Farnoush Reshadi
PhD Candidate, University of Toronto
Research project description and findings from the 2022 CloudResearch Grant recipient.
Gilad Feldman
Gilad Feldman
Assistant Professor, University of Hong Kong
Research project description and findings from the 2022 CloudResearch Grant recipient.
Yefim Roth
Yefim Roth
Lecturer, University of Haifa
Research project description and findings from the 2022 CloudResearch Grant recipient.

Prime Panels Workshop

Learn about CloudResearch Prime Panels and how to access high-quality participant pools for your research studies.
Cheskie Rosenzweig
Cheskie Rosenzweig
Co-CEO & Chief Technology Officer, CloudResearch
An overview of CloudResearch Prime Panels and how to effectively use them for high-quality data collection.

2021 CloudResearch Grant Recipients

The 2021 CloudResearch grant recipients present updates on the projects they have been working on for the past year. Michiel Spape discusses time perception, Art Marsden presents an AI-generated realistic face stimuli database, and Nick Byrd exhibits the Socrates Platform for facilitating reflective cognition.
Michiel Spape
Michiel Spape
Adjunct Professor in Cognitive Neuroscience, The University of Helsinki
Having recently discovered that time perception can be reliably altered by imagining movement, in this talk Michiel presents evidence demonstrating this effect is enhanced in depression, which may have important implications for diagnosis and treatment.
Art Marsden
Art Marsden
Assistant Professor, Syracuse University
Art presents an AI-generated realistic face stimuli database designed specifically for psychological research, addressing issues of consent and representation in facial research studies.
Nick Byrd
Nick Byrd
Assistant Professor, Stevens Institute of Technology
Nick exhibits the Socrates Platform, an innovative tool designed to facilitate reflective cognition and improve decision-making processes in research participants.

Technological Advances in Online Research

This session primarily focuses on technological advances that can facilitate online research and lead to more innovative and creativity in research methods and designs. Hiromichi Hagihara uses webcams to for eye-tracking, Matt Lease introduces a novel measure of annotator agreement, Susan Persky uses Virtual Reality to enhance experimental control and generalizability, and Carlos Ochoa digs into people's willingness to participate in in-the-moment surveys triggered by one's geolocation.
Hiromichi Hagihara
Hiromichi Hagihara
Research Fellow, International Research Center for Neurointelligence (WPI-IRCN), The University of Tokyo
Matt Lease
Matt Lease
Professor of Computer Science, University of Texas at Austin
The reduced experimental control in online experiments leads to the interference of factors such as lighting or the distance from a webcam. Hiromichi talks about a video dataset that systematically includes factors that may affect automated gaze coding and its potential to improve data quality.
Susan Persky
Susan Persky
Investigator, National Human Genome Research Institute, NIH
Carlos Ochoa
Carlos Ochoa
Co-founder and Chief Technology Officer, Netquest
Susan discusses how Virtual Reality can enhance experimental control and generalizability in online research, while Carlos explores people's willingness to participate in location-triggered surveys.

Sentry Workshop

Learn about CloudResearch Sentry, our data quality validation tool that helps ensure the integrity of your research data by detecting and filtering out low-quality responses.
Aaron Moss
Aaron Moss
Senior Research Scientist, CloudResearch
An in-depth workshop on using CloudResearch Sentry for data quality validation, covering setup, configuration, and best practices for maintaining high-quality research data.

Accessing, Incentivizing, and Verifying Niche Samples

These talks focus primarily on niche samples in online research. Leah Hamilton discusses her use of MTurk to reach public assistance recipients, Spencer Baker focuses on recruiting religious samples through social media, Rachel Hartman presents a method for verifying age online, and Michael Maniaci talks about offering personalized feedback as an incentive.
Leah Hamilton
Leah Hamilton
Assistant Professor, Appalachian State University
Spencer Baker
Spencer Baker
PhD Candidate, University of Virginia
Leah discusses using MTurk to reach public assistance recipients, while Spencer focuses on recruiting religious samples through social media platforms.
Rachel Hartman
Rachel Hartman
PhD Candidate, Florida International University
Michael Maniaci
Michael Maniaci
Associate Professor, Florida Atlantic University
Rachel presents methods for verifying age online, and Michael discusses offering personalized feedback as an effective incentive for research participation.

Toloka Workshop: Beyond WEIRD Samples

Learn about Toloka, a crowdsourcing platform that enables researchers to access diverse, global participant pools beyond the typical Western, Educated, Industrialized, Rich, and Democratic (WEIRD) samples commonly used in research.
Elena Brandt
Elena Brandt
Behavioural Research Lead, Toloka
An introduction to Toloka and how researchers can use this platform to access more diverse, global participant populations for cross-cultural and international research studies.

Click here for a summary of our 2021 Innovations in Online Research Conference.