Can smartphones replicate traditional in-house cognitive tests?

A group of researchers from the University of Oxford’s Centre for Human Brain Activity and the University of Birmingham’s School of Psychology recently published a trial involving 20,800 users submitting data performing cognitive tasks via smartphone games. These tasks, which cover a range of cognitive domains — perception, action inhibition, decision-making and short-term memory — were designed as part of a crowdsourced video game. The game paired game player’s demographic data like age, sex, education, location, and a rating of overall life satisfaction with with their performance.

The result – games are feasible, can engage the public, and get reliable results. These cognitive experiment results mirrored those typically found in traditional in-house tasks.


“These data demonstrate that canonical experimental results can be replicated using smartphone games,” Harriet Brown and colleagues write, “despite the relatively uncontrolled environment when compared to laboratory testing.”

The game, The Great Brain Experiment, has demonstrated reproducibility between distraction-filtering tasks both on smartphones and in lab experiments, as well as a mathematical model predicting players’ momentary happiness based on in-game decisions.

Brown et al also note parallels with other crowdsourced science projects, including classifying galaxy shapes, protein folding, and tracking neurons through the retina.

However, they note several limitations and issues with the accuracy of smartphones, recruitment & demographics of participants, and the design of such smartphone experiments. For instance, smartphones can vary in terms of graphics and processor performance, which matter for timed stimuli being presented on-screen. Experiments with ultra-fast stimulus durations may not work when smartphones can’t display items quickly enough.

Recruitment of participants is also different from that of traditional local studies. The team paid considerable attention and commitment towards marketing the app through traditional and social media, including Twitter and the Wall Street Journal Speakeasy blog. And, there may potentially be more than one player per phone.

The type of experiments that can be crowdsourced through an app also needs to encourage participants to play and replay the game. The average time to complete a game under 5 minutes, for instance. Successful apps need to be designed to be “short, fast-paced, easy at the beginning,” with rewards such as high points.

You can read the full open-access paper at PLoS ONE. Download the app on Google Play or the Apple App Store.