Apple announced today the launch of three new ResearchKit apps from Johns Hopkins, Duke University, and Oregon Health Sciences University (OHSU). The studies being launched at these universities will use data collected through the app, including heart rate and motion data from the Apple Watch in the case of the Johns Hopkins study.


EpiwatchEpiWatch was developed by neurologists at Johns Hopkins in collaboration with THREAD Research. If the latter sounds familiar, it’s because they’ve worked on other ResearchKit studies including the PRIDE study from UCSF.

In a first for ResearchKit apps, EpiWatch will use an Apple Watch app to capture data on heart rate and activity. A big focus is to see whether there are ways to automate detection of seizures. That could enable automated alerts to family members or activation of emergency response.

According to the press release,

The study will test whether the wearable sensors included in Apple Watch can be used to detect the onset and duration of seizures. During the first phase of this study, researchers will use a custom application on the Apple Watch to provide patients with one-touch access to trigger the custom watch app to capture accelerometer and heart rate sensor data, to capture the digital signature of their seizure and send an alert to a loved one. The app will keep a log of all seizures and the participant’s responsiveness during the event.

The app will also include other tools like medication & symptom tracking as well as screening for medication side effects. All of this will add up to a lot of data peri-seizure that could be used to determine triggers or other predictors.

Autism & Beyond

Autism AppAutism & Beyond was developed by Duke University to screen children for autism. The app works by showing children videos and using the front-facing camera to record their response. They will then analyze those responses for signs of autism and other developmental disorders. According to a report on Vox,

The app is set up to play 20-minute videos while using an iPhone or iPad’s built-in camera to scan viewers’ facial expressions, analyze their microreactions, and then indicate if there’s a potential risk of autism. It’s intended for parents to use with their children, who see videos of lights, sounds, and storytellers….When I smile, the dots that line the video version of my face turn green. When I frown, they shade red.

Signs of autism are frequently missed early in life which means a big delay in helping get children connected to appropriate therapy and resources. Using an automated screening tool that could be administered using nothing more than a tablet device could help make screening scalable though that’s of course assuming it works.

Mole Mapper

Mole Mapper AppMole Mapper was developed initially by a cancer biologist to help track skin lesions for a family member between dermatology clinic visits. Now its being launched as part of a broader study by OHSU to try to improve early detection of skin cancer as well as tracking of skin lesions.

Users can take pictures of moles or skin lesions with a reference object, like a coin. The app then uses that reference object to automate measuring the size of the lesion as well as capturing information on other characteristics. It will then track changes over time. These images will be uploaded to a central database at OHSU to try to inform the design and improvement of automated skin cancer detection tools.