A recent press release by 2Morrow Inc. announced the availability of a new app called SmartQuit. 2Morrow Inc. markets this app as the “first smoking cessation app proven effective in a clinical trial.” According to its creators, SmartQuit uses a strategy for smoking cessation that is different from most smoking cessation methods. The method is called Acceptance and Commitment Therapy (ACT), and was created at the Fred Hutchinson Cancer Research Center, a world renowned research institute known for its cancer diagnostic and treatment research. The institute also conducts cancer prevention research, and it was this research that led to SmartQuit.
Marketing the app as being proven by a clinical trial reflects a growing concern in the medical app industry, and by government regulators, that certain health related apps should be subjected to clinical research standards when making claims that appear to be medical in nature. While the research supporting the app is not a clinical trial endorsed by government or funded by a source without conflicts of interest, the research is peer-reviewed and may be one of the best approaches for proving this technology is valuable – short of FDA approval for all health related apps, which neither government nor industry would favor at this point.
So, what is the evidence supporting this app’s effectiveness? So far the effectiveness of SmartQuit is supported by one randomized clinical trial conducted by Jonathan Bricker at Fred Hutchinson and researchers at University of Washington. The researchers based the functioning of the app on the ACT method. This method focuses on helping individuals do the following: 1) accept their physical cravings, emotions, and thoughts related to smoking and 2) commit to their values. After developing the app, it was tested for usability by four internal and four external individuals with changes being made to the app afterward. The researchers recruited 196 participants through various means, including Facebook, and randomized them to either SmartQuit or an app called QuitGuide. The participants were asked to use the apps for a period of eight weeks and received reminder emails weekly to use the apps.
The key results of the study were that users of SmartQuit had higher quit rates, but based on a very wide confidence interval. They also opened their app more frequently (p- value <.0001). Overall researchers concluded that the app was a feasible method of delivering ACT and showed higher quit rates and levels of engagement than QuitGuide.
This study is a great step forward in smoking cessation app research but there are definitely limitations to it. The researchers indicate that this study offered useful pilot results but a larger study focused on effectiveness is needed. Also, the abstract for this study states that it is a double-blind study, but there is no indication in the methodology as to how the participants or researchers were blinded, hence whether it is blinded is not clear. A future study might be blinded by providing apps that lack any specific name to participants and not allowing researchers any access to the list of individuals assigned to each app. In addition, the researchers did not go through an objective process for selecting a comparison app. However, true double blinding is unlikely to work for apps because the participants usually download the app and they know what they are using. Finally, the app selected as a comparison app was not designated in any way as the best smoking cessation app or the most popular smoking cessation app on the market.
Future clinical trials of new smoking cessation apps (and other health behaviors) would be more useful for clinicians if they tested their app against the best apps on the market. Another useful approach would be randomizing participants to an app that is designed similarly to the intervention app but without specific features of interest. Despite the limitations, this is a great step forward and a model for researchers working in this area in the future.