A group of researchers with the nonprofit SRI International have developed a nutrition app that can detect the caloric and nutritional content of food from a picture that you snap with your smartphone.
Obesity is a major problem in the United States – we’ve all seen the powerpoint slides showing CDC maps of obesity prevalence spreading over the past fifty years. Apps like MyFitnessPal and LoseIt let you track the food you eat in a very detailed way. We also recently reviewed Rise, a platform that lets you snap pictures of your meals and get feedback from certified nutritionists.
Researchers with the nonprofit SRI international recently published a paper describing Snap’n’Eat, an app that lets you snap a picture of your meal and calculates nutritional information like caloric content automatically for you.
Basically, the app figures out which segments of the picture contain food and then tries to figure out what type of food is in each segment. Based on that determination, it estimates the caloric content and other nutritional information.
They found that when dealing with a limited set of samples (fifteen in their tests), they were able to achieve 85% accuracy. But when expanding to a larger sample set, the app did not work as well.
They do note that it may be possible to improve the system by having users “train” the app early on; if the app can be taught about the users typical diet, then its accuracy could be improved.
In some ways, the ability to automatically detect nutritional information from a picture is the “holy grail” of diet apps. It would make diet tracking incredibly easy. However, this study highlights the current challenges and limitations of available technology. Further work is certainly needed but it’s a goal worth working towards given the scope of the problem it seeks to address.