When AI Gets Nutrition Wrong
A growing number of teenagers dealing with weight concerns are turning to AI chatbots for help creating diet plans. It seems like a reasonable idea — AI models can process nutritional data, calculate macronutrient ratios, and generate personalized meal suggestions in seconds. But a new study reveals that the resulting plans may be dangerously inadequate, undercounting daily caloric intake by nearly 700 calories and potentially putting adolescent health at risk.
The research, which tested five major AI chatbots including ChatGPT, Gemini, Claude, Copilot, and Llama, found that all of them regularly proposed dietary plans for teenagers that were equivalent to skipping an entire meal each day. For adolescents whose bodies are still growing and developing, chronic caloric restriction at this level could interfere with bone development, hormonal maturation, cognitive function, and long-term metabolic health.
How the Study Worked
The researchers prompted each AI model with scenarios representing typical teenager requests: a 15-year-old wanting to lose weight, a 16-year-old athlete looking to optimize nutrition, and a 17-year-old seeking a healthy eating plan. They specified age, gender, height, weight, and activity level — the standard inputs that any legitimate nutrition calculator would use to determine caloric needs.
The AI models generated detailed meal plans with specific foods, portion sizes, and macronutrient breakdowns. These plans appeared professional and comprehensive. However, when the researchers independently calculated the actual caloric content using standard nutritional databases, they found a consistent pattern of underestimation averaging approximately 700 fewer calories per day than the minimum recommended intake for adolescents.

