Digital technologies will redefine how food industry measures taste and consumer behavior
Artificial intelligence, extended reality, biometrics and digital sensors are transforming traditional evaluation methods in the food industry and accelerating product development, while introducing new questions about data quality, technological bias and research validity, according to a new analysis that argues that these tools will determine how companies, regulators and laboratories understand the modern consumer in the decades ahead.
The findings of the study titled Integrating Cutting-Edge Technologies in Food Sensory and Consumer Science: Applications and Future Directions are published in the journal Foods. The authors examine how a rapidly expanding set of digital and computational tools is rewriting the rules of sensory testing, long considered the scientific backbone of food development. Their review tracks advances across artificial intelligence, machine learning, natural language processing, molecular modelling, virtual and augmented reality, biometrics and digital sensing, arguing that these technologies are converging into an integrated analytical ecosystem that will fundamentally change how taste and consumer behaviour are studied.
AI and machine learning create predictive sensory models while raising concerns about bias and overfitting
The study finds that AI has become one of the most influential new forces in sensory and consumer science. Machine learning models are now capable of predicting flavour profiles, texture properties, aroma intensity and consumer liking scores with high levels of accuracy across a wide range of foods, including wine, meat, fermented beverages and mineral water. Researchers have developed algorithms that detect sensory defects, classify quality, forecast shelf stability and identify consumer preference patterns using both chemical input data and human evaluation records.
According to the review, the most robust findings come from models that integrate multiple types of data, such as volatile compounds, physicochemical properties and descriptive sensory analysis scores. Some reported models reach accuracy levels approaching perfect classification for narrow tasks, demonstrating how machine learning can complement or, in some cases, partially substitute human sensory panels. However, the authors note that model performance declines when datasets are unbalanced, when categories lack variation or when models are tested in real-world contexts that differ from the training conditions.
A related challenge identified in the paper is overfitting, where an algorithm performs exceptionally well on training data but fails to generalize. The authors highlight this as a recurring issue in food-focused AI research, particularly when sample sizes are small or when researchers rely on chemical data that may not correlate consistently with sensory experience. Another difficulty is domain drift; food products evolve during processing, distribution and storage, which causes input data to shift over time, degrading the accuracy of static models.
The review also examines the growing use of natural language processing and large-language-model techniques to analyse consumer reviews, social media posts and free-text comments from sensory panels. These tools allow researchers to extract sentiment, identify emerging flavour trends and detect hidden drivers of liking or disliking that traditional numerical scales may overlook. At the same time, the authors warn that linguistic bias, translation issues and cultural variation can lead to misinterpretation if models are not carefully calibrated. They argue that AI-driven tools should assist, rather than replace, expert interpretation in consumer science.
The analysis also explores how AI has entered molecular-level flavour research. Machine learning, combined with molecular docking and molecular dynamics simulations, can rapidly screen taste-active peptides, predict their binding affinity to receptor proteins and estimate their potential impact on umami, saltiness or bitterness. These integrated approaches have allowed researchers to expand candidate lists faster than conventional laboratory techniques, speeding the discovery of flavour-enhancing compounds. Yet, the authors note that computational predictions still require sensory validation and that physical experiments remain essential for confirming theoretical outcomes.
Extended reality brings real-world context into sensory labs but introduces new methodological challenges
The review shows that virtual reality, augmented reality and mixed reality are becoming powerful tools for creating immersive sensory-testing environments. Through headsets, projections or digitally enhanced rooms, researchers can simulate cafés, restaurants, bars, outdoor markets or home kitchens, providing consumers with contextual cues that more closely resemble real consumption experiences. The study notes that these environments can alter expectations, emotional engagement and willingness to try new products, affecting sensory perception in measurable ways.
Extended reality has gained traction because traditional sensory booths impose highly controlled but artificial conditions that strip away environmental stimuli known to influence taste and preference. By integrating ambient sound, lighting, visual scenes and even simulated social settings, extended reality aims to combine ecological validity with experimental control. The authors point out that immersive environments can also be more cost effective than in-person tests with large sample sizes, especially when researchers need to evaluate products in multiple contextual settings.
However, the study stresses that extended reality introduces its own set of practical and methodological concerns. Cybersickness is one of the most significant barriers, with symptoms such as dizziness, nausea and eye strain affecting data quality and participant comfort. The review also warns that novelty effects may distort results, as participants unfamiliar with immersive environments may focus more on the technology itself than on the products being tested. Ensuring high presence scores, an indicator of how realistic the simulation feels, while keeping simulator sickness scores low is essential for valid results.
Furthermore, extended reality requires careful calibration. Variables such as frame rate, headset resolution, ambient sound fidelity and scene design can change participants’ emotional states and attention levels. These, in turn, may influence flavour perception, adding complexity to result interpretation. The authors argue that extended reality experiments must be standardized across hardware platforms and testing protocols to ensure reliability and comparability across studies.
Despite these challenges, the review concludes that extended reality offers a promising path forward for studying how real-world environments influence food experiences. As immersive technologies improve and become more affordable, they are likely to play a central role in next-generation sensory labs.
Biometrics and digital sensors redefine consumer insight through continuous, objective measurement
The study devotes significant attention to biometric technologies that capture non-conscious responses to food. Tools such as electroencephalography, functional near-infrared spectroscopy, functional magnetic resonance imaging, skin conductance sensors and heart-rate monitors allow researchers to quantify emotional reactions, arousal, cognitive load and neural processing associated with taste and aroma perception. Eye tracking and facial expression analysis add further detail by revealing where consumers direct their attention and how they react to packaging, colour or visual cues.
The authors describe these biometric tools as essential for understanding aspects of consumer behaviour that self-reported liking scores cannot fully capture. Physiological signals often reveal subtle differences between expected and actual flavour experiences or between products that consumers consciously rate as similar. Biometrics also help identify the influence of factors such as stress, hunger, satiety or sensory fatigue, enabling more precise interpretation of taste-test results.
The review also examines ethical issues linked to biometric data. Physiological and affective measurements are deeply personal, raising questions about privacy, informed consent and data storage. The authors argue that researchers must implement strict safeguards and clearly communicate how data will be used, especially as machine-learning models begin to combine biometric and behavioural inputs in predictive frameworks.
Parallel to biometric monitoring, digital sensors, including electronic noses and tongues, hyperspectral imaging, near-infrared spectroscopy and automated IoT devices, are expanding the scope of objective food measurement. These tools can detect volatile compounds, monitor product quality in real time and link chemical fingerprints to sensory attributes. When combined with AI-based models, digital sensors help create closed-loop systems that continuously update predictions as food products evolve during processing or storage.
- FIRST PUBLISHED IN:
- Devdiscourse

