Understanding multisensory food perception through the internet

People working on the project: Carlos Velasco, Charles Spence, Olivia Petit, Kasun T. Karunanayaka, and Adrian D. Cheok.

Screenshot 2015-12-10 06.49.06

Our research here focuses on utilizing all the sensors and capabilities of mobile technologies in order to understand the way in which people perceive and interact with food. In particular, we are interested in designing mobile applications that allow us to gather information in context. For example, nowadays it is common to take pictures of our food yet, it is less common to gather information about the sonic environment in which the picture is taken and/or people’s expectations and perception of the food. By combining the aforesaid variables, it may be possible to understand how sonic cues, plating (and its corresponding graphical parameters), and people’s subjective experience of the food interact.