Teaching machines how to smell

Avatar

By Aishwarya Badiger

Ever wished you knew what food at a restaurant tasted like before choosing a place to eat on a Friday evening? If you are like me, you probably spend about an hour trying to make sense of Google reviews and eventually end up choosing a tried and “tasted” restaurant. Right now, asking my digital assistant about a restaurant goes something like this: “Hey Google, what does the falafel at Brassica taste like?” and it gives me the same response, “Sorry, I don’t understand”. Digitizing the senses of smell and taste seems to be a challenge that remains unsolved, yet artificial intelligence and machine learning techniques have allowed scientists to make huge strides in this effort.

Our complex sense of smell:

A lot of what we taste depends on our sense of smell; so much so that people suffering from anosmia (inability to smell) tend to have a hard time appreciating food and often lose their appetite1. When we eat a banana, what we are actually smelling is 100s of small molecules that drift up in our nasal cavity and bind to olfactory receptors (ORs) allowing these receptors to then communicate with our brain and associate the smell with a “banana”. While we have just 3 photo receptor types for our vision (RGB or red, blue, green), we have a vast olfactory space with around 400 types of olfactory receptors2. This is what makes digitizing smell so much more complex than vision. So why not use something inspired by neuronal connections3,4 to actually predict the olfactory nervous system?

How would it work?

In 2019, the Google Research Brain team developed an algorithm that predicts the smell of a compound based on its molecular structure5. While the facial recognition feature on your phone is able to break an image down into its smallest components (i.e. pixels), the smallest unit for an odorant molecule in 3D wasn’t as well defined. To approach this issue, Google created the pixel equivalent of an odorant molecule using Graph Neural Networks (or GNNs). In simple terms, a neural network is a system designed to imitate the way neurons are activated in our brain. In a GNN, each atom of an odorant compound is a node that encodes information from neighboring atoms; and vectors describe the interaction between these nodes. To train their model, Google used a database of ~5000 common flavor molecules with assigned odor descriptors (such as “buttery” for 2,3-butanedione) for each molecule. The graph for each molecule was treated as the input layer in a neural network. Adjusting the parameters in a neural network helps train your model to correctly recognize odors, just as you would tune a guitar to give you the desired music scale. This is done through multiple GNN layers (steps of transformation) and with each layer the “resolution” (or ability to distinguish the smell percept of one odorant molecule from another) keeps getting better.

Source: Sanchez-Lengeling, B. et al. (2019) Machine learning for scent: Learning generalizable perceptual representations of small molecules5

Google’s model is able to predict how a particular odorant will smell like by looking at the molecular structure. Furthermore, it has enabled them to identify molecular neighbors with similar smell percepts. This can especially be useful to substitute rare or high-priced molecules such as vanillin (a key molecule in vanilla flavor) with more readily accessible compounds, without compromising on the flavor.

What’s next?

Looking forward, several challenges stand to be tackled. Going back to the banana example, 100s of molecules make up the familiar banana odor. While scientists have been successful in elucidating QSORs (Quantitative Structure-Odor Relationships), how compounds would behave in the presence of each other is not well understood. Moreover, what one person perceives as “pine” odor may be described as “floral” by another, which could partly be driven by differences in receptor function from person to person6. Nevertheless, this was the first step in beginning to digitize smell. Physically recreating smells is a whole other challenge. One futuristic idea that seems like it is straight out of a Black Mirror episode is the use of code sent to a Neuralink implantable brain chip7. Such a chip could activate specific olfactory receptor neurons and make a person perceive an odor without even physically inhaling odorant molecules. Could that someday enable people with anosmia to smell the world?

A whiff of the future:

There’s more than just convenience to digitizing smell. Machine learning has begun to help scientists understand how a particular molecule is transformed into a smell percept in our brain. This would also allow us to reverse engineer smells to design molecules that might be cheaper, safer, or more sustainable for food and fragrance applications. Dogs are capable of sniffing out a wide array of odors – from disease markers to human scents that enable them to catch criminals. Although I’m sure detectives must love spending time with their furry buddies, training can be a time and cost intensive process and that’s where an AI capable of recognizing scents can begin to replace trained sniffing dogs. Only three years back humans didn’t know what FaceID on their phones could look like. The possibilities of using machine learning to understand our very complex sense of smell are only limited by our imagination. Could we, in the future, digitally share scents like we share images? Imagine using Google Maps street view for a restaurant and being able to smell a menu item. Personally, I can’t wait for a time when my digital assistant will recreate scents and take me back to my favorite places.

References

  1. Burges Watson, D. L. et al. Altered Smell and Taste: anosmia, parosmia and the impact of long Covid-19. medRxiv 1–19 (2020). doi:10.1101/2020.11.26.20239152
  2. Purves, D. et al. Neuroscience. Sinauer Associates Inc. (2004). doi:10.1016/B978-0-12-801238-3.62132-3
  3. Adan, Y. Do neural networks really work like neurons? | by Yariv Adan | The Startup | Medium. Available at: https://medium.com/swlh/do-neural-networks-really-work-like-neurons-667859dbfb4f. (Accessed: 12th March 2021)
  4. Liljeqvist, I. The Essence of Artificial Neural Networks | by Ivan Liljeqvist | Medium. Available at: https://medium.com/@ivanliljeqvist/the-essence-of-artificial-neural-networks-5de300c995d6. (Accessed: 12th March 2021)
  5. Sanchez-Lengeling, B. et al. Machine learning for scent: Learning generalizable perceptual representations of small molecules. arXiv (2019).
  6. Monell Chemical Senses Center. Do you smell what I smell? From genes to receptors to perception: Olfaction unraveled. Science Daily (2019). Available at: https://www.sciencedaily.com/releases/2019/04/190430164208.htm. (Accessed: 12th March 2021)
  7. Musk, E. & Neuralink. An integrated brain-machine interface platform with thousands of channels. bioRxiv 0–11 (2019). doi:10.1101/703801

Aishwarya Badiger | Linkedin

Guest SMF Blog Writer

Originally from India, Aishwarya is now a PhD Candidate at The Ohio State University studying compounds generated in milk during storage and their sensory perception. The goal of her research is to improve existing shelf-life depicting techniques as an alternative to date labels known to promote food waste. She is passionate about building a sustainable food system and hopes to use her food science skills to continue this effort in the food industry after graduation. In her free time, she likes experimenting in the kitchen with new cuisines (if only someone else could clean), reading fiction, volunteering with food recovery, and occasionally painting.

Avatar
Science Meets Food

The IFT Student Association (IFTSA) is a forward-looking, student-governed community of IFT members. Through competitions, scholarships, networking, and leadership opportunities, you’ll set yourself apart from your classmates (unless they’re members too).

Leave a Reply