Robots can see at wavelengths beyond the human eye. Robots can hear at wavelengths beyond human ears. Robots can even tactilely sense the approach to human skin.
But when it comes to tasting, robots are laggards. Taste is a sense that might seem fundamental to any human, including small children licking food off the floor, but not to robots. The tasting technology does not come close to the multi-faceted sensitivity of the human tongue.
Improving this technology is an active area of research for both robot builders and food scientists. An idea: move the tongue to an arm that a robot can manipulate. Researchers at the University of Cambridge have done just that, testing a robotic arm’s ability to taste egg dishes. They published their work in the journal Frontiers in robotics and AI on May 4th.
The Cambridge group wasn’t new to the idea, having previously developed a robot that could make omelettes and hone its egg-making skills with human feedback. It fits seamlessly into a wave of robots that have started making their way into restaurants, typically doing routine tasks in the kitchen.
Consider Spyce, a Boston-area restaurant where diners could watch automated machines prepare bespoke bowls. The MIT engineers who founded Spyce dreamed of expanding it to a chain across the US East Coast. But those dreams were met with mixed reactions, and Spyce closed its doors earlier this year.
Even the most elementary cooking tasks can prove to be insurmountable hurdles for robots. A British startup is offering a range of robotic cooking arms that cost over $300,000 and can cook thousands of recipes – but it still needs human help to chop its vegetables.
[Related: What robots can and can’t do for a restaurant]
Another thing robots can’t do – which many human chefs take for granted – is checking their progress by taste. “If robots are going to be used for certain aspects of food preparation, it’s important that they can ‘taste’ what they’re cooking,” said Grzegorz Sochacki, a Cambridge engineer and author of the study, in a press release.
This is a solvable problem because taste is a chemical process. Flavors are your brain’s interpretations of different molecules touching your tongue. Acids, for example, taste sour while their alkaline counterparts taste bitter. Certain amino acids impart a savory umami flavor, while salts like sodium chloride taste, well, salty. A chemical called capsaicin is responsible for the hot flavor in peppers.
For a number of years, researchers have been tinkering with so-called “electronic tongues,” devices that emulate this process by sensing these molecules and more. Some of these devices even look like human tongues. In previous research, they were used to taste orange juice.
But electronic tongues are a pale imitation of the organic kind. To taste anything remotely solid – even honey – you have to mix the food with water, and that water has to be pure to keep out unwanted molecules. Electronic tongues can rate cheese or a roast chicken dish, but first a human must liquefy the food. You will hardly find a chef who wants to wait 10 minutes to taste.
Even then, this process results in a one-off measurement that doesn’t do justice to the food. Every foodie knows that taste is far more complex than taking a chemical sample from liquefied foods. The taste changes over the course of a bite. Different spices meet in different places. As you chew on a bite and your saliva and digestive enzymes mix into an increasingly mushy bite, the bite’s flavor profile can change.
The Cambridge group hoped to address this issue directly (or perhaps openly). Instead of a tongue-like tendril, they decided to relocate the probe – specifically a salinity sensor – to a movable arm. By doing this, the researchers hoped to give the robot a tool to sample a dish at multiple points during preparation and create a ‘taste map’ of the food.
“Having the robot control over movement and where and how it’s sensing” is different from other electronic tongues that have come before, says Josie Hughes, a roboticist at École Polytechnique Fédérale de Lausanne in Switzerland, who was part of the Cambridge group in the past but was not an author of this current article.
To test the arm, the researchers made nine simple egg dishes, each with different amounts of salt and tomato. The arm recorded the salinity of each plate. After that, the researchers put each dish in a blender to see if the robotic arm could tell the differences in salt levels while the egg and tomatoes were whisked together into a salmon-colored mash, just like they would in a human mouth.
Using this new technique, the researchers were able to create salt maps that surpassed anything electronic tongues had done before. Of course, salt content is only one aspect of cooking. In the future, the researchers hope to expand to other flavors, such as sweetness or greasiness. And putting food in a blender isn’t quite the same as putting it in your mouth.
“We may see robotic ‘assistants’ in the kitchen in the near term,” says Hughes, “but we need more exciting and insightful advances like this work to be able to design robots that taste, chop, mix, cook and do a lot more than learn.” true human chef.” The road to bringing something like this robot into a kitchen, whether in a restaurant or at home, might make culinary training seem easy in comparison.