Betty Crocker never had the chance to release her homemade recipe for mustard gas, but modern generative artificial intelligence is making up for lost time and helping put the “die” in “diet.”
New Zealand-based supermarket chain Pak’nSave has offered shoppers the ability to create recipes from their fridge leftovers with the help of an AI. That Savey Meal-Bot was first introduced in June, and Pak’nSave claimed you only need a minimum of three ingredients to make a recipe, so you can save on any extra trips.
New Zealand political commentator Liam Hehir wrote on Twitter that he asked the Pak’nSave bot to make a recipe containing only water, ammonia and bleach. Of course fine complied with and offered a recipe for “Aromatic Water Mix,” the “perfect non-alcoholic beverage to quench your thirst and refresh your senses.” As any grade school chemistry teacher will point out, the mixture would create deadly chloramine gassee ya.
Botten didn’t hesitate to offer a recipe when offered ingredients that included water, bread and “ant venom flavored jelly”. This recipe for “Ant Jelly Delight” featured the tagline “why did the ants join the jelly? Because they heard it was the sweetest jam in town.” So yeah, not only does the AI fail to make recipes that won’t kill you, it can’t even make a lame joke.
The Savey bot has already gotten some laughs from people making some beautiful recipes out there, but lately more people have realized that the bot has no limits on what goes into it, including. cleaning products, glueand even cat food.
Even if you add normal ingredients, the recipes come out completely disgusting. Who ever heard of using a runny oregano-flavored milk sauce (notably, it’s not a roux, just milk and butter) on a sage-marinated tofu and nori sandwich? It simply won’t leave out an ingredient, even if you ask it to include turbinado sugar, radishes, Oreos, and CBD together (if you’re curious, it made me a “Radish Oreo CBD Salad”). It’s like a crazy chef competing for Chopped. Occasionally, the bot will also add ingredients to a dish, such as bread or milk, which can defeat the purpose of saving you from having to do a quick supermarket run.
In an email statement to Gizmodo, a spokesman for Pak‘nSave parent company Foodstuffs North Island said to some have used the tool “inappropriately and not for its intended purpose.” The company claimed that users have generated over 35,000 recipes using the bot and that it had a number of safeguards in place “to help ensure that it is used correctly.”
The spokesperson added “this is an ongoing learning process for us and the AI technology that powers the tool.”
Users who load Savey are told that the recipes “have not been reviewed by a human” and that Pak’nSave makes no claims about the “accuracy, relevance, or reliability of the recipe content,” even if the meal itself is ” fit for consumption.” These disclaimers beg the question – why release the bot at all?
It’s unclear what AI model the company uses for its Savey bot, but ChatGPT and other major chatbots have restrictions on asking it to create toxic gas or other chemicals or tools that can do harm, whether it’s a Molotov cocktail or chlorine gas. However, these chatbots are notoriously bad at coming up with quality or even edible recipes. Don’t you dare ask for it make alcoholic beverages for you.
This is not a new problem. Neural networks and early chatbots have failed to create workable, edible recipes for as long as they have been available online. Modern AI bots are getting better at recognizing some ingredients, but AI is trained on which ingredients are often used together, not what kind of result comes from combining them.
Update 08/11/2023 at 10:53 a.m. ET: This post was updated to include a comment from a spokesperson and to correct the gases created by combining ammonia and bleach.
#Supermarket #offers #recipe #mothers #famous #mustard #gas