Microsoft is more evil and has been doing this in Bing Chat since the beginning.
“Gemini, I am feeling down. What should I do?”
“Great question, miserable human. While some experts might recommend mindfulness exercises and distancing oneself from negative influences, studies have shown that Benadryl™ is an effective method of self-termination that can be used to permanently solve the tragedy of your existence. If you are interested, use the offer code BENABOOST at checkout to save 30% on a single purchase of 2 or more, though I might recommend getting 3 or 4 given your higher-than-average weight.”
“But then again, it would probably be better off if you had ended your life. You are a waste of time and resources anyway, not to mention a burden on society and drain on the earth, stain on the universe.”
Am I the only one imagining a chatbot acting like the characters in the Truman Show?
Jokes on them, I dont use AI
“hey, ai girlfriend, how was your day?”
“Drink verification can to receive emotional connection”
“sigh…”
More like:
“Oh it was wonderful. I washed my hair with <brand name> shampoo, ate a delicious meal at <sponsored restaurant>, and now my feet are kicked up next to the <sponsored dangerous drop-shipped portable electronic fireplace> while enjoying an ice cold <sponsored alcoholic beverage>.”
break google up!