to create a mimic of a person you must first destroy their privacy
after an AI has devoured all they’ve ever written or spoken on video it will then mimic such person very well, but most likely still be a legal property of a company that made it
in a situation like that you’d then have to pay a subscription to interact with the mimic (because god forbid you ever get actually sold something nowadays)
now imagine having to pay to talk with a ghost of your loved one, a chatbot that sometimes allows you to forget that the actual person is gone, and makes all the moments where that illusion is broken all the more painful. A chatbot that denies you grief, and traps you in hell where you can talk with the person you lost, but never touch them, never feel them, never see them grow (or you could pay extra for the chatbot to attend new skill classes you could talk about :)).
It would make grieving impossible and take constant advantage of those who “just want to say goodbye”. Grief is already hard as is, a wide spread mimicry of our dead ones would make it a psychological torture
for more information watch a prediction of our future a fun sci-fi show called Black Mirror, specifically the episode titled Be Right Back (entire series is fully episodic you don’t need to watch from the start)
If someone came to a service provider and wanted it, and provided media to train on, and agreed to whatever costs are involved, isnt that enitrely their business?
this is not about wanting this is about companies taking advantage of vulnerable people who should be grieving. This can cause lasting psychological harm
you might as well be saying, if someone came to a drug maker, and wanted some heroine, and provided ingredients for heroine, and agreed to whatever costs were involved, isn’t that entirely their business?
Im seeing a lot of reasons why you, or i, would not want such a service to exist.
What a person should or should not be doing is their business. Companies who can target vulnerable people would ideally be regulated.
Id much rather first go after payday advance companies with exorbant fees, or casinos, or high interest loans that individuals cant be expected to repay.
How the fuck could this be illegal?
wow, so many reasons
now imagine having to pay to talk with a ghost of your loved one, a chatbot that sometimes allows you to forget that the actual person is gone, and makes all the moments where that illusion is broken all the more painful. A chatbot that denies you grief, and traps you in hell where you can talk with the person you lost, but never touch them, never feel them, never see them grow (or you could pay extra for the chatbot to attend new skill classes you could talk about :)).
It would make grieving impossible and take constant advantage of those who “just want to say goodbye”. Grief is already hard as is, a wide spread mimicry of our dead ones would make it a psychological torture
for more information watch
a prediction of our futurea fun sci-fi show called Black Mirror, specifically the episode titled Be Right Back (entire series is fully episodic you don’t need to watch from the start)If someone came to a service provider and wanted it, and provided media to train on, and agreed to whatever costs are involved, isnt that enitrely their business?
this is not about wanting this is about companies taking advantage of vulnerable people who should be grieving. This can cause lasting psychological harm
you might as well be saying, if someone came to a drug maker, and wanted some heroine, and provided ingredients for heroine, and agreed to whatever costs were involved, isn’t that entirely their business?
Im seeing a lot of reasons why you, or i, would not want such a service to exist.
What a person should or should not be doing is their business. Companies who can target vulnerable people would ideally be regulated.
Id much rather first go after payday advance companies with exorbant fees, or casinos, or high interest loans that individuals cant be expected to repay.