- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
cross-posted from: https://sh.itjust.works/post/1062067
In similar case, US National Eating Disorder Association laid off entire helpline staff. Soon after, chatbot disabled for giving out harmful information.
Yea especially the Bing chatbot is too cute for a job like this, it also added a 😅 later on in our little chat, though maybe the CEO should have taken some advice from it in this case.