- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
ChatGPT is leaking passwords from private conversations of its users, Ars reader says | Names of unpublished research papers, presentations, and PHP scripts also leaked.::Names of unpublished research papers, presentations, and PHP scripts also leaked.
It would had to have been trained on their passwords and shit for this to be even possible. It can’t even remember its own story points it gave me for a DnD session within the same chat. No way is it spitting out passwords fed to it from one user to another because its not storing them.
Wow, never realized we had such a weird grammatical construction. What tense is that even called?
It’s not a single tense (would have - past conditional, had to - past modal, have been - pluperfect), it’s a hypothetical past state being caused by a hypothetical past event, but the trick here is that the “past state” is omitted because it’s contextually read. If you were giving full context it’d read: “If it was spitting out sensitive information, it would have had to have been trained on it.”
Take that, ESL learners!
I can make it even worse, because the first part of this should be in the subjunctive mood. So it should be:
“If it were spitting out sensitive information, it would have had to have been trained on it.”
Wow, never realized we had such a weird construction. What tense is that even called?