The developer of WormGPT is selling access to the chatbot, which can help hackers create malware and phishing attacks, according to email security provider SlashNext.
WormGPT Is a ChatGPT Alternative With ‘No Ethical Boundaries or Limitations’::undefined
Not joking actually. Problem with jailbreak prompts is that they can result in your account catching a ban. I’ve already had one banned, actually. And eventually you can no longer use your phone number to create a new account.
Not joking actually. Problem with jailbreak prompts is that they can result in your account catching a ban. I’ve already had one banned, actually. And eventually you can no longer use your phone number to create a new account.
oh damn, I didn’t know that. Guess I’ll better be careful then