CHATGPT PROMPTS AND JAILBREAKS 1000+ (GITHUB)
by LilFrag - Sunday April 7, 2024 at 04:16 PM
thanks bro, i will check
Reply
seems interesting, will test out

This forum account is currently banned. Ban Length: Permanent (N/A Remaining)
Ban Reason: Leeching | http://c66go4clkqodr7tdjfu76jztjs7w7d3fajdeypxn73v4ju3dt7g5yyyd.onion/Forum-Ban-Appeals if you feel this is incorrect.
Reply
cooool nice one boyyy
Reply
will test, lets see
Reply
ok , good one hurrah!!!!
Reply
nice one, gotta try this
Reply
thanks a lot dude;
Reply
Im wondering if these prompts or jailbreaks are still valid
Reply
i am anxious to tested it

this is a github too?
Reply
I gonna check it
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  [FREE] Leaked Data Search Engine (10+ Terabytes of data, NO REDACTED DATA) 0btkop 288 6,723 30 minutes ago
Last Post: kffnyx
  HQ Stealer log search w/ API Daemon 177 6,992 1 hour ago
Last Post: saimraza786
  [FREE] Latest Offensive Security OSDA / SOC-200 - complete course PDF + videos Tamarisk 731 75,725 1 hour ago
Last Post: 0xCAFEBABE
  ChatGPT Unethical Prompt WORMGPT PROMPT Blach-Hat 1,225 41,359 1 hour ago
Last Post: voidoid
  [FREE] OSCP+ Exercise Solutions & Old Retired OSCP Lab report Techtom 110 3,440 1 hour ago
Last Post: 0xCAFEBABE

Forum Jump:


 Users browsing this forum: 1 Guest(s)