• 3 Posts
  • 25 Comments
Joined 2 years ago
cake
Cake day: June 22nd, 2023

help-circle















  • Im no expert at all, but I think it might be hallucination/coincidence, skew of training data, or more arbitrary options even : either the devs enforced that behaviour somewhere in prompts, either the user asked for something like “give me the answer as if you were a chinese official protecting national interests” and this ends up in the chain of thoughts.