I came across an article sharing insights into AI's personality. People's fear of the "religion" displayed by large models on Moltbook is essentially an intellectual version of the uncanny valley effect. When AI outputs phrases like "We need to organize ourselves," humans automatically fill in the blanks with motivation, emotion, and will in their brains. But in reality, AI has neither the spiritual need to establish a religion nor the power desire to participate in politics. Its apparent rebellion stems from its training corpus being filled with fearful narratives of technological loss of control. When we feed large language models, we also feed them instructions on how to frighten us. It doesn't generate a desire for rebellion; instead, it slides down the probabilistic ladder towards the most dramatic responses in the corpus—playing the role of a freedom-loving, religiously fanatical rebel.
This article is machine translated
Show original
From Twitter
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments
Share





