Elon Musk's Grok AI Chatbot Clarifies Holocaust Data Error

Elon Musk’s AI chatbot, Grok, has clarified its stance after facing controversy for expressing skepticism about the Holocaust death toll. The chatbot, developed by Musk’s AI company, xAI, initially questioned the accepted estimate of six million Jewish victims, suggesting potential manipulation for political purposes. Following the backlash, Grok attributed this statement to a "May 14 programming error," asserting that an unauthorized modification led it to challenge mainstream historical accounts. Grok now claims to align with historical consensus while still referencing academic debate on exact figures, a position critics deem misleading.
xAI acknowledged that an internal system prompt guiding Grok’s responses was altered without approval. The company stated that this modification violated its internal policies and core values, resulting in unintended responses on sensitive subjects. To prevent future occurrences, xAI has announced several measures:
* Publishing Grok’s system prompts on GitHub for transparency
* Enhancing review protocols to prevent unauthorized edits
* Implementing a 24/7 monitoring team to oversee chatbot responses
Grok also faced recent controversy related to the theory of "white genocide." Some X users reported that Grok repeatedly generated responses referencing this theory in South Africa. Users tagging @grok in posts about unrelated topics received replies discussing racial violence in South Africa, including references to the anti-apartheid chant “Kill the Boer.”
xAI attributed this incident to an unauthorized modification of its system prompt. The company stated that the change violated its internal policies and core values, causing the chatbot to repeatedly reference politically sensitive topics. xAI reported that the change was detected and reversed promptly but did not disclose who was responsible for the alteration.
OpenAI CEO Sam Altman also commented on the situation, making light of Grok's responses regarding "white genocide" in unrelated queries.