News
In the two-plus years since generative artificial intelligence took the the world by storm following the public release of ChatGPT, trust has been a perpetual problem. Hallucinations, bad math and ...
Elon Musk's AI chatbot, Grok, developed by xAI, ignited a firestorm on 14 May 2025, when it repeatedly posted about 'white genocide' in South Africa in response to unrelated X queries, from ...
In response to X user queries about everything from sports to Medicaid cuts, the xAI chatbot inserted unrelated information ...
Last Wednesday, as I watched Grok bring up white genocide in response to an anodyne query about the Toronto Blue Jays pitcher ...
The Grok meltdown underscores some of the fundamental problems at the heart of AI development that tech companies have so far yada-yada-yada’d through anytime they’re pressed on questions of ...
Kolin Koltai, a researcher at the Netherlands-based investigative outlet Bellingcat, recently discovered that X users were using the platform’s AI chatbot, Grok, to undress women in photos that ...
The faked image was created by the platform’s AI assistant, Grok, which seemed to be operating ... a human influencer would do to, you know, influence people. But the potential for misuse ...
now finds himself in an ironic situation where his chatbot resists his influence. The big question now is whether Musk will take action. Shutting Grok down could trigger a global debate over AI ...
In 2025, data is the new currency, and AI is the judge. According to consensus from top AI platforms—ChatGPT (OpenAI), Grok (xAI), Microsoft ... platforms assess influence differently than ...
Grok is a free AI tool built into X. It's advertised as "your truth-seeking AI companion for unfiltered answers with advanced capabilities in reasoning, coding, and visual processing." Users can ...
Hallucinations, bad math and cultural biases have plagued results, reminding users that there's a limit to how much we can rely on AI, at least for now. Elon Musk's Grok chatbot, created by his ...
Results that may be inaccessible to you are currently showing.
Hide inaccessible results