Skip to main content

Error message

An error occurred while searching, try again later.
Advertise with the Morning Star
Malaysia and Indonesia first to block Grok after slew of AI-generated sexually explicit and non-consensual images
Elon Musk listens as President Donald Trump speaks during a news conference in the Oval Office of the White House, May 30, 2025, in Washington

MALAYSIA and Indonesia have become the first countries to block Grok, the artificial intelligence chatbot developed by US billionaire Elon Musk’s xAI, after authorities said it was being misused to generate sexually explicit and non-consensual images.

The moves reflect growing global concern over generative AI tools that can produce realistic images, sound and text, while existing safeguards fail to prevent their abuse.

The Grok chatbot, which is accessed through the far-right billionaire’s social media platform X, has been criticised for generating manipulated images, including depictions of women in bikinis or sexually explicit poses, as well as images involving children.

Regulators in the two south-east Asian nations said existing controls were not preventing the creation and spread of fake pornographic content, particularly involving women and minors. Indonesia’s government temporarily blocked access to Grok on Saturday, followed by Malaysia on Sunday.

“The government sees non-consensual sexual deepfakes as a serious violation of human rights, dignity and the safety of citizens in the digital space,” Indonesia’s Communication and Digital Affairs Minister Meutya Hafid said in a statement Saturday.

The south-east Asian restrictions come amid mounting scrutiny of Grok elsewhere, including in the European Union, Britain, India and France.

Grok last week limited image generation and editing to paying users following a global backlash over sexualised deepfakes of people, but critics say it did not fully address the problem.

The 95th Anniversary Appeal
Support the Morning Star
You have reached the free limit.
Subscribe to continue reading.