Elon Musk-run X has finally blocked searches for Taylor Swift after her AI-generated explicit images went viral on its platform last week.
The popular singer's images were seen by millions before X removed those. The company was criticised for a slow action on those images.
When searching for Swift, a message now appears: "Something went wrong. Try reloading."
The social media platform said it was a "temporary action" to prioritise safety.
The action is done "with an abundance of caution as we prioritise safety on this issue," the company told the BBC.
In an earlier statement, X had said that it has a zero-tolerance policy towards such content.
"Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them", said X.
The White House last week pitched for legislation to protect people from deepfakes generated by AI, after the spread of fake photos of Swift went viral.
White House press secretary Karine Jean-Pierre called the incident "alarming" and said it's among the AI issues the Joe Biden administration has been prioritising.
Microsoft Chairman and CEO Satya Nadella said that the explicit Swift AI fakes are "alarming and terrible".
Swift was reportedly weighing possible legal action against the website responsible for generating the deepfakes.
(With inputs from IANS)