U.S

Why Is Hollywood Still Using X?

If you are a Hollywood studio, network, agency or manager, one of your primary responsibilities has to be protecting your intellectual property and your talent. One of the biggest complaints about artificial intelligence and machine learning is that the data sets are being trained in part using copyrighted material that is gathered without permission. Because aside from all of the legal reasons that shouldn't be allowed, most people understand that it is a short walk from "using IP without permission" to "creating images of Mickey Mouse humping Goofy."

Despite everything that has happened on X over the past few years, most of Hollywood still uses the platform regularly. They do it part because there is still a substantial audience on X and the platform regularly uses its algorithm to try and protect high-profile accounts from the worst of what X has to offer.

But leaving X has also become a political decision - like it or not. Elon Musk and his Scooby Gang have managed to convince many of its MAGA users that leaving the platform is the equivalent of saying "I hate Trump and MAGA." And in an era where no Hollywood corporation wants to poke the unpredictable Bear on Crack that is the Trump Administration, most companies have decided to just suck it up, hold their noses and keep posting.

But if there is a point in which corporate responsibility should overcome industry cowardice, it is the recent decision by Elon Musk to allow X's janky AI platform Grok to begin creating images of nearly anything on request. And collect and use any image posted on the platform as the starting point for porn or murder fantasies.

Last week, there was a controversy over Grok's willingness to strip the clothes from images of underage girls. And in theory, that should be enough of a red flag to scare skittish Hollywood corporations.

But the bigger problem for Hollywood is that Grok is being used by X users as a way to create soft porn images of every imaginable female actress. The images range from creating images of female stars in thongs or kissing other female stars to borderline violent and submissive images that should horrify anyone who cares about the actresses they have a business relationship with on any level.

Musk will no doubt argue he is working to eliminate these problems. But if we have learned anything about Grok since its launch is that it is impossible to tweak the programming in ways that remove the options for abuse while still allowing it to do things that promote engagement.

As a way of illustration, here is a sampling of some of the gentler examples Grok has created in recent days:











If Hollywood values its female talent, if it is serious about protecting some of its most important assets, then it is time to leave Grok and X behind. 

And this sexual abuse offers Hollywood the opportunity to frame the exit from X as a moral issue:

"We appreciate the users here and the support we've received. We hope to return some day. But we cannot in good conscience stay on a platform that allows users to create non-consensual sexual images of women - whether or not they are celebrities."

That last part is important, because it is not just woman who are famous that are being victimized. 



It is way past the time when Hollywood needs to grow some stones and protect the women who work in the industry.