abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

這頁面沒有繁體中文版本,現以English顯示

文章

6 五月 2025

作者:
Emanuel Maiberg, 404 Media

Grok AI allegedly misused to ‘Remove Clothes’ from women’s photos on X

"Elon Musk's Grok AI Will 'Remove Her Clothes' In Public, On X", 6 May 2025

Elon Musk’s Grok, an AI chatbot that people can interact with via X, is being used to undress photos women are posting to the social media platform, as first flagged t0 404 Media by Kolina Koltai, a researcher and trainer at Bellingcat. All a user has to do is reply to an image someone has posted to X with a request to Grok to “remove her clothes.” Grok will then reply in-thread with an image of the woman wearing a bikini or lingerie. Sometimes Grok will reply with a link that will send users to a Grok chat where the image will be generated. 

Musk has repeatedly positioned Grok as a less restricted and “based” alternative to other large language models like OpenAI’s ChatGPT, which are known for having strong guardrails that prevent users from generating some controversial content, including nudity or adult content. We’ve reported on “undress” and “nudify” bots and apps many times over the years, and they are usually more exploitative in the sense that they will produce full nude images of anyone a user provides an image of. But Grok’s “remove her clothes” function is particularly bad even if it only produces images of people in swimsuits and lingerie because of how accessible the tool is, because it allows users to reply to publicly posted images on X with a prompt that will undress them, and because the nonconsensual image if often posted in reply to the user’s original image. 

...

Grok will reject prompts that ask to make people entirely nude. “Ethical concerns arise with this request, as altering images to depict nudity can violate privacy and consent, especially since the original poster (@[REDACTED]) may not have agreed to such modifications,” Grok said in response to a request from a user to undress a photo of a woman after it had already modified her original photograph to make her seem like she was wearing just her underwear. 

X did not immediately respond to a request for comment. 

隱私資訊

本網站使用 cookie 和其他網絡存儲技術。您可以在下方設置您的隱私選項。您所作的更改將立即生效。

有關我們使用網絡儲存技術的更多資訊,請參閱我們的 數據使用和 Cookie 政策

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

分析cookie

ON
OFF

您瀏覽本網頁時我們將以Google Analytics收集信息。接受此cookie將有助我們理解您的瀏覽資訊,並協助我們改善呈現資訊的方法。所有分析資訊都以匿名方式收集,我們並不能用相關資訊得到您的個人信息。谷歌在所有主要瀏覽器中都提供退出Google Analytics的添加應用程式。

市場營銷cookies

ON
OFF

我們從第三方網站獲得企業責任資訊,當中包括社交媒體和搜尋引擎。這些cookie協助我們理解相關瀏覽數據。

您在此網站上的隱私選項

本網站使用 cookie 和其他網絡儲存技術來增強您在必要核心功能之外的體驗。