Satya Nadella says apparent Taylor Swift AI fakes are ‘dangerous and terrifying’

-Gudstory

Satya Nadella says apparent Taylor Swift AI fakes are ‘dangerous and terrifying’ -Gudstory

Getting your Trinity Audio player ready...
Rate this post

[ad_1]

Microsoft CEO Satya Nadella has responded to the controversy over sexually explicit AI-generated fake images of Taylor Swift. in an interview with nbc nightly news It will air next Tuesday, with Nadella calling the spread of non-consensual fake nudity “alarming and horrifying”, telling interviewer Lester Holt that “I think we should move faster on this.”

In a transcript distributed by NBC ahead of the January 30 show, Holt asked Nadella to respond to the Internet “exploding with fake, and I emphasize fake, sexually explicit images of Taylor Swift.” Nadella’s response succeeded in opening several cans of tech policy-related worms, while saying remarkably little about them – which is not surprising when there is no definitive solution in sight.

I would say two things: One, is I again go back to what I think is our responsibility, which is to have all the guardrails that we need to put around the technology to produce more safe content. . And there is much to be done and much being done. But it’s about global, social – you know, I would say, convergence on certain norms. And we can – especially when you have laws and law enforcement and technology platforms that can come together – I think we can govern much more than we think – we give ourselves credit for. .

Fake Swift photos may have something to do with Microsoft. A 404 media Reports indicate that they come from a Telegram-based non-consensual porn-making community that recommends using the Microsoft Designer Image Generator. Designers refuse to create images of famous people in principle, but it is easy to confuse AI generators, and 404 It was found that you could break its rules by making small changes to the signals. Although this does not prove that the designer was used for Swift images, it is the kind of technical flaw that Microsoft can deal with.

But AI tools have massively simplified the process of creating fake nudities of real people, causing trouble for women who have much less power and celebrity than Swift. And controlling their production is not as easy as forcing big companies to strengthen their guardrails. Even if major “Big Tech” platforms like Microsoft are shut down, people can still retrain open tools like Stable Diffusion to create NSFW images despite efforts to make it harder. Very few users can access these generators, but the Swift incident shows how widely the work of a small community can spread.

There are other stopgap options, too – like social networks limiting access to non-consensual imagery or, apparently, Swifty-imposed vigilante justice against people who spread them. (Does this count as “convergence on some criteria”?) For now, though, Nadella’s only apparent plan is to get Microsoft’s own AI house in order.

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *