Sam Hawley: She's one of the most famous women in the world, so when sexually explicit images of Taylor Swift began appearing on social media, they went viral. The images were AI-generated and were viewed by tens of millions of people around the world. Today we meet the American journalist who uncovered how a Microsoft tool was manipulated to produce the images. He explains why everyone is vulnerable as the technology becomes ever more sophisticated. I'm Sam Hawley on Gadigal Land in Sydney. This is ABC News Daily.
Emanuel Maiberg: My name is Emanuel Maiberg. I'm a co-founder and journalist at 404 Media.
Sam Hawley: Emanuel, just tell me what is 404 Media, explain it.
Emanuel Maiberg: 404 Media is a website started by me and three of my colleagues. We report about AI, we report about niche online communities, the right to repair, consumer rights, this kind of thing.
Sam Hawley: Alright, and you've been looking into how these images of Taylor Swift were generated and who was behind them. Take me back first, Emanuel, to when you first heard that these images were being circulated.
Emanuel Maiberg: Yeah, so for reasons that you and probably many of your listeners are probably familiar with, I'm trying to spend less time on Twitter these days. So I actually missed the images kind of like popping up in my feed. I noticed them when my co-founder Jason sent them to me and he knows that this is an area I cover and he was like, maybe this is something we should look into. This kind of thing happens all the time, non-consensual images generated by AI of celebrities. But I was shocked by how many people have seen it. Twitter now has data on display for how many people see a post. So according to Twitter, millions of people saw these images.
Sam Hawley: It was something like 47 million people viewed those images. It was a huge number of people, wasn't it?
Emanuel Maiberg: Yeah, and it's hard to put an exact number on it because you'd have to tally all these different posts. It wasn't just one account that posted it, it was a few. And once those accounts posted it, other people posted it. So for sure we can say that millions and millions of people have seen the images.
Sam Hawley: It was actually Taylor Swift's massive fan base, they're known as Swifties I believe, that really pushed for these images to be removed or blocked at least. So what did they do? Just tell me about that.
Emanuel Maiberg: So Twitter has this feature called trending topics where depending on your follows and your Twitter behaviour, it displays topics that are popular in real time that you might be interested in. And Taylor Swift and Taylor Swift AI was a trending topic because so many people were looking at the images. So if you were to click on it, you would see them. And once the Swifties saw this, they made a concerted effort to flood that trending topic with other posts that weren't these images in order to push them down and out of the feed. So if you were to click on them, you wouldn't see them. And actually by the time that I was looking into it, it was hard to find them naturally. You'd have to dig a bit because they've done such a good job of burying the images.
Sam Hawley: And for a time, X had to block searches for Taylor Swift altogether from the platform. Were the images just drawn at X or did they appear elsewhere as well?
Emanuel Maiberg: So X is definitely where they went viral. But once they hit X, they also made their way to other websites that are dedicated to celebrity nudes and pornography websites that steal pornography from legitimate adult content creators. And prior to that, they were circulated in private Telegram groups. And the earliest instance of them that I could find was actually on 4chan.
Sam Hawley: Alright. So now let's have a look at these images and how they were actually generated because they're not just Photoshopped, right? Just tell me more about the technology that was used and how it's used.
Emanuel Maiberg: Yeah, so I've seen some people call them deepfakes. I understand why people use that term. But they're not deepfakes in the way that we think about them traditionally and certainly not how we saw deepfakes when they first appeared in 2017. A deepfake is you take an adult video and you take the face of a different person and you use software to paint the face of the person into the porn video. The Taylor Swift images were made with generative AI. It's with a tool called a text-to-image AI image generator. And basically what you do is you type a prompt, a text prompt, into a box and the AI instantly generates the image. So broadly that's the technology that was used to make these.
Sam Hawley: And you've also got close to finding the source of these images. So tell me now what you've found in the dark corners, I suppose, of the internet.
Emanuel Maiberg: Yeah, so when I looked at the images, I recognised the style of the images. I've been reporting on deepfakes and non-consensual pornography since 2017 when they first appeared. And I'm embedded in these communities. I'm just looking at what these people do, the kind of images they make all the time. And when I saw the images, I knew that I had seen them before and I kind of went to all the places online where they hang out. And sure enough, I found a Telegram channel where the images appeared a day before they hit Twitter. What I saw is that the community where these images came out of, they were very focused on one tool made by Microsoft called Designer. And Designer is one of these text-to-image AI tools. And obviously Microsoft didn't make it in order to make non-consensual pornography. Designer specifically was made for you to make your PowerPoints look fancier with AI images. But what these users on Telegram found is that they could game the guardrails that are put in place on Designer in order to prevent people from making pornography, in order to generate pornography. And generally the way it works is they find ways to describe an image that would look sexual in nature but without using sexual terms. And that way they're able to work around the protections that Microsoft had in place.
Sam Hawley: Wow. Okay. And what have the users or the members of this group had to say about all of this?
Emanuel Maiberg: I mean, initially they were very excited and they made a ton of these images. And that's why the Telegram channel was filled with them and why 4chan was filled with them. And they were kind of teaching each other how to do it and how to game Microsoft's system. And once we reported the story and we explained that it appears like Microsoft's tool was able to do this, we reached out to Microsoft, we told them we saw this, we told them some of the workarounds that people were using. And then I would say within a few hours, I started to see that people in those communities were complaining that the loopholes no longer worked. So it appears that Microsoft has made changes to prevent most of this behaviour. Microsoft doesn't go into details about how the AI works, how the guardrails work, what they did, what changes they made. But I can see the reaction in the communities in real time and they're upset that they're no longer able to generate most of these images.
Sam Hawley: And the Microsoft CEO, Satya Nadella, he spoke to NBC News. What's he been saying about all of this? Because it's a big worry for Microsoft, isn't it?
Emanuel Maiberg: I suppose, yeah. I mean, they're getting a lot of heat for this, understandably.
NBC News Interviewer: What does that tell you about this technology and whether we could ever get the toothpaste back in the tube?
Satya Nadella, Microsoft CEO: I'd say two things. One is, again, I go back to, I think, what's our responsibility, which is all of the guardrails that we need to place on around the technology so that there's more safe content that's being produced. And there's a lot to be done there and a lot being done there. But it is about global societal, I would say, convergence on certain norms.
Emanuel Maiberg: His position is that we as a society need to come to some understanding about the technology and decide what are the norms around them. In his defence, he also admitted that Microsoft needs to do more in order to have guardrails in place. I think it's great that he said it, but also he said it after the fact. I mean, the guardrail should have been there before these images were produced. It's not as if the workarounds were that advanced.
Sam Hawley: So, Emanuel, let's look now at what could be done better to regulate this space, because this could happen to anyone, right? It doesn't have to be an A-list celebrity like Taylor Swift. Everyone's vulnerable.
Emanuel Maiberg: So, what can be done, what needs to be done? There are a few levels to this. One is the companies that make the tools, they can make more, better guardrails. And the way to do that is to stress test these tools before they get to market. And I think if they and other companies slow down and really thought about the consequences and the abuse that can happen before they put the tools out, we would be in a better place. So that's one level of it. The other level of it, which I think can't be emphasised enough, is people have been making non-consensual, nude images of Taylor Swift for years and years. As I've said, I report about this space, I see these images every day, I've seen them for years. The reason that we're talking about this right now is because of how big they got on Twitter. Twitter is at a diminished moderation capacity since Elon Musk bought it. Moderators were let go, the trust and safety team, the people who are kind of in charge of making Twitter safe, a lot of them have been let go. It's just a less safe online space now, and that's why this is such a big problem. And then there needs to be some sort of legal consequences. I think until we see people really pay a heavy price for making these images, people are going to be pretty cavalier about making them.
Sam Hawley: Yeah, alright. Well, the White House press secretary was concerned calling the fake images alarming, so it's obviously got the attention of the government there.
Karine Jean-Pierre, White House press secretary: And so we're going to do what we can to deal with this issue.
Reporter: Should there be legislation?
Karine Jean-Pierre, White House press secretary: Yeah there should be legislation to deal with this issue, but as I just stated the president’s taking action.
Sam Hawley: And if we can't trust online companies or social media companies to keep us safe, isn't it up to governments to really step in here? It seems like this is all just getting so far ahead of us and we're kind of trying to catch up with it.
Emanuel Maiberg: I think certainly if there were some rules in place, and more importantly, if there was precedent, it's not just that we need the rules, we need to actually need to see this play out. Somebody has to make an image, get sued, pay a fine, go to jail, whatever. Until there is actual accountability, I don't think that we're going to see this slow down at all. I think when you see a community like Taylor Swift's fans come together and really push back, that gets people alarmed, that gets attention. You get the CEO of Microsoft and you get the White House responding. So, yeah, some sort of community response, people speaking up, people maybe telling these companies to slow down, I think would really help.
Sam Hawley: This episode was produced by Bridget Fitzgerald and Nell Whitehead. Audio production by Sam Dunn. Our supervising producer is David Coady. Over the weekend, catch the new podcast from journalists Geraldine Doogue and Hamish Macdonald. It's called Global Roaming. And this week, they look at what Iran really wants as tensions rise with the United States. I'm Sam Hawley. ABC News Daily will be back again on Monday. Thanks for listening.
She’s one of the most famous women in the world, so when sexually explicit images of Taylor Swift began appearing on social media - they went viral.
The images were AI-generated and were viewed by tens of millions of people around the world.
In this episode we meet the US journalist who uncovered how a Microsoft tool was manipulated to produce the images.
He explains why everyone is vulnerable as the technology becomes ever-more sophisticated.
Featured:
Emanuel Maiberg, journalist and co-founder of 404 Media
Subscribe to ABC News Daily on the ABC listen app.