Bloomberg Businessweek (December 4, 2023) |
Скачать - Журналы | |||
Год выпуска: December 4, 2023 Автор: Bloomberg Businessweek Жанр: Бизнес Издательство: «Bloomberg Businessweek» Формат: PDF (журнал на английском языке) Качество: OCR Количество страниц: 68
A Deepfake Horror StoryTeenage victims of generative AI porn fought back—and showed just how unprepared the world is for technology’s new frontier“Do you have a second? The message was from a former classmate at General Douglas MacArthur High School. Luque found it odd. They’d graduated a year and a half earlier and hadn’t talked in months. She texted back, asking if she could call in 10 minutes, after her boyfriend left for work. “He’s gonna wanna hear this, too,” the reply said. Luque put the call on speaker. There’s a website, the classmate said. A weird and creepy site where someone is posting explicit photos of girls from school and writing about them being raped and murdered. “There’s pictures of you on it, and I wanted you to know that,” she said. The phone buzzed as a link to the website came through. It had an extremely graphic internet address: cumonprintedpics.com. Luque started scrolling. She saw Ana, a classmate, in her cheerleading uniform. And Ruby, a friend who’d sat beside her in detention. Then she saw a photo she recognized. It was her, at 18, standing in a dressing room. But the swimsuit she’d been wearing in the original photo, the one she’d uploaded to social media, was gone. Someone had digitally altered the picture so it looked like she was posing completely naked. She knew the breasts weren’t hers, but they looked real enough that other people might think they were. She was too stunned to speak. The next image made her gasp. It showed a printout of a photo taken when she was 5, with the chubby cheeks and ringlets she’d long since grown out of. An erect penis rested atop the photo, touching her face. The accompanying post encouraged men to ejaculate on it. Then she read: “Spit on this Spanish spic.” “Oh, my God,” Luque said, choking. “I started crying really hard,” she says, nearly three years later. “You know that kind of cry where you sound like you’re dying? All the heavy breathing and shaking and everything.” She drove back to her boyfriend’s house and called the cops. An hour later detectives from the Nassau County Police Department were knocking on the door. She hadn’t been the first to call that night. Word of the website had spread across Levittown. More than 40 girls from MacArthur High had been targeted. Some were working shifts in clothing stores or sitting at home watching New Year’s Eve celebrations on TV when they opened the link to see doctored nude pictures of themselves. Others were at college parties and ran home in tears. HALF A WORLD away, unbeknown to anyone in Levittown, a former police officer named Will Wallace was investigating a possible internet sex crime in New Zealand. Earlier in 2020, an ex-colleague had called him about a case that had stumped police. A woman was being bombarded with anonymous emails containing pictures of herself next to erect male genitalia. The photos had also been sent to her parents and to a boyfriend, who’d broken up with her after receiving them. The harassment had been going on for years. Wallace, who was trained to dig up evidence online and hoped to start a private investigation business, decided to look into the case. Using a reverse image search tool to find places the photos had appeared online, he was directed to cumonprintedpics.com. The site, which had been around for about a decade, featured thousands of images. Some were rudimentary, made with basic photo-editing software. Others were more sophisticated: faces stitched seamlessly onto bodies engaged in sex acts, women who’d been digitally undressed. Threads posted on the site detailed violent fantasies. Some urged internet trolls to find and rape the women. Wallace found an account that had been sharing the photos. The man was later charged with blackmail, harassment and possession of child pornography, but to Wallace’s chagrin, cumonprintedpics .com remained in operation. In the months that followed, as a group of young women on Long Island made it their mission to uncover who’d put altered images of them online, Wallace continued his investigation into the website. When he found out it was charging women to remove photos, he says, he was furious. “Who the f--- do they think they are to not only run a website like this, but to also charge people to remove content?” he thought. “And how are they getting away with this?” NO FEDERAL LAW criminalizes the creation or sharing of fake pornographic images in the US. When it comes to fake nudes of children, the law is narrow and pertains only to cases where children are being abused. And Section 230 of the Communications Decency Act protects web forums, social media platforms and internet providers from being held liable for content posted on their sites. This legal landscape was problem enough for police and prosecutors when it took time and a modicum of skill to create realistic-looking fake pornography. But with billions of dollars of venture capital flowing into imagegenerating software powered by artificial intelligence, it’s gotten cheaper and easier to create convincing photos and videos of things that never happened. Tools such as Midjourney and Stability AI’s Stable Diffusion have been used to produce images of Pope Francis in a puffer jacket, actress Emma Watson as a mermaid and former President Donald Trump sprinting from a cadre of FBI agents. The term “deepfake” was coined on a Reddit forum dedicated to fake porn made with deep-learning models. It’s now in the Oxford English Dictionary, defined as an image digitally manipulated to depict an individual doing something they didn’t. More than 15 billion such images have been created since April 2022, according to Everypixel Group, an AI photo company. The vendors that designed these tools have installed safety filters to ban the creation of explicit images, but because much of the software is open source, anyone can use it, build off it and deactivate the safeguards. Online security experts say more than 90% of deepfakes are pornographic in nature. Mark Pohlmann, founder and chief executive officer of Aeteos, a content moderation company, says he’s seen doctored images of girls as young as 3 dressed in leather, their hands tied together, their throats slit. Like many technological advances, these AI tools edged their way into popular culture before lawmakers and law enforcement authorities understood their power. One man who did is Björn Ommer, a professor at Ludwig Maximilian University in Munich and co-creator of Stable Diffusion. Ommer says he told academic colleagues last year, before Stability AI released the software to the public, that he was “deeply concerned” it had the potential for great harm and wanted researchers to stresstest it first. But it was rushed out anyway, he says, to appease investors. (A spokesperson for Stability AI didn’t respond to questions about Ommer’s allegations but said the company is “committed to preventing the misuse of AI” and has taken steps to prohibit the use of its models for unlawful purposes.) In October, the Biden administration issued an executive order seeking to prevent AI from producing child sexual abuse material or nonconsensual intimate imagery of real individuals, but it’s unclear how and when such restrictions would go into effect. More than a dozen states have passed laws targeting deepfakes, but not all of them carry criminal charges; some cover only election-related content. Most states have revenge porn laws, and a few, including New York, have amended them to include deepfakes. But some prosecutors say those laws apply only to intimate photos shared consensually. As for images pulled from social media and doctored to become sexual content, no law exists. LEVITTOWN, THE FIRST postwar US suburb, looks much as it did in the late 1940s, when it was built for veterans—White veterans only— returning from World War II. The streets are still wide and tree-lined. The singlefamily homes are still uniform, tucked behind manicured lawns and picket fences. The 52,000 residents are still overwhelmingly White. Many work as teachers or cops. “It’s a very close-knit community,” Luque says outside her father’s house. Now 22 and an art student at a nearby community college, she goes by her middle name, Cecilia. “All the houses are right next to each other, and on the inside they all look exactly the same,” she says, waving to a neighbor walking a dog. “Levittown is such a safe place to be. Nothing weird ever happens here. Kids don’t get abducted. People don’t get hurt or assaulted or anything like that. And that’s why this was all so crazy.” By New Year’s Day 2021, the former MacArthur students had group threads going, seeking to support one another and unmask the predator behind the harassment. They already had a suspect: Patrick Carey, a former classmate who was then 19. He’d never played sports or had a girlfriend, and they regarded him as a stoner with a superiority complex. His father was a police detective in New York City. Some of the young women had previously received Snapchat notifications that Carey had taken screen captures of bikini shots they’d posted—pictures that had later appeared, altered, on cumonprintedpics.com. Others recognized his handwriting from images on the site with words like “whore” and “slut” written across their faces. Luque, who was friends with Carey in school, saw his writing style in some of the long, detailed fantasies posted along with the pictures. Several shared their suspicions with their parents and the police, who told them there wasn’t much they could do. They didn’t have probable cause for a warrant to subpoena Carey’s IP address. Cyberharassment cases are generally hard to prove. Keyboard predators are savvy and know how to cover their tracks. Digital evidence they may fail to mask or delete is difficult to capture and time-consuming to process. The detectives hunting them are often more comfortable investigating IRL (in real life) crimes. Online vulgarity isn’t high on police priority lists. In this case, what the person had done might be grotesque, but it wasn’t obviously illegal. Months went by without an arrest. Deepfake images of Levittown girls, some made from pictures taken when they were as young as 13, were still being posted from accounts with names like Serryjeinfeld and Tweenhunter. The material was getting even more graphic. Some threads had reached 30,000 views, including one where the poster asked users to send voice recordings to a girl threatening to rape her to death to “finally teach her not to be such a teasing cum target.” In May 2021 he wrote about her again, saying how funny it was “seeing which TikToks she deletes after they get posted here.” He began including the former students’ full names, addresses, phone numbers and social media handles and prompted others to contact them directly. By summer 2021 the young women started receiving private Facebook, Instagram and Snapchat messages with their photos beside male genitalia, or covered in semen. They got calls late at night from foreign numbers, with heavy breathing at the end of the line. In response, most deleted their social media accounts. One dropped out of college. Another says she lost 20 pounds from stress. At least two started carrying knives in their handbags. The gravity of the posts did little to accelerate the police response. They told the young women they were still working on the case but provided no further information. (A spokesman for the Nassau County Police Department said detectives “conducted a thorough investigation.” He didn’t respond to specific questions about the case.) The victims and their parents cared less about the nuances of the law than the immediate danger. The prime suspect was in their community. If the police weren’t going to do something about it, they’d have to do something themselves. OVER THE SUMMER, a former MacArthur cheerleader found a disturbing photo of herself on the site. She was smiling, wearing a white tank top and jeans. Beside that picture was what looked like a deepfake image of a woman in the same outfit covered in blood, her hands tied behind her back and a plastic bag over her head. The caption used her real name and said her body had been found near an abandoned construction site with semen in her mouth, anus and vagina. And it claimed a video of her death was circulating on the dark web. “I’d had enough—it had to stop,” says the former student, Ana, who asked to be identified only by her first name to avoid further harassment. Ana was then working as a special needs aide at an elementary school in Levittown. She’d heard that many of her former classmates suspected one of her oldest friends was behind the pictures. She’d known Carey since she was 5. His parents’ modest clapboard house backed onto East Broadway Elementary School, which they’d both attended. By the time they got to MacArthur, Ana was a cheerleader in the popular crowd, while Carey was into grunge music and weed. But they remained friends, sitting together in a ninth-grade computer class. He’d regularly tease her about being Christian and tell her he’d “rather be with the devil.” Carey liked to stir up debates on social media. He’d told some girls their viewpoints on issues like Black Lives Matter were misinformed. “You don’t need me to explain what a false dichotomy is, do you?” he teased one. “You’re basically a socialist,” he wrote another. “I’m just trying to spare you the next five to 10 years of irrational thinking.” He didn’t sit for a graduation photo in the 2019 MacArthur yearbook—beside his name it just says “camera shy.” Ana knew Carey was odd, but she didn’t think he was perverse enough to be behind the pictures. She decided to start her own investigation to unmask the predator and see if she could clear Carey’s name. From her bedroom, she spent hours each night scrutinizing every post her harasser made. In one, he’d shared an image of his genitals bulging out of a little girl’s underwear while standing in a girl’s bedroom. Looking closely at the background, she saw a white dresser with brown trim and a stuffed toy sloth on a bed. Carey had younger twin sisters, and Ana began searching for them on social media. She discovered that one of them was posting dancing clips on TikTok. They were filmed from the exact same bedroom she could see on cumonprintedpics.com, in front of the same dresser, with the same brown trim. Even the sloth was in the same position on the bed. “Oh, my God, this is crazy,” Ana recalls thinking. “It really is him.” She says she sent the photos to Detective Timothy Ingram, the lead investigator on the case, in August 2021. “You girls are doing our detective work for us,” she remembers him saying.
Fighting Deepfake Porn
Can Science Save the Banana?
Tallying America’s Trauma
IN BRIEF
OPINION
AGENDA
REMARKS
BUSINESS
TECHNOLOGY
FINANCE
ECONOMICS
PURSUITS / HOLIDAY ENTERTAINING
THE SHOW
скачать журнал: Bloomberg Businessweek (December 4, 2023)
|