AuthorTopic: Things That Make Me Say, "Dafuq?"  (Read 3926 times)

Online Surly1

  • Administrator
  • Master Chef
  • *****
  • Posts: 11641
    • View Profile
    • Doomstead Diner
The Purge of AI-Assisted Fake Porn Has Begun
« Reply #75 on: January 31, 2018, 11:21:29 AM »
Who knew this was even a thing?

The Purge of AI-Assisted Fake Porn Has Begun

The Purge of AI-Assisted Fake Porn Has Begun

Photo: Getty

It was only a matter of time before more sophisticated fake porn videos surfaced online. But a crackdown on this super-realistic fake porn is already beginning.

Reddit and Gfycat, two popular platforms where users have been uploading the fake porn, have begun to eradicate the manipulated smut, which is often so convincing that it blurs the contours of reality itself.

 

This type of fake porn, also referred to as deepfakes, involves mapping someone else’s face onto a porn star’s body. While fake porn has existed for years, free and more powerful tools using artificial intelligence now afford trolls a way to create more realistic videos, given they have enough images of their victim to recreate a scene. The Deepfakes subreddit dedicated to this type of content has thousands of users who often use celebrity faces, though the practice has also evolved to include classmates and former partners. These are often done without the person’s permission.

As the Next Web pointed out on Wednesday, some of these videos and GIFs are being removed from some of the more popular sites hosting them after news outlets began reporting on the trend. “I just noticed that […] my upload yesterday was deleted, could be a copyright issue,” a Reddit user wrote, according to the Next Web. “I don’t think it’s copyright since random Japanese idols with little to minimal presence in the West have been removed,” another redditor reportedly wrote, “[e]ven the botched ones that have no resemblance to any human being are gone. That suggests someone from [G]fycat proactively removed all the gifs linked here.”

Another Redditor posted in Deepfakes last night, noting that older Gfycat links had been removed from the site. “Seems like a targeted purge, so people should avoid using the website,” they wrote. “Anyone who posted a link before should try and rehost as well.”

In a statement emailed to Gizmodo, Gfycat confirmed that it is proactively purging its platform of deepfakes. Gfycat’s terms of service doesn’t have an explicit policy on revenge porn, but it does prohibit any content that is “unlawful, harmful, threatening, abusive, harassing, tortious, excessively violent, defamatory, vulgar, obscene, libelous, invasive of another’s privacy, hateful racially, ethnically or otherwise objectionable.” The company’s spokesperson said deepfakes clearly violate the site’s rules.

“Our terms of service allow us to remove content we find objectionable,” the Gyfcat spokesperson said. “We find this content objectionable and are actively removing it from our platform.”

Reddit hasn’t yet responded to our request for comment, but deepfakes are likely a violation of the site’s terms of service as well. Reddit clearly states that unwelcome content includes “the posting of photographs, videos, or digital images of any person in a state of nudity or engaged in any act of sexual conduct, taken or posted without their permission.” Of course, the whole matter is complicated by the fact that the naked bodies in deepfake porn do not belong to the people whose faces are attached to them.

Fake porn videos aren’t just harassing and a gross invasion of privacy—they likely violate copyright laws. To create deepfakes, someone will need hundreds of images of their victim. To collect these, they can use an open-source photo-scraping tool that will grab photos of their victim that are available online. But that victim can request the removal of those images if they are posted without their permission, citing copyright infringement. Both Reddit and Gfycat have copyright infringement policies that state they’ll remove offending content.

Just as advanced fake porn was a predictable outcome of our trollish reality, so was the inevitable backlash and policing. And trolls are already trying to figure out which platforms they can flock to next, posting alternative sites like Russian social networks vk.com and ok.ru for those worried about their videos being deleted. It’s a vicious cycle—and entirely unsurprising.

"It is difficult to write a paradiso when all the superficial indications are that you ought to write an apocalypse." -Ezra Pound

Online Eddie

  • Administrator
  • Master Chef
  • *****
  • Posts: 12829
    • View Profile
Re: The Purge of AI-Assisted Fake Porn Has Begun
« Reply #76 on: January 31, 2018, 12:41:41 PM »
Who knew this was even a thing?

The Purge of AI-Assisted Fake Porn Has Begun

The Purge of AI-Assisted Fake Porn Has Begun

Photo: Getty

It was only a matter of time before more sophisticated fake porn videos surfaced online. But a crackdown on this super-realistic fake porn is already beginning.

Reddit and Gfycat, two popular platforms where users have been uploading the fake porn, have begun to eradicate the manipulated smut, which is often so convincing that it blurs the contours of reality itself.

This type of fake porn, also referred to as deepfakes, involves mapping someone else’s face onto a porn star’s body. While fake porn has existed for years, free and more powerful tools using artificial intelligence now afford trolls a way to create more realistic videos, given they have enough images of their victim to recreate a scene. The Deepfakes subreddit dedicated to this type of content has thousands of users who often use celebrity faces, though the practice has also evolved to include classmates and former partners. These are often done without the person’s permission.

As the Next Web pointed out on Wednesday, some of these videos and GIFs are being removed from some of the more popular sites hosting them after news outlets began reporting on the trend. “I just noticed that […] my upload yesterday was deleted, could be a copyright issue,” a Reddit user wrote, according to the Next Web. “I don’t think it’s copyright since random Japanese idols with little to minimal presence in the West have been removed,” another redditor reportedly wrote, “[e]ven the botched ones that have no resemblance to any human being are gone. That suggests someone from [G]fycat proactively removed all the gifs linked here.”

Another Redditor posted in Deepfakes last night, noting that older Gfycat links had been removed from the site. “Seems like a targeted purge, so people should avoid using the website,” they wrote. “Anyone who posted a link before should try and rehost as well.”

In a statement emailed to Gizmodo, Gfycat confirmed that it is proactively purging its platform of deepfakes. Gfycat’s terms of service doesn’t have an explicit policy on revenge porn, but it does prohibit any content that is “unlawful, harmful, threatening, abusive, harassing, tortious, excessively violent, defamatory, vulgar, obscene, libelous, invasive of another’s privacy, hateful racially, ethnically or otherwise objectionable.” The company’s spokesperson said deepfakes clearly violate the site’s rules.

“Our terms of service allow us to remove content we find objectionable,” the Gyfcat spokesperson said. “We find this content objectionable and are actively removing it from our platform.”

Reddit hasn’t yet responded to our request for comment, but deepfakes are likely a violation of the site’s terms of service as well. Reddit clearly states that unwelcome content includes “the posting of photographs, videos, or digital images of any person in a state of nudity or engaged in any act of sexual conduct, taken or posted without their permission.” Of course, the whole matter is complicated by the fact that the naked bodies in deepfake porn do not belong to the people whose faces are attached to them.

Fake porn videos aren’t just harassing and a gross invasion of privacy—they likely violate copyright laws. To create deepfakes, someone will need hundreds of images of their victim. To collect these, they can use an open-source photo-scraping tool that will grab photos of their victim that are available online. But that victim can request the removal of those images if they are posted without their permission, citing copyright infringement. Both Reddit and Gfycat have copyright infringement policies that state they’ll remove offending content.

Just as advanced fake porn was a predictable outcome of our trollish reality, so was the inevitable backlash and policing. And trolls are already trying to figure out which platforms they can flock to next, posting alternative sites like Russian social networks vk.com and ok.ru for those worried about their videos being deleted. It’s a vicious cycle—and entirely unsurprising.


That really made me wish I was able to use Photoshop  so I could paste Melanie Ehrenkranz's head on a pornstar body. Just too tempting.
What makes the desert beautiful is that somewhere it hides a well.

Offline luciddreams

  • Administrator
  • Sous Chef
  • *****
  • Posts: 3259
    • View Profile
    • Epiphany Now
Re: The Purge of AI-Assisted Fake Porn Has Begun
« Reply #77 on: January 31, 2018, 02:15:32 PM »
Who knew this was even a thing?

The Purge of AI-Assisted Fake Porn Has Begun

The Purge of AI-Assisted Fake Porn Has Begun

Photo: Getty

It was only a matter of time before more sophisticated fake porn videos surfaced online. But a crackdown on this super-realistic fake porn is already beginning.

Reddit and Gfycat, two popular platforms where users have been uploading the fake porn, have begun to eradicate the manipulated smut, which is often so convincing that it blurs the contours of reality itself.

This type of fake porn, also referred to as deepfakes, involves mapping someone else’s face onto a porn star’s body. While fake porn has existed for years, free and more powerful tools using artificial intelligence now afford trolls a way to create more realistic videos, given they have enough images of their victim to recreate a scene. The Deepfakes subreddit dedicated to this type of content has thousands of users who often use celebrity faces, though the practice has also evolved to include classmates and former partners. These are often done without the person’s permission.

As the Next Web pointed out on Wednesday, some of these videos and GIFs are being removed from some of the more popular sites hosting them after news outlets began reporting on the trend. “I just noticed that […] my upload yesterday was deleted, could be a copyright issue,” a Reddit user wrote, according to the Next Web. “I don’t think it’s copyright since random Japanese idols with little to minimal presence in the West have been removed,” another redditor reportedly wrote, “[e]ven the botched ones that have no resemblance to any human being are gone. That suggests someone from [G]fycat proactively removed all the gifs linked here.”

Another Redditor posted in Deepfakes last night, noting that older Gfycat links had been removed from the site. “Seems like a targeted purge, so people should avoid using the website,” they wrote. “Anyone who posted a link before should try and rehost as well.”

In a statement emailed to Gizmodo, Gfycat confirmed that it is proactively purging its platform of deepfakes. Gfycat’s terms of service doesn’t have an explicit policy on revenge porn, but it does prohibit any content that is “unlawful, harmful, threatening, abusive, harassing, tortious, excessively violent, defamatory, vulgar, obscene, libelous, invasive of another’s privacy, hateful racially, ethnically or otherwise objectionable.” The company’s spokesperson said deepfakes clearly violate the site’s rules.

“Our terms of service allow us to remove content we find objectionable,” the Gyfcat spokesperson said. “We find this content objectionable and are actively removing it from our platform.”

Reddit hasn’t yet responded to our request for comment, but deepfakes are likely a violation of the site’s terms of service as well. Reddit clearly states that unwelcome content includes “the posting of photographs, videos, or digital images of any person in a state of nudity or engaged in any act of sexual conduct, taken or posted without their permission.” Of course, the whole matter is complicated by the fact that the naked bodies in deepfake porn do not belong to the people whose faces are attached to them.

Fake porn videos aren’t just harassing and a gross invasion of privacy—they likely violate copyright laws. To create deepfakes, someone will need hundreds of images of their victim. To collect these, they can use an open-source photo-scraping tool that will grab photos of their victim that are available online. But that victim can request the removal of those images if they are posted without their permission, citing copyright infringement. Both Reddit and Gfycat have copyright infringement policies that state they’ll remove offending content.

Just as advanced fake porn was a predictable outcome of our trollish reality, so was the inevitable backlash and policing. And trolls are already trying to figure out which platforms they can flock to next, posting alternative sites like Russian social networks vk.com and ok.ru for those worried about their videos being deleted. It’s a vicious cycle—and entirely unsurprising.


This is a perfect example to illustrate why you cannot be sure that any video you watch is depicting reality.  Video is proof of nothing now.  Talk about fake news.  Just about the only reality you can believe is the reality you are able to determine with your senses...and even then only half of that can be believed these days. 

Take Dump for instance.  Just a few short years ago Dump as president was fiction on the Simpsons, not it's reality, and I'm still having a hard time believing it. 

Reality has managed to become stranger then fiction. 

Online Surly1

  • Administrator
  • Master Chef
  • *****
  • Posts: 11641
    • View Profile
    • Doomstead Diner
I guess it's a tough gig to be a fox in a henhouse.

Scott Pruitt Says He Had to Spend So Much on First Class Flights Since Backlash to Him Is So 'Toxic'

Scott Pruitt Says He Had to Spend So Much on First Class Flights Since Backlash to Him Is So 'Toxic'

Photo: AP

Environmental Protection Agency chief Scott Pruitt, who either denies climate change or thinks it might be good, actually depending on how he feels that day, has done his very best to decimate the agency’s ranks at the same time he’s spent hundreds of thousands in taxpayer money on elaborate security measures like 24-hour guards and biometric office locks. Now, per the New Hampshire Union Leader, Pruitt has an explanation for the $90,000-plus he spent flying on largely-first class seats in June 2017.

It’s that people hate him so much he requires extra security, lest randos jump him in coach or whatever.

“Unfortunately ... we’ve had some incidents on travel dating back to when I first started serving in the March-April timeframe,” Pruitt told the Union Leader on Tuesday. “We live in a very toxic environment politically, particularly around issues of the environment.”

“We’ve reached the point where there’s not much civility in the marketplace and it’s created, you know, it’s created some issues and the (security) detail, the level of protection is determined by the level of threat,” Pruitt continued. “I’m not involved in any of those decisions ... Those are all made by the (security) detail, the security assessment in addition to the chief of staff.”

As the Huffington Post noted, the EPA has defended Pruitt’s lavish flight expenditures as legitimate government spending, and CNN reported he’s received numerous death threats:

The EPA defended Pruitt’s travel in an interview with The Washington Post on Sunday, saying ethics officials had approved the expenses. Federal regulations state that government employees must “consider the least expensive class of travel” for their needs, but security concerns do allow for more expensive bookings.

CNN reported in October that Pruitt gets at least “four to five times the number of threats” as his predecessor. He’s also the first person in the role to have a full-time security detail at a cost of about $2 million a year.

More threats than his predecessors may not exactly be a Code Red situation, though, as prior agency chiefs did not request anywhere near the same level of security as Pruitt. Former EPA administrator Christine Todd Whitman, a vocal Pruitt critic herself controversial from her tenure under President George W. Bush, told CNN that she felt no need to install elaborate security systems in her office and that cleaning staff could enter and exit with no restrictions.

It’s pretty obvious that sending Pruitt death threats is bad, extremely dumb, and absolutely will not do anything to make him think twice about his relentless crusade against the environmental mission of his own agency. But at the same time, something tells me he’s been reading a little too much about “antifa supersoldiers.”

In any case, perhaps the reason the EPA is suddenly attracting so much negative publicity over the course of the past year is things like Pruitt suggesting that it would be “arrogant” for humans to predict what temperatures are ideal for the continued survival of the species at the turn of the century, or his efforts to deregulate major polluters, or raise radiation safety limits, and various stuff of that nature. Not sure it says that in How to Make Friends and Influence People, though hey, there’s a lot of reading Pruitt should probably catch up on.

"It is difficult to write a paradiso when all the superficial indications are that you ought to write an apocalypse." -Ezra Pound

 

Related Topics

  Subject / Started by Replies Last post
3 Replies
1217 Views
Last post May 14, 2013, 05:59:34 PM
by Snowleopard
3 Replies
983 Views
Last post November 12, 2015, 01:49:04 PM
by K-Dog
0 Replies
90 Views
Last post July 14, 2017, 06:48:21 PM
by Palloy2