TikTok, Snapchat, OnlyFans and others to combat AI-generated child abuse content

Share This Post

Major social platforms, AI companies, governments and NGOs issued a joint statement pledging to combat AI-generated abusive content, such as explicit images of children.

A coalition of major social media platforms, artificial intelligence (AI) developers, governments and non-governmental organizations (NGOs) have issued a joint statement pledging to combat abusive content generated by AI.

On Oct. 30, the United Kingdom issued the policy statement, which includes 27 signatories, including the governments of the United States, Australia, Korea, Germany and Italy, along with social media platforms Snapchat, TikTok and OnlyFans.

It was also undersigned by the AI platforms Stability AI and Ontocord.AI and a number of NGOs working toward internet safety and children’s rights, among others.

The statement says that while AI offers “enormous opportunities” in tackling threats of online child sexual abuse, it can also be utilized by predators to generate such types of material.

It revealed data from the Internet Watch Foundation that, within a month of 11,108 AI-generated images shared in a dark web forum, 2,978 depicted content related to child sexual abuse.

Related: US President Joe Biden urges tech firms to address risks of AI

The U.K. government said the statement stands as a pledge to “seek to understand and, as appropriate, act on the risks arising from AI to tackling child sexual abuse through existing fora.”

“All actors have a role to play in ensuring the safety of children from the risks of frontier AI.”

It encouraged transparency on plans for measuring, monitoring and managing ways AI can be exploited by child sexual offenders and on a country level to build policies regarding the topic.

Additionally, it aims to maintain a dialogue around combating child sexual abuse in the AI age. This statement was released in the run-up to the U.K. hosting its global summit on AI safety this week.

Concerns over child safety in relation to AI have been a major topic of discussion in the face of the rapid emergence and widespread use of the technology.

On Oct. 26, 34 states in the U.S. filed a lawsuit against Meta, the Facebook and Instagram parent company, over child safety concerns.

Magazine: AI Eye: Get better results being nice to ChatGPT, AI fake child porn debate, Amazon’s AI reviews

Read Entire Article
spot_img
- Advertisement -spot_img

Related Posts

Rogue-Lite Adventure Meets Player-Driven Economy in Etherscape

Join Regina as she uncovers the allure of this dungeon-crawling sensation dominating charts just days after its release Last Episode’s Quick Recap Portal Fantasy – an exciting new Web3 game

Bitcoin Golden Multiplier Ratio: Analyst Says The Party Is Just Getting Started

Crypto analyst CryptoCon recently alluded to a Bitcoin ‘Golden Multiplier Ratio,’ which he suggested paints a very bullish picture of the Bitcoin price Based on this, the analyst remarked that

Bitcoin At $120K? Trading Firm Links Trump’s Reserve Plan To Bold Prediction

Many analysts aren’t surprised by the recent price surge of Bitcoin to $80,000, given Donald Trump’s convincing victory in the recently concluded US national elections Some market analysts

Why is Bitcoin Price Down Today?

The post Why is Bitcoin Price Down Today appeared first on Coinpedia Fintech News Bitcoin is currently down by more than two percent, trading slightly below the $90k mark It is currently experiencing

Cameron Winklevoss Emphasizes The Importance Of DOGE Initiative, Says It Could Reduce Inflation

The post Cameron Winklevoss Emphasizes The Importance Of DOGE Initiative, Says It Could Reduce Inflation appeared first on Coinpedia Fintech News US President-elect Donald Trump recently announced

Bitcoin MVRV Hits Critical Threshold For Profit Taking – What Does This Mean?

Bitcoin recorded another remarkable price performance in the past week, gaining by 1916% according to data from CoinMarketCap The crypto market leader established a new all-time high at $93,434 on