Make earnings with no risk
Automated AI-driven system makes the trades, you earn the money
Join now
News

European Commission Requests Information from Meta Platforms and TikTok

0

The European Commission has recently called upon both Meta Platforms and TikTok to provide details regarding the actions they’ve taken to combat disinformation and illegal content on their platforms. This request comes in the aftermath of the recent attack by Hamas on Israel.

Under the newly implemented Digital Services Act, the companies have until Wednesday to submit their reports on crisis response strategies during the conflict. The European Union’s executive arm emphasized its focus on tackling the dissemination and amplification of illegal content and misinformation.

The ongoing conflict between Israel and Hamas has presented a significant challenge for social media platforms in terms of content moderation, as they navigate through a vast amount of misidentified videos, fabricated information, and graphic violence.

The European Union has specifically urged TikTok, which is a subsidiary of Chinese tech giant ByteDance, to disclose the measures it has taken against the spread of terrorist and violent content, as well as hate speech on its platform.

In response, TikTok has stated that it is currently reviewing the request and plans to release its first transparency report under the Digital Services Act next week. Additionally, the company announced the establishment of a command center to address the crisis, alongside a series of other measures.

Stay tuned for further developments as both Meta Platforms and TikTok provide their respective responses regarding their efforts to combat disinformation and illegal content.

Enhanced Measures to Address Graphic and Violent Content

Meta, the parent company of popular social media platforms, has announced additional measures to combat the spread of graphic and violent content. These measures aim to protect both moderators and users from exposure to harmful materials. One such measure is the implementation of real-time proactive automated detection systems, which swiftly identify and remove inappropriate content.

Furthermore, Meta plans to increase the number of moderators who are proficient in Arabic and Hebrew languages. This addition will aid in more effectively identifying and addressing harmful content within these specific linguistic contexts.

In a recent update, Meta emphasized its commitment to tackling conflicts by prohibiting content that praises Hamas, a designated dangerous organization. Additionally, default settings in the region have been adjusted to safeguard users against unwelcome or unwanted comments. The platform has also introduced a convenient method for bulk comment deletion.

The European Commission (EC) closely monitors these efforts and has requested information from Meta. The EC will evaluate the company’s responses and may initiate formal proceedings if necessary. Failure to comply with the EC’s requests may result in penalties.

Under the Digital Services Act, social media companies bear legal responsibility for the content shared on their platforms. Failure to remove illegal or harmful content can lead to significant fines, potentially amounting to 6% of the company’s global turnover.

Meta has yet to respond to media inquiries seeking comment on these recent developments.

In a separate matter, the EC has requested information from both social media platforms about measures undertaken to safeguard the integrity of electoral processes. The deadline for providing this information is November 8th.

fxcoach

Producer Prices in Canada Rise on Higher Crude Energy Costs

Previous article

VinFast Auto Secures $1 Billion Funding Agreement

Next article

You may also like

Comments

Leave a reply

Your email address will not be published.

More in News