Vietnam tells foreign social media to use AI to detect 'toxic' content

FILE PHOTO: A man uses a mobile device in a coffee shop in Hanoi

HANOI (Reuters) - Vietnam has told cross-border social platforms to use artificial intelligence (AI) models that can detect and remove "toxic" content automatically, the latest requirement in its stringent regime for social media firms, state media reported on Friday.

Vietnam has repeatedly asked companies like Meta's Facebook, Google's YouTube and TikTok to coordinate with authorities to stamp out content deemed "toxic", such as offensive, false and anti-state content.

"This is the first time Vietnam has announced such an order," state-run broadcaster Vietnam Television (VTV) reported from the information ministry's mid-year review event, which was opened to selected newspaper.

The report did not give details on when and how cross-border platforms had to abide by the new requirement.

During the first half of this year, in accordance with government requests, Facebook removed 2,549 posts, the ministry said in a statement. YouTube removed 6,101 videos while TikTok took down 415 links, the info ministry said in a statement.

The announcement came as Southeast Asian countries are drawing up governance and ethics guidelines for AI that will impose "guardrails" on the booming technology, Reuters reported this month.

Vietnam in recent years has issued several regulations together with a cybersecurity law that target foreign social media platforms in a bid to battle disinformation in news and force foreign tech firms to establish representative offices in Vietnam and store data in the country.

The country last month undertook a comprehensive inspection on short videos platform TikTok's local operations and preliminary results showed "various" TikTok violations, the info ministry has said.

VTV reported the info ministry saying at Friday's event that U.S streaming giant Netflix had submitted documents needed to open a local office in Vietnam.

(Reporting by Phuong Nguyen; Editing by Mark Potter)