![]() The first batch of reports from the platforms has to be submitted in the second half of 2024. The report must include details on what steps the platform has taken to mitigate Singapore users’ exposure to harmful content, how much and what types of harmful content users here encounter on the service, and what actions were taken on user reports. IMDA will also collect annual online safety reports from each platform, which the agency will publish online to help users make an informed choice on which platform is best suited to provide a safe user experience. Platforms are required to prioritise user reports based on severity or imminence as a general principle, IMDA told ST in a separate statement. The agency did not give a timeframe within which platforms are expected to respond to reports. ![]() The platform should take appropriate action on user reports in a timely and diligent manner and inform the users concerned of its decision and any action taken in response to the reports, said IMDA. This includes safety resources or information on support centres.Įach social media service is expected to have effective and easy-to-use reporting mechanisms to flag harmful content or unwanted interactions. Users who use high-risk terms related to self-harm or suicide must be actively offered local safety information that is easy to understand. ![]() Parents and guardians must also be given tools to manage the content that their children can see, the public visibility of their accounts and permissions for who can contact and interact with them. Platforms are also required to include tools that allow children or their parents to manage their safety on these services, and mechanisms for users to report harmful content and unwanted interactions. This could apply to advertisements that involve alcohol or body-modification and weight loss products, The Straits Times understands. “Accounts belonging to children must not receive advertisements, promoted content and content recommendations that designated social media services are reasonably aware to be detrimental to children’s physical or mental well-being,” said IMDA. ![]() Users should also be given tools to manage their own safety, like the option to hide harmful content and unwanted interactions, and limit location sharing and the visibility of their accounts to other users.Įach platform must also create separate community guidelines for younger users, along with moderating content and providing online safety information that they can easily understand. Using technology and other processes, the social media service must minimise users’ exposure to any content related to child sexual exploitation and abuse, or content promoting terrorism, said IMDA. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |