SINGAPORE (ANN/STRAITS TIMES) – Starting March 31, app stores in Singapore will be required to verify the age of young users before allowing them to download adult-targeted apps. This move, similar to age checks for pub entry, is part of new regulations aimed at protecting children from exposure to harmful content online.
Under the new guidelines, users aged 18 and below will be prohibited from downloading adult apps, including dating apps and those with explicit or sexual content. The Singaporean government’s media regulator is introducing a Code of Practice for Online Safety, which will enforce this age-screening requirement. App stores will have until March 2026 to implement the changes.
The primary goal of this initiative is to safeguard children from harmful content such as violence, sexual material, and cyberbullying. Apple, Google, Huawei, Samsung, and Microsoft—who operate the major app stores—will be affected by the new rule.
Violation of this rule could result in fines of up to USD1 million or the blocking of non-compliant services under the Broadcasting Act, which was amended in 2023 to address concerns over social media platforms and app stores.
Benefits of the new code
One key feature of the new code is age screening, which aims to block children from downloading apps meant for adults. Chew Han Ei, an adjunct senior researcher at the Institute of Policy Studies, explained that app stores act as “gatekeepers,” ensuring better content regulation by implementing age checks at the point of access.
For parents like Madam Salizawati Abdul Aziz, a teacher with six children, the new code offers a solution to the ongoing issue of children circumventing age verification systems to access inappropriate content. “By having this code, app stores can help restrict young users from downloading inappropriate apps and prevent unnecessary exposure,” she said.
The code will also help authorities curb harmful user-generated content within apps, such as inappropriate chats or content posted on social media. App stores will now be required to act against developers who fail to address complaints about harmful content or take steps to prevent it. These actions may include banning or removing the app from the store.
The focus on user-generated content is critical, as platforms like Minecraft and Roblox have been criticised for allowing predators and inappropriate content to target children via in-game chats and forums.
Collaboration with existing regulations
This new code will complement the 2023 Code of Practice for Online Safety, which mandates that social media platforms address harmful content affecting children. App developers will be required to monitor their platforms for online harms, provide parental controls, and act on user reports of harmful content.
Associate Professor Carol Soon from the National University of Singapore’s (NUS) Department of Communications and New Media emphasised that the new code adds an extra layer of protection by targeting app stores, which act as entry points to a wide range of online services and products not covered by existing laws.
Challenges ahead
One of the major challenges in enforcing age verification is determining how to accurately verify a user’s age. The Infocomm Media Development Authority (IMDA) has suggested two methods: government-issued IDs or advanced technology such as artificial intelligence (AI) and facial recognition.
While government IDs are considered accurate, concerns over privacy and data security remain. Josh Lee, a tech adviser from Rajah and Tann, warned that the collection of personal data could be misused. To address this, some experts have suggested using zero-knowledge proofs, a digital verification method that would confirm a user’s age without revealing sensitive information.
Another option under consideration is facial recognition technology. However, experts have raised concerns over the accuracy of such systems, as variables like camera quality, skin tone, and facial structure can affect age estimation. Additionally, age estimation may be less reliable for teenagers, who undergo rapid physical changes.
Despite the potential challenges, the role of parents remains crucial in monitoring and controlling their children’s app usage. Experts suggest that notifications, such as alerts to parents when their child attempts to download an adult app, could help mitigate risks.
Potential gaps and future considerations
As the five major app stores begin to implement these age verification measures, there may be gaps in other platforms, such as PC gaming marketplace Steam. With millions of global users, Steam has been criticised for its lax age verification practices, which merely ask users to state their birthdate before downloading mature-rated games.
Professor Soon also noted that while the new code targets the five most dominant app stores, it could eventually be expanded to include other platforms as regulators refine their age verification methods.
In the meantime, experts and parents alike agree that the code represents an important step forward in protecting children from online harms, but its success will ultimately depend on effective implementation and the involvement of both tech companies and parents.