The UK may be preparing to hit a handful of tech companies with enforcement orders (and possible fines) related to the Children’s Online Privacy and Safety Code, which has been blocked valid for one year.
“The ICO is currently investigating how more than 50 different online services comply with the code and has four investigations underway. We have also audited nine organisations and are currently assessing their results,” the data protection watchdog said in a statement blog post Yesterday was the first anniversary of the Code’s implementation.
The TelegraphTwo of the four social media and tech companies surveyed in today’s newspaper report are household names.
The ICO’s decision on whether to prosecute is expected within weeks, its report said.
“This code makes it clear that children are not like adults online and their data needs greater protection,” Edwards told The Telegraph. “We will use our enforcement powers where needed.”
The company in question hasn’t been named by a newspaper or ICO, but finally NovemberAfterwards, regulators wrote to Apple and Google after they were concerned about how the two companies evaluate apps on their respective mobile app stores to determine the age rating of their apps.
The ICO described its outreach at the time as an “evidence-gathering process to determine compliance with the code” — although it remains to be seen whether the two tech giants are among the four companies that could face enforcement, or whether they’re just a wider group of regulators Who’s compliance is being watched all the time.
“Unfortunately, due to the ongoing investigation, we cannot name these companies at this time,” an ICO spokesperson confirmed when asked if they could share more details.
ICO issues first children’s code Back to 2020. It contains 15 standards advertised as “age-appropriate design” – essentially a set of design recommendations for web services that children may access, with features such as setting high privacy defaults and not using harsh engagement policies and other suggestions. May lead to an unhealthy addiction to using digital services.
While ICOs regulate personal data (not content), the overarching goal of the guidelines is to encourage platforms to protect children from accessing inappropriate content and prevent them from being mined by commercial data—the latter responsibility will fall under the upcoming Online Ofcom Security Act (assuming Another change of British Prime Minister would not lead to legislative rethinking in this regard).
This division of regulatory responsibilities has led to some friction among child safety activists, who support the Code—in fact, even surpass it 5Rights Chairman and Lifetime Peer, Baroness Kidronwho was a fundamental driver of adopting the standard (and went on to demand revisions in her upper house seat) – complained about the “gap” as they waited for a content-focused security law to pass parliament.
As a result, ICOs are under pressure to also focus on adult sites — that is, requiring porn sites to also comply with the Code — rather than just reviewing the most popular games and social media apps for children.
Age check for porn sites?
The first push by child safety activists is to force adult websites to apply strict age checks to prevent children from accessing online pornography — so, basically, a revival Mandatory age check for porn site policy UK lawmakers have been discussing the issue for years – recently revived (earlier this year) as an (other) addition to the Online Safety Act, following an independent age-checking scheme Decline in 2019 After facing infeasible criticism.
as the government February It would enforce the use of “age verification technology” on adult websites to make it harder for children to access or stumble upon pornography. But they obviously won’t sit idly by and wait for the legislation to pass – when the Children’s Code and UK Data Protection Law already exist for them to take advantage of…
Announced yesterday that, in a related change to its approach, the ICO has bowed to pressure to expand its interpretation of the Code to cover pornographic sites – or at least those that “may” be accessed by children (whatever that means Author) — wrote in its blog post: “We have … revised our position to clarify that adult-only services fall within the scope of the Children’s Code if they may be accessed by children.”
The evolution of how the guidelines are applied follows a petition from child safety activists and others warning children of the risks of “data protection harm” when visiting pornographic sites, the ICO said.
“We will continue to improve our approach and listen to others to ensure the code has the greatest impact,” it continued. “For example, we’re seeing more and more research (from NSPCC, 5Rights, Microsoft and the UK Film Rating Board) showing that children may be accessing adult-only services and that these services are a data protection hazard and children are out of control Their data may be manipulated to provide more data, in addition to content harm.”
This variation of the application does not (cannot) expand the content specified by the ICO to include the content itself. (“We do not regulate content,” its spokesperson confirmed. “We regulate the use or processing of children’s personal data in order to make content available to children. This is a step before children see it.”)
It’s clear, however, that porn sites’ data collection habits aren’t the primary concern of child safety activists — yes, the content is — but if activists can use children’s privacy rules to force porn sites to implement age checks, they look Not too picky.
In a statement welcoming the ICO revision to bring adult-only websites into the scope of the code, the 5Rights Foundation, a child safety campaign group, said:
“The UK age-appropriate design code applies to all services that children under the age of 18 may use, even if they are not intended for children. Through investigative work submitted to the ICO last year, 5Rights found that sites including gambling, dating and pornography were being accessed by children , and does not follow guidelines, especially the profiling of children for harmful material.”
“The ICO’s announcement on adult websites will provide much-needed clarity for companies that believe they are breaking the law,” added Duncan McCann, its head of policy enforcement, in a separate statement of support. “They will no longer have grey lines to draw on and we hope this development will help to further improve young people’s lives online.”
While the UK Children’s Code itself is not legally binding, it is linked to the country’s broader data protection rules, including the Data Protection Act and the UK GDPR, and ICO Guide It noted that applicable online services “need to follow” these standards to “ensure that they comply with their obligations under data protection law to protect children’s data online”.
Under the GDPR, ICOs have broad powers to fight privacy breaches – infringers can be fined up to 4% of their annual global turnover (or up to £17.5m, whichever is higher). So the subtext here is basically “comply with the code or risk GDPR-level enforcement” – give ICOs a big stick to encourage in-scope digital services to apply gold-plated rules that could end up in an age-restricted Internet, because who knows? What other services might children “likely” have access to?
When asked how adult websites should assess whether children are likely to access their services, a spokeswoman for the ICO replied: “Services must be accountable for their decisions and be able to provide evidence to support their concerns about the potential for access by children. Views. Children. To determine whether they fall under the code, adult services need to understand who their users are and determine whether children make up a significant portion of those users. To do this, online services can conduct research on their users, Review academic research or commissioned market research to consider the types of content and activities children are interested in and the appeal of their services to children; or consider whether children are known to enjoy similar services.”
The phrase “know who their users are and determine if children make up a significant portion of those users” does a lot of work in this sentence – although the ICO doesn’t explicitly recommend using age-verification technology as a service to determine if it belongs Scope of this Code. Next up is…
“If children are likely to access an adult-only online service, the service needs to take steps to limit children’s access to the service, such as by implementing age-guarantee measures, or must implement code-based standards in a proportionate, risk-based manner to protect children’s Online privacy,” the ICO’s spokeswoman also told us, adding: “It is critical to look after children online rather than treating them like adults. Embedding children’s codes is a long-term, transformative process. , but we’re seeing more and more changes that benefit children and make the online industry more innovative, which is the right thing to do.”
The ICO’s blog post also states that the (privacy) watchdog will work with Ofcom (the upcoming content watchdog) and the Department of Digital, Culture, Media and Sport (DCMS) to “determine how code is relevant in practice to adults – only limited services and what they should expect.” So expect further implementation “evolutions” as more parts of the UK’s digital regulatory strategy land (or, well, disappear).
This ICOs have gained credibility Over the past year, major platforms have made several policy adjustments to children’s accounts, including Facebook, Instagram, YouTube, Google, and Nintendo, such as Meta-owned platforms that restrict age, gender, and location to children under 18; YouTube turns off automatic by default Play feature, and turn on rest and bedtime reminders for Google accounts under 18 by default to list the two actions it flags.
The British Code has also been credited with encouraging similar policy initiatives in other jurisdictions — the bill reportedly inspired a California bill that Legislators just passed this week (and, if signed into law, would implement similar protections for children under the age of 18 in the state), as well as other efforts by other regulators and policymakers to protect children online.