eSafety Commissioner continues to take a tough stance in Australia

Recent developments regarding CSEA transparency notices and registration of various industry codes for class 1 material

 

Key Points

Since the Online Safety Act 2021 (Cth) (the Act) came into force in January 2022, Australia’s eSafety Commissioner (Commissioner) has:

  • issued two sets of transparency notices to large technology multinationals;
  • published two transparency reports regarding child sexual exploitation and abuse (CSEA) material, a particularly high risk and highly harmful issue that has seen sustained growth in Australia and globally;
  • issued a fine to X (formerly Twitter) in respect of its response to a transparency notice;
  • indicated that the eSafety Commission will continue to take a tough stance against CSEA material in Australia;
  • registered 5 industry codes which regulate class 1 material with obligations commencing on 16 December 2023 or 12 March 2024 depending on the type of online service; and
  • refused to register the draft industry codes regulating class 1 material for relevant electronic services and designated internet services because that they did not provide appropriate community safeguards for users in Australia.

The Commissioner’s proactive regulatory approach indicates that social media services, relevant electronic services and designated internet services must be ready to respond to questions from the eSafety Commissioner (particularly given that notices will likely include tight deadlines for responses) and, should a notice be issued, provide sufficiently detailed and specific (as opposed to generic) responses.

Our detailed update is below.

Background

The Act empowers Australia’s Minister for Communications to set ‘basic online safety expectations’ for social media services (Defined in s 13A of the Act), relevant electronic services and designated internet services (Defined in s 14 of the Act) (the Services) by way of a determination. As at the date of writing, only the Online Safety (Basic Online Safety Expectations) Determination 2022 (Expectations) is in force, which applies to each of the Services.

The Expectations require, for example, providers of the Services to have in place:

  • clear and readily identifiable mechanisms that enable end users of the Service as well as any person ordinarily resident in Australia to report, and make complaints about certain material;
  • terms of use;
  • policies and procedures in relation to the safety of end-users;
  • policies and procedures for dealing with reports and complaints; and
  • standards of conduct for end-users (including in relation to material that may be posted using the Service by end-users, if applicable), and policies and procedures in relation to the moderation of conduct and enforcement of those standards.

Such terms, policies and standards are required to be readily accessible, regularly reviewed and updated and set out in plain language.

Service providers are also required to take reasonable steps to:

  • ensure end-users are able to use the relevant Service in a safe manner;
  • proactively minimise the extent to which material or activity on the Service is unlawful or harmful;
  • minimise the extent to which certain material is provided on the Service, including a non-consensual intimate image of a person, cyber-bullying material targeted at Australian children and cyber-abuse material targeted at an Australian adult; and
  • ensure that penalties for breaches of its terms of use are enforced against all accounts held or created by the end-user who breached the terms of use of the Service.

Under the Act (The Act, ss 49, 56), the Commissioner can require the provider of the Services to report (whether periodically or on a one-off basis) on the extent to which the provider complied with applicable or specified Expectations during a period determined by the Commissioner. Civil penalties apply in relation to a failure to comply with a notice or determination from the Commissioner requiring the provision of a periodic report (Ibid, s 50).

The Commissioner is permitted to publish summaries of the information received through the notices and has indicated that the purpose of doing so is ‘to improve transparency and accountability of providers by providing better information about what they are actually doing – or not doing – to keep Australians safe, and to incentivise services to improve their safety standards' (First Transparency Report: ‘Basic Online Safety Expectations: Summary of industry responses to the first mandatory transparency notices’, August 2022) .

The Commissioner has issued 13 (Annual report for the eSafety Commissioner, FY2022-23 207) non-periodic notices since the Act came into force in January 2022 and has subsequently released reports regarding the findings in respect of those notices. Those findings are summarised below.

First mandatory transparency notices and findings

In August 2022, the Commissioner issued non-periodic notices to Apple, Meta (and WhatsApp), Microsoft (and Skype), Omegle and Snap (of Snapchat) requiring each provider to report on its implementation of the Expectations with respect to CSEA material. Service providers were asked about the tools, policies and processes they were using to address various forms of CSEA, such as the proliferation of online CSEA material, the online grooming of children, and the use of video calling and conferencing services to provide live feeds of child abuse.

The Commissioner released her report regarding the responses to those notices in December 2022 (First Transparency Report). As the notices were issued only in respect of the period 24 January 2022 to 31 July 2022, the First Transparency Report reflects the responses provided by each of those entities as at that point in time. It is important to note that the providers of the Services may have implemented changes to policies and processes since this information was provided. Findings from the First Transparency Report included, in relation to:

  • the detection of previously confirmed (‘known’) images and videos: many providers used image hash matching tools to detect such images and videos, however Microsoft (in respect of OneDrive), Snap (noting that snaps are taken live) and Apple (in respect of iMessages) did not;
  • the detection of new material: some providers used tools, for example via artificial intelligence, to detect the sharing of new CSEA material. Microsoft, Skype and Snap were found not to use such tools, however OneDrive, Teams and Xbox Live use classifiers to detect nudity, which also picks up some CSEA material;
  • the detection of CSEA in livestreams or video calls: only Omegle had measures in place to detect such CSEA, while Microsoft (in respect of Teams and Skype) and Apple (in respect of Facetime) did not;
  • the detection of grooming: while some providers used language analysis tools, Microsoft (on Skype and Outlook), Snap and Apple did not;
  • the detection of recidivism (in this context, referring to offenders creating multiple new accounts when they have been banned from a platform): all providers took steps to prevent recidivism, however WhatsApp and Omegle were identified as only having ‘minimal’ indicators to prevent banned users re-registering. In addition, whilst Microsoft and Skype had in place cross-service bans to ensure that users banned on one subsidiary service for CSEA related activities were not able to use a provider’s other services, Meta only had such measures in place ‘in specific circumstances’ and WhatsApp did not have such a ban in place; and
  • responding to user reporting of illegal content: the median time for user-reported CSEA material to be actioned by the Service provider ranged from 4 minutes (Snap) to 2 days (Microsoft and Skype). Times were not provided for Apple and Omegle, who the Commissioner had to specifically ask for confirmation regarding their user reporting options, as eSafety could not find this information available prominently on the relevant Services.

Second mandatory transparency notices and findings

In February 2023, the Commissioner issued a second round of non-periodic reporting notices to Google, Twitter (which later became X), TikTok, Discord and Twitch also in relation to CSEA material.

The Commissioner released her report regarding the responses to those notices on 16 October 2023, building on her findings from the First Transparency Report (Second Transparency Report). As the notices were issued only in respect of the period 24 January 2022 to 31 January 2023, the Second Transparency Report reflects the responses provided by each of those entities as at that point in time. It is important to note that the providers of the Services may have implemented changes to policies and processes since this information was provided. Findings from the Second Transparency Report included, in relation to:

  • the detection of CSEA in livestreams or video calls: despite the availability of technology to help detect CSEA material in livestreams or video calls, not all providers were using it, including Discord;
  • the detection of grooming: Discord, X and Google (in respect of Meet, Chat, Messages and Gmail) were not using language analysis technology to detect online grooming;
  • the blocking of URLs linking to known CSEA material: despite the availability of databases that identify URLs which link to known CSEA material and websites that are dedicated to it, some providers were not using them, including Discord, Google (in respect of YouTube, Drive, Meet, Chat, Photos, Messages, Gmail, Blogger) and TikTok (in respect of direct messages).
  • community moderation:
    • ‘In user-governed online communities, some Service providers used appointed volunteers to actively support content moderation and enforcement of community rules, alongside the tools and resources deployed by the Service itself. These volunteer roles, such as creators, streamers, administrators and moderators, are given administrative power to remove unacceptable material and ban violators. Where there is no standards enforcement policy that outlines the responsibilities and expectations of these volunteers, enforcement of rules can be inconsistent.’
    • ‘In addition, a lack of engagement between volunteer moderators and the Trust and Safety staff of a [Service provider] increases the risk of sexual predators continuing to abuse and re-victimise children, because they may only be banned from a specific channel or group, rather than the whole [Service].’
    • Discord was found not to have a standards enforcement policy outlining the responsibilities and expectations of volunteer moderators or administrators. Professional Trust and Safety staff at Discord and Twitch were also found to not be automatically notified when volunteer moderators, administrators or creators took action against CSEA material. It was also indicated that users on both Discord and Twitch were not able to directly report volunteer moderators, administrators or creators for failing to enforce rules.
  • language coverage: it is important that Service providers have human moderators operating in the languages of the communities that use their Services. The number of languages of human moderators on the platforms ranged from 12 (X) to 73 (TikTok). In this respect, X noted that it ‘has the ability to seek and conduct vendor services in a range of additional languages, which include but are not limited to those needed in the event of additional reviews, or for emergencies’; and
  • responding to user reporting of illegal content: the responses to the notices highlighted significant differences in the time that Service providers took to consider and respond to user reports about CSEA material, ranging from 5.2 minutes (TikTok) to 13 hours (Discord, in respect of direct messages), for those Service providers who provided a median time for the actioning of CSEA material.

First infringement notice fine

In the Second Transparency Report, the Commissioner also indicated that it had closely considered the responses provided and found that X did not comply with the transparency notice given to it to the extent that it was able, as in the 7 months following the notice being issued, X failed to provide any response to some questions (including by leaving boxes entirely blank) and in other instances provided a response that was otherwise incomplete and/or inaccurate. X was given further opportunities to provide the information required but the Commissioner found that significant gaps remained. As a result, the Commissioner issued to X a service provider notification and an infringement notice fine of AU$610,500 for its non-compliance. See our update here.

The Commissioner also issued a formal warning to another notice recipient because she found that it had provided answers that were not relevant, or were generic, and in other instances had provided aggregated information across multiple services in response to a request for information regarding specific services.

Takeaways

Across both the First Transparency Report and Second Transparency Report, the Commissioner has found that there are serious shortfalls in how some Service providers detect, remove and prevent CSEA material, inconsistencies in how providers deal with this material across their different Services, and significant variations in the time it takes them to respond to public reports regarding CSEA material.

More generally, the Commissioner has indicated that she will continue to use the full range of powers available to her under the Act to ensure transparency and to hold Service providers to account.

Looking forward

The Act also provides for industry bodies to develop codes and standards and for the Commissioner to register such codes if they meet the statutory requirements set out in the Act.

Following consultation with the Commissioner, the Commissioner has decided to register 5 codes which regulate ‘class 1 material’ and relate to social media services, app distribution services, hosting services, internet carriage services and equipment. The obligations contained in these codes will come into effect on 16 December 2023. An additional industry code for internet search engine services was registered on 12 September 2023 and will come into effect on 12 March 2024.

The Commissioner decided not to register the industry codes submitted in relation to relevant electronic services and designated internet services on the basis that they ‘did not provide appropriate community safeguards for users in Australia’. The eSafety Commission will draft industry standards for those industry sections following consultation with those industries and the public.

The second phase of codes development which focuses on class 2 material (such as online pornography) has not yet commenced.