Documents containing statistics on reports and outcomes

This request has been withdrawn by the person who made it. There may be an explanation in the correspondence below.

Dear eSafety Commissioner,

If possible, please treat this as an informal or administrative request. Otherwise, please treat this as a formal request for documents under the Freedom of Information Act 1982.

I request copies of documents created between 1 Jan 2020 and 1 Jan 2021, inclusive, that contain the following information:

- The number of reports made to the eSafety Commissioner, and their category (e.g. image-based abuse, cyber-bullying, illegal or harmful content, etc.)
- The number of reports made to the eSafety Commissioner that resulted in a successful takedown of content, and the type of content that was taken down or otherwise removed.
- The number of reports made to the eSafety Commissioner that did not result in a successful takedown of content, and the reason the takedown was unsuccessful.
- The number of reports made to the eSafety Commissioner that were not actionable or were abandoned, and the reason they were not actionable or were abandoned.
- The number of reports made to the eSafety Commissioner that were determined to be fake, malicious, or otherwise not made in good faith.
- The number of reports made to the eSafety Commissioner that were repeats or follow-ups to earlier reports.
- The number of reports made to the eSafety Commissioner that were appealed by either the subject of the report or the platform hosting the content, and the outcome of that appeal.
- The number of reports made to the eSafety Commissioner that resulted in a takedown of content where the takedown was later found to have been made in error.

If there are multiple drafts of a document, I request only the final version of a document. If there is no final version of a document, then I request only the latest draft. For the avoidance of doubt, if a document is regularly produced, such as a monthly report, I request the final or latest draft of each report (i.e. monthly for monthly reports, weekly for weekly reports, etc.)

I request that the documents be provided in electronic form.

I further request that any charges associated with this request be waved on public interest grounds. The eSafety Commissioner administers a substantial budget and its ability to effectively use this budget to achieve its objectives is a matter of public interest; effective oversight of public expenditure is a key function of the Freedom of Information Act. The requested documents will also help to inform debate on issues are that of public importance.

Yours faithfully,

Justin Warren

enquiries@esafety.gov.au, Children’s eSafety Commissioner

5 Attachments

Dear Justin,

 

We are writing in relation to your correspondence of 24 February 2021.

 

We confirm that we are happy to handle your request as an administrative
access request, rather than pursuant to the formal process outlined under
the Freedom of Information Act 1982 (Cth) (FOI Act). Relevantly, section
3A(2) of the FOI Act specifically states that the Act does not limit an
agency’s power to give access to information under other legislative or
administrative arrangements.

 

Given this is not a formal freedom of information request, please provide
a direct email address that we can use to correspond with you about your
administrative access request.

 

We look forward to hearing from you.

 

Kind regards,

 

FOI Coordinator

eSafety enquiries

The eSafety Commissioner

 

[1]Email-Footers icons3   [2]esafety.gov.au

 

[3]Email-Footers iconsT    [4]Email-Footers iconsF    [5]Email-Footers
iconsL

 

eSafety acknowledges the Traditional Custodians of country throughout
Australia and their continuing connection to land, water, culture and
community. We pay our respects to Elders past, present and emerging.

 

 

 

_____________________________________________________________________________

 

show quoted sections

Dear [email address],

I have contacted you via email to discuss your offer to provide this information by administrative means.

Yours sincerely,

Justin Warren

Justin Warren left an annotation ()

I sent the following email, personal details redacted:

Dear FOI Coordinator,

Thank you for your offer to provide the information I have requested in
my FOI request
(https://www.righttoknow.org.au/request/d...)
via administrative means.

I can be contacted at this email address, and on the number below, to
discuss your offer.

Justin Warren left an annotation ()

I received the following email from the eSafety FOI Coordinator:

Dear Justin,

Thank you for confirming you’re happy to handle your request as an administrative access request. Thank you also for providing us a direct email and phone number to correspond with you about your request.

We are in the process of preparing a response to your request. We will provide you a response in due course.

If you have any questions, please contact enquiries@esafety.gov.au

Kind regards,

FOI Coordinator

The eSafety Commissioner

Justin Warren left an annotation ()

I responded to the FOI Coordinator's email:

On 4/3/21 12:06 pm, enquiries@esafety.gov.au wrote:
>
> Dear Justin,
>
>
>
> Thank you for confirming you’re happy to handle your request as an administrative access request. Thank you also for providing us a direct email and phone number to correspond with you about your request.

You misunderstand me. I confirmed no such thing.

I said I was happy to discuss how you will fulfil my request using administrative means. If I am satisfied that this will provide me with the documents I have requested in a timely fashion, then I will withdraw my FOI request.

Please confirm that you intend to provide me with all of the documents I have asked for and an estimate of when I will receive this information.

Otherwise, the FOI request remains valid and I expect eSafety to comply with the statutory requirements for processing such a request.

Justin Warren left an annotation ()

I'll be chatting with eSafety about this request on Monday 8 March 2021.

Justin Warren left an annotation ()

Some additional email correspondence:

>
>
> Dear Justin,
>
>
>
> Thank you for your time on Monday discussing your request. Thank you also for providing a follow up email clarifying your request, this was helpful.
>
>
>
> As discussed, please see below the information we anticipate being able to provide you for your request. We hope this assists you in making a decision as to whether you want to proceed with your request as an administrative access request or a formal request under the Freedom of Information Act 1982 (Cth) (FOI Act).
>
>
>
> To ensure we can process your request in a timely manner, we would appreciate you confirming as soon as possible whether you would like to proceed with your request as administrative access request or a formal request under the FOI Act.

I think we can proceed with an administrative request, based on the information below.

Some additional clarifying questions inline below.

>
>
> ________________________________________________________________________________
>
>
>
>
>
> - The number of reports made to the eSafety Commissioner, and their category (e.g. image-based abuse, cyber-bullying, illegal or harmful content, etc.)
>
>
>
> This information can be provided for all schemes.
>
>
>
> - The number of reports made to the eSafety Commissioner that resulted in a successful takedown of content, and the type of content that was taken down or otherwise removed.
>
>
>
> This will depend on the scheme. We can provide information in relation to reports where this information is available. We can also outline the factors and circumstances as to why this information may not be available.
>
>
>
> - The number of reports made to the eSafety Commissioner that did not result in a successful takedown of content, and the reason the takedown was unsuccessful.
>
>
>
> We can provide high level information relating to this request.
>
When you say "high level information" what does this mean?

>
>
>
> - The number of reports made to the eSafety Commissioner that were not actionable or were abandoned, and the reason they were not actionable or were abandoned.
>
>
>
> eSafety does not ‘abandon’ any reports.
>
>
>
> We can provide high level information relating to this request, including information about the matters we take into account when:
> exercising our discretion to investigate; and
> considering what action to take in respect of a report.
>
Okay, that would be useful.

To clarify, I'm after an idea of how many reports didn't progress past the initial triage process for one reason or another.

An aggregate count of all requests that couldn't be progressed, possibly due to your exercising discretion, would be fine. Ideally also grouped by category of reason (discretion exercised, insufficient detail, that sort of thing) if you do that.

If you don't categorise these kinds of reports, that'd be good to know instead.
>
>
>
>
> - The number of reports made to the eSafety Commissioner that were determined to be fake, malicious, or otherwise not made in good faith.
>
>
>
> We can provide information about the number of hoax, frivolous or vexatious complaints.
>
>
>
> - The number of reports made to the eSafety Commissioner that were repeats or follow-ups to earlier reports.
>
>
>
> We can provide high level information about the reasons people may reengage with eSafety on the same or connected reports.
>
Again, what does "high level information" mean?

Ideally I'm trying to get a feeling for how many times people "re-engage". Their reasons for doing so (grouped by category) would also be useful to know.
>
>
>
> - The number of reports made to the eSafety Commissioner that were appealed by either the subject of the report or the platform hosting the content, and the outcome of that appeal.
>
>
>
> No reports have been subject to appeal.
>
>
>
> - The number of reports made to the eSafety Commissioner that resulted in a takedown of content where the takedown was later found to have been made in error.
>
>
>
> No reports that have resulted in takedown of content have later been found to have been made in error.
>
>

Justin Warren left an annotation ()

Email response from eSafety:

Dear Justin,

Thank you for your email.

Based upon your response, we will proceed with an administrative access request, rather than a formal request under the Freedom of Information Act 1982 (Cth). As such, please withdraw your freedom of information request of 24 February 2021, so that we can provide you the information under an administrative access request.

Where we have indicated high level, we will do our best to provide information in response to your comments.

We will endeavour to get back to you as soon as possible, ideally within a week.

Kind regards,

FOI Coordinator

eSafety Commissioner

Dear [email address],

As discussed over email, I withdraw this FOI request and look forward to receiving the information I have requested administratively.

Yours sincerely,

Justin Warren

enquiries@esafety.gov.au, Children’s eSafety Commissioner

5 Attachments

Hi Justin,

 

Thank you for your email confirming withdrawal of your request under the
Freedom of Information Act 1982 (Cth).

 

We are currently processing your administrative access request and
anticipate being able to send you our response early next week.

 

Thank you for your patience.

 

Kind regards,

 

eSafety enquiries

The eSafety Commissioner

[1]eSafety logo Email-Signautre

 [2]Email-Footers icons3   [3]esafety.gov.au 

 

[4]Email-Footers iconsT    [5]Email-Footers iconsF    [6]Email-Footers
iconsL

 

 

show quoted sections

enquiries@esafety.gov.au, Children’s eSafety Commissioner

7 Attachments

Dear Justin,

 

Please see below responses to your administrative access request.

 

 1. The number of reports made to the eSafety Commissioner, and their
category (e.g. image-based abuse, cyber-bullying, illegal or harmful
content, etc.)

 

 a. Cyber Report - 24,198 URLs reported between 1 January 2020 and 28
February 2021.
 b. Image-based abuse - 3660 reports received between 1 January 2020 and
28 February 2021.
 c. Cyberbullying - 2699 reports received between 1 January 2020 and 28
February 2021.

 

 2. The number of reports made to the eSafety Commissioner that resulted
in a successful takedown of content, and the type of content that was
taken down or otherwise removed.

 a. Cyber Report

                                                               i.     
Cyber Report prioritises the investigation of online child sexual abuse
material (CSAM). In 2020, all CSAM investigations were hosted overseas and
referred to law enforcement. eSafety does not have visibility of what
happens to our CSAM results once they are referred to law enforcement
partners.

                                                             ii.      In
the time period, we issued 6 Abhorrent Violent Material (AVM) notices, 5
of which led to removal. AVM notices put a provider on notice of material
that if it is not removed could lead to prosecution under the Criminal
Code (Cth).

 b. Image-based abuse

                                                               i.     
During the 2019-2020 annual reporting period, eSafety was successful in
having image-based abuse material removed in 82 per cent of cases where
removal was requested.

                                                             ii.     
During the 2019-2020 annual reporting period, eSafety also alerted social
media services to almost 970 accounts that were being misused to share, or
threaten to share, intimate content.

                                                           iii.      The
majority of the material was posted on exposé or pornography sites.

 c. Cyberbullying

                                                               i.     
Approximately 90% of the material eSafety requested to be removed in 2020
was removed.

 

 3. The number of reports made to the eSafety Commissioner that did not
result in a successful takedown of content, and the reason the
takedown was unsuccessful.

 

 a. Reports may not always result in the takedown of content. Further,
where a request is made for removal of content, eSafety may not
always know the outcome of such a request. Examples of why this may
be the case include:  

                                                               i.      the
fact that the matter may be dealt with by local or overseas law
enforcement partners. For example, eSafety does not have visibility of
what happens to our CSAM results once they are referred to law enforcement
partners;

                                                             ii.      the
fact that not all reports require the removal of content. For example
reporters are often seeking advice about their options, including in
relation to preventative action.  In addition, where a report relates to a
threat to share intimate images, rather than actual sharing of intimate
images this will also be captured by the scheme even where the images have
not been posted online;

                                                           iii.     
 eSafety may take other steps to assist reporters. For example, where
eSafety is unable to effect removal of intimate content posted online, it
takes steps to limit the discoverability of the content, typically by
requesting removal of the content from search engine results;

                                                           iv.     
material may have already been removed; or

                                                             v.      where
there are mitigating circumstances of which eSafety was previously unaware
because of subscriber or privacy walls.

 

 4. The number of reports made to the eSafety Commissioner that were not
actionable or were abandoned, and the reason they were not actionable
or were abandoned.

eSafety does not ‘abandon’ any reports. When exercising our discretion to
investigate, and consider what action to take, we consider a number of
different matters, depending on the applicable legislative scheme, as set
out below:

 

 a. Cyber Report

                                                               i.      All
content that is reported to Cyber Report is assessed by an investigator to
determine hosting location and likely classification under the National
Classification Scheme.

                                                             ii.     
Cyber Report prioritises investigations into online CSAM, pro-terror
content, and Australian-hosted prohibited content.

                                                           iii.      The
team exercises discretion where content is not sufficiently serious.

 b. Image-based abuse

                                                               i.      The
most common reason reports are not actioned is because there is no
Australian connection, as required by the legislation.

                                                             ii.      Less
common reasons for reports not being actioned include the report not
involving intimate content (as defined by the legislation) or the reporter
having no standing to report under the legislation.

                                                           iii.     
Whether eSafety can take action also depends on whether we have received
sufficient information or whether we require more information from the
complainant and are waiting for this to be provided.

                                                           iv.      Unless
a report is deemed a hoax, eSafety always responds to complainants and
provides advice, even when a report is assessed as not actionable or where
the complainant does not continue to engage with us.

 c. Cyberbullying

                                                               i.     
Often a matter will be closed because the complainant doesn’t respond to a
request for more information.

                                                             ii.     
Matters also may not meet the legislative threshold for serious
cyberbullying, so won’t result in removal requests to the providers.

                                                           iii.      In
these cases, we still provide advice and support to the complainant with
managing cyberbullying and building resilience.

 5. The number of reports made to the eSafety Commissioner that were
determined to be fake, malicious, or otherwise not made in good faith.

 

In respect of hoax, frivolous or vexatious complaints:

 

 a. Cyber Report

                                                               i.      No
complaints have been considered to be hoax, frivolous or vexatious
complaints.

 b. Image-based abuse

                                                               i.     
Only  a very small number of reports have been considered to
be hoax reports. eSafety generally provides a response to all reports if
valid contact details are provided.

 c. Cyberbullying

                                                               i.      169
complaints were identified as hoaxes.

 6. The number of reports made to the eSafety Commissioner that were
repeats or follow-ups to earlier reports.

 

Individuals commonly re-engage with eSafety on the same or connected
reports for a number of reasons.

 

The nature of some material, such as CSAM, means it is likely to be
reposted at different URLs, which may result in people making multiple
reports about the same content at different locations. Online content
reports can also be made anonymously.

 

Victims of image-based abuse may also discover new content online and
report this additional content to eSafety. These reports are generally
saved on the original report file. Additionally, multiple reports are
sometimes submitted on or around the same time as the initial report.
These are categorised as ‘duplicate reports’.

 

 7. The number of reports made to the eSafety Commissioner that were
appealed by either the subject of the report or the platform hosting
the content, and the outcome of that appeal.

 

No reports have been subject to appeal

 

 8. The number of reports made to the eSafety Commissioner that resulted
in a takedown of content where the takedown was later found to have
been made in error.

 

No reports that have resulted in takedown/removal of content have later
been found to have been made in error.

 

Thank you again for your patience.

 

 

Regards,

 

eSafety enquiries

The eSafety Commissioner

 

[1]eSafety logo Email-Signautre

[2]Email-Footers icons3   [3]esafety.gov.au

  

[4][IMG]    [5][IMG]    [6][IMG]    [7][IMG]

 

 

eSafety acknowledges the Traditional Custodians of country throughout
Australia and their continuing connection to land, water, culture and
community. We pay our respects to Elders past, present and emerging.

 

 

show quoted sections

Dear [email address],

Thank you for this information.

I note that counts of the sub-categories of Cyber Report are not provided for anything other than Abhorrent Violent Material (AVM). I am somewhat surprised that these subtotals are not provided, given that these reports made up the vast majority of reports to eSafety within the period (79% of reports), and that I asked for totals of reports to be provided by category. Perhaps you misunderstood what I was asking for?

You have explained that your process for Cyber Reports involves an assessment "by an investigator to determine hosting location and likely classification under the National
Classification Scheme." You further explain that you prioritise "investigations into online CSAM, pro-terror content, and Australian-hosted prohibited content."

Could you please provide totals for each sub-category of Cyber Report, including whether the content was successfully taken down or not, and the reason (if known)?

I would therefore expect to see sub-totals that add up to the total 24,198 Cyber Reports, broken into the following categories, as a minimum:

- CSAM
- Pro-Terror Content
- Australian-hosted prohibited content
- [other categories you may use]
- Non-priority reports

It would also be useful to know the totals for the determination categories made by investigators:
- Hosting location
- Likely classification under the National Classification Scheme

This will assist in understanding how many reports received by eSafety fall into the priority categories, and whether they are actionable within Australia or must be referred to other entities.

Yours sincerely,

Justin Warren

Justin Warren left an annotation ()

No response to this follow up so far, so I've resent it from my personal email.

Justin Warren left an annotation ()

eSafety responded this morning. Email received at 10:34am:

Thank you for you email. We confirm we have received your request and will respond as soon as possible.