Copyright Material
Copyright Material
Copyright Material
Copyright Material
Copyright Material
Copyright Material
Copyright Material
Copyright Material
Copyright Material
Copyright Material
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
June 2024
• If NDIA decides not to extend use of Copilot, trial ends – 30 June 2024.
• Removal of existing licenses and return to BAU.
Goals and objectives
Business objectives
• To meet the project objectives, as outlined by the DTA.
• To effectively engage with and encourage active participation of Copilot by trial staff.
• Capture useful information through various feedback mechanisms.
• To build the knowledge, capabilities and skill base of trial staff with Copilot, in line with the NDIA’s
continuous improvement approach.
• To raise and build awareness and knowledge of Copilot across all NDIA staff, in readiness for
wider implementation of the application in the future.
• To raise awareness of Copilot as one of the use cases of safe and responsible use of AI in the
workplace to improve administrative efficiencies.
Communications and engagement objectives
Objective
How will it be measured?
To build awareness and foster the desire for
Engagement statistics for Intranet and Huddle
change to using Copilot for all NDIA staff.
notices and milestone stories.
To build knowledge about Copilot and increase
Statistics from Copilot intranet page.
confidence in technical abilities for trial staff.
Attendance and engagement in
training/workshops.
Survey feedback.
Community of practice feedback.
Influence buy-in with trial staff to ensure active
Attendance and engagement at presentations,
participation and successful engagement.
leadership forums and community of practice
discussions.
Click through to links in CEO newsletter.
Current and future state
• Current state
o The NDIA is participating in the whole of government trial of Copilot. The trial is one way
the Australian Government, in partnership with Microsoft, is exploring the safe and
responsible use of AI in the public service.
o The introduction of Copilot will help trial staff to more efficiently complete administration
tasks. This means they will have more time to focus on valuable work that aligns with
NDIA’s strategic goals.
o Copilot is designed to enhance productivity and efficiency by automating repetitive and time-
consuming tasks, Copilot allows employees to focus on higher-value work that requires
3
Page 55 of 189
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
Communications and engagement risk
Risks
How will the risk be mitigated?
Concern about the impact of AI on staff roles.
Clear and transparent messaging in all
collateral, presentations and meetings. Follow
up on any feedback and circulate.
Negative sentiment, limited understanding and
Build awareness and educate through clear
scepticism about use of AI technologies in the
messaging about the benefits of AI and its
workplace.
application in Copilot. Provide examples.
Trial staff face unexpected challenges when
Reference project contingency plan and
using the application (e.g
. technical issues).
mitigation strategies. Project team to manage
messaging via service desk.
Concerns about data security, particularly for
Clear and consistent messaging across all
NDIS participants.
communications and engagement channels.
Concerns about the accessibility aspects of
Regularly share and highlight how the EDN
using Copilot.
insights about accessibility.
Change fatigue and resistance to change:
Feedback loop and proactive encouragement
from leaders. Consistent messages to address
• Trial staff concerned they do not have
concerns.
the skills, extra effort to learn new
system. Concern may impact job duties.
Self-directed online training and passive level
Engaging tech savvy staff as coaches or super-
of engagement with trial staff.
users to support less confident staff.
Community of practice approach with active
discussion boards, presentations to drive use of
technology.
Key messages
Audience 1: General
• The NDIA is participating in a whole of government trial of Copilot for Microsoft 365
Enterprise (Copilot), starting from February 2024.
• Copilot is an artificial intelligence (AI) assistant for Microsoft 365. It is designed to improve
productivity by helping staff complete administration tasks more efficiently.
• By using Copilot, participating staff in the trial will also be able to test its accessibility and how
it works with assistive technology.
• The NDIA was invited to provide valuable insights about accessibility using Copilot. This
information will inform the Digital Transformation Agency and Microsoft about how to better
support the needs of people with disability.
• Staff in the initial trial will include:
o Senior Executive Service (SES) staff and their support staff.
o Employee Disability Network (EDN) members.
5
Page 57 of 189
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
o Office of the Chief Information Officer (OCIO) staff.
• Led by the OCIO, the project team will be collecting feedback and data throughout the trial to
report to the Digital Transformation Agency. We will regularly share updates with staff.
• If the whole of government trial is successful, we will look to roll this out to all NDIA staff in the
future.
• You can find out more on the Copilot for Microsoft 365 intranet page.
How can I use copilot?
• During the trial, the capabilities of Copilot will be tested in a public service setting. Examples
of how Copilot may be used include:
o Creating PowerPoint presentations from Word documents.
o Creating first drafts in Word sourced from your files and ready for you to edit. Or you
can prompt Copilot to shorten, rewrite or give feedback.
o Using Copilot in a Teams meeting to develop minutes from the meeting. It can also be
used to identify where people agreed and where they disagreed. It can suggest action
items in real time.
o Summarising long email threads or providing drafts of replies in Outlook to quickly
clear your inbox.
o Assisting with different tasks, like drafting emails and managing deadlines. For
example, Copilot can quickly provide a summary of large documents to help staff with
dyslexia.
o Checking and amending documents so they meet accessibility requirements.
o Using Copilot in Excel to quickly analyse trends and create data visualisations.
• Using natural language processing to understand user commands, making it user-friendly for
individuals who may find traditional user interfaces challenging.
• Copilot is compatible with screen readers, making it accessible to visually impaired users.
• Copilot will only access internal information from Microsoft applications in this trial. It will not
be able to access information from Jira, Confluence or Salesforce, as these applications
remain outside of the scope of the trial. This will minimise the possibility of NDIS participant
data being accessed. You can visit the Microsoft website to find out more about data, privacy
and security.
Audience 2: Trial staff
• Microsoft will be providing training to trial staff to introduce Copilot and to develop their AI
skills and literacy.
• The dedicated Copilot intranet page includes all knowledge articles, training sessions,
resources, frequently asked questions, success stories and helpful links. This information will
help you use Copilot with confidence.
• Trial staff can access the 2024 Copilot for Microsoft 365 trial – Workspace in Microsoft
Teams. This channel will operate like a community of practice, where you can ask questions
and share insights in real time.
• We have been assigned a limited number of Copilot licences. Active trial staff will regularly
use Copilot in their day-to-day work. If a Copilot licence is not being used regularly, it will be
reassigned to another staff member.
• Trial staff will need upgrades to their Microsoft Outlook and Microsoft Teams applications
before using Copilot. The OCIO project team will manage these upgrades for all trial staff.
6
Page 58 of 189
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
• Copilot will only be trailed on your laptop.
• You are in control of navigating Copilot so that it works for you. Copilot follows your prompt
and works with your data to give you a first draft. You can then edit the first draft or add more
content. You always need to check the draft for accuracy each time as Copilot continues to
learn what you need.
• Copilot is a tool used to improve what we do by changing the amount of time we spend on
administrative tasks. Copilot enhances our job roles and is not designed to replace them.
• We want to hear your feedback about using Copilot. If you have any questions or concerns
when using the application, please contact the ICT Service Desk - Service project . This will
help us to track, compile and analyse all feedback about using the new technology.
• Please email s47E(d) - certain operations of agencies if you have any comments, concerns or insights about the
trial you would like to share. We will answer your questions and regularly share information
with you.
Audience 3: SES staff
• SES staff are encouraged to take up the opportunity to actively explore what Copilot can do.
• SES staff will be change leaders by improving their digital skills and learning more about the
benefits of AI in the workplace.
• You will have more time to focus on leadership activities by simplifying some of your
administration tasks.
Audience 4: EDN
• EDN members are encouraged to take up the opportunity to actively explore what Copilot can
do.
• You will be making an important contribution by testing how well Copilot meets accessibility
requirements, particularly for people with disability.
• Participating EDN members will provide valuable insights about their experience when using
Copilot.
Audience 5: Technical messages
• Copilot uses a large language model (LLM) type of AI that recognises and generates text and
other tasks. It can analyse vast amounts of data and can learn the patterns and connections
between words and phrases.
• Copilot’s LLM system connects to Microsoft Graph, which accesses data and files. Microsoft
Graph is all your content and context, in your emails, files, meetings, chats and calendars.
• Copilot is embedded within Microsoft 365 applications. It understands your prompts and can
seek information from your data to provide solutions across different Microsoft 365
applications.
• Copilot cannot access the internet to create new documents in the way Chat-GPT does. This
is a built-in security feature that ensures it always uses a human prompt and works with your
own data and information.
• Copilot is designed to ensure that your data privacy is protected and it remains safe and
secure. Copilot will respect and follow the privacy, security and compliance policies of the
NDIA.
• Copilot follows Microsoft’s principles of responsible use of AI. These principles include
fairness, reliability, safety, privacy, security, inclusiveness, transparency and accountability.
7
Page 59 of 189
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
Evaluation
The project team will establish clear metrics of success at the start of the project. Metrics could include
user satisfaction ratings, estimated reduction in time spent on administrative tasks and qualitative
feedback from users about the effectiveness of the application.
Lessons learned from the trial will be documented by the project team and shared across the NDIA to
foster a culture of learning and innovation. This information will inform future AI advancements.
The effectiveness of the communications and engagement objectives will be measured by:
• The number of trial staff who have downloaded knowledge articles.
• Level of intranet page visits and interaction with direct links.
• Survey data reports.
• Success story examples.
• Engagement activity with the Teams workspace page.
• Number of downloads of Huddle notices and anecdotal feedback about discussions in Huddles.
• Number of unique visits to milestone stories.
• Assessment of feedback from presentations and meetings.
8
Page 60 of 189
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
Communications and Engagement schedule
Go live coordinator: Vicki s47F - personal ,
privac Senior Communications and Engagement Officer, Strategic Communications.
Business readiness schedule
Date
Target
Activity
Purpose
Outcome
Responsibility
Comments
audience
Feb 2024
All
Preparing the business
Set up systems for
All administrative steps
OCIO
May need to advise
for Copilot.
effective use of
completed ensuring a
stakeholders about
Copilot for trial staff.
positive experience for
these steps:
all trial staff.
Information ready
for search
Pre-requisites in
place
Licenses assigned.
Feb/Mar 2024
All
Training and support
To train trial staff to
Clear, consistent
OCIO
Use responsive
• Workshops
use and adapt
understanding of how to
DTA and
feedback loop for
Copilot in their daily
use Copilot in the
insights,
• Microsoft Teams
Microsoft setting
work practices.
workplace.
accessibility,
channel/communit
up training
issues/opportunitie
y of practice
Identify and adjust any
sessions
accessibility issues.
s – maximise
engagement and
learning.
9
Page 61 of 189
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
Internal communications and engagement schedule
Date
Target
Activity
Purpose
Outcome
Responsibility
Comments
audience
Feb 2024
All
Intranet page
Key source of truth
All stakeholders are
OCIO to draft.
Placeholder intranet
with all resources,
aware and well informed
SPC to review.
page - plan for it to
training, FAQs, news,
about Copilot.
be moved to ICT
and stories.
Project Hub section.
Mar – June
All staff
Celebrating NDIA
Raise awareness and
Increased awareness
SPC to draft.
Opportunity to
2024
milestone stories –
understanding about
about Copilot and
OCIO to upload.
highlight NDIA
intranet page.
benefits of Copilot
contribution by NDIA
contributions
and AI.
staff.
regarding
accessibility in
Copilot.
Early March /
All staff
CEO newsletter –
Visibility of CEO
NDIA staff aware of
SPC
Active senior
early May/end
update at start, midpoint
support of project and
strategic overview of
management
June 2024
and end of trial.
raises awareness of
Microsoft partnership
support – key
trial.
and Copilot trial in a
ADKAR approach.
public service
Helps to counteract
environment and
resistance to
contribution by NDIA
change.
trial staff.
Early March /
All staff
Intranet notices - update
Build awareness and
NDIA staff have raised
SPC
Subjects include
early May/end
at start, midpoint and
increase
awareness and greater
about the trial; how
June2024
end of trial.
understanding about
understanding about
AI and Copilot work;
Copilot, AI and the
Copilot, AI and the
highlights and
objectives of the trial.
whole of government
results.
trial.
10
Page 62 of 189
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
Date
Target
Activity
Purpose
Outcome
Responsibility
Comments
audience
Early March /
All staff
Huddle notices - update
To build awareness
NDIA staff have raised
SPC
Subjects include
early May/end
at start, midpoint and
and increase
awareness and greater
about the trial; how
June 2024
end of trial.
understanding about
understanding about
AI and Copilot work;
Copilot, AI and the
Copilot, AI and the
highlights and
trial objectives.
whole of government
results.
trial.
Post implementation review
Things that worked well
Opportunities
Communications
What worked well and should continue next time?
What didn’t work well and how can we do things
differently?
Engagement
11
Page 63 of 189

FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL: SENSITIVE
5.
Copilot Trial
5.1.
The Copilot Trial was conducted from February to June 2024, with access to 300 licences
(at a cost of s47G - business ),194 li
inform
cences were used in the trial. The trial set out to better
understand the functionality and applicability of Copilot as an AI tool, its compliance with
cyber security requirements, and overall usability.
5.2.
The trial initially used feedback from experts in the ICT, Cyber Security, Legal, and
Accessibility Technology teams to identify risks and benefits.
5.3.
On completion of the trial the Copilot Project team will submit a report outlining the
outcomes of the trial, key insights, and recommendations.
5.4.
Preliminary findings of the Copilot trial indicate:
5.4.1. Support for SES in Microsoft applications: Users have seen significant timesaving
benefits using Copilot to summarise information from the Microsoft suite of
applications. This means users can search for key information and Copilot can
provide a summary combining information from all applications.
5.4.2. Extensive Use in Microsoft Teams: Copilot is being utilised for minute taking,
recapping meetings, and providing actions from meetings in MS Teams.
5.4.3. Enhanced Accessibility across M365 Applications: Copilot has significantly
improved accessibility within M365 applications for employees with disabilities,
enabling them to interact with technology more efficiently.
5.5.
A Survey of trial participants conducted in May 2024, found that Copilot exceeded
expectations in Word, Outlook, and PowerPoint. Copilot exceeded user expectations for
67.4% of trial members and 84% of users were now highly confident with the use of AI
due to their participation in the trial.
6.
Extension of the Copilot Trial
6.1.
The DTA Copilot Trial found value in the AI tool, noting there continues to be broad
discussion regarding aspects such as value for money, integration into operating
environments and risk management.
6.2.
While the Agency seeks approval to continue the Copilot trial, further work will be
undertaken to understand the risk exposure that AI poses to the Agency and will evaluate
key mitigation strategies that are in place on completion of the Copilot trial.
6.3.
Continuation of the trial will allow further opportunities for the Employee Disability
Network (EDN) to explore the accessibility benefits of using Copilot in their business-as-
usual activities. This will provide further data on the use of Copilot as an accessibility
assistant.
6.4.
An extension of the trial will be separate from the current DTA trial, which finishes in June
2024. Microsoft has quoted
per Co
s47G - busine
pilot licence to allow a continuation of the trial,
with our current pool of 300 licences, equating to a cost of s47G - business (GST I
inform
nc) for
2024/25.
6.5.
The NDIA Trial will utilise all 300 licences by including most NDIA SES, and their
Executive Assistants. This will expand the trial participants into more business areas and
work types, further testing the capability, value, and risks of Copilot.
OFFICIAL: SENSITIVE
x.x.2
Page 65 of 189

FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL: SENSITIVE
7.
Risks
7.1.
An interim policy was implemented to address the risk of using AI and machine learning.
The policy is supported with mandatory learning modules for users that address the
ethical, legal and security risks.
7.2.
The AI Working Group is coordinating with the Agency Risk, Audit and Resilience
Division to undertake a documented risk assessment, consolidating existing risk
mitigation strategies and to identify any emerging risks.
7.3.
The Agency’s Strategic Leadership Team has decided that any uses of AI and machine
learning must be considered by the Agency’s Ethics Committee.
7.4.
The NDIA conducted a Cyber Security risk assessment on the Microsoft suite of
applications that incorporate Copilot. The DTA has also undertaken an assessment of
Copilot, which the NDIA has reviewed for risks that may impact the Agency. The Cyber
Security risks of using Copilot are considered LOW due to:
7.4.1. Copilot operates as a private tool, which is separated from other Microsoft
customers and the public. Copilot only accesses information and data in the
Microsoft suite of applications which the user has authorisation to access.
7.4.2. Sensitive NDIA systems and data (such as the CRM and Protected Enclave)
cannot be accessed by Copilot.
7.4.3. The NDIA’s use of Copilot is designed to work within the boundaries of the NDIA
ICT environment and does not access the wider internet. Microsoft has made a
commitment to the Australian Government that the use of Copilot will never train its
large language model.
8.
Participant Impact
8.1.
Stable, accessible, secure, and participant-orientated technology is a crucial enabler for
the delivery of the NDIA.
9.
Sustainability Impacts
9.1.
Increased administrative productivity will lead to time savings and quicker responses
improving participant and stakeholder satisfaction, increasing Agency reputation, and
supporting the long-term sustainability of the Scheme.
10.
Other Impacts
10.1.
The extended trial will be overseen by the current AI Working Group and will provide
further understanding of the impact, risks, benefits, and value of Copilot as a productivity
support for NDIA staff.
11.
Responsibility and next steps
11.1.
The Office of the Chief Information Officer will administer the extended trial with
continued oversight provided by the Agency AI Working Group.
11.2.
The Branch Manager of Enterprise Architecture and Governance will allocate Copilot
licences across the Senior Leadership Group.
Internal use only –removed for meeting
OFFICIAL: SENSITIVE
x.x.3
Page 66 of 189

FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL: SENSITIVE
Prepared by:
[Action Officer]
Approved by:
[GM Name]
Division:
[Action officer’s
[GM title]
division]
Phone:
[Action officer’s phone
Phone:
[GM phone number]
number]
Cleared by ELT
[ELT date]
OFFICIAL: SENSITIVE
x.x.4
Page 67 of 189

FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL: SENSITIVE
EXECUTIVE BRIEF
3.2. While the Agency seeks approval to continue the Copilot trial from July 2024 to June
2025, further work will be undertaken to understand the risk exposure that AI poses
to the Agency and will evaluate key mitigation strategies that are in place on
completion of the trial.
3.3. The majority of DTA trial participants across government have elected, where
possible to continue using Copilot. Departments such as Home Affairs have a
Protected network and are unable to continue the trial in this environment. The
Agency doesn’t have this restriction and so is not limited by its operating
environment and policy settings.
3.4. Continuation of the trial will allow further opportunities for the EDN to explore the
accessibility benefits of using Copilot in their business-as-usual activities. This will
provide further data on the use of Copilot as an accessibility assistant and provide
opportunities for the Agency to work with Microsoft as an active partner in improving
accessibility features.
3.5. An extension of the trial will be separate from the current DTA trial, which finished in
June 2024. Microsoft has quoted
per Copilo
s47G - busines
t licence to allow a continuation of
the trial, with our current pool of 300 licences, equating to a cost of s47G - business
inform
(GST Inc.) for 2024/25.
3.6. The extended trial at the NDIA will utilise all 300 licences by including most NDIA
SES, and their Executive Assistants. This will expand the trial participants into more
business areas, further testing the capability, value, and risks of Copilot.
4. Risk Mitigation
Risks
4.1. There is a recognised potential for Copilot to generate inaccurate, misleading, or
inappropriate content that may affect the quality and integrity of NDIA's documents
and communications.
4.2. The possible legal and ethical implications of using Copilot-generated content
without proper verification, attribution, or consent, especially when dealing with
personal or sensitive information of participants, staff, or stakeholders is problematic
where use of the tool is unregulated, and users are not trained and supported in
their use.
4.3. The continued development of a robust governance framework and accountability
mechanism for the continued use of Copilot, including clear roles and
responsibilities, escalation and approval processes, and feedback and evaluation
mechanisms is essential.
4.4. Such a governance framework is further enhanced by ongoing communication and
engagement with Copilot users to raise awareness of functions and risks inherent in
the tool, whilst encouraging best practice and soliciting feedback on the benefits and
challenges of using Copilot.
Copilot's access and separation
4.5. Prior to the implementation the trial the Agency conducted a Cyber Security risk
assessment (Authority to Operate) on the Microsoft suite of applications that
incorporate Copilot. The DTA has also undertaken an assessment of Copilot, which
was reviewed to identify risks that may impact the Agency. The Cyber Security risks
of using Copilot are considered LOW due to the following:
OFFICIAL: SENSITIVE
Extension of the Microsoft Copilot Trial
2
Page 69 of 189

FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL: SENSITIVE
EXECUTIVE BRIEF
4.5.1. Copilot operates as a private tool, which is separated from other Microsoft
customers and the public. Copilot only accesses information and data in the
Microsoft suite of applications which the user has authorisation to access.
4.5.2. Sensitive NDIA systems and data (such as PACE, SAP CRM and Protected
Enclave) cannot be accessed by Copilot.
4.5.3. The NDIA’s use of Copilot is designed to work within the boundaries of the
NDIA ICT environment and does not access the wider internet.
4.5.4. Microsoft has made a commitment to the Australian Government that the use
of Copilot will never train its large language model to gain insights and
intelligence from Government processes and decision making.
Artificial Intelligence and Machine Learning Policy
4.6. Risk will continue to exist where a virtual office assistant is being used based on a
Large Language Model of Artificial Intelligence (LLM). LLM is the deep learning
algorithm in AI tools which summarise, translate and generate replicated human
responses, ideas and concepts.
4.7. The Use of Artificial Intelligence, Generative AI, and Machine Learning – Interim
Policy was implemented in April 2024 to address the inherent risks of using AI and
machine learning within the Agency (Interim Policy)
4.8. The Interim Policy, which was developed in consultation with the DTA to support
users of the technology and is accompanied by mandatory learning modules for
users that address the ethical, legal and security risks (
Attachment B).
4.9. The Interim Policy has been compared with the National Framework for the
Assurance of Artificial Intelligence in Government published by the DTA on June 21,
2024, to determine if adjustments are necessary.
4.10.
The review concluded that no modifications to the Interim Policy are needed
at this time regarding Copilot usage. Another review is expected by December 2024
to include insights from the DTA’s Copilot survey conducted across all participants in
the trial and the Privacy Assurance Advice provided to the Agency on August 8,
2024 (
Attachment C).
Privacy and data protection measures
4.11.
Some to the updates anticipated are:
4.11.1. Establishing clear and transparent criteria for the selection of Copilot users
and ensuring that they are adequately trained and informed of their privacy
obligations.
4.11.2. Implementing a Privacy Impact Assessment (PIA) process for any new use
cases of Copilot that involve personal information or sensitive data.
4.12.
Developing and enforcing clear guidelines and protocols for the use of
Copilot, including the types of documents and data that can or cannot be accessed
or generated by Copilot, the retention and deletion of Copilot outputs, and the
reporting and resolution of any privacy incidents or breaches.
4.13.
Conducting regular audits and reviews of Copilot's performance, accuracy,
and compliance with privacy laws and NDIA policies.
Risk management and governance
OFFICIAL: SENSITIVE
Extension of the Microsoft Copilot Trial
3
Page 70 of 189

FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL: SENSITIVE
EXECUTIVE BRIEF
4.14.
The Agency’s AI Working Group collaborated with the Agency Risk, Audit and
Resilience Division to undertake a risk assessment (
Attachment D), consolidating
existing risk mitigation strategies and to identify any emerging risks (
Attachment E).
4.15.
The project team will engage with the Agency Ethics Committee anticipated to
be established in October 2024 through the Integrity Agencies and Ethics team in
the Legal Services Branch. The Agency Ethics Committee will maintain a standing
item on the use of AI and will establish a review mechanism for all new AI and
machine learning technology proposals. The project team will provide updates in
status reporting to that Committee.
4.16.
The Agency will continue to consult with the DTA on the Copilot trial to
support the adoption of government wide practice and policies guard railing the use
of AI.
5. Recommendation
5.1. The COO:
5.1.1. notes the preliminary findings of the Copilot trial and the evolving AI
governance and policy.
5.1.2. approves the extension of the DTA Copilot trial through to July 2025.
5.1.3. notes that the project team will continue to engage on the ethical and
responsible use of Copilot and inform our evolving governance and policy
position on AI under consultation with the DTA.
6. Authorisation
Approval in PDMS History
……………………………………..
Date: 08 July 2024
Natasha Murphy
Prepared by: Carol s47F - personal privacy.
Branch Manager
Enterprise Architecture and Governance
Mobile: s47F - personal privacy
Phone / Mobile s47F - personal privacy
OFFICIAL: SENSITIVE
Extension of the Microsoft Copilot Trial
4
Page 71 of 189

FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL: SENSITIVE
EXECUTIVE BRIEF
s47F - personal privacy
21 August 2024
Signature:
…………………….
Date: …………………..
Samuel Porter
Deputy CEO Enabling Services / Chief Operating Officer
COO Comment
Cleared on the following basis:
- Ethics committee consideration occurs asap and use thereafter is subject to the
committee’s views
- CIO and I taking the agency's current policy on AI (at attachment B) through SLT and the
ethics committee (in that order)
- confirmation (in a separate brief) about the guardrails we've put around copilot use for
EDN to ensure that it is separated from service delivery activities (in particular planning).
7.
Attachments:
Attachment A: End of Trial Report (DTA Evaluation)
Attachment B: Use of AI Interim Policy
Attachment C: Privacy Assurance Advice – Copilot for Microsoft 365 Project
Attachment D: AI Working Group Terms of Reference
Attachment E: Final Copilot Project Risk Register
OFFICIAL: SENSITIVE
Extension of the Microsoft Copilot Trial
5
Page 72 of 189
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
End of Trial Report
1
Copilot for Microsoft 365
1
January – June 2024
1
1. Executive summary
4
1.1 Background
4
1.2 Objectives of the trial
4
1.3 Purpose
5
2. Copilot at the NDIA
6
3. Implementation of Copilot for Microsoft 365
8
3.1 Approach
8
3.2 Planning and Configuration
8
3.3 Training and Support
9
3.4 Implementation and Monitoring
9
4. Results
10
4.1 Using Copilot
10
4.2 Accessibility
11
Neurodivergent User
12
JAWS User
12
Dyslexic User
12
Hearing Impaired User
12
4.3 Ongoing Training, User Monitoring, and Guidance
13
4.4 Network Integration
13
5. Summary of Benefits
14
6. Key Findings
16
6.1 User Productivity
Error! Bookmark not defined.
6.2 User Satisfaction
Error! Bookmark not defined.
6.3 Accessibility Usage
Error! Bookmark not defined.
6.4 Stakeholder Engagement
Error! Bookmark not defined.
7. Perceived Risks and Mitigation Strategies
17
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
2
Page 74 of 189
OFFICIAL
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
8. Controls for Mitigating Risks During the Trial
19
8.1 Data Privacy Controls
19
8.2 User Training and Support
19
8.3 Ethical Guidelines
19
9. Summary of Recommendations
20
10. Conclusion
20
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
3
Page 75 of 189
OFFICIAL
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
1. Executive summary
1.1 Background
In the wake of rapid advancements in Artificial Intelligence (AI) technologies, the Agency
recognises the potential of AI to transform public service delivery, enhance operational
efficiency, and foster innovation.
The Australian Public Service (APS) commenced a six-month trial of Copilot for Microsoft
365 in January 2024 to test new ways to enhance productivity and develop skills,
capabilities and preparedness for the use of generative AI tools by APS staff. The trial was
coordinated by the Digital Transformation Agency (DTA) and involved over 50 participating
Commonwealth agencies.
The Agency is participating in the trial alongside numerous Australian Government
Agencies and Departments and obtained 300 licenses. The trial started as a Proof of
Concept (POC) for four months, and then continued as a Trial for the remaining trial period.
A smaller group of users were chosen for the er group of users were chosen for the POC,
more users were included for the Trial. Staff who participated were urged to use Copilot to
help with their daily administrative tasks.
Over the six-month trial period, data was collected and analysed to assess security posture,
user productivity, satisfaction, and accessibility improvements.
1.2 Objectives of the trial
•
Evaluate effectiveness: Assess staff using Copilot impacts productivity and user
satisfaction.
•
Determine value for money: determine whether future use by all NDIA staff
represents value for money.
•
Ongoing AI risk mitigation: Assess if having an internal AI product like Copilot
reduces the likelihood of staff accessing external/unapproved AI services, and the
corresponding threat to Agency data that arises from unapproved use.
•
Policy development: Use the trial to develop a policy position and unified approach
to the use of Artificial Intelligence within ICT services.
•
Accessibility impact: Measure the accessibility improvements for users requiring
support.
•
Gather data for informed decisions: Collect quantitative and qualitative data to
inform future AI tool implementations.
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
4
Page 76 of 189
OFFICIAL
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
1.3 Purpose
This purpose of this report is to:
• Describe the implementation of Copilot for Microsoft 365 at the Agency
• Illustrate use cases of Copilot at the Agency
• Evaluate the effectiveness of Copilot for Microsoft 365 on productivity and
accessibility within the Agency.
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
5
Page 77 of 189
OFFICIAL
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
2. Copilot at the NDIA
A description of how Copilot can assist NDIA staff with their daily tasks is tabled below.
Use
Description
Ensuring policy
Assistance with preparing documents and emails and aligning
compliance
with NDIA policies and guidelines by:
• interpreting NDIA policies and generating a summary of the the
purpose
• identifying amendments to a policy document and the
responsible person
Meeting scheduling
Quickly schedule and reschedule meetings:
• without manually comparing participants’ calendars for
availability, and
• creating meeting invites, adding participants and setting the
meeting date and time based on availability
Checking document Improve the accessibility of documents by providing suggestions
accessibility
for:
• readability and simplicity
• formatting
• complying with the NDIA’s accessibility standards
• colour contrast for colour vision deficiencies
• converting into structured PDF or plain text for accessibility
• accessibility best practices
Email summary and Assist users with:
creation
• summarising key points and decisions from large email chains
• drafting email responses
• managing email inboxes (eg flagging important emails)
Assisting with data
Assist users with:
analysis
• creating complex formulas in Excel spreadsheets
• generating charts and graphs to visualise data
• interpretating numbers of trends
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
6
Page 78 of 189
OFFICIAL
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
Use
Description
• sorting and filtering data
• reminding users to keep data up to date
Providing in-context Assist user with learning Microsoft 365:
training
• navigating tools and processes by providing step-by-step
instructions
• helping learn more about advanced features
Document
Assist visually impaired staff with reviewing extensive documents
summarisation and
and reports by:
reading aloud
• providing summaries of Word documents with the key points,
decisions and action items; and
• text-to-speech functionality.
Presentation
Assist staff with creating engaging and informative presentations
creation assistance
by:
• suggesting a slide structure from an outline of the presentation
• using the appropriate NDIA templates
• retrieving and including relevant statistics, images and case
studies from NDIA data
• assisting with presentation rehearsals
• assisting with managing live Q&A sessions by quickly providing
information which answers the audience’s questions
Accessible data
Assist visually impaired users interpret complex data sets, charts
analysis and
and graphs by:
reporting
• interpreting data and providing simple and understandable
insights in a format which is understandable without visual aids
• analysing trends, summarising data points and predicting future
patterns based on historical data
• creating accessible reports through the translation of the data
analysis into descriptive narratives in a format which is inclusive
for all
• allowing reports to be easily shared through Microsoft 365
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
7
Page 79 of 189
OFFICIAL
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
Use
Description
• ensure documents created are compliant with accessibility
standards and easy to navigate
Accessible project
Assist hearing impaired users with managing projects by:
management
• enabling real-time transcription during Teams meetings so the
user receives live captions and allow for confirmation of any
discussion points which may have been missed
• generating notes, list tasks and assist the user in other ways
during the meeting
• organising project tasks and timelines, the user can provide
Copilot verbal instructions to update task statuses, add new
assignments, and set reminders for deadlines.
3. Implementation of Copilot for Microsoft 365
3.1 Approach
The trial faced difficulties due to:
• scepticism about the usefulness of generative AI tools such as Copilot
• concerns for the security and protection of Agency data Copilot would access
• the perception that the trial was being conducted to evaluate it’s effect on
productivity as part of a plan to reduce staff numbers
• findings of the recent “Robodebt” Royal Commission
To mitigate, the Copilot trial was reduced to a Proof of Concept (POC) for the initial four
months of the trial. This provided the additional time required to carefully examine and
mitigate risks before conducting the trial. Add some points in here about what was scaled
down for the POC. A communication ban was enforced that impacted the ability to
communicate broadly.
The POC was extended to a full trial in May 2024. Add some points in here about what was
scaled up for the Trial. What happened to the communication ban?
3.2 Planning and Configuration
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
8
Page 80 of 189
OFFICIAL
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
•
Planning phase: Detailed planning sessions were held to define the scope,
objectives, and success criteria for the trial.
•
Infrastructure configuration: Necessary IT infrastructure and security measures
were put in place to support the deployment of Copilot.
•
Agency AI Working Group formation: Approximately 10 executives across
different NDIA Business Areas, including staff requiring accessibility support.
•
POC Group selection: Select up to 300 NDIA staff across different Groups,
Divisions and Sections, including users requiring accessibility support.
•
Trial Group selection: Select up to 300 NDIA staff across different Groups,
Divisions and Sections, including users requiring accessibility support.
3.3 Training and Support
•
Information sessions: Comprehensive training sessions were conducted to ensure
users were familiar with Copilot features and functionalities.
•
Learning modules: A mandatory learning module ensured that all users were taken
through the ethical and professional obilgations of using AI.
•
Support channels: Dedicated support channels were established, including a
helpdesk, FAQs, and user guides.
3.4 Implementation and Monitoring
•
Phased rollout: The implementation was carried out in phases to manage any
potential issues effectively.
•
Monitoring and feedback: Continuous monitoring of usage and performance, along
with regular feedback collection from users, was conducted to identify areas for
improvement.
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
9
Page 81 of 189
OFFICIAL
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
4. Results
4.1 Using Copilot
The following benefits were demonstrated by users using Copilot, and could also be
realised by users across all Australian Government Departments:
Benefit
Copilot Use
Enhanced productivity Copilot assisted users achieve their tasks faster. For
example, capturing summaries of meetings, assisting with
summarising and prioritising email trails, assisting with
scheduling meetings. This outcome indicates significant
improvements to staff productivity.
Improved accessibility Copilot enhanced accessibility support by providing guidance
to enhance the accessibility of document content; interpreting
complex text, graphs and analytical data; providing
assistance during and after meetings through transcripts and
summaries; and other capabilities that increased a staff
member’s independence to complete tasks. This improved
engagement and productivity of staff relying on accessibility
tools.
Data-driven decision
Copilot has the ability to search through NDIA data for
making
relevant statistics, images, and case studies. This feature
ensures that the content is accurate and visually appealing,
aiding in informed decision-making.
Presentation
Using Copilot's assistance to prepare and rehearse
preparation and
presentations was a key finding. Copilot can generate slides
delivery
with suggested titles and content, provide a script based on
the slide content, and offer feedback on delivery, which
analyses pacing, clarity, and tone.
Streamlining
All use cases indicate that Copilot will streamline routine
administrative
administrative activities, such as writing briefs and initial
activities
drafts of correspondence. This has the potential to provide
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
10
Page 82 of 189
OFFICIAL
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
Benefit
Copilot Use
significant time savings to administrative staff within the
Agency.
4.2 Accessibility
The participation of Employee Disability Network (EDN) members in the NDIA trial provided
valuable insights into Copilot's impact on their work. The tool received overwhelmingly
positive feedback, with notable benefits.
User Group
Copilot Use
Neurodivergent Users
Copilot significantly improved task management,
communication, and focus, offering tailored suggestions and
automating routine tasks.
JAWS Users
Copilot's integration with JAWS screen readers enhanced
accessibility, making information more accessible and
interactions more efficient.The ability to use Copilot to recap
meetings allowed visually impaired users the freedom to
participate fully in meetings rather than concentrating to
ensure that transcript matched the tone of the discussion.
Hearing Impaired
Copilot’s application within Microsoft Teams was the biggest
Users
gamechanger for this group of users. The recap function, the
live transcript in meetings, and the ability to ask Copilot for
clarification when the conversation moved too quickly were
revolutionary for hearing impaired users.
Users with Dyslexia
The ability to draft content from simple prompts or rewrite
written content allowed users with Dyslexia the comfort of
knowing that their work was correct, raising their confidence
and shortening editing and review time across the board.
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
11
Page 83 of 189
OFFICIAL
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
4.3 Ongoing Training, User Monitoring, and Guidance
Feedback from users indicated that Copilot made it very easy to slip into usage patterns
outside of the trial and interim policy parameters. Ongoing education around the ethical use
of AI would be required to ensure that users complied with data protection expectations.
Training and
All use cases indicated that the implementation of any AI tool
Support
required ongoing and comprehensive training sessions. A new
tool like Copilot also required dedicated support channels,
including a helpdesk, FAQs, and user guides.
Ethical Use and
The establishment of clear guidelines, transparency, and
Data Privacy
accountability measures to monitor the ethical use of AI tools,
along with strict data privacy controls, are critical use cases that
demonstrate responsible AI implementation.
4.4 Network Integration
Microsoft Managed
The Agency's (MMD) program, coupled with its greenfield
and Device (MMD)
relationship with Microsoft, facilitated a smooth and efficient
program
activation of Copilot.
This advanced setup allowed for quick deployment and reduced
downtime, ensuring minimal disruption to ongoing operations.
Identity
Identity management functions were configured effectively,
Management
allowing users to access data and information strictly within
Identity and Access Management (IDAM) rules.
This setup not only enhanced security but also ensured
compliance with internal data governance policies, providing a
robust framework for secure and efficient use of Copilot across
the Agency.
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
13
Page 85 of 189
OFFICIAL
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
5. Summary of Benefits
The following benefits were realised during the POC and trial of Copilot:
Benefit
Description
Enhanced Collaboration
Copilot facilitates better collaboration by
summarising conversations, providing real-time
suggestions, and enabling seamless sharing of
information, thus enhancing team productivity.
Process Optimisation
By automating routine tasks and providing
intelligent suggestions, Copilot helps streamline
workflows, reducing manual effort and
minimising errors.
Time efficiencies
Copilot saves time by automating repetitive
tasks, drafting emails, and summarising
lengthy documents and conversation threads,
allowing users to focus on more complex
tasks.
Skill Development
Copilot assists users in learning and improving
their skills by providing context-aware tips and
recommendations, thus enhancing their overall
proficiency in using Microsoft 365 tools.
Accesibility Improvements
Copilot's accessibility features significantly
improved the experience for all users with
disabilities within the trial.
Cyber Security Mitigations
By providing an in-house AI tool like Copilot,
the Agency mitigated the risk of users seeking
potentially insecure, web-based AI tools,
ensuring Agency data remains within the
secure government ecosystem.
Improved Decision Making
Copilot supports better decision-making by
providing data-driven insights, analysing trends,
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
14
Page 86 of 189
OFFICIAL
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
Benefit
Description
and generating reports, enabling users to make
informed and timely decisions.
Enhanced User Satisfaction
The intuitive interface and helpful suggestions
from Copilot lead to higher user satisfaction
and engagement, contributing to a more
positive work environment.
Reduced Workload
By automating repetitive and time-consuming
tasks, Copilot reduces the overall workload on
employees, helping them manage their time
more effectively and reduce burnout.
Consistency in Work Output
Copilot ensures consistency in documents and
communications by providing standard
templates and suggesting improvements,
leading to a more uniform and professional
output.
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
15
Page 87 of 189
OFFICIAL

FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
6. Key Findings
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
16
Page 88 of 189
OFFICIAL
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
7. Perceived Risks and Mitigation Strategies
Risk Category
Risk Description
Likelihood
Impact
Mitigation Strategy
Data Privacy External
Potential risks of data
Medium
High
Strict adherence to data
breaches or misuse of
privacy regulations, regular
sensitive information by
security audits, and user
members
awareness training.
Data Privacy Internal
Risks of accidental data
Medium
High
Implementation of robust
exposure or
access controls, continuous
unauthorised access by
monitoring of data access and
internal users
usage, regular internal audits,
and comprehensive training
for employees on data
handling practices.
Integration Issues
Technical challenges in
Low
Low
Thorough testing, phased
integrating Copilot with
rollout, and collaboration with
existing systems
Microsoft for troubleshooting. |
User Resistance
Resistance to adopting
High
Medium
Comprehensive training,
new AI tools
ongoing support, and
communication plan that
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
17
Page 89 of 189
OFFICIAL
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
Risk Category
Risk Description
Likelihood
Impact
Mitigation Strategy
highlight benefits and lowers
expectations
Dependence on AI
Over-reliance on AI tools Low
Low
Encouraging balanced use,
might reduce critical
where Copilot is seen as a
thinking skills
support to repetitive tasks and
not a replacement of human
judgment.
Ethical Concerns
Ethical implications of
High
High
Preparation of the Interim Use
using AI in decision-
of AI Policy and establishing
making processes
clear supporting guidelines
and ensuring transparency in
how the Agency assesses AI.
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
18
Page 90 of 189
OFFICIAL
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
8. Controls for Mitigating Risks During the Trial
8.1 Data Privacy Controls
Encryption: All sensitive data was encrypted both in transit and at rest.
Access Controls: Strict access controls were implemented to ensure only authorised
personnel could access sensitive information.
Regular Audits: Periodic security audits were conducted to identify and mitigate potential
vulnerabilities.
8.2 User Training and Support
Comprehensive Training: Detailed training programs were conducted to ensure users
were well-versed in using Copilot.
Continuous Support: Ongoing support was provided through helpdesks, user guides, and
FAQs to address any user concerns promptly.
8.3 Ethical Guidelines
Transparency: Clear guidelines were established to ensure transparency in how AI tools
were used within the organisation.
Accountability: Accountability measures were put in place to monitor the ethical use of AI
tools.
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
19
Page 91 of 189
OFFICIAL
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
9. Summary of Recommendations
Extend the Trial: Extend the trial for an additional 12 months to gather more data and
further evaluate the long-term impact.
Expand user base: Include more users, particularly those requiring accessibility support, to
get a broader understanding of the tool’s impact.
Enhance training: Continue to provide comprehensive training and support to maximise
user engagement and effectiveness.
Strengthen partnerships: Foster stronger relationships with Microsoft and other AI tool
providers to ensure continuous improvements and support.
Establish clear guidelines: Develop and implement clear guidelines for the ethical use of
AI tools within the organisation.
10. Conclusion
The Copilot for Microsoft 365 trial has shown significant potential in enhancing productivity,
user satisfaction, and accessibility within the NDIA.
The six-month trial of Copilot for Microsoft 365 demonstrated significant benefits for the
Agency, particularly for employees with accessibility needs. The tool's positive impact on
neurodivergent users and those relying on JAWS underscores its potential as an inclusive
productivity aid.
Additionally, robust data security measures ensured the protection of Agency and
participant data throughout the trial.
Extending the trial period and expanding the user base will provide more comprehensive
data, allowing the agency to make informed decisions about the long-term adoption of AI
tools.
The controls and mitigation strategies implemented during the trial have effectively
managed risks, and the lessons learned will guide future initiatives to further leverage AI
tools for organisational benefits.
Based on these findings, it is recommended that the Agency consider wider adoption of
Copilot to further enhance productivity and accessibility for all employees.
ndis.gov.au
17 June 2024 | Copilot – End of Trial Report
20
Page 92 of 189
OFFICIAL
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
Policy Summary
The National Disability Insurance Agency (NDIA or the ‘Agency’) requires all staff
(including contractors and partner staff) to implement and adhere to the Agency Policies
and Procedures. This document establishes the policy for Use of Artificial Intelligence,
Generative AI, and Machine Learning in the Agency.
Document Control
Use of Artificial Intelligence, Generative AI, and Machine
Document Name
Learning – Interim Policy
HPE Document No or SharePoint
Use of AI Interim Policy Draft 0.1.docx
Link
Date
April 2024
Status
Final
Version
1.0
Owner
Branch Manager, Enterprise Architecture and Governance
Branch
Approval Status Log
Version
1.0
Reviewed by
BM Enterprise Architecture and Governance
Publication date
April 2024
Approved by
Ajay Satyan, Chief Information Officer
Approval date
April 2024
Use of Artificial Intelligence, Generative AI, and Machine Learning Interim Policy 2
Page 94 of 189
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
Table of Contents
Use of Artificial Intelligence, Generative AI, and Machine Learning – Interim Policy...............1
Policy Summary .......................................................................................................................2
1.
Introduction .......................................................................................................................4
2.
Purpose ............................................................................................................................4
3.
Applicability.......................................................................................................................4
3.1.
Policy exclusions .......................................................................................................4
3.2.
Policy exemptions......................................................................................................5
4.
Policy Principles................................................................................................................5
5.
Roles and responsibilities .................................................................................................6
5.1.
Chief Information Officer (CIO)..................................................................................6
5.2.
AI Working Group ......................................................................................................6
5.3.
Project Team .............................................................................................................6
5.4.
Copilot trial members.................................................................................................7
6.
Non-compliance................................................................................................................7
7.
Authority and review .........................................................................................................7
Appendix A – Glossary of terms and abbreviations .................................................................8
Appendix B – References, legislation, and standards .............................................................9
Use of Artificial Intelligence, Generative AI, and Machine Learning Interim Policy 3
Page 95 of 189
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
1. Introduction
The National Disability Insurance Agency (Agency) is participating in a trial of Copilot for
Microsoft 365 (the trial) along with other Federal Government Agencies. The trial is
sponsored by the Prime Minister and is supported by the Digital Transformation Agency
(DTA). The use of Copilot as part of a trial aligns with interim Government guidance on the
use of generative AI tools.
Copilot by Microsoft brings the power of generative AI to professional settings while offering
protection from unauthorised data sharing and unregulated internet access.
The aim of the trial is to help to shape the future use of Artificial Intelligence (AI) by
Government staff. Copilot is built on top of advanced AI tools to provide intelligent assistance
to users to potentially enhance productivity and collaboration.
Copilot is not a mechanism for making independent decisions.
During the trial, emphasis will be placed on protecting Agency data and personal privacy.
Finally, the trial will explore the benefits and any potential impacts of integrating Copilot
within the Agency and the wider APS.
2. Purpose
This policy outlines expectations in the use of AI functionality provided by Copilot within the
trial.
The policy establishes a framework for the ethical, legal, secure, and effective use of
Artificial Intelligence (AI), Generative AI, and Machine Learning (ML) within the Agency. The
policy ensures the safety and security of our staff, critical data, and the National Disability
Insurance Scheme (the Scheme) information.
The policy aims to harness the benefits of generative AI services to enhance business
outcomes and administrative efficiencies for Agency staff.
This policy is written within the context of the Agency ICT Policy Framework.
3. Applicability
This policy applies to all Agency staff, including labour hire workers, participating in the DTA
trial of Copilot and the use of AI-products and related activities within the agency during the
Microsoft Copilot trial period.
3.1.
Policy exclusions
There are
NO exclusions to this policy.
Use of Artificial Intelligence, Generative AI, and Machine Learning Interim Policy 4
Page 96 of 189
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
3.2.
Policy exemptions
Unless specified in this policy, there are
NO exemptions, unless approved by the CIO and
recommended by the Agency AI Working Group.
4. Policy Principles
AI tools offer transformative possibilities for public sector services, particularly in enhancing
the efficiency and effectiveness of administrative and operational functions. To ensure
confidence in the Agency’s processes and alignment with the operational and strategic
goals, users of Copilot must ensure that the following policy principles are met:
•
Accountability: Users must be able to explain, justify and take ownership of any
advice or decisions made when using AI. Users are responsible for the outcomes of
AI generated artefacts. Exercising due diligence when using AI tools is required to
ensure the highest standards of quality and ethical responsibilities are met.
•
Transparency: Decisions made by Agency staff utilising AI within Microsoft
productivity software must be understandable and explainable and provide a clear
auditable trail of how the decision was made. AI must not be used for automated
independent decision making which may impact participants or the strategic direction
of the Agency.
•
Accessibility and Inclusion: Any
AI tool used by the Agency must adhere to and
actively support accessibility standards such as the Web Content Accessibility
Guidelines (WCAG).
•
Privacy: All Agency staff have a responsibility to protect classified, personal, or
otherwise sensitive information held by the Agency. Acceptable use of AI tools
compliments existing use of data and information expectations and must protect the
privacy of Participants, Partners, and other stakeholders. The use of AI must adhere
to the requirements of the
Privacy Act 1998 (Cth). The recommendations in the
privacy assurance advice in respect of Copilot must be implemented.
•
Use of Scheme data: Any recording, use or disclosure of protected Agency
information by the AI tools must comply with the secrecy provisions in the
National
Disability Insurance Scheme Act 2013 (Cth) (
NDIS Act), Unless otherwise expressly
[authorised by the CIO and] authorised under the NDIS Act, AI tools must not access
participant records in PACE, store protected Agency information outside of Australia,
disclose protected Agency information to any third parties, or use any protected
Agency information for any purposes other than provision of services to the NDIA.
•
Compliance: Use of AI tools must comply with existing Agency and ICT specific
policies including but not limited to the Acceptable Use Policy, the Digital
Collaboration policy, and ICT Security Policy. The use of AI tools must also take into
consideration and comply with the APS Code of Conduct and other relevant
legislation.
•
Training: All users of AI tools must have familiarised themselves with Agency
supplied information and training on the responsibilities and ethical use of AI tools
within the Agency.
Use of Artificial Intelligence, Generative AI, and Machine Learning Interim Policy 5
Page 97 of 189
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
•
Governance: The establishment of a dedicated AI Governance function via the
Agency AI Working Group will oversee this trial. The governance body will include
representatives with expertise in accessibility, technology, ethics, and legal
compliance.
•
Human Intervention and Review Processes: Procedural guidelines will ensure that
human oversight and input is integral to any continued use of AI tools after the trial.
This policy, supported by procedural documents and information, reflects our commitment to
ethical AI usage across the Australian Government.
5. Roles and responsibilities
5.1.
Chief Information Officer (CIO)
The CIO is responsible for ensuring that, as far as reasonably possible, effective controls are
in place during the trial period to ensure that Copilot is being used as per the requirements of
this policy. The CIO is responsible for resourcing the Copilot trial.
5.2.
AI Working Group
The AI Working Group are the responsible governance body for the trial. They are
responsible for monitoring the use of AI, ensuring that users have access to training and
resources that provides clear guidance on the use of AI during the trial. The body will review
the outcomes of the Microsoft Copilot trial when it concludes on the 30th of June 2024. The
body is also responsible for informing the DTA and other agencies of any insights that will
assist the government in implementing and guiding the use of AI tools throughout the APS.
The feedback collated from the AI Working Group will be incorporated when developing the
permanent Use of AI Policy, should the Copilot trial be embedded further into the agency.
The AI Working Group is not responsible for individual quality control of artefacts produced
as part of the trial.
5.3.
Project Team
The project team for the trial are responsible for monitoring the use of Copilot and ensuring
that, throughout the trial users are informed of their responsibilities and the need to comply
with this policy. The project team are accountable for undertaking a review of the trial and
providing feedback to DTA. The project team are responsible for working with the
Information Law and Privacy team in Legal Services Group to facilitate the privacy
assurance advice for the Copilot trial.
The project team will be responsible for reporting the outcomes of the trial to the Agency
Senior Leadership Team (SLT) and the Board in the form of a decision paper on the future
use of AI within the Agency in alignment with the approach of the Australian Government.
The project team will engage with DTA during the trial period, to socialise and review the
policy, to ensure consistency with DTA’s expectations of the Copilot trial.
Use of Artificial Intelligence, Generative AI, and Machine Learning Interim Policy 6
Page 98 of 189
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
5.4.
Copilot trial members
The members of the Copilot trial are responsible for ensuring that they understand and
comply with the requirements of this policy and that they provide unbiased and factual
feedback in a timely manner regarding their experience using Copilot.
6. Non-compliance
Any intentional,
repeated, or negligent breach of this policy or any other ICT policy by
Agency personnel may be considered a breach of the APS Code of Conduct which
could give rise to a range of possible sanctions including termination.
Breaches will be managed in line with Agency Human Resource policies.
7. Authority and review
This policy was approved by the CIO on April 2024 and will be reviewed at the completion of
the trial pending recommendations from DTA.
Use of Artificial Intelligence, Generative AI, and Machine Learning Interim Policy 7
Page 99 of 189
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
Appendix A – Glossary of terms and
abbreviations
The Glossary contains definitions for all key terms used in this document and contain
specific meaning in the context of this policy.
Term or Abbreviation
Definition
Artificial Intelligence (AI)
Systems performing tasks requiring human intelligence. For
the Agency, this could include AI-assisted communication
tools for clients with speech or hearing impairments.
Machine Learning (ML)
A subset of AI where systems learn from data and improve
results over a period. In the Agency context, this could
involve using ML to analyse data indicating software use to
inform decisions around the purchase of additional licences.
Generative AI
AI tools that generate new content based on data. For the
Agency, this can include summarising documents, providing
meeting notes from a recorded transcript, or creating a
PowerPoint presentation from a reading.
Copilot for Microsoft 365
A government initiative using AI in public sector operations.
Trial
The Agency may leverage this for automated reporting and
administration tasks, reducing the workload on staff and
enhancing service delivery.
Copilot for Microsoft 365
Copilot is an AI-powered productivity tool that coordinates
large language models (LLMs), available and appropriate
Agency data, and the Microsoft 365 apps such as Word,
Excel, PowerPoint, Outlook, and Teams. This integration
provides real-time quasi-intelligent assistance, enabling
users to enhance their creativity, productivity, and skills.
Use of Artificial Intelligence, Generative AI, and Machine Learning Interim Policy 8
Page 100 of 189
FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
Appendix B – References, legislation, and
standards
Legal Frameworks and Acts
The Privacy Act
Disability Discrimination Act 1992
National Disability Insurance Scheme Act 2013
National Disability Insurance Scheme (Protection and Disclosure of Information) Rules 2013
Ethical Guidelines and Standards
Australia’s Artificial Intelligence Ethics Framework
European Unions Ethics guidelines for Trustworthy AI
Australian Government AI Resources
Digital Transformation Agency's AI resources and guidelines
Australia's Tech Future policy document
Engaging with Artificial Intelligence
International AI Guidelines
OECD Principles on AI
IEEE Standards on AI and Autonomous Systems
Best Practice Guidelines for AI in Public Sector
World Economic Forum's Guidelines on AI Governance
United Nations Guidelines on AI and Public Service Delivery
Use of Artificial Intelligence, Generative AI, and Machine Learning Interim Policy 9
Page 101 of 189
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege
s42 - legal professional privilege

FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
OCIO Artificial Intelligence Working Group – Terms of Reference March 2024
•
Stakeholder Engagement and Feedback Synthesis: Engaging with a broad spectrum of
stakeholders, including trial participants, the DTA, and the public, to garner diverse perspectives on
AI deployment. This engagement will inform the Group's recommendations on the future direction of
AI initiatives within the Agency.
Membership
The AI Working Group shall comprise Directors and Branch Managers from various business areas within
the Agency to ensure a diverse range of perspectives and expertise. The composition is as follows:
• Chair: Phil Bergersen, Branch Manager of the OCIO Enterprise Architecture and Governance
Branch, will serve as the Chair of the Working Group, responsible for leading discussions, setting
agendas, and ensuring the Group's objectives are met.
• Deputy Chair: Cathy s47F - personal privacy, Director of the OCIO Portfolio Management Directorate, will support
the Chair and act in their stead as required.
• Additional members will include Directors and Branch Managers from Legal, Procurement, Human
Resources, and Accessibility and Inclusion each bringing unique insights into how AI can serve their
area's objectives and challenges.
Member Responsibilities
Members of the AI Working Group are entrusted with the following responsibilities:
•
Strategic Oversight: Ensure that the AI initiatives, particularly the Copilot for Microsoft 365 trial and
the development of the Interim Use of AI Policy, align with the Agency's strategic goals and comply
with Australian Government standards for ethical AI use.
•
Collaboration and Communication: Facilitate effective communication and collaboration across
departments and with external stakeholders to ensure a comprehensive understanding and
integration of AI technologies.
•
Policy Development: Lead the creation, refinement, and approval process of the Interim Use of AI
Policy, ensuring it is comprehensive, practical, and reflective of stakeholder feedback.
•
Evaluation and Reporting: Assess the outcomes of the Copilot trial and synthesise feedback from
all relevant sources to inform policy development and future AI strategy.
Operations
The AI Working Group will convene every fortnight via Microsoft Teams to ensure regular monitoring and
progress on its objectives. These meetings will:
•
Discuss Progress: Review the status of the Copilot trial and policy development efforts.
•
Resolve Issues: Identify and address any challenges or concerns arising from the trial or policy
formulation process.
•
Strategic Decisions: Make decisions regarding the direction of AI initiatives and policy adjustments
as required.
Special meetings may be called by the Chair or upon request by any member if urgent issues or decisions
arise.
March 2023
2
Page 144 of 189

FOI 24/25-1356 - DISCLOSURE LOG
OFFICIAL
OCIO Artificial Intelligence Working Group – Terms of Reference March 2024
Reporting and Review
•
Reporting Structure: The AI Working Group will report directly to the Chief Information Officer
(CIO), the Chief Operating Officer (COO) and the Agency's Senior Leadership Team, providing
regular updates on the progress, findings, and recommendations regarding the Copilot trial and AI
policy development.
•
Review Process: The ToR will be subject to an annual review or as needed to ensure it remains
relevant and effective in guiding the Working Group's activities. Feedback from members and
stakeholders will be integral to this review process.
Approval and Amendment Process
• The initial ToR is to be approved by the Chief Information Officer following consultation with the AI
Working Group members.
• Amendments to the ToR can be proposed by any member of the Working Group but must be
ratified by a majority vote within the Group before submission for final approval by the Senior
Leadership Team.
March 2023
3
Page 145 of 189
DOCUMENT 4.5
FOI 24/25-1356 - DISCLOSURE LOG
The content of this document is OFFICIAL: SENSITIVE.
Risk & Issue Review Process
The following process steps need to be included in Risk Register for consistency:
Project teams should follow the below steps to determine the applicable risk rating for each of the identified project risks and/or issues:
1.
Open Project Risk Register
2.
Refer to the NDIA Risk Rating Matrix (Below)
3.
Consider the Likelihood of the risk or issue being realised (percentage-based assessment), taking account for any existing controls in place to manage/mitigate the risk
4.
Consider the most material Consequence to the project if the risk is realised using the Project Consequence matrix (for project delivery risks) or the Strategic & Operational Consequence matrix (for delivered risks)
5.
Document both Likelihood (numeric) and Consequence (alphanumeric) rating outcomes in the Risk Register
6.
Applicable risk rating (Low/Medium/High/Critical) will be calculated based on the intersection of the Likelihood and Consequence ratings determined in steps 3. and 4. (e.g., 3D = High).
Assessments under points 3 and 4 above are to be based on the current / prevailing project environment.
Risk assessments should be conducted on a regular basis (weekly is preferred, with alignment to reporting timeframes being the minimum).
All high and critical rated risks should referred to the SES project lead for confirmation and reported to the Program Management Office as appropriate.
NDIA Risk Rating Matrix
The Agency has a risk assessment criteria which includes a risk matrix (below) to support the process of applying a severity rating to each identified risk.
The matrix contains consequence assessment criteria for both strategic and operating risks (left hand side of matrix) and project risks (right hand side of matrix), with the same likelihood ratings and severity matrix used for all risks categories (centre of matrix).
Project Consequences
Likelihood
Likelihood
Likelihood
Likelihood
Likelihood
Strategic Consequence
One ‐
Two ‐
Three ‐
Four ‐
Five ‐
Rare
Unlikley
Possible
Likely
Almost Certain
5 percent ‐ 20 percent
20 percent ‐ 50
50 percent – 80
Our Agency
< 5 percent likelihood
>80 percent likelihood
Schedule
Cost
Outcomes /
likelihood of the risk
percent likelihood of
percent likelihood of
Scheme
Staff and
Public Confidence &
of the risk occurring in
of the risk occurring in
Participant Outcomes
(critical path)
(Budget)
Benefits
occurring in the next
the risk occurring in
the risk occurring in
Partners in the
the next year
the next year
Sustainability1
Market2
Trust3
year
the next year
the next year
Community
Significantly below target • Scheme support
• Significant
• Long term
Complete loss of public
levels across most
costs >20% over
shortages in
(e.g. 1 year)
confidence and trust for
regions for:
target
availability of
significant
the agency's
• Participant intake levels
• Agency costs
critical service
deficiencies in
management of the
for > 2 quarters
>20% over budget
supply across
workforce
scheme leading to
Fundamental
• Participant satisfaction
• Divisional
most regions for
capacity,
significant Government
impact on
levels for > 3 quarters
underspend >60% > 2 quarters,
quality and
intervention and/or
>3 months or
business case
• Participant outcomes
requiring
engagement
inquiry. Board and/or
>30% of
& benefit
for > 1 year
significant NDIA
across most of
CEO removed.
>$3M or >30% of budget
schedule
(requiring
M
H
H
C
C
intervention
the agency
Extreme
whichever is the greater
whichever is the
immediate
• Significant and
• Single or
Extreme
E ‐
greater
assessment of
sustained impact
multiple
E ‐
project
on provider
fatalities or
viability)
sentiment across
serious
most regions for
sustained
> 2 quarters
health impacts
for a large
number of
people
Page 146 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Significantly below target • Scheme support
• Significant
• Significant
Significant loss of public
levels across multiple
costs 10-20% over shortages in
deficiencies in confidence and trust
regions for:
target
availability of
workforce
relating to multiple
• Participant intake levels • Agency costs 10-
some critical
capacity,
aspirations over > 1
for > 1 quarter
20% over budget
service supply
quality and
quarter. Increased
• Participant satisfaction
• Divisional
across multiple
engagement
scrutiny at ministerial
Tangible
levels for > 2 quarters
operating costs
regions for > 1
across most of level.
impact on
• Participant outcomes
>25% over budget quarter requiring the agency for
>1 month & < 3
business case
for 6-12 months
• Divisional
some NDIA
> 2 quarters
months or <30% >$1M & <=$3M or <30% of
& benefit
underspend 40-
intervention
• Potentially
60%
• Significant and
serious or life
Major
of schedule
budget whichever is the
(requiring
M
M
H
H
C
Major
‐
whichever is the
greater
immediate
lengthy impact
threatening
‐
D
D
greater
reporting to
on provider
consequences
Steering
sentiment across for a person or
Committee)
multiple regions
with systemic
for > 1 quarter
health effects
across large
parts of the
workforce
Moderately below target
• Scheme support
• Moderate
• Moderate
Moderate loss of public
levels across 1 or more
costs 5-10% over
shortages in
deficiencies in confidence and trust on 1
regions for:
target
availability of
workforce
or more issues for > 1
• Participant intake levels • Agency costs 5-
some service
capacity,
month
for 1 month
10% over budget
supply across 1
quality and
• Participant satisfaction
• Divisional
or more regions
engagement
levels for 1 quarter
operating costs 20- for > 1 month
across 1 or
>10 days & <1
• Participant outcomes
30% over budget
• Moderate
more divisions
month or <20%
>$250k & <=$1M or <20% of
Minor impact
for 6 months
• Divisional
impact on
and/or multiple
of schedule
budget whichever is the
on business
M
M
M
H
H
underspend 25-
provider
regions for > 1
Moderate
which ever is the greater
case & benefit
40%
sentiment across month
Moderate
‐
‐
C
greater
1 or more
• Significant
C
regions for > 1
injury or health
month
effects to a
person or
cohort
Slightly below target
• Scheme support
• Delay in
• Minor
Some loss of public
levels across 1 or more
costs 2-5% over
availability of few deficiencies in confidence and trust on a
regions for:
target
non-critical
workforce
single issue for < 1 month
• Participant intake levels • Divisional
services across 1 capacity,
>5 & <10 days or
for < 1 month
operating costs 10- or more regions
quality and
<10% of
>$50k & <=$250k or <10% of
No impact to
• Participant satisfaction
20% over budget
for < 1 month
engagement
schedule which
budget whichever is the
business case
levels for < 1 quarter
• Divisional
• Minor impact on within 1
Minor
L
L
M
M
M
Minor
‐
ever is the
greater
& benefit
• Participant outcomes
underspend 10-
provider
division for < 1
‐
B
greater
for < 1 quarter
25%
sentiment across month
B
1 or more
• Minor impact
regions for < 1
on a person’s
month
health
Below target levels within • Scheme support
• Minor delay for
• Temporary
Minor and isolated
1 region for:
costs <2% over
< 1 week in
minor shortfall damage to public
t
• Participant intake levels target
availability of non- in workforce
confidence and trust
t
for < 1 week
• Divisional
critical services
capacity,
within 1 region or issue.
Up to <5 days or
No impact to
• Participant satisfaction
operating costs 5-
within 1 region
quality and
<5% of schedule Up to $50k or <5% of budget
business case
L
L
L
L
L
levels for <1 month
10% over budget
• Minor impact
engagement
significan
which ever is the whichever is the greater
& benefit
• Participant outcomes
• Divisional
for < 1 week on
within 1 region
significan
In
greater
for < 1 quarter
underspend <10%
provider
In
‐
‐
A
sentiment within
A
1 region
Page 147 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Project
Current
Current Likelihood
Risk
Causes
Impacts
Consequence
Rating
Status
Rating
A project risk is an uncertain event that may or may not occur during a project
Select status
i.e. List each Cause as a bullet point
i.e. List each Impact as a bullet point
Select Consequence
Select Liklehood rating
Rating
Active
There could be configuration issues leading to data breaches where Copilot retrieved
Inappropriate access of information by Copilot tool.
B - Minor
Two - Unlikley
sensitive information that the user should not have had permissions for. Broader concerns
Breach of data governance & agency process.
regarding data governance in terms of how data is stored and who has access to it.
Risks to data and information security.
Impact to participants privacy
Sensitive information compromised
Active
CoPilot implementation has lack of appropriate training for users
Inefficient use of the tool
C - Moderate
Three - Possible
'Inadequate and short of timing by agency to train staff
Confusion and errors
'Agency more focussed on reducing costs and funds are not spent on training
Retraining and further agency costs
More time consuming
Inadvertant data breaches and private and confidential data is released
'Participants data used within the tool to help arrive on decisions
Active
People not updating patches (live updates on PC, software updates, windown updates)
The introduction and outputs from Copilot will bring with it increased risks and channels for threats, i.e, Copilot D - Major
Two - Unlikley
Phishing email
can often provide external agency links. Sometimes these links may contain and bring with them increased
Likelihood of phishing incidents due to the high volume of links present in Copilot output.
cyber threats.
'Adversarial Attacks: These are deliberate attempts to confuse or mislead AI systems, such Ransom demand from cyber criminals against the agency and the individuals
as feeding them untrustworthy data to induce malfunctions.
Reputational damage, credibility and loss of public trust
Lawsuits, damages and further regulatory fines for the agency
Active
Microsoft updates having updates on Copilot which hasn't been patched
Unintended access to Copilot by Agency staff
B - Minor
Three - Possible
Reset of settings in Microsft to access Copilot
Data breaches, participant data breaches and confidentiiality breaches
Lack of monitoring by agency on use of Copilot
Reputational damage, credibility and loss of public trust
Lawsuits, damages and further regulatory fines for the agency
Active
Change resistance culture
Staff confusion and ineffective use of the tool
B - Minor
Three - Possible
Lack of planning and ineffective change management process
Costs of AI in license is under utilised
Leadership not supportive the change management activties
Likelihood of data breaches and information security
Active
Copilot is a new project, process, and technology which will have new risks posing to the
Benefits of Copilot won't be realised
C - Moderate
Three - Possible
agency.
Page 149 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Current Risk
Current Risk Rating Rationale
Control Name
rating
Based on the
Rationale for the Current Risk Rating, including justification for the Current Likelihood and
Date when the Risk will i.e. One line summary/title
Likelihood/Consequence Consequence rating.
be reviewed
fields
Low
Agency to ensure their information management and security processes are prepared for
Ongoing/Review
Existing ICT policies
Copilot i.e. actively manage their permissions and IT infrastructure to address data storage
Interim policy for the trial
and access issues and to provide enhanced data governance to maintain appropriate
Authority to Operate
oversight on data and information assets. Without the appropriate infrastructure and
Microsoft agreement on Data Protection with DTA to allow Copilot
governance in place, there are risks to data and information security.
access only to files based on user identity.
Medium
Training needs to incorporate both technical information and practical training on the use of
Ongoing/Review
Training: All users of AI tools must have familiarised themselves with
Copilot. Agency need to provide both technical information on Copilot and practical training
Agency supplied information (CIA) and training on the responsibilities
on how to use Copilot, such as how to author prompts. In addition, there is also a need for
and ethical use of AI tools within the Agency.
tailored training that reflected participants’ role and use of Copilot.
Better coomunication plan
Effective change impact assessment
Awareness of Copilot risks
Medium
Valuable data on vunerable participants
Ongoing/Review
Awareness to staff n scams and cyber threats
Lack of security within agency firewalls
Increase numner of staff to manage cyber security
Increase threats on cyber threats due to evolving technology and more sophisticated scams
Strengthened monitoring controls to capture when participants data or
been developed
ocnfidential data is entered into AI tool.
Continuous improvement in IT controls
Copilot does not access CRM or Protected Enclave
Medium
Possibility of microsoft embedding Copilot links in their existing product suites that is
Ongoing/Review
Policy inplace restricting staff from unitentionally accessing Copilot
available to staff within the agency.
Stronger monitoring in place, more staff reconfiguring settings and
applying patches
Medium
Effective communication plan & strategy followed by Project team
Ongoing/Review
Comms plan drafted by Internal Communications
Staff training and awareness
Effective transition plan for the agency & government
High Staff competency level
Learning from other government agencies
Conduct Surveys
Medium
Lack of controls within Insight for Cyber Security
Ongoing/Review
New controls for AI technology
Unknown knowledge about AI technology
Strengthen control assurance program
Ability to enforce breaches is limited
Stronger enforcement where policy breaches occur
Page 150 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Primary Control
Control Description
Control Owner
Control Manager
Status
Categor
y
i.e. Detailed description
SES accountable for the
Responsible for day to day
Select status
Refer to Controls taxonomy sheet
Control
management of the Control
(For approved status please ensure
Control has been approved by
Surname, First name
Surname, First name
Control Owner)
Copilot accesses files based on the user's identity and current level of permission
Active
ICT_and_Physical_Security
Copilot does not access CRM or Protected Enclave
Page 151 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Is this control
Is there an
Control
What is the
linked to a
assurance
Control Sub-
Control
How does the control
documented
assurance/review
Control Au
tomation
Frequency
risk and/or a
process
Category
Typ
e
mitigate the risk
within a
schedule of the
Regulatory
behind this
process?
control?
Obligation?
control?
Refer to Controls taxonomy
Refer to List data
Refer to Definitions sheet
Refer to Definitions
Risk or Regulatory
Describe the effect of the control on the
Yes or No
Yes or No
Frequency (refer to drop
sheet (aligned with primary
sheet
sheet
obligation or both?
cause(s) and how it mitigates the risk(s)
down options)
control)
Digital Access & Cyber Security
Preventative
Manual with quality assurance
Ad-hoc
Risk
The Windows and Microsoft 365 identity
Yes
No
N/A
controls provide effective measures to
restrict access to Agency data.
Microsoft has quarrantined Agency data in
line with the Copilot trial agreement made
with the Digital Transformation Agency.
Page 152 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Treatment/Action Plan
Treatment Name
Treatment Description
Treatment/Action Plan Owner
Manager
What additional actions are required/underway
i.e. One line summary/title
i.e. Detailed description
Surname, First name
Surname, First name
Will be reviewed if there is a decision to move Copilot into production
and make it available beyond the trial.
Page 153 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Original Due Date
Status
(DD/MM/YYYY)
i.e. 01/01/2021
Select status
In Progress
In Progress
In Progress
In Progress
In Progress
In Progress
Page 154 of 189

FOI 24/25-1356 - DISCLOSURE LOG
Controls Taxonomy
A controls taxonomy is a controlled vocabulary of terms used to categorise and organise the Controls library.
The Taxonomy ensures:
A consistent standard of controls for the Agency
The ability to effectively manage a more sustainable Controls Library
Reduces duplication of controls
Supports a more mature risk environment
Primary Control Category
Definition
Control Sub-Category
Examples
Documents that govern the design,
operations, specifications &
Standards & Documentation
Legislation
NDIS Act, PGPA Act, Privacy Act, NDIS Rules
regulations of the Agency & the
Scheme.
Policy & Frameworks
Fleet Vehicle Policy, Risk Management Framework
Standard Operating Procedures
Standard Operating Procedures
Practise Guidance & Knowledge Management
Guidance materials
Templates & Checklists
Medium Term Accommodation Checklist, Off-System Planning, Severity Tools
Enterprise Agreements & Employment Policies
Enterprise Agreements, Policies
The monitoring, assessment &
Validation & Reviews
validation of data to ensure we are
Formal Compliance & Assurance Processes
Controls Assurance, Compliance Checks
a high performing NDIA.
Fraud & Risk Detection Profiles
Typologies, Reviews, Assessments
Reconciliation Validating
expenditure
On - System Verification
Payment Validations, mandatory & structured data fields
System Alerts
Pace Alerts, Geolocating, payment flags
Data Extraction & Validation
BI Reports, Data Collation for reporting, Survey Results
The tasks associated with the daily
Organisational Governance
Defined Roles & Responsibilities
Documented Roles and Responsibilities, Span of Control
running of business.
Segregation of Duties
Separate review/ sign-off / approval processes
Instruments of Delegation
HR Delegations, Financial Authorities, Board Attestation, CEO/ELT Representation letters
Reporting
Financial Reporting, Incident Reporting, Declarations, Risk Reporting, ELT Reporting, Attestations
Procurement & Contract Management, On-Boarding & Off-Boarding Process, Action Logs, Comms
Operational Management
Plans
Building staff capability & supporting
Resource Management
Discretionary Training (E-Learning & Facilitated)
Conferences, technical training
a high performing NDIA.
Mandatory Training
Legislative, Agency Assigned
Probation Procedure
Probation
Performance Management
Annual Performance Plans & Performance Support
Managing the systems & physical
ICT & Physical Security
Building Security & Safety
Building Passes, Fire Alarms, Locked Doors
assets of the Agency.
Digital Access & Cyber Security
System Access and protecting data
Records Management
Recording and Storage of documents
Asset Management
Phones, Laptops, Fleet Vehicles, Key Registers
Business Continuity
Crisis Communications System,
Incident Management
Incident & Emergency Reporting & Response (ICT, Cyber, Media, WHS), Speak Up
System Operations
ICT Maintenance, ICT Upgrades, New ICT Programs
Protecting the health & safety of
Work Health & Safety
Workplace Supports
Ergonomic Workstations, Disability Adjustments, EAP
staff & participants.
Health, Safety & Wellbeing
Personal Emergency & Evacuation Plans, Wellbeing Programs,
Page 155 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Control Sub-Category
Primary Control Category
(please note: sub-categories aligned with specific primary
control. Refer to control taxonomy tab for mappings)
Control Type:
Control Evaluation
Frequency Linked to risk or obligation Control documented within a process? What is the assurance/review schedule of this control?
Standards & Documentation
Legislation
Corrective
Automated with Intervention
Ad‐hoc
Risk
Yes
Daily
Validation & Reviews
Policy & Frameworks
Detective
Fully Automated
Daily
Regulatory Obligation
No
Weekly
Organisational Governance
Standard Operating Procedures
Directive
Manual
Weekly
Both
Monthly
Resource Management
Practise Guidance & Knowledge Management
Preventative
Manual with quality assurance Monthly
Quarterly
ICT & Physical Security
Templates & Checklists
Quarterly
Annually
Work Health & Safety
Enterprise Agreements & Employment Policies
Bi‐Annually
N/A
Formal Compliance & Assurance Processes
Annually
Fraud & Risk Detection Profiles
Reconciliation
On - System Verification
System Alerts
Data Extraction & Validation
Defined Roles & Responsibilities
Segregation of Duties
Instruments of Delegation
Reporting
Operational Management
Discretionary Training (E-Learning & Facilitated)
Mandatory Training
Probation Procedure
Performance Management
Building Security & Safety
Digital Access & Cyber Security
Records Management
Asset Management
Business Continuity
Incident Management
System Operations
Workplace Supports
Health, Safety & Wellbeing
Page 157 of 189
DOCUMENT 5
FOI 24/25-1356 - DISCLOSURE LOG
ADDITIONAL ESTIMATES 2023–24
TITLE:
USE OF AI/ML AND ADVANCED TECHNOLOGIES WITHIN NDIA
WITNESS: Sam Porter, Chief Operating Officer, DCEO Enabling Services
Strategic Narrative
• The NDIA is supportive of the safe, responsible, and ethical use of
Artificial Intelligence (AI) in line with the advice provided by the AI in
Government Taskforce.
• AI, Machine Learning, and generative AI will provide opportunities for the
NDIA in the future to improve the efficiency and effectiveness of delivery
to NDIS participants.
•
Machine Learning is a subset of AI that involves the use of algorithms to
learn from data and make predictions or decisions without being
explicitly programmed.
•
Generative AI is a type of AI that involves the use of machine learning
models to generate new data that is like a training set. It differs from
other forms of machine learning in that it doesn't just make predictions
or decisions based on data but can also create new data.
• The Agency is still developing its policy position on the use of AI,
especially Generative AI, which is changing quickly in the market. We are
listening to various views from across the Australian Public Service to
inform the NDIA approach.
• On 20 September 2023, the Hon Ed Husic MP, Minister for Industry and
Science and Senator Katy Gallagher, Minister for Finance announced the
AI in Government Taskforce.
• In January 2024, the Australian Government has released its interim
response to the Safe and Responsible AI in Australia consultation, which
seeks to ensure that AI is safe and responsible.
• The response focuses on the use of AI in high-risk settings, where harms
could be hard to undo, while allowing low-risk AI use to keep growing.
Page 158 of 189
FOI 24/25-1356 - DISCLOSURE LOG
• The government is thinking about mandatory rules for AI development
and deployment in high-risk settings and is taking immediate actions
such as working with industry to develop a voluntary AI Safety Standard
and options for voluntary labelling and watermarking of AI-generated
materials. This work will be vital to the safe and responsible use of AI
within the NDIS.
• The NDIS will develop approaches to the governance, risk management
and adoption of AI at an agency level, based on the outcomes of the AI in
Government Taskforce and future update to the interim guidance.
• As part of the upcoming Microsoft Copilot trial coordinated by the Digital
Tranformation Agency (DTA), we will work through uplifting Agency
capability to provide a better understanding of how AI may impact
Agency security and support the disability sector.
• Since AI can easily take inputs, NDIA staff will be educated on the risks of
disclosing classified information, or personal information. The
Privacy Act
1988 (Privacy Act) and Protective Security Policy Framework (PSPF) will
be referenced in all training.
• Aligned with the DTA guidance, the Chief Information Officer will be clear
in warning that participant data, plans, or supports should never be
included in any interaction with an AI service, whether it is approved or
not.
• In May 2023, the Agency stopped access to Open AI products (see
Background) in the Agency Information and Communications Technology
Environment.
• This decision was driven by compliance obligations with the Privacy Act
along with the risk of participant data being stored overseas and the re-
use of that data by third parties.
• AI tools offer new and innovative ways for government agencies to
improve effectiveness and efficiency of services, but AI also gives threat
actors new and improved abilities for targeting ICT systems, which has
already been seen with better and more targeted phishing campaigns.
Page 159 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Background
• OpenAI products are a collection of AI tools and models created by the research
organisation OpenAI, with ChatGPT and similar language-based models being the
most popular products.
• OpenAI and similar products can have an impact on the environment by generating
an output (results, recommendations, or options) based on a given set of
objectives. It uses a combination of machine AI, generative AI, and/or data and
inputs from humans to perceive real and/or virtual environments; simplify these
perceptions into models through analysis in an automated way. The output of these
models can be documents or images.
• Access to Open AI products was largely restricted in May 2023 and the NDIA has
continued to track the launch of new products into the market.
Improved AI monitoring in the future
• The NDIA Cyber Security operations team continues to provide regular scanning of
the broader technology and AI landscape.
• In September 2023, NDIA Cyber Operations broadly blocked all generative AI
services.
• This block excludes services like Read Speaker, Adobe Sensei, and Microsoft’s Bing
native AI chat function which can be used to support Agency assistive technology
requirements.
• Access to Generative AI services were blocked due to the following risks:
− Quality of data outputs
− NDIS data being part of the AI training set which might include Agency
protected data
− NDIA would not have control over NDIS data once it was used in external AI
services
− The use of generative AI products has led to a big rise in malware created by
criminal and state-based actors who want to attack systems and services.
Page 161 of 189
FOI 24/25-1356 - DISCLOSURE LOG
AI Opportunities for the NDIA
• By using Machine Learning AI, Generative AI, and process automation, the NDIA can
enhance its performance and productivity in administrative tasks. This can lead to
better results for Participants and lower costs to improve scheme sustainability.
• The NDIA will work with the DTA, the AI in Government Taskforce, and other
government entities to establish a cohesive strategy and effective governance
model for the management and use of AI in NDIA to ensure:
− the accuracy of data and outcomes developed through AI
− the security of the systems and data engaging with AI
− the development of clear governance structures for the use of AI
− effective educative supports allow NDIA users and ICT staff to work within
whole of government and legislative guiderails when using AI to keep critical
data secure.
• Examples of sanctioned use of generative AI within the Agency ICT environment
include:
−
Adobe Sensei: This product allows NDIA business areas to utilise generative AI
to assist in editing images quickly within Adobe products, saving critical time in
the creation of marketing or educative materials.
−
ReadSpeaker.com: This web based generative AI application is used to allow
text to speech services within the NDIA external websites. The AI functionality
allows text to mimic natural language and context driven speech patterns and is
an advance on robotic text translators.
Page 162 of 189
FOI 24/25-1356 - DISCLOSURE LOG
CoPilot for Microsoft 365
• In November 2023, the NDIA was invited to be one of 30 agencies to take part in a
managed trial of CoPilot for Microsoft 365 (M365) coordinated by the DTA.
• The trial of Copilot for M365 will provide a safe and integrated AI environment that
can be fully controlled by the Agency. The trial will begin in February 2024 with 300
initial licences and will include members of the NDIA Employee Disability Network.
Scheme Actuary algorithms
• The Agency’s Scheme Actuary and Data Analytics area do not use generative AI
within any of the algorithms that currently support the scheme.
• While machine learning is utilised within draft budgets (or Typical Support Package)
for first plans based on key information from a participant’s profiles. The algorithm
is only ever used to make recommendations, with decisions made by actual
delegates.
• The machine learning recommendations are used to assist delegates by speeding
up the initial analysis to provide quicker resolutions for participants and improved
service.
Policy Development Update
• The NDIA will work with the DTA to develop an interim policy for the governance
and use of AI, based on the outcomes of the Copilot for M365 trial and the AI in
Government Taskforce.
Page 163 of 189
DOCUMENT 6
FOI 24/25-1356 - DISCLOSURE LOG
ADDITIONAL ESTIMATES 2023-2024
TITLE:
USE OF AI/ML AND ADVANCED TECHNOLOGIES WITHIN NDIA
WITNESS: Sam Porter, Chief Operating Officer, Deputy Chief Executive Officer,
Enabling Services
Strategic Narrative
• The NDIA is supportive of the safe, responsible, and ethical use of
Artificial Intelligence (AI) in line with the advice provided by the AI in
Government Taskforce.
• AI, Machine Learning, and generative AI will provide opportunities for
the NDIA in the future to improve the efficiency and effectiveness of
delivery to NDIS participants.
•
Machine Learning is a subset of AI that involves the use of algorithms to
learn from data and make predictions or decisions without being
explicitly programmed.
•
Generative AI is a type of AI that involves the use of machine learning
models to generate new data that is like a training set. It differs from
other forms of machine learning in that it doesn't just make predictions
or decisions based on data but can also create new data.
• The Agency is still developing its policy position on the use of AI,
especially Generative AI, which is changing quickly in the market. We are
listening to various views from across the Australian Public Service to
inform the NDIA approach.
• On 20 September 2023, the Hon Ed Husic MP, Minister for Industry and
Science and Senator Katy Gallagher, Minister for Finance announced the
AI in Government Taskforce.
• In January 2024, the Australian Government has released its interim
response to the Safe and Responsible AI in Australia consultation, which
seeks to ensure that AI is safe and responsible.
• The response focuses on the use of AI in high-risk settings, where harms
could be hard to undo, while allowing low-risk AI use to keep growing.
Page 164 of 189
FOI 24/25-1356 - DISCLOSURE LOG
• The government is thinking about mandatory rules for AI development
and deployment in high-risk settings and is taking immediate actions
such as working with industry to develop a voluntary AI Safety Standard
and options for voluntary labelling and watermarking of AI-generated
materials. This work will be vital to the safe and responsible use of AI
within the NDIS.
• The NDIS will develop approaches to the governance, risk management
and adoption of AI at an agency level, based on the outcomes of the AI in
Government Taskforce and future update to the interim guidance.
• As part of the upcoming Microsoft Copilot trial coordinated by the Digital
Transformation Agency (DTA), we will work through uplifting Agency
capability to provide a better understanding of how AI may impact
Agency security and support the disability sector.
• Since AI can easily take inputs, NDIA staff will be educated on the risks of
disclosing classified information, or personal information.
The Privacy Act
1988 (Privacy Act) and Protective Security Policy Framework (PSPF) will
be referenced in all training.
• Aligned with the DTA guidance, the Chief Information Officer will be clear
in warning that participant data, plans, or supports should never be
included in any interaction with an AI service, whether it is approved or
not.
• In May 2023, the Agency stopped access to Open AI products (see
Background) in the Agency Information and Communications Technology
Environment.
• This decision was driven by compliance obligations with the Privacy Act
along with the risk of participant data being stored overseas and the re-
use of that data by third parties.
• AI tools offer new and innovative ways for government agencies to
improve effectiveness and efficiency of services, but AI also gives threat
actors new and improved abilities for targeting ICT systems, which has
already been seen with better and more targeted phishing campaigns.
Page 165 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Background
• OpenAI products are a collection of AI tools and models created by the research
organisation OpenAI, with ChatGPT and similar language-based models being the
most popular products.
• OpenAI and similar products can have an impact on the environment by generating
an output (results, recommendations, or options) based on a given set of
objectives. It uses a combination of machine AI, generative AI, and/or data and
inputs from humans to perceive real and/or virtual environments; simplify these
perceptions into models through analysis in an automated way. The output of these
models can be documents or images.
• Access to Open AI products was largely restricted in May 2023 and the NDIA has
continued to track the launch of new products into the market.
Improved AI monitoring in the future
• The NDIA Cyber Security operations team continues to provide regular scanning of
the broader technology and AI landscape.
• In September 2023, NDIA Cyber Operations broadly blocked all generative AI
services.
• This block excludes services like Read Speaker, Adobe Sensei, and Microsoft’s Bing
native AI chat function which can be used to support Agency assistive technology
requirements.
• Access to Generative AI services were blocked due to the following risks:
− Quality of data outputs
− NDIS data being part of the AI training set which might include Agency
protected data
− NDIA would not have control over NDIS data once it was used in external AI
services
− The use of generative AI products has led to a big rise in malware created by
criminal and state-based actors who want to attack systems and services.
Page 167 of 189
FOI 24/25-1356 - DISCLOSURE LOG
AI Opportunities for the NDIA
• By using Machine Learning AI, Generative AI, and process automation, the NDIA
can enhance its performance and productivity in administrative tasks. This can lead
to better results for Participants and lower costs to improve scheme sustainability.
• The NDIA will work with the DTA, the AI in Government Taskforce, and other
government entities to establish a cohesive strategy and effective governance
model for the management and use of AI in NDIA to ensure:
− the accuracy of data and outcomes developed through AI
− the security of the systems and data engaging with AI
− the development of clear governance structures for the use of AI
− effective educative supports allow NDIA users and ICT staff to work within
whole of government and legislative guiderails when using AI to keep critical
data secure.
• Examples of sanctioned use of generative AI within the Agency ICT environment
include:
−
Adobe Sensei: This product allows NDIA business areas to utilise generative AI
to assist in editing images quickly within Adobe products, saving critical time in
the creation of marketing or educative materials.
−
ReadSpeaker.com: This web based generative AI application is used to allow
text to speech services within the NDIA external websites. The AI functionality
allows text to mimic natural language and context driven speech patterns and is
an advance on robotic text translators.
CoPilot for Microsoft 365
• In November 2023, the NDIA was invited to be one of 30 agencies to take part in a
managed trial of CoPilot for Microsoft 365 (M365) coordinated by the DTA.
• The trial of Copilot for M365 will provide a safe and integrated AI environment that
can be fully controlled by the Agency. The trial will begin in February 2024 with 300
initial licences and will include members of the NDIA Employee Disability Network.
Page 168 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Scheme Actuary algorithms
• The Agency’s Scheme Actuary and Data Analytics area do not use generative AI
within any of the algorithms that currently support the scheme.
• While machine learning is utilised within draft budgets (or Typical Support Package)
for first plans based on key information from a participant’s profiles. The algorithm
is only ever used to make recommendations, with decisions made by actual
delegates.
• The machine learning recommendations are used to assist delegates by speeding
up the initial analysis to provide quicker resolutions for participants and improved
service.
Policy Development Update
• The NDIA will work with the DTA to develop an interim policy for the governance
and use of AI, based on the outcomes of the Copilot for M365 trial and the AI in
Government Taskforce.
Page 169 of 189
DOCUMENT 7
BUDG
FOI
ET ESTI
24/25-1356
MATES 2024–25
- DISCLOSURE LOG
TITLE:
USE OF AI/ML AND ADVANCED TECHNOLOGIES WITHIN NDIA
WITNESS: Sam Porter, Chief Operating Officer, Deputy Chief Executive Officer,
Enabling Services
Strategic Narrative
• The NDIA is supportive of the safe, responsible, and ethical use of
Artificial Intelligence (AI) in line with the advice provided by the AI
in Government Taskforce.
• AI, Machine Learning, and Generative AI will provide opportunities
for the NDIA to improve the Agency’s efficiency and effectiveness.
• However, there are also important questions about how to ensure
that the use of AI is safe, responsible and ethical.
• We are closely following whole-of-government developments in this
space, and taking a cautious approach.
• The Australian Government is thinking about mandatory rules for AI
development and deployment in high-risk settings and is taking
immediate actions such as working with industry to develop a
voluntary AI Safety Standard and options for voluntary labelling and
watermarking of AI-generated materials. This work will be vital to the
safe and responsible use of AI within the NDIA.
• The Agency has developed an interim policy position on the use of AI,
especially Generative AI which is changing quickly in the market. We
are working with the AI in Government Taskforce on this dynamic
policy issue.
• See
Attachment A for the Use of Artificial Intelligence, Generative AI,
and Machine Learning – Interim Policy.
• The NDIA will develop approaches to the governance, risk
management and adoption of AI at an agency level, based on the
outcomes of the AI in Government Taskforce and future updates to the
interim guidance.
• As part of the trial of CoPilot for Microsoft 365 (M365) coordinated by
the Digital Transformation Agency (DTA), we are uplifting Agency
capability to provide a better understanding of how AI may impact
Agency security and support the disability sector.
Page 170 of 189
• In May 2023, the Agency
FOI stoppe
24/25-1356 d access to Open
- DISCLOSURE LOG AI products (see
Background) in the Agency Information and Communications
Technology Environment.
• This decision was driven by concerns about preserving the security,
confidentiality and privacy of information held by the Agency.
• On 20 September 2023, the Hon Ed Husic MP, Minister for Industry
and Science and Senator Katy Gallagher, Minister for Finance
announced the AI in Government Taskforce.
• In January 2024, the Australian Government has released its interim
response to the Safe and Responsible AI in Australia consultation,
which seeks to ensure that AI is safe and responsible.
• The response focuses on the use of AI in high-risk settings, where
harms could be hard to undo, while allowing low-risk AI use to keep
growing.
• On 17 January 2024, interim advice from the AI in Government
Taskforce was released to guide the safe and responsible use of AI.
The advice indicated that mandatory guiderails for AI development
and deployment of generative AI in high-risk settings may occur
through specific changes to legislation to protect privacy and sharing
of confidential information.
• The DTA updated its advice to Government Departments and
corporate entities on the use of Generative AI to support the
management of risk in November 2023.
• NDIA welcomes the AI in Government Taskforce created on
20 September 2023 and will consider outcomes from the Taskforce,
and any guidance on public use of generative AI platforms.
• NDIA suspended use of Open AI products in May 2023 to prevent input
of personal information into AI tools and ensure alignment with
Privacy Act.
• The Agency continues to monitor the rapidly evolving AI and
Generative AI technology landscape and block access to tools in the
interest of protecting Agency and participant data.
Page 171 of 189
Background
FOI 24/25-1356 - DISCLOSURE LOG
•
Machine Learning is a subset of AI that involves the use of algorithms to
learn from data and make predictions or decisions without being explicitly
programmed.
•
Generative AI is a type of AI that involves the use of machine learning models
to generate new data that is like a training set. It differs from other forms of
machine learning in that it doesn't just make predictions or decisions based on
data but can also create new data.
• OpenAI products are a collection of AI tools and models created by the
research organisation OpenAI, with ChatGPT and similar language-based
models being the most popular products.
• OpenAI and similar products can have an impact on the environment by
generating an output (results, recommendations, or options) based on a given
set of objectives. It uses a combination of machine AI, generative AI, and/or data
and inputs from humans to perceive real and/or virtual environments and
simplify these perceptions into models through analysis in an automated way.
The output of these models can be documents or images.
• Access to Open AI products was largely restricted in May 2023 and the NDIA
has continued to track the launch of new products into the market.
Improved AI monitoring in the future
• The NDIA Cyber Security operations team continues to provide regular scanning
of the broader technology and AI landscape.
• In September 2023, NDIA Cyber Operations broadly blocked all Generative
AI services.
• This block excludes services like Read Speaker, Adobe Sensei, and Microsoft’s
Bing native AI chat function which can be used to support Agency assistive
technology requirements.
• Access to Generative AI services were blocked due to the following risks:
− Quality of data outputs
− NDIS data being part of the AI training set which might include
Agency protected data
− NDIA would not have control over NDIS data once it was used in external
AI services
− The use of Generative AI products has led to a big rise in malware created
by criminal and state-based actors who want to attack systems and
services.
Page 172 of 189
AI Opportunities for the NDIA
FOI 24/25-1356 - DISCLOSURE LOG
• By using Machine Learning AI, Generative AI, and process automation, the NDIA
can enhance its performance and productivity in administrative tasks. This can
lead to better results for participants and lower costs to improve scheme
sustainability.
• The NDIA will work with the DTA, the AI in Government Taskforce, and
other government entities to establish a cohesive strategy and effective
governance model for the management and use of AI in NDIA to ensure:
− the accuracy of data and outcomes developed through AI
− the security of the systems and data engaging with AI
− the development of clear governance structures for the use of AI
− effective educative supports allow NDIA users and ICT staff to work within
whole of government and legislative guiderails when using AI to keep
critical data secure.
• Examples of sanctioned use of generative AI within the Agency ICT
environment include:
−
Adobe Sensei: This product allows NDIA business areas to utilise generative
AI to assist in editing images quickly within Adobe products, saving critical
time in the creation of marketing or educative materials.
−
ReadSpeaker.com: This web based generative AI application is used to allow
text to speech services within the NDIA external websites. The AI functionality
allows text to mimic natural language and context driven speech patterns and
is an advance on robotic text translators.
CoPilot for Microsoft 365
• In November 2023, the NDIA was invited to be one of 30 agencies to take part in
a managed trial of CoPilot for M365 coordinated by DTA.
• The trial of Copilot for M365 will provide a safe and integrated AI environment that
can be fully controlled by the Agency. The trial began in February 2024 with 300
initial licences and includes members of the NDIA Employee Disability Network.
• CoPilot is a decision support system, which supports Microsoft products and
services. It automates basic activities such as minute taking and calculations, when
directed. It does not make decisions or independently generate output.
Security of CoPilot for Microsoft 365
• CoPilot is part of the M365 tenancy, which the NDIA uses for its Operating
Environment. The Operating Environment has a set of features incorporated into
its design which protect Agency information and systems.
• The NDIA’s Operating Environment has been subject to a separate and rigorous
cyber security assessment.
Page 173 of 189
• CoPilot is a private large lan
FOI guage mo
24/25-1356 del, which is separa
- DISCLOSURE LOG te from other Microsoft
customers. Agency information is not available to the public and is not used to
train CoPilot.
• The Agency has performed a separate risk assessment on CoPilot and reviewed
DTA’s IRAP Assessment.
• CoPilot uses existing access levels and configurations across the Microsoft suite.
It leverages the Agency’s existing controls to prevent unauthorised access to
information.
• CoPilot users go through training on how CoPilot works and their responsibilities
in using it at work. This includes their responsibility for the use of all CoPilot
output.
Scheme Actuary algorithms
• The Agency’s Scheme Actuary and Data Analytics area does not use Generative
AI within any of the algorithms that currently support the scheme.
NDIA Policy Update
• In April 2024, the CIO approved the NDIA’s Use of Artificial Intelligence,
Generative AI, and Machine Learning – Interim Policy. The policy was
considered by ICT, cyber security and legal experts in the NDIA. See
Attachment A for the policy.
• The NDIA shared the policy with the DTA as part of our participation in the
trial of Copilot for M365. Feedback and comments from the DTA and the AI in
Government Taskforce will be considered as the NDIA actively monitors the AI
environment.
Attachments
• Attachment A - Use of AI Gen AI and ML Interim Policy
Page 174 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Page 177 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Page 178 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Page 179 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Page 180 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Page 181 of 189
FOI 24/25-1356 - DISCLOSURE LOG
DOCUMENT 8
Title:
Use of AI/ML and Advanced Technologies Within NDIA
Witness: Sam Porter, Chief Operating Officer, Deputy Chief Executive Officer
Enabling Services
Talking Points
• The NDIA supports the safe, responsible and ethical use of Artificial
Intelligence (AI). This aligns with advice provided by the AI in Government
Taskforce led by the Digital Transformation Agency (DTA).
• AI, Machine Learning (ML) and generative AI provide opportunities
to improve the Agency’s efficiency and effectiveness.
• The Agency follows guidance from the DTA to ensure it’s use of AI is safe,
responsible and ethical.
• The NDIA is part of a Government AI working group that is developing rules
for AI development and deployment in high-risk settings. This will
complement the work underway by DTA with industry to develop a voluntary
AI Safety Standard and options for voluntary labelling and watermarking of
AI-generated materials. The Agency’s work with the DTA will be vital for the
safe and responsible use of AI within the Agency.
• The NDIA has developed an interim policy position on the use of AI, noting
this technology is changing quickly with a growing market, expansion of AI
tools and integration of AI within existing tools.
• The NDIA took part in the Copilot for Microsoft 365 trial coordinated by the
DTA which ran from January 2024 to June 2024.
• This trial provided valuable data on the impact AI can provide and how it can
better support the disability sector. In comparison to open AI, Copilot
provides the benefits of a large language model whilst protecting Agency
data. Risk is reduced as open market AI tools harvest data as part of their
operating model.
Page 184 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Use of AI in the NDIA – Copilot Trial
• The NDIA provides training to employees and contractors on the responsible
use of AI. This has included micro-credentials at the AI introductory level,
and an AI literacy short course targeting the Agency’s senior leaders.
The Agency provided specific training to staff who participated in the
Microsoft Copilot trial to ensure appropriate, ethical and responsible use
of AI.
• The DTA Policy for the responsible use of AI in government (external) and
the NDIA’s Use of AI, Generative AI, and Machine Learning - Interim Policy
are clear that sensitive and security classified information must not be
entered into generative AI tools.
• In May 2023, the Agency ceased access to Open AI products in
the Agency’s operating environment due to concerns regarding the security,
confidentiality and privacy of information held by the Agency.
• The NDIA has blocked access to public generative AI services, continues
to monitor web content filtering and data loss prevention controls, and
assesses and blocks new websites as they arise.
Page 185 of 189
FOI 24/25-1356 - DISCLOSURE LOG
DOCUMENT 9
Title:
Use of AI/ML and Advanced Technologies Within NDIA
Witness: Sam Porter, Chief Operating Officer, Deputy Chief Executive Officer
Enabling Services
Talking Points
• The NDIA supports the safe, responsible and ethical use of Artificial
Intelligence (AI). This aligns with advice provided by the AI in Government
Taskforce led by the Digital Transformation Agency (DTA).
• AI, Machine Learning (ML) and generative AI provide opportunities
to improve the Agency’s efficiency and effectiveness.
• The Agency follows guidance from the DTA to ensure its use of AI is safe,
responsible and ethical.
• The NDIA is part of a Government AI working group that is developing rules
for AI development and deployment in high-risk settings. This will
complement the work underway by the DTA with industry to develop a
voluntary AI Safety Standard and options for voluntary labelling and
watermarking of AI-generated materials. The Agency’s work with the DTA
will be vital for the safe and responsible use of AI within the Agency.
• The NDIA has developed an interim policy position on the use of AI, noting
this technology is changing quickly with a growing market, expansion of AI
tools and integration of AI within existing tools.
• The NDIA took part in the Copilot for Microsoft 365 trial coordinated by the
DTA which ran from January 2024 to June 2024. The trial has been
extended for a further 12 months (June 2025) to enable a more thorough
review of use cases and benefit to the organisation.
• This trial provided valuable data on the impact AI can provide and how it can
better support the disability sector. In comparison to open AI, Copilot
provides the benefits of a large language model whilst protecting Agency
data. Risk is reduced as open market AI tools harvest data as part of their
operating model.
Page 186 of 189
FOI 24/25-1356 - DISCLOSURE LOG
Use of AI in the NDIA – Copilot Trial
• The NDIA provides training to employees and contractors on the responsible
use of AI. This has included micro-credentials at the AI introductory level,
and an AI literacy short course targeting the Agency’s senior leaders.
The Agency continues to provide specific training to staff who are continuing
to participate in the Microsoft Copilot trial to ensure appropriate, ethical and
responsible use of AI.
• The DTA Policy for the responsible use of AI in government (external) and
the NDIA’s Use of AI, Generative AI, and Machine Learning - Interim Policy
are clear that sensitive and security classified information must not be
entered into generative AI tools.
• In May 2023, the Agency ceased access to Open AI products in
the Agency’s operating environment due to concerns regarding the security,
confidentiality and privacy of information held by the Agency.
• The NDIA has blocked access to public generative AI services, continues
to monitor web content filtering and data loss prevention controls, and
assesses and blocks new websites as they arise.
Page 187 of 189
FOI 24/25-1356 - DISCLOSURE LOG
DOCUMENT 10
Title:
Use of AI/ML and Advanced Technologies Within NDIA
Witness: Sam Porter, Chief Operating Officer, Deputy Chief Executive Officer,
Enabling Services
Talking Points
• The NDIA supports the safe, responsible and ethical use of Artificial
Intelligence (AI). This aligns with advice provided by the AI in Government
Taskforce led by the Digital Transformation Agency (DTA).
• AI, Machine Learning (ML) and generative AI provide opportunities
to improve the Agency’s efficiency and effectiveness.
• The Agency follows guidance from the DTA and other agencies such as the
Australian Cyber Security Agency to ensure its use of AI is safe, responsible
and ethical.
• The NDIA ICT Governance, Risk and Assurance Team participates in a
whole of government DTA led AI working group, that is developing rules for
AI development and deployment in high-risk settings. This will complement
the work underway between industry and DTA, to develop a voluntary AI
Safety Standard and options for voluntary labelling and watermarking of
AI-generated materials. The Agency’s work with the DTA will be vital for the
safe and responsible use of AI within the Agency. The NDIA has developed
an interim policy position on the use of AI and established a whole of
agency AI working group to provide recommendations on investment and
management. This acknowledges the rapid evolution of AI technology, the
expanding market, and the increasing integration of AI within existing tools.
• The NDIA took part in the whole-of-government trial of Microsoft 365 Copilot
coordinated by the DTA, which ran from January 2024 to June 2024. The
trial has been extended for a further 12 months (June 2025 – and capped at
300 licences) to enable a more thorough review of use cases and benefit to
the organisation.
Page 188 of 189
FOI 24/25-1356 - DISCLOSURE LOG
• This trial provided valuable data on the impact AI can provide and how it can
better support the disability sector. In comparison to open AI, Copilot
provides the benefits of a large language model whilst protecting Agency
data. Risk is reduced as open market AI tools harvest data as part of their
operating model.
Use of AI in the NDIA – Copilot Trial
• The NDIA provides training to employees and contractors on the responsible
use of AI. This has included micro-credentials at the AI introductory level,
and an AI literacy short course targeting the Agency’s senior leaders.
The Agency continues to provide specific training to staff who are continuing
to participate in the Microsoft 365 Copilot trial to ensure appropriate, ethical
and responsible use of AI.
• The DTA Policy for the responsible use of AI in government (external) and
the NDIA Use of AI, Generative AI, and Machine Learning - Interim Policy
are clear that sensitive and security classified information must not be
entered into generative AI tools.
• In May 2023, the Agency ceased access to Open AI products in
the Agency’s operating environment due to concerns regarding the security,
confidentiality and privacy of information held by the Agency.
• In January 2025, the Agency blocked access to DeepSeek AI products in its
operating environment in response to security concerns.
• The NDIA has blocked access to public generative AI services and
continues to monitor web content filtering and data loss prevention controls,
and assess and block new websites as they arise.
Page 189 of 189