Hot on the heels of our Australian Government Data Forum, our recent Work with Purpose podcast guests Anthony Murfett, head of division Technology and Digital, Department of Industry, Science and Resources and Sally Bayley-Nelson, manager at AusIndustry Insights share their views on how to start using artificial intelligence (AI) tools in public sector work.
Whilst the public sector isn’t new to AI, the emergence of generative tools such as ChatGPT has generated even stronger momentum and awareness of the great potential of the technology in improving public services.
Yet, many are unsure how to introduce the AI to their workplaces or how to mitigate the risks that come with AI.
Our recent Work with Purpose guests – Sally Bayley-Nelson from AusIndustry Insights and Anthony Murfett from the Department of Industry, Science and Resources – say that as long as generative AI is based on bespoke data that is protected and contained, there are immense opportunities for everyone to work more efficiently.
“My hope is saving us from some of the soul-crushing drudgery that we do and free us up for more of the higher value and more interesting work,” Sally says.
Before you get started with any new technology and try to find ways to apply it in your work, Sally recommends assessing what the business needs.
“Not that there's anything wrong with playing with shiny things, I love it, and we've all played with ChatGPT, but we've had the most success in applying it and getting good benefits and managing the risks where we've looked for a problem or a process that AI can help with, rather than starting from the place of, what can we apply AI to,” Sally says.
Anthony agrees and advocates for not using technology for technology's sake, but asking yourself: “where does it make sense to use these particular technologies?”
Once you have your mind made up about where to apply artificial intelligence in your role or organisation, it’s important to test and repeat in small, controlled environments.
This will build better understanding of the risks and limitations of the technology, but also help you build support for your work.
At every step of this journey, make sure to keep humans involved to do fact-checking.
“Scope something really small and manageable, test it, get it working. That allows you to manage those risks in a really controlled way,” Sally says.
“For example, we would need a call recording transcribed into words, or lots of information clustered into key themes, or texts re-written in a certain style. Those are all things that AI can do as part of its business as usual.”
“[At the Department of Industry] we are thinking about whether there are sandboxes that we can create where it's a safe environment to experiment,” Anthony adds.
“We also think about some barriers and test how this technology evolves. I think it becomes quite important to have humans in the loop to look for data, to look for bias, and to understand the data sources.”
To really embed AI in your business as usual, you will need to get buy in from across your organisation – from leadership down to your IT team.
“As you're thinking through AI, engage the executive, talk through what it is, explain the opportunities because the more we build awareness, the greater the understanding of both the opportunity and how we manage the risk becomes,” Anthony says.
“That's going to be really important because if we don't maintain the dialogue, there can be an aversion to looking at what some of the opportunities are.”
“You will also absolutely need to build support from your IT team because they hold the keys to all of the resources that you need for these computing resource-heavy models to actually run,” Sally says.
Making the most of these new tools will require continuous training and improvement and a close eye on managing risks such as bias to ensure AI is used for good.
“Those that are looking to use AI, need to make sure they've got the skills and they're building their teams around them with the relevant skills, so they understand the implications,” Anthony says.
“As the public service, we need to make sure we keep those democratic values, meet communities’ expectations, but importantly deliver a service to the community and providing advice to government.”
Anthony concludes that AI is changing the work that the public services does, and that the sector needs to embrace re-skilling to ensure staff demonstrate exemplar use of the technology, both in terms of delivery and ethics.
“I'd encourage those that are thinking about AI to read the AI ethics principles that we've released and understand what those are,” Anthony says.