AI Sydney Tour: Australian Federal Police shares how it’s using AI to protect Australia and its people

Australia with city lights from space at night

Today the Australian Federal Police (AFP) shared how it’s using AI to protect Australia and its people as part of Microsoft’s AI Sydney Tour. The AFP’s Manager Technology Strategy and Data, Benjamin Lamont, will discuss how the agency is using AI and trialling the responsible use of the technology to enhance its operations and the wellbeing of its people.

The AFP is Australia’s national policing agency, dedicated to protecting Australians and Australia’s interests. With more than 7,000 staff members, it has jurisdiction to investigate federal crimes across Australia and those that occur in the Australian Capital Territory.

Building on earlier innovations

In recent years, the agency has increased its use of commercial cloud computing services, including Microsoft Azure. This has enabled it to leverage Azure AI services and work with Microsoft to develop various custom AI solutions.

For example, the AFP is working with Microsoft and specialised software vendors to use AI to better detect deepfake images and other problematic content. Microsoft is providing the underlying technology and engineering expertise to integrate these solutions.

This work has potential applications in child protection, where the AFP has successfully partnered with Microsoft and shown that the technology can be effective in fighting crime.

In 2008, the AFP launched the Child Exploitation and Tracking System, which Microsoft helped develop to support law enforcement groups more effectively track down child predators and rescue victims. Today, the agency is also collaborating with the Australian Centre to Counter Child Exploitation to address challenges related to generative AI and deepfakes.

“AI is a powerful tool for detecting and addressing issues relating to deepfakes and other crimes,” said AFP Commander Helen Schneider. “It also offers many possibilities for making us more efficient and effective as a police force.”

Data analysis, Copilot and staff wellbeing

The AFP and Microsoft are also working on new ways to securely quarantine, clean and analyse material that the agency seizes as part of its investigations. This includes operating in a secure, fully disconnected environment to ensure the integrity of data and evidence.

This work and other initiatives support the AFP’s goal of building an AI capability that spans from the tactical edges of its technology environment to its hyperscale cloud resources to enable the organisation to achieve outcomes more quickly. The AFP expects to use the full capabilities of Microsoft’s AI platform, from generative AI to cognitive services.

The AFP is also one of more than 50 Australian Public Service agencies that have trialled Microsoft 365 Copilot this year. It found that the generative AI service, which integrates with Microsoft 365 business applications such as Word, could significantly increase its officers’ efficiency by automating document and report creation.

Finally, the AFP is exploring how AI can be used to better support its staff as they complete difficult tasks that arise in law enforcement.

For example, generative AI can be used to create text summaries of visual content such as images or videos before officers view the content. This can help prepare officers for what they are about to see and reduce the potential for heightened reactions which negatively impact their mental health. The AFP is also exploring how AI might be used to modify graphic content to reduce its mental health impact, such as converting images from colour to greyscale and removing audio.

“We’ve been pleased to help the Australian Federal Police migrate to the cloud and now in exploring how the agency can use AI in ways that are both effective and responsible,” said Steven Worrall, Managing Director of Microsoft ANZ. “As criminals move more and more into the digital environment, it’s vital that law enforcement authorities leverage the latest capabilities to keep the community and their own people safe.”

Taking a responsible approach

According to Schneider, the AFP carefully considers ethical and community concerns before deploying technologies such as AI.

“The AFP is very committed to ensuring that AI is used responsibly and in a way that aligns with our ethics as an organisation,” she said. “We conduct proactive due diligence, focus on robust human oversight and accountability, and carefully consider the values, norms and expectations of the community we serve before deploying any technology.

“Reflecting this commitment, we have developed a Responsible and Ethical AI Framework that leverages Microsoft’s well-developed principles. It has been very helpful to be able to work with Microsoft as we consider the new possibilities and issues presented by AI.”

Related Posts