Skip to Main Content
Skip to main content

Teaching and learning in the AI era: How the University of Sydney is empowering staff and students with AI across the institution

The University of Sydney is Australia’s oldest institution of higher learning and is ranked among the world’s top 20 universities. Throughout its 174-year history, the University has maintained a forward-looking approach. Today, it’s known as much for its commitment to embracing new technologies and driving innovation as it is for academic excellence.

Frank Grippi, the University’s Director, Strategy & Architecture

“The University of Sydney is committed to ensuring that equity and inclusion are at the heart of every learning experience we deliver, regardless of the physical location or mode of delivery,” says Frank Grippi, the University’s Director, Strategy & Architecture.

“Leveraging technology underpins our approach here – it allows us to effectively support around 70,000 students and close to 26,000 academic and support staff, including part-timers. It also helps us create a more connected learning environment across multiple teaching and research locations, including a station in the Great Barrier Reef Marine Park.”

The University of Sydney was early to embrace generative AI. In February 2023, the institution declared that “AI tools will become part of every workplace, and we want our students to be those who master the technology. They will need to learn how to build on the work produced by AI”.

As Jim Cook, Innovation Lead, explains:

With this in mind, the University wants to take everything it knows about effective pedagogy and think about how AI can support this to make teaching and research easier and improve students’ learning.

Adopting AI responsibly

The University’s AI journey began about eight years ago when its Digital Innovation team started exploring the potential of large language models to benefit educational approaches by offering personalised and interactive learning experiences for students and teachers alike.

“Our computer science research teams have been exploring new horizons in AI for decades and doing outstanding academic work. But we’ve been waiting for the technology to hit a pivot point where it would be fiscally responsible to implement AI tools more broadly across the organisation,” says Cook. “That moment has now arrived with the emergence of powerful generative AI tools that are also widely accessible.”

Today, the University is taking proactive steps to equip every member of its community to use AI responsibly and productively. This has included running workshops and consultations for staff and students and developing guidelines and educational resources on the appropriate use of AI for learning and assessments.

Empowering staff and students to harness AI’s potential

The University’s eagerness to embrace generative AI has led to considerable interest from staff and students in developing new teaching and learning solutions that harness the technology’s capabilities. Support staff have also expressed interest in finding ways to make administrative processes more efficient, such as onboarding students.

Anticipating this response and to meet new use case demand, the University’s ICT division started developing a large-language-model-as-a-service platform based on Microsoft Azure in late 2022.

“The University uses a shopfront model whereby ICT acts as the enabler by providing a common platform that other University groups focused on specific education, research and operational needs can use,” says Grippi. “Using our approach, it’s possible to create an application in about 20 minutes – it’s one-click, based on deploying simple code and gets a basic product into people’s hands for trialling and testing very quickly.”

The team chose to use OpenAI’s GPT-4 Turbo and GPT 3.5 Turbo large language models through Microsoft Azure OpenAI Service. It also deployed Azure AI Search, Azure AI Studio and other related services, all of which allow it to use Microsoft’s robust security features and advanced generative AI capabilities.

But as Cook explains, creating the platform was a complex job: “We spent about six months figuring out how we’d put it all together, then we built it in just three months, launching in March 2023. It was pretty complex because when we started, the tools for what we wanted to do didn’t really exist.

Luckily for us, Microsoft quickly provided everything we needed to make our approach much faster and more agile. We also worked closely with Microsoft’s AI First Movers team, which helped us integrate our platform with our service delivery mechanisms, among other things.

Developing innovative use cases

So far, the University’s ICT team has built 33 minimum viable products on its Azure platform. One of the most successful solutions is Cogniti, an AI assistant for students, developed by Danny Liu, Associate Professor of Educational Innovation.

Cogniti allows teachers to create their own AI chatbot ‘agents’ that can be steered with specific instructions and resourced with specific contextual information from units of study. These agents can be embedded into the University’s learning management system, providing a seamless experience for students – they do not have to sign up for a separate account, and the institution provides their AI access.

Teachers have full visibility over conversations with Cogniti agents, and students can flag and give feedback on AI messages.

“Using Cogniti, students can get instant, personalised support, guidance and feedback, including explanations of key concepts and coaching on study techniques,” says Cook. “It can also boost staff productivity by helping teachers with time-consuming tasks such as creating rubrics to establish criteria for assessment.”

Jim Cook, Innovation Lead

Since Cogniti’s soft launch in October 2023, educators from 30 institutions in Australia, New Zealand and Singapore have created more than 600 AI agents using the solution. These AI agents have engaged in over 31,500 conversations with more than 10,000 users and answered thousands of syllabus and content questions.

Cogniti has also improved personalised feedback for thousands of students, including those who used Cogniti agents during a workshop to develop occupational therapy intervention plans in the University’s Faculty of Medicine and Health.

“I thought the AI was very good,” says one student. “There was a balance between challenging our suggestions that encourages us to think and back up our ideas, and affirming our suggestions with add-ons that improve the strategy delivery.”

Cogniti is currently used across 300 units of study and the University expects this number to double by the end of 2024.

Another successful project on the University’s Azure platform is the generative AI policy navigator developed by the University’s Digital Innovation team. This tool, built using Microsoft’s public sector Information Assistant accelerator, simplifies staff access to the university’s 360 policies.

“Navigating our extensive policy library can be challenging for our 26,000 staff. However, as they become literate in working with generative AI and ask the policy navigator the right questions, they can gain some real value from it,” says Cook.

A big reason this project has been so successful is the flexibility of Microsoft’s Information Assistant accelerator, which allowed us to provide a robust and safe tool customised to our needs.

Reimagining assessment and feedback

The University’s dedication to the responsible and ethical use of AI means it is continuously evolving its governance approaches. “As our use of the technology expands and matures, we’re constantly reviewing our robust standards and policies, including quality assurance processes for AI outputs, and assessing what the next level of governance is that we need to embed,” Cook says.

This responsible stance reflects the University’s commitment to complementing human skills with technology and ensuring a harmonious integration of AI into its community.

“With generative AI becoming ubiquitous in everything we use, like Microsoft 365, search engines and social media, universities need to help students learn how to use it well,” says Liu.

“A key part of this will be through assessments, many of which will need to be reimagined to scaffold and support students to apply generative AI productively and responsibly. This will help them adapt to their present reality and prepare for their future reality, where working alongside AI will become as commonplace as using the internet. Authentic assessments will be ones where AI is an integrated part of the process of learning.”

The University’s Educational Innovation team has recently begun pioneering an innovative ‘two-lane’ model for assessments, including tests and exams, to ensure students develop the ability to work ethically with AI technologies and maintain academic integrity.

“We offer ‘Lane 1’ assessments with two options. In Option 1, generative AI is not allowed – this applies to typical scenarios like tests, exams and oral assessments. In Option 2, generative AI is allowed but always under supervised conditions,” says Grippi. “This helps our educators confirm certain levels of student attainment.

With ‘Lane 2’ assessments, by contrast, students can collaborate with AI to help motivate and engage them in their learning.

Supporting Australia to be an ethical AI leader

As the University continues its AI journey, it is excited to explore additional use cases for the technology. For example, it rolled out Microsoft Copilot in early 2024 to boost productivity and creativity for around 130,000 students and staff members. Importantly, this move ensured equitable access to generative AI technology among students, particularly benefitting those who cannot afford a Copilot subscription.

However, it also has a much broader mission, with partnership and collaboration at its core.

“Our advice for any educational institution embarking on an AI journey is to see it as a coalition,” says Grippi. “Build a community of practice within your university and with fellow universities, developers and stakeholders. Be transparent about how you’re formulating your principles for safe, responsible AI use. The more you’re willing to share your knowledge about how you’re doing that, the better outcomes you’ll achieve.”

As Cook points out, the University’s long-term vision is to support Australia in becoming a leader in ethical AI teaching and learning, as well as research and applications. “Our journey here is evolving in a very fast-moving, international context. But while we want to go fast, we also want to go safely.

“That’s one reason we recently signed a memorandum of understanding with Microsoft to help us achieve our vision. We want to build on the strong foundation we’ve already established through new collaborations related to our research, education and operations, and together harness the power of AI for good.”