Skip to main content
Editorial illustration for Oxford colleges gain secure access to Google's Gemini 3 AI model

Editorial illustration for Oxford colleges gain secure access to Google's Gemini 3 AI model

Oxford colleges gain secure access to Google's Gemini 3...

Oxford colleges gain secure access to Google's Gemini 3 AI model

3 min read

Google has struck a formal partnership with Oxford University to roll out its newest generative‑AI offering across the college network. The deal, announced this week, places a suite of tools directly in the hands of students, lecturers and researchers, with the university purchasing a campus‑wide “Gemini Pro” licence. Security‑focused provisioning means the model runs behind Oxford’s own firewalls, keeping data under institutional control while still tapping Google’s cloud‑scale infrastructure.

The arrangement is billed as a joint effort between the tech giant’s AI team and education specialists, aiming to align the system’s capabilities with pedagogical best practices. In practice, the package bundles the core Gemini 3 engine with an additional assistant called Deep Research, which is described as an AI‑powered aide for scholarly work. The university hopes the combination will streamline literature reviews, data analysis and citation management, though details on performance and rollout timelines remain limited.

---

Oxford's colleges and academic departments now have secure access to Gemini 3, Google's latest model which is grounded in learning science and built in partnership with education experts. The university's Gemini Pro licenses include access to Deep Research, an AI‑powered academic assistant that can

Oxford's colleges and academic departments now have secure access to Gemini 3, Google's latest model which is grounded in learning science and built in partnership with education experts. The university's Gemini Pro licenses include access to Deep Research, an AI-powered academic assistant that can formulate multi-step research plans, browse hundreds of relevant web sources, reason through the findings, and synthesise them into a comprehensive, multi-page report with citations. These cutting-edge tools will enable Oxford's community to enrich its research and unlock more effective and personalized pathways to learning while fostering crucial AI literacy and skill-building.

Oxford University is pioneering the provision of secure AI tools in higher education, to support students, accelerate research breakthroughs, and create efficiencies for administrative staff and faculty members. Alwyn Collinson, Head of the AI Competency Centre at the University of Oxford, said: 'Many of our staff and students are already experimenting with AI. The Gemini for Education and NotebookLM tools we're making available through our partnership with Google will provide secure access to leading AI models, supported by training and guidance to ensure they are used safely and responsibly for work and study.

They will help to ensure that our researchers and academics can harness AI's potential to accelerate high-impact research, facilitate breakthroughs, and drive innovation which could help us to address key global challenges.' Oxford's pilot programme identified key use cases such as accelerating research and supporting grant proposals, and improving productivity using Gemini and NotebookLM -- critical functions for a world-leading research institution. Through this collaboration, Oxford University and Google share a commitment to advancing critical AI literacy and skill-building, supporting the next generation as they prepare to embark on a future in which AI tools will be used in multiple ways.

Oxford’s colleges now have secure access to Gemini 3, Google’s newest model built with education experts and grounded in learning science. The partnership promises tools such as Gemini for Education, NotebookLM and the Deep Research assistant, all bundled under Gemini Pro licences. During a pilot, 85 % of respondents said their productivity rose, a figure the university cites as evidence of the system’s utility.

Yet the rollout leaves open questions about data privacy, integration with existing curricula and the long‑term effects on student learning. Because the model is described as “secure,” one might assume safeguards are in place, but the specifics of those protections have not been disclosed. Likewise, while Deep Research is billed as an AI‑powered academic assistant, its actual impact on scholarly rigor remains unclear.

The collaboration marks a notable step toward embedding advanced AI in higher education, but whether the promised efficiencies translate into sustained academic benefit will depend on how faculty and students adapt to these tools.

Further Reading