Deploying Azure OpenAI at a Local Authority: Governance, Data Security, and Staying Ahead of the Curve
The challenge
In early 2023, AI was moving fast. Large Language Models (LLMs) were making headlines, but for local government, the dominant question was not “how do we use this?” but “how do we know it is safe?”
One local authority wanted to give staff access to an LLM tool for drafting, summarising, and answering queries. The technical capability existed. The problem was governance: where would data go, who could see it, and could confidential information end up training an external model?
Without credible answers to those questions, any AI deployment was going to face significant resistance from IT security. In early 2023, most organisations had no template for how to handle it. There was no established playbook. The governance frameworks that exist today had not yet been written.
The architecture decision
The key to unlocking the project was where the AI would run.
Rather than connecting staff to an external AI service, the solution was to deploy the LLM entirely within the council’s existing Microsoft Azure tenancy, using the Azure OpenAI Service API. PowerApps provided the front-end interface. Staff could interact with an LLM without any data leaving the council’s environment. The model ran inside Azure, under the same tenancy governance as SharePoint and the rest of the Microsoft 365 estate.
Confidential documents, case notes, internal briefings: all of it could be fed in freely, because it was subject to exactly the same access controls and data residency requirements as any other internal file. There was no external model to worry about, no data leaving the boundary, and no new security perimeter to create.
This also resolved the cost question. An organisation-wide Microsoft Copilot rollout carries a significant per-user per-month licence cost. For staff with low or irregular AI usage, that cost is hard to justify. The Azure OpenAI API approach meant capability was available to all staff, with usage costs incurred only when the tool was actually used. For a large organisation with mixed usage patterns, the saving was considerable.
Making the case to IT security
Having the right architecture was necessary. Getting it approved was a separate piece of work.
Rather than presenting to the IT security board and waiting for objections, the business case was written to pre-empt every likely concern before it was raised. Data residency, model training behaviour, access controls, audit logging, GDPR compliance, third-party data processing: each question was addressed directly, with the technical controls documented alongside the answer.
When the proposal went before the IT security board, it was approved without objection. Every concern the board might have raised had already been answered in writing.
The product was approved not because the board was persuaded to accept a risk. It was approved because there was no meaningful risk left to raise.
This approach reflects something Gibbs Brothers brings to AI work in local government: not just the ability to build the product, but the understanding of what governance bodies need to see, and how to present complex technical architecture in terms that a security board can evaluate and approve with confidence.
What was delivered
The deployed tool gave staff access to an LLM within their normal working environment: a PowerApps interface, Microsoft 365 authentication, and the same login they used for everything else. The tool was live, in active use, and delivered what it was built to do.
At a time when most organisations were still debating whether AI could be used safely with sensitive data, this council was already doing it. The architecture was proven, the governance framework was in place, and staff had a working AI tool inside their existing environment — without a Copilot licence in sight.
Exploring AI for contract management
Alongside the internal assistant, a second application was explored: an LLM grounded in the New Engineering Contract (NEC) suite, designed to help project managers respond to early warnings and compensation events under NEC3 and NEC4.
The problem it was addressing is real. Even well-resourced commercial teams cannot give every compensation event and early warning the attention the contract demands. Events get cherry-picked. A commercial manager covering ten live projects cannot respond thoroughly to every notification, and lower-priority items inevitably receive less rigour. The gaps that result are routinely exploited by contractors who administer the contract more carefully than the client side does. The cost consequences can be substantial and are, in most cases, entirely avoidable.
The concept of an AI assistant grounded in the NEC suite was to complement experienced commercial support, not replace it. A well-resourced commercial team still needs to lead. The tool was intended to help cover the events that would otherwise go under-administered, and to give project managers without a commercial background a reliable reference point when they needed to respond quickly.
In practice, the AI model available in early 2023 was not sharp enough to perform reliably in a contract administration context. The outputs required too much checking to be genuinely useful, and the assistant was not formally deployed.
AI capability has advanced considerably since then. The case for a well-grounded NEC assistant is stronger now than it was in 2023, and Gibbs Brothers considers this an area worth revisiting with the right client.
If you are working in local government or the public sector and want to deploy AI tools securely inside your existing Microsoft environment, Gibbs Brothers can help you build the product, design the architecture, and get it through governance. Get in touch.