Public Sector Cloud Strategy and Transformation in Australia

Are Australian governments unlocking the full cloud promise; or just testing the waters? 

Australia’s public sector has embraced cloud as a strategic enabler, not just an IT upgrade. But with rising expectations for secure, citizen-centric digital services, agencies now face a tougher challenge: moving from cloud adoption to cloud maturity

The ambition is high. The complexity is real. And the decisions government leaders make now will shape the next decade of public sector capability. 

Embedding Hybrid & Public Cloud into the National Digital Government Visio

Australia’s Data and Digital Government Strategy (2023) and the forthcoming Whole-of-Government Cloud Computing Policy (effective July 2026) mark a pivotal shift: cloud-first is no longer aspirational. It’s the default operating model for modern government. 

This isn’t just a technology upgrade. It’s a structural change in how the government delivers services, manages risk, and builds resilience. 

These frameworks push agencies to: 

  • Use public cloud for new digital services 
  • Actively retire legacy and high-risk systems 
  • Prioritise reusable, interoperable platforms 
  • Modernize procurement and governance 
  • Strengthen whole-of-government consistency 

In short, we’re moving from siloed ICT to shared national digital infrastructure, a foundation that supports collaboration, agility, and citizen trust. 

Hybrid cloud plays a critical role in this vision. Public cloud accelerates innovation and scalability, but hybrid architectures allow agencies to keep sovereignty over sensitive workloads while still gaining flexibility and cost efficiency. It’s not about choosing one or the other. It’s about designing a model that balances speed with control. 

This shift raises a leadership question: 
Does your Digital Investment Plan treat cloud as infrastructure or as a strategic capability that shapes service delivery? 

Secure Cloud Strategy: Building Resilience, Assurance & Agility 

The Secure Cloud Strategy, supported by ASD’s Blueprint for Secure Cloud, is designed to move the public sector away from “lift and shift” thinking toward secure design. 

This isn’t just a checklist for compliance. It’s a mindset shift. From treating cloud as a convenient hosting option to recognizing it as part of Australia’s critical national infrastructure. 

The strategy provides practical tools: 

  • Architecture patterns for secure deployment 
  • Risk assessment templates 
  • Guidance for configuration and hardening 
  • Clear shared responsibility boundaries 
  • Controls aligned to the PSPF and Essential Eight 

But the real goal is bigger … modernizing how government builds, tests, and operates services in an environment where threats are constant, and citizen expectations are uncompromising. 

A secure cloud strategy must help agencies: 

  • Detect threats faster 
  • Mitigate failures gracefully 
  • Respond to crises without downtime 
  • Maintain high trust in public digital services 

Cloud is no longer just a place to store applications. It’s the backbone of the digital government. Every outage, every breach, every delay impacts public confidence. Treating cloud as critical infrastructure means designing resilience, agility, and assurance from day one. 


Is your cloud security posture reactive or embedded as a strategic capability that protects trust and continuity? 

Navigating Sovereignty, Security & Complexity 

Australia’s cloud environment isn’t just shaped by technology choices, it’s defined by regulatory guardrails, security expectations, and sovereignty obligations that few other markets face. For public sector leaders, these aren’t optional considerations. They’re foundational to trust and compliance. 

Data Sovereignty: More Than a Location Requirement 

Under the PSPF, Privacy Act, and sector-specific rules like the My Health Records Act and APRA CPS 234, agencies must ensure sensitive data: 

  • Stays within Australian jurisdiction 
  • Is processed by vetted and accredited providers 
  • Aligns with sovereign risk and resilience standards 

This is why sovereign cloud regions, such as those in Canberra and Sydney, matter. They’re not just technical zones. They’re protected environments for workloads with national sensitivity, ensuring that critical data remains under Australian control. 

Across Australia’s cloud policy, protective security framework, and secure cloud guidance, the message is consistent: sovereignty is a foundation for resilience. Control over data, jurisdiction, and access is not about geography alone; it is about reducing national risk, strengthening security posture, and ensuring continuity in the face of disruption. 
Read more here. 

Security Expectations: Continuous, Not Occasional 

Australia’s Essential Eight maturity model, ISM controls, and PSPF frameworks demand more than periodic audits. They require ongoing posture management, because in a cloud world, risk is dynamic. 

That means: 

  • Continuous monitoring 
  • Policy automation 
  • Zero Trust architecture 
  • Governance at scale 

Security isn’t a bolt-on. It’s a living capability that evolves as threats evolve. 

Operational Complexity: The Hidden Challenge 

Cloud promises simplicity, but reality often looks different. Agencies face: 

  • Multi-cloud governance friction 
  • Cost unpredictability 
  • Talent shortages for cloud-native skills 
  • Risk of over-dependence on a single vendor 

Recent ANAO audits show that failures rarely stem from cloud itself. They come from governance, maturity, and lagging adoption. Technology moves fast. Policy and capability must be kept at a pace. 

 
Is your agency treating sovereignty, security, and complexity as compliance hurdles or as strategic levers for trust and resilience? 

Sharpening Cloud ROI & Agility: CIO Best Practices 

Cloud maturity isn’t measured by how many systems an agency migrates. It’s measured by how effectively cloud supports outcomes, resilience, cost efficiency, service improvement, and risk reduction. The question isn’t “How much cloud do we have?” but “How much value does it deliver?” 

High-performing CIOs in the public sector share a common approach. They treat cloud as a strategic capability, not just infrastructure. Here’s what sets them apart: 

1. Strategic Governance Built-In 

Cloud strategy must be embedded early in Digital Investment Plans, not bolted on later. Governance isn’t paperwork. It’s the guardrails that keep transformation on track. 

What does this look like? 

  • Portability clauses to avoid lock-in 
  • Vendor-neutral patterns for flexibility 
  • Reusable reference architectures 
  • Clear multi-cloud guardrails 

This ensures consistency across agencies and reduces reinvention. It’s about building a system that scales without chaos. 

2. Cost Transparency & Control (FinOps for Government) 

Cloud can be a silent cost escalator if left unchecked. That’s why the government is adopting FinOps disciplines, blending finance and operations to make spending visible and accountable. 

Key practices include: 

  • Real-time monitoring across providers 
  • Workload right-sizing 
  • Clear unit costing 
  • Independent audits of cloud use 

The goal? Every dollar spent on cloud should map to measurable public value. 
Read the FinOps Public Sector Whitepaper. 

3. Agile, Risk-Aware Security Models 

ASD’s blueprint stresses continuous, adaptive security. CIOs must: 

  • Align provider responsibilities 
  • Automate compliance checks 
  • Standardize configurations across environments 

Security isn’t a static policy. It’s a living system that evolves as threats evolve. 

4. Effective Hybrid Architecture 

Sensitive workloads often remain in sovereign regions or protected private environments, while scalable digital services leverage public cloud elasticity. The challenge? Integration. 

Legacy systems and modern cloud-native platforms must interoperate seamlessly, securely, reliably, and under consistent governance. This is where architecture discipline meets operational reality. 

5. Culture, Skills & Centers of Excellence 

Technology transformation fails without workforce capability. Agencies benefit from creating: 

  • Cloud Centers of Excellence (CCoE) 
  • Cloud-native training pathways 
  • Shared learning across government 
  • Communities of practice 

This builds consistent standards and accelerates adoption. Cloud isn’t just a tech shift. It’s a cultural one. 

6. Measuring Business Outcomes 

CIOs are moving beyond technical KPIs to outcome-based metrics: 

  • Reduced operating risk 
  • Improved citizen experience 
  • Strengthened service resilience 
  • Shorter delivery cycles 
  • Lower cost-to-serve 

Cloud success is strategic, not technical. It’s about impact, not infrastructure. 

 
Is your cloud program measured by migration milestones, or by the outcomes that matter most to citizens and government resilience? 

Looking Ahead: The Cloud-Enabled Public Sector 

Australia is building a public sector cloud ecosystem that balances innovation with sovereignty, resilience, and trust. 

The next step is consolidation. Not just running hybrid environments, but aligning them into a cohesive, cloud-native platform for the entire public sector. 

The real test: 
Can agencies deliver unified, citizen-first public services while managing risk, cost, and national control? 

 
What would success look like for your agency’s cloud-first journey in 2026 and beyond? 

Resources 

• Data and Digital Government Strategy (DTA) 

https://www.dta.gov.au/our-initiatives/data-and-digital-government-strategy

• Whole-of-Government Cloud Computing Policy 

https://www.digital.gov.au/cloud-policy

• Secure Cloud Strategy 

https://architecture.digital.gov.au/strategy/secure-cloud-strategy

• ASD Blueprint for Secure Cloud 

https://blueprint.asd.gov.au/

• Cyber.gov.au – Cloud Computing Guidance 

https://www.cyber.gov.au/business-government/protecting-devices-systems/cloud-computing

• Protective Security Policy Framework (PSPF) 

https://www.protectivesecurity.gov.au/

• Privacy Act 1988 

https://www.oaic.gov.au/privacy/privacy-legislation/the-privacy-act

• Digital Investment Management Framework (DIMF) 

https://www.dta.gov.au/our-initiatives/digital-investment-management

• Australian Government Architecture – Cloud and Hosting 

https://architecture.digital.gov.au/domains/cloud-and-hosting

• ASD Essential Eight 

https://www.cyber.gov.au/resources-business-and-government/essential-cyber-security/essential-eight

• ANAO Reports and Audit Insights 

https://www.anao.gov.au/

• FinOps Framework (FinOps Foundation) 

https://www.finops.org/framework/

Cloud in Healthcare: How Australia is Using AI to Transform Digital Health 

What if your next medical breakthrough isn’t a new drug or device— 
but the cloud infrastructure running quietly behind the scenes? 

Australia’s healthcare system is undergoing a quiet revolution. And at the heart of it isn’t just AI, or machine learning, or cutting-edge telehealth tools—it’s the rapid evolution and reach of cloud computing. 

From telemedicine in remote towns to real-time hospital analytics in the CBD, cloud infrastructure is no longer an IT decision. It’s a care decision. And it’s accelerating faster than most organisations are ready for. 

The Rise of Cloud in Australian Healthcare 

Cloud computing in Australian healthcare has gone from experiment to essential. 

In 2022–23, 20% of all GP services were delivered via telehealth—phone and video are now a standard part of care delivery, particularly in rural and aged care settings. 

Electronic Health Records (EHRs) are evolving from static repositories to dynamic, AI-ready platforms. 

Predictive analytics is helping hospitals forecast admissions, manage resources, and reduce waiting lists. 

But with every new capability comes a challenge: integration, security, governance, and compliance. 

Cloud has shifted from a back-end technology to a strategic engine for growth and innovation. It’s becoming the backbone of modern health delivery—and the risk and compliance surface has expanded accordingly. 

AI in Action: Smarter, Faster, Fairer Care 

Australia is at the forefront of AI and ML innovations in healthcare. 

  • AI triage bots are helping assess symptoms and direct patients to appropriate care pathways. 
  • Machine learning models are predicting patient deterioration in emergency rooms. 
  • Natural language processing is accelerating clinical documentation, giving practitioners more time with patients. 
  • Computer vision is assisting radiologists in detecting anomalies more quickly and accurately. 

These use cases are not hypothetical. They are operational today—and they rely on scalable, secure cloud environments. 

However, these technologies are only as strong as the infrastructure they run on. And in healthcare, that infrastructure must meet an exceptionally high bar. 

The Privacy and Compliance Tightrope 

Healthcare cloud adoption in Australia must navigate a complex environment of privacy laws, ethical obligations, and system-wide compliance expectations. 

Technology teams supporting healthcare are not simply managing digital records—they are stewards of public trust. 

The Privacy Act 1988  and the My Health Records Act 2012  impose clear responsibilities around data sovereignty, consent, and transparency. 

The Australian Digital Health Agency maintains national standards for interoperability, access controls, and cybersecurity. 

Accreditation frameworks such as ISO/IEC 27001  and IRAP (Information Security Registered Assessors Program) are becoming mandatory in procurement processes. 

Choosing the wrong cloud partner is not just a technical oversight. It becomes a compliance issue, a reputational risk, and an ethical liability. 

Choosing the Right Cloud Partner for Healthcare in Australia 

For healthcare leaders, selecting a cloud partner in healthcare is no longer a purely operational decision—it is a strategic one. 

At a minimum, ensure your cloud solution offers: 

  • Data residency within Australia 
  • IRAP-assessed infrastructure 
  • Proven interoperability with national digital health systems 
  • Capacity to support AI and machine learning workloads 
  • Transparent security protocols, SLAs, and audit trails 

Above all, choose a partner who understands that in this sector, the goal is not disruption. The goal is safe, sustainable, patient-focused innovation. 

Final Thought 

If you’re leading technology in a healthcare organisation, the question is no longer whether cloud and AI should be adopted. 

The real question is: are we building the kind of infrastructure that can support the next decade of health innovation? 

Because in the end, this is not just about platforms and data. It is about empowering clinicians. It is about faster, more informed decisions. And ultimately, it is about improving lives—quietly, securely, and intelligently in the background. 

Let’s build that future—thoughtfully, together. 

Resources 

1. MBS Telehealth Post-Implementation Review Final Report 
https://www.health.gov.au/sites/default/files/2024-06/mbs-review-advisory-committee-telehealth-post-implementation-review-final-report.pdf 

2. Patient Experiences in Australia 
https://www.abs.gov.au/statistics/health/health-services/patient-experiences/latest-release 

3. Australia Telehealth Market Report 2025–2034 
https://www.expertmarketresearch.com.au/reports/australia-telehealth-market 

4. Privacy Act 1988 
https://www.oaic.gov.au/privacy/privacy-legislation/privacy-act-1988 

5. My Health Records Act 2012 
https://www.legislation.gov.au/Details/C2012A00184 

6. IRAP – Information Security Registered Assessors Program 
https://www.cyber.gov.au/acsc/view-all-content/programs/irap 

7. ISO/IEC 27001 – Information Security Management 
https://www.iso.org/isoiec-27001-information-security.html 

8. FHIR (Fast Healthcare Interoperability Resources) 
https://www.hl7.org/fhir/ 

9. Real-Time AI for Patient Deterioration Prediction

Source: National Library of Medicine (PubMed)

https://pubmed.ncbi.nlm.nih.gov/37150397/

10. AI Chatbots in Australian Healthcare

Source: University of Melbourne, Pursuit
https://pursuit.unimelb.edu.au/articles/the-promise-and-peril-of-ai-chatbots-in-healthcare

11. Computer Vision in Radiology (SA Medical Imaging)

Source: Adelaide Now (News Corp Australia)
https://www.adelaidenow.com.au/news/south-australia/artificial-intelligence-advising-on-xray-diagnoses-in-sa-medical-imaging/news-story/ae20cc4c30320354069d586ca1d23846

Publishing a .NET Core App

In my last blog, I mentioned a basic Web App that I had put together using Visual Studio 2015 tooling for .NET Core. Over the last couple of days, I’ve been looking at publishing the App, and the steps involved.

.NET Core is a new beast with a lot of potential…. the aim is it will run anywhere, on anything. So to keep my costs down, I’m going to trial it on my Amazon Linux VM. Note to self, SQL Server for Linux is about 1 year down the track – unfortunate, as the MVC scaffold uses SQL Server LocalDB – so I’ll have to figure out what database I can use.

But, first things first… how to publish my App to Linux?

First step – publish the App using dotnet publish. I found that Bower was not referenced in my Path environment variable. Bower was installed with Visual Studio 2015 Professional – in a sub-folder of my Visual Studio 2015 folder. I’m not sure if it was installed because I had installed tooling for .NET Core, or if it comes by default. Anyway, once that was sorted, dotnet publish worked fine and it created a portable for me to use.

I copied all of the files in the folder that dotnet publish created over to my Amazon Linux server. (I used WinSCP for this). Then I found I needed to install .NET Core on Amazon Linux. Installation was easy but when I tried to run the dotnet CLI, I received an error. Running dotnet –info from a bash shell I saw,

dotnet: /usr/lib64/libstdc++.so.6: version `GLIBCXX_3.4.18′ not found (required by dotnet)

After several hours of searching, I found this was due to the libstdc++ library version on my Amazon Linux distribution, which was libstdc++47. I had ensured my VM was up to date, so it seems that the libstc++ version was lagging (for whatever reason). After running the below, I was able to successfully run the dotnet CLI, with a valid response from dotnet –info.

sudo yum install libstdc++48

So, I had installed .NET core and fixed up the reference library it needed. After that, I needed a way to access my Web App. Kestrel is a web server built into .NET Core and can listen on any port you tell it to. However, it is better to use a web server as a proxy/reverse-proxy to relay request/response to Kestrel.

I already had a web server on my Amazon Linux VM. So I configured it with proxy and reverse-proxy mappings (to the Kestrel server in my .NET Core App). The calls to my web server forward on to the Kestrel server in the App, and vice-versa. I ran the App with dotnet run and checked access via the web server. All good 🙂

Almost there.

Finally, I installed supervisor to manage start/stop/restart of my App. And set up a script to ensure supervisor is re-started whenever my Amazon Linux machine restarts.

Note, this would have taken days if not for the early groundwork of several people who blog their efforts.

Now I have a Web App but no database. So next step is to get the database up and running.

 

Scrapy Spiders, Python Processing & Web APIs

Over the past couple of weeks, I’ve spent some time drafting a Web App for Touch Footy results. The App is built on .NET Core, and this gave me a great opportunity to review the new Visual Studio .NET Core tooling. But once I had my bare bones App, I needed some data to play with. Enter Scrapy and Python…

What I wanted was a data set that I could use in the Touch Footy App. I had a good data source and I figured my best bet was a web scraper. Scrapy made it easy for me to scrape together my test data set. It’s built using Python (hence you need some understanding of Python to use it). Python is an interpreted language. It’s great for list processing and it’s easy to read/write.

Scrapy is an open source framework for writing web crawlers, or spiders. It gives you control over how and when you execute the spiders you’ve written. And a great shell as a part of the framework to test/debug commands. After looking at other web scraping options, I decided on Scrapy as a neat way to get my data.

After a few hours coding, I had a crawler that collected the data I wanted, i.e. groups, teams, fixtures and results for my Web App. I wanted to store the data in JSON – for easy processing – and Scrapy made that easy too. It was simple then to write some Python code to process the JSON for groups, teams, fixtures and results.

All good so far, and fun to boot. My next step – how to get the data to the Touch Footy Web App? Well, Visual Studio 2015 tooling for .NET Core makes it easy to add a Web API to an MVC Web Application…. several hours later I had a working spider populating data into my App.