We all know legacy technology is a systemic challenge in the public sector. It is complex, outdated and often closed or siloed, meaning data interoperability is extremely difficult. However, it is often integral, underpinning vital citizen services.
In the last twelve months, we’ve sat down with technologists, cloud architects and regulators to unpick the legacy challenge in a bid to encourage a shift in thinking. The Cloud-First policy served a purpose. It stimulated transformation, removing some of the barriers and nervousness in cloud technologies and new suppliers. However, our research suggests that ‘cloud’ as a term is often confused and ill-defined, meaning the government directive had been perceived, in some instances, as ‘cloud-only’ or at least incited that ambition.
This ambition begins to falter when legacy technology is considered. Our research found that as much as 70% of public sector organisations’ infrastructure and 73% of data remains on-premises. Around 25% of respondents stated that over 50% of their infrastructure is legacy. Over 78% of respondents also shared they have third-party services and solutions that are not ready or fit for public cloud migration.
This is a stark indicator of the sectors’ ongoing battle with legacy technology, meaning digitisation is no mean feat. A ‘lift and shift’ approach is not viable for moving these systems and workloads to the public cloud. Doing so can result in infrastructure that is more complex, cumbersome, and costly than before.
In 2013, the National Audit Office said that £480bn of government operating revenues and £210bn in non-staff expenditure were reliant on legacy ICT, citing DWP and HMRC as case studies. The DWP pension service was introduced in 1987, serving around 13 million customers a year. Their mainframe was installed in 1974, and only in 2021 did they bring support of it in-house after a £143m project, which they hope will save them around £257m over the next 7-8 years. We understand significant progress has been made in modernising the mainframe, and we explored this in an interview with Bryan Nelson, Lead Transformation Manager for Hybrid Cloud Services at Department for Work and Pensions (DWP) Digital, earlier this year.
Almost a decade since the 2013 NAO report, the Cabinet Office reported that the UK Government spends £2.3 billion annually on operating, maintaining, and supporting legacy systems out of its £4.7 billion total annual IT budget. We wanted to understand why this is.
Nelson shared, “The challenge for us is the legacy estate and the complexity of these systems – many of which have been operating since the DWP was founded over 20 years ago. They cannot be easily moved into a cloud environment. We were also cognisant of the long-range cost implications of investing in cloud – especially when storing such enormous quantities of data.”
This isn’t unique to central government. Mark Gannon, Former Director of Business Change, and Information Solutions at Sheffield City Council, shared that:
“It’s a very complicated world out there because not everything can go into any kind of cloud, let alone public cloud. Particularly in local government where there are lots of legacy application and data issues.”
Legacy technology often comes hand in hand with legacy thinking. Unsurprisingly, local government suffers from a significant lack of skills in cloud technology. A reliance on multigenerational IT has meant next-generation digital skills are often not required. Local government is not only competing with the private sector for digital talent but also central government, with many new candidates seeking roles within central government over other public bodies.
Mark says development of digital skills is imperative, not only for technical team members, but for public sector leaders too. “We often struggle with the digital literacy of financial leadership - I believe this is the reason so many transformation projects fail. Those in charge lack the knowledge to confidently make the big decisions, making them risk-averse, and stunting project progress”.
Limited digital skills also contribute to the benefits that a hybrid cloud strategy can provide. Mark went on to share that his team developed a hybrid cloud strategy to support public cloud adoption, where it was deemed appropriate.
Decades of underinvestment in the Council’s IT infrastructure meant essential services were reliant on multi-generational technology. Something that Mark refers to as a ‘spaghetti ball’ of legacy architecture. He commented, “It makes sense to move some public sector services to public cloud, but it’s not always appropriate. Local councils hold sensitive citizen data, so protecting this is a high priority. We also have applications that cannot currently move. A business critical 20-year-old application runs on a very specific device, which is technologically incompatible with modern systems, so modernising it to enable it to be cloud-based will take time.”
The Council focused initially on moving appropriate applications into a cloud environment. For those systems and applications that operate better in data centres, Mark and his team made plans to modernise existing infrastructure and leave workloads where they operate best.
This is echoed by Bryan at DWP Digital, “What’s important to us at DWP Digital is the way that we operate. Because when you have on-premises services and public cloud services, often the way you orchestrate those services, and the way you maintain and manage those services, is different. We are very much on that journey at the minute where we are doing bimodal operations. We are doing traditional IT operations and cloud-native IT administrations of our services. It’s really about how can we start introducing that private cloud capability. Just because you have on-premises IT does not mean you have a private cloud. DWP Digital is really trying to align the way we work in the public cloud with the way we work on-premises.”
He continues, “We chose the hybrid approach, which allows us to operate a public cloud and a private cloud experience from our on-premises servers. By actively subscribing to this methodology and incorporating it into our strategy, we benefit from increased data portability and efficiency where our on-premises and cloud systems can work together in unison, providing the best solutions for both our internal and external customers.”
Legacy technology will still need to be supported for the foreseeable future. Moving it to the public cloud simply hosts it in a different environment, perpetuating the issue and providing little in the way of modernisation.
The UK government’s Central Digital and Data Office has acknowledged the need to prevent future legacy IT in its Digital, Data and Technology (DDaT) Playbook, citing it as a key policy.
DDaT argues Legacy IT is the result of a failure to plan for the end of a contract, product, or service’s life, leading to technical debt. Its resolution to this is to ensure contracts are designed with the right length of time in mind and that expiry, extension, transition, and termination are planned in good time.
Albeit good advice, the ‘need to plan’ is arguably just a very small step in the shift in thinking that is required to truly combat legacy technology.
We believe the focus should be on placing the right workloads, in the right place, for the right reason. Shoehorning multi-generational IT into public cloud environments takes time, costs money, and distracts resources away from modernising legacy systems in a stable, controllable environment on-premises.
Adopting a consciously hybrid approach, and making considered, deliberate decisions about where data and workloads should be housed can alleviate the unplanned state of flux between multiple operating systems. It is time to eliminate the need to keep reinventing the wheel because legacy tech didn’t work in the public cloud. Let’s drive a conscious movement, together.
Watch the full interviews with Bryan Nelson and Mark Gannon at hpe.com/uk/forthegood.