The trusty old mainframe may seem synonymous with a bygone era of computing. Left behind by cloud computing and walled off from next-gen functionalities such as artificial intelligence and business process as a service, it once seemed destined for the dustbin of IT history.
But a funny thing happened on the road to obsolescence. People kept using mainframes. No matter how attractive cloud platforms become or how imperative modern features are to leading businesses, the mainframe continues to offer a compelling value proposition. Mainframes often host applications that can’t be moved to the cloud because it would be either too cost-prohibitive due to the substantial work needed to refactor applications or too risky due to the possibility of breaking system dependencies.
The trick is getting the mainframe to communicate with modern applications, and this is where leading enterprises are getting creative. To be sure, enterprises have been working on this problem for years, but they’re giving it a fresh look as they see the costly and potentially risky work associated with cloud migration. Rather than rip and replace legacy core systems, businesses are increasingly looking to link them to emerging technologies using innovative new connectors so that each family of systems can do what it does best.
That’s the approach taken by Meuhedet, an Israeli health insurance and care provider, whose mainframe-based electronic medical record system continues to serve as an effective store of patient data. “The vision is not to move on from legacy systems – because they work,” says Katy Bar-Shalom, the organization’s chief information officer. “The things they do are good, just not good enough. But with layers, web services, and applications, we can enlarge and bring new data and insights to our medical staff.”
Business users today expect to rely on modern applications such as CRM, data dashboards, and machine learning – and reconciling the business logic between mainframes and modern applications can be a technical challenge. This is partly because most mainframe code is written in COBOL, a language few computer science majors learn today. Modern applications are typically at the heart of enterprise’s digital transformation efforts, and legacy systems are often seen as a hurdle.
Traditional efforts to link mainframes to modern applications have focused on APIs, which can work perfectly fine but have some limitations. Applications need to have prebuilt connections, or engineers need to build these connectors, which isn’t always realistic for each piece of software. Building and deploying APIs can be a complex, time-consuming process.
Organizations are meeting this challenge by redoubling their efforts on tried-and-true approaches to core system modernization that allow them to connect legacy applications to even the most modern of tools. This includes things like AI-powered middleware solutions, advanced microservices applications, and refreshed user interfaces that harness the power of data-as-fuel. The result is a powerful pairing: the trusted functionality of core legacy systems with the expansive capabilities of emerging technologies.
In this way, legacy systems don’t have to be roadblocks on the path to digital transformation, but rather, engines that drive the business forward.
Now: Mainframe Remains Business Critical
Mainframes aren’t just hanging around. Nearly three-quarters of business and IT executives believe mainframes have long-term viability in their organization, and more than 90% expect to expand their mainframe footprint. Mainframes are still commonly used in tasks such as payroll processing, transaction recording, insurance underwriting, and much more. Mainframes do what they were intended to do, and they do it well.
The problem is people aren’t getting what they want from them in terms of modern functionality. More than 60% of businesses say integrating legacy tools with new applications is a challenge, and 57% say lack of business agility – an inability to respond to emerging business challenges and opportunities – is a problem with legacy systems.
New: Innovative Takes On Established Approaches Extend Legacy System Capabilities
For years, enterprises have been reinvigorating their legacy systems with the five Rs of core modernization: replatform, remediate, revitalize, replace, and retrench. Those approaches are still bearing fruit. Some of the emergent faces of these approaches touch on new-to-world extensions that are breathing fresh life into core systems and extending their functionality for the modern, digital enterprise.
For example, the US Air Force recently began using a tool originally developed by the Defense Advanced Research Projects Agency called STITCHES, which is essentially a library of technical standards and translations that allow various applications to pass data back and forth, regardless of their underlying code. In practice, one application sends data or instructions into STITCHES’ library, which processes it into the standards of the next system. Various tools can connect to each other without requiring a common interface language.
Colonel William “Dollar” Young, the first commander of the 350th Spectrum Warfare Wing in the Air Force, says developing and deploying bespoke APIs to connect various pieces of software is time-consuming and complex. Each connection must be planned ahead of time, which limits the ability of people in the field to make connections between programs on the fly. But with STITCHES, anyone can link two or more pieces of software as soon as they need to, enhancing agility while improving connectivity between applications. “STITCHES allows humans to do what they do best, which is dream up a concept, and then the tool assembles the capabilities,” Young says.
Others are putting fresh integration layers on top of legacy systems that incorporate more advanced capabilities. Far more than the tried-and-true APIs of old, these applications have flexible file systems that can work with data in many formats and translate them to the standards of other applications. They help bring data from legacy systems to life in new ways.
This was the approach BMW took when it used NVIDIA’s Omniverse platform to help make its UK manufacturing facility more efficient. BMW wanted to transform its assembly line to be more responsive to customization requests and support the production of more electric vehicles. But its software infrastructure was geared mostly toward producing traditional vehicles.
Rather than retool its whole software infrastructure, BMW was able to connect and extend its existing tools. NVIDIA’s Omniverse software utilizes an open-source file format that allows users to create scenes composed of many different file types. To enable multiple software systems to work in conjunction, it supports different client applications and microservices. In practice, this means that legacy data stores, ERP systems, computer-aided design software, and purchasing tools, to name just a few, can all sync up, connecting the tried-and-true functionality of legacy systems with the value-adding capabilities of emerging software.
As NVIDIA’s industry product manager, Mike Geyer, says: “You’ve spent 15 years putting data into a software system. You can keep using it. Now you can just do more with it.”
In another example, a commercial airline built a new app for customers to manage their membership, loyalty, and points program. The app itself is hosted in a cloud environment. A rules engine references data in the airline’s mainframe without changing any of the mainframe data. The rules engine and cloud platform allow the airline to change offerings and functionality as needed without forcing it to completely revamp its data platform, which would have been a heavy lift given that the airline industry is particularly dependent on mainframes.
Next: Mainframe Levels Up To Meet Emerging Needs
Thanks to emerging technologies, the mainframe might actually become more relevant in the years ahead. A recent report from Allied Market Research found that the market for mainframe systems is expanding, in part thanks to increased adoption of Internet of Things (IoT) systems that produce massive reams of data that would be cost-prohibitive to move to the cloud.
Economies of scale may continue to favor mainframes. Indeed, nearly 70% of business and technology executives expect mainframe computing performance to grow in the years ahead, making mainframe systems even better at these types of workloads.
There is a class of problems that is deep and needs to be executed through world-class capability. For these types of problems, mainframes may share some characteristics with supercomputers, particularly as mainframe processing power continues to increase. For jobs that require high volume and precision – such as checking account balances at large, international banks – mainframes are likely to grow even more capable and continue as the choice for enterprises. When processes get more complex and require shifting data between applications – training machine learning algorithms, for example – cloud may offer better functionality.
Whether to keep applications in mainframes or move them to the cloud will continue to be a complex question. While refactored applications can work more seamlessly with modern, cloud-native applications, the process of refactoring takes a lot of work. Many businesses instead choose to lift and shift, but that approach simply replicates existing roadblocks in the cloud. Then there’s the cost to consider. Legacy applications running on on-premises hardware may already be paid for and shifting those applications to the cloud could constitute new costs.
This doesn’t mean there’s no cost to keeping applications in a mainframe, however. Especially given the lack of skilled workers available, finding people to maintain these systems – or worse, respond in the case of an outage – could become very expensive. More than 90% of business leaders say it’s moderately or extremely difficult to acquire the right talent to maintain mainframes. And maintaining applications in an on-premises environment could carry the opportunity cost of causing businesses to miss out on the broader gains that come with digital transformation enabled by cloud technologies.
Dave Linthicum, chief cloud strategy officer at Deloitte Consulting LLP, says the pull of the cloud is strong today because it’s trendy and mainframes are generally seen as passé. And while cloud platforms are likely to offer advanced capabilities that are difficult to replicate in a mainframe environment, businesses should still carefully examine the business rather than jumping into the cloud to be on the cutting edge.
“People manage by what they read in magazines,” Linthicum says. “They aren’t necessarily making decisions based on business requirements. They’re making emotional decisions based on where they think they should go. It may work if you spend a lot of money, but you may incur a million dollars more in operational costs because you move to a platform that is difficult to adjust to the needs of the business.”
More than 90% of business leaders say it’s moderately or extremely difficult to acquire the right talent to maintain mainframes.
Enterprises will have to weigh the costs and benefits of moving applications from mainframes to the cloud. They should evaluate what business needs have changed, and what opportunities exist in cloud versus mainframes to meet those needs. With more and more modern applications emerging that extend the functionality of the mainframe, it may not always make sense to throw out processes that are working simply in the name of modernization.
originally posted on deloitte.com