Unlocking Data From The Legacy Stack In Federal IT
Unix-based systems. Code bases written in COBOL. End-of-life software. The ubiquity of legacy IT systems in the federal government has become a fact of life. Federal agencies have saved approximately $2.8 billion between 2011 and 2015 by consolidating data centers. But consolidation is no longer enough. Civilian agencies still spend over $36 billion (about 71 percent) of their IT budgets maintaining legacy IT, driving the need to take more aggressive action. If cost alone is not enough, then thereas the issue of national security. Last month, the Obama administration proposed a $3.1 billion IT modernization fund to improve U.S. cybersecurity and trim data center costs by retiring, replacing, and modernizing legacy systems. Modern web network and internet telecommunication technology, big data storage and cloud computing computer service business concept: server room interior in datacenter in blue light.
But historically, federal IT modernization projects have had a less-than-favorable track record of success. The IRS, for instance, has failed to modernize its master system of all taxpayer history not only once, but twice (of which the latter occasion was a failed $4B investment that could have caused a government shutdown had it gone worse). More recently, the Department of Agriculture's project to modernize its agricultural systems project has failed to deliver 80% of planned functionality and has doubled its initial cost estimate of $330 million. Federal modernization projects fail for a variety of reasons. One is that IT is inherently interconnectedaaacross systems, data centers, and even agenciesaamaking it difficult to simply unplug and replace a single component. The longer a system is around, the more entangled it becomes in the legacy web. At the center of the mix is data. Many applications and users require access to the same data. Data is not only the lifeblood of applications, but also the connective tissue between systems, data centers, and clouds. But more often than not, data is a primary source of friction. The complex and manual process of provisioning data to test environments takes days or weeks, placing a strain on operations teams. Despite noble efforts, developers and testers still never have access to the data they need when they need it. The reality is that data is trapped in legacy systems, creating a major bottleneck that prevents many federal IT organizations from completing modernization projects on time and on budget. Every modernization project should have a proper data strategy - one that decouples data from legacy systems, automates manual processes, and curbs the rapid proliferation of data. And as insider and outsider threats continue to rise, so will the importance of data security and privacy. To securely modernize systems, government agencies must look at new, innovative technologies to automate data management. Having the proper tools and strategy in place for data will be critical to running IT like a well-oiled machine. For more information on the challenges agency face in application development and cloud migration and how secured virtual data can help agencies do more with less, check out a webinar here.