This is a guest post by Tom Austin from Coeo partners Redgate Software.
The financial services industry is heavily regulated, complicated, and challenging in IT terms. Conversely, it’s also the industry that has the most to gain from DevOps initiatives.
Firstly, let’s start by trying to define DevOps. I’ve seen 101 different presentations and heard 101 ways of describing it, but I like the one from Donovan Brown, Principal DevOps Manager at Microsoft:
“DevOps is the union of people, process, and products to enable continuous delivery of value to our end users.”
Unfortunately, it’s not something you can just buy or decide to do tomorrow. Instead, it’s a shift that needs the right guidance to become reality. But while there are significant challenges and costs to adopting DevOps in the financial industry, the benefits are too great to ignore – as are the risks of not delivering value to customers quickly enough and losing your customers to competitors or new fintech disruptors. But what are the drivers for adopting DevOps in financial services, how can you include the database, and how can you assess your database DevOps maturity against your peers?
DevOps drivers in financial services
As the 2017 State of DevOps Report from DORA and Puppet shows, there are big advantages for companies and organizations which embrace DevOps. They can typically deploy changes, updates and improvements 46 times more frequently, for example. Their change failure rate is also 5 times lower, and they can recover from failures when they do occur 96 times faster. In the financial services sector specifically, DevOps enables companies to perform better in three key areas.
- Increasing the speed of delivery: Financial services companies are under increasing pressure to release software faster. Whether it’s new entrants to the market such as mobile-only banks, or the likes of Apple and Google entering the mobile payments space, or increased investment in fintech start ups, change is afoot. Adopting DevOps practices has been proven to significantly increase the speed of delivery, with high IT performers deploying multiple times per day, and low performers deploying once a week or even once a month.
- Reducing downtime: In its 2015 report, DevOps and the Cost of Downtime, IDC calculated that, on average, infrastructure failures cost large enterprises $100,000 per hour. The 2016 Cost of Data Center Outages report from the Ponemon Institute goes further, indicating the cost of unplanned outages in the financial services industry is the highest of any business sector, and more than double that of the public sector. In an ever-more competitive industry, today’s financial institutions can’t afford these costly mistakes. Especially when DevOps practices have been proven to significantly reduce downtime, as mentioned above, with the mean time to recovery (MTTR) of high IT performers 96 times faster than low performers.
- Improving compliance: The financial services industry is one of the most highly regulated sectors in the world. While introducing DevOps may at first appear to be the antithesis of such regulations, the opposite is true. DevOps practices allow for greater risk management, for example, with small, iterative changes being thoroughly tested by processes like continuous integration. This in turn leads to levels of confidence far higher than the traditional software development cycle. The 2017 State of DevOps report also found that high performers spend 50% less time remediating security issues than low performers.
Including the database in DevOps
DevOps is about changing the culture of software development and improving collaboration between development and operations teams. But it’s also about automating many of the common jobs in delivering software, such as source control, testing, compliance and security checks, and deployments. With the automation in place, a process is established that is now common in application development: Development progresses from source control through continuous integration to release management before changes are deployed. At each stage, the changes are checked and tested so that errors are picked up earlier in the cycle and software releases are both faster and more reliable. Databases, however, are more problematic because business critical data needs to be safely and correctly preserved. In addition to this, there are specific challenges in financial services, such as extremely complex systems, legacy databases, and siloed departments. Tools and processes have now been introduced, however, that allow databases to be developed alongside applications by plugging into and integrating with the systems and infrastructure already in place:
As can be seen, rather than database development being separate to that of the application and managed at the very end by a siloed team, it becomes an integral and natural part of the whole development process. This is a real advantage for companies and institutions where, typically, the database has been a bottleneck. Because the application and database are developed and tested together, errors or potential issues are highlighted much earlier in the development process, avoiding problems when changes are deployed. Compare this to the conversations I’ve had with a lot of DBAs who are required to review thousands of lines of script when it comes to deploying database changes. That can take days, depending on how many errors they find in the script. By committing database changes to source control on a regular basis, you can introduce automated builds and tests to make sure that all of those small units of change are tested and validated multiple times before you are ready to deploy to your next environment. This results in releases being more reliable and less timeconsuming, and also means you can respond to change a lot faster.
This article is based on a series of posts from the Redgate blog about the leading DevOps drivers in financial services. To find out more, please visit their blog.