October 29, 2012 will forever be remembered as the day Hurricane Sandy made landfall in the U.S. What was then a post-tropical cyclone arrived in New Jersey, with a storm surge that rapidly flooded New York City’s streets. It was notable in the financial services industry because, despite organizations’ disaster recovery plans, a huge amount of disruption ensued, costing the sector billions of dollars almost in the wink of an eye.
Why? Even though these organizations had followed what was then best practice and backed up their data so that if one data center failed, another one would take its place close by – to minimize data transaction latency – that wasn’t enough. These organizations, largely based in Manhattan, had data on both sides of the Hudson River, in order to minimize disaster recovery time.
When the storm surge hit both sides of the river, it disrupted data in both the primary and secondary data centers in New York State and New Jersey. The result was a force majeure incident and a costly lesson in data management we all thought we’d learned from Hurricane Katrina, seven years earlier.
The lesson? Best practice guidelines can still leave many enterprises literally adrift, especially now during the COVID-19 crisis, where there’s a race to get data into the cloud. That’s because the sector wants to take advantage of computing flexibility – at low cost – all while freeing themselves from the crushing cost and management burden that their legacy infrastructure and apps places on them.
As they rush to take advantage of the cloud and the flexibility of remote working, mistakes are being made and best practice is no longer the North Star it once was.
Between a rock and a hard place
According to Julien Courbe, Global FS Technology Leader at PWC (PDF report), “It is now becoming obvious that the accelerating pace of technological change is the most creative force – and also, the most destructive one – in the financial services ecosystem today.” Although he recommends embracing disruption, that’s still a grim warning to the financial services industry that best intentions to migrate to the cloud can go awry.
The time for change in the financial services industry is here, and to quote Winston Churchill, “Don’t waste a good crisis.” Many firms have taken this to heart and are using COVID-19 and subsequent Work From Home (WFH) precepts to equip their employees to meet the demands of the new WFH normal.
As Courbe says in his report, “Customers have had their expectations set by other industries; they are now demanding better services, seamless experiences regardless of channel, and more value for their money. Regulators demand more from the industry too, and have started to adopt new technologies that will revolutionize their ability to collect and analyse information. And the pace of change shows no signs of slowing.”
Indeed, it’s this pace of change which is causing some major issues. Because even though it’s dawning that organizations will never again return to at-office working versus the benefits of WFH, flaws in file sharing and collaboration – critical to customer service in the financial service sector – are emerging as networks are becoming stress-tested and are failing to deliver.
That’s because organizations tend to focus on giving users remote access to applications when they’re unable to come into the office, but can put less focus on providing fast access to crucial data. The logical answer would seem to be to move data and workflows to the cloud, where they can be accessed from anywhere, however these organizations often have several hundred homegrown applications – sometimes up to a couple of thousand – to migrate to the cloud.
Given the stark choice between remaining with the status quo, versus re-writing hundreds of applications for the cloud and the cost and disruption that involves, many firms have been disenchanted by thoughts of moving everything to the cloud. Of those organizations that have moved applications to the cloud, 74% have moved an app back after experiencing either performance or security issues.
Surely, there’s a better way? Because data is the lifeblood of financial services, nothing should ever disrupt the critical path of data between organizations, their customers, and trading platforms worldwide. Then, there’s data security to worry about, and moving into private, public or hybrid clouds carries concerns, particularly where data connects directly to a financial value, and contains a multitude of very private and highly regulated information.
However, with new technology, the choice to move to the cloud is no longer black or white. Financial services firms are moving to cloud because the risk of not doing so, coupled with the upsides, are providing the impetus; the risk of not moving to the cloud has become the risk itself.
As they migrate to the cloud, data durability – ensuring stored data doesn’t become corrupted and inaccurate – combined with data transaction speed and minimizing latency while gaining computational flexibility and data availability are key.
So, how do we combat uncertainty and ‘get there from here?’
Best practices for uncertain times
Even in these uncertain times, there are a number of best practice points that offer a tried and trusted way forward. For financial services organizations who want to move to the cloud as rapidly as possible, there are a number of worries, including migrating apps which won’t run without being rewritten, security and regulatory concerns, including data sharing and ransomware, and also, a lack of immediate data consistency for every location which makes collaboration virtually impossible.
In the face of these difficulties, we have the answers and here are our new best practice tips for organizations that want to get ahead without incurring unnecessary risk.
You can now migrate to the cloud rapidly
The biggest challenges in the financial sector with moving to the cloud are rewriting applications, and achieving immediate data consistency. It’s classically a complex process, but it doesn’t need to be!
In reality, organizations can pursue a hybrid cloud migration model that allows businesses to migrate data to the cloud while leveraging on-prem filers to provide local processing power. continue to use data on-premise, preserving file services so that applications do not need to be rewritten in parallel with moving gradually into the cloud. This means companies can move to the cloud right now – migrating the most critical applications first, while also allowing resilience through data being stored in a primary and a secondary data center.
Make your dual supplier solution fit your needs
Financial services firms have a dual supplier agreement, which offers resilience so data operations from one vendor can be switched over to another for disaster recovery and business continuity purposes. But failing over from one to another can be expensive and disruptive, as data needs to actively be written to the alternative vendor. Often, by the time it’s written, customers and revenue have been lost.
New cloud mirroring technology allows enterprises to write data to two different cloud providers at the same time. This is an effective way of avoiding the cost, worry and disruption of dual supplier agreements while allowing core data to be backed up and usable from either of the two providers. With dual vendor support, cloud mirroring can enable automatic switchover without disruption in the case of a service outage, and business as usual even in chaotic circumstances.
Stay secure by using an immutable data architecture
Data encryption is nothing new, but the way it is administered by today’s cloud providers involves unnecessary risk. Because data is encrypted in the cloud, the provider holds the encryption keys, placing enterprise trust in cybersecurity with a single potential point of failure. Solutions which allow enterprises to encrypt their own data locally at the edge of the network before it enters the cloud are moving cybersecurity responsibility back into the hands of enterprises.
In addition to encryption services an immutable data architecture is a critical feature to protect against malware such as crypto lockers. An immutable data architecture means that all data is written as new immutable data blocks (Write Once, Read Many), and so in the instance of ransomware attempting to encrypt corporate data, existing data is unaffected. Reverting to an earlier, protected snapshot prior to the attack then neatly sidesteps the issue, making immutable data architectures inherently bulletproof against ransomware and crypto lockers.
Many of the cloud services offered today come with an embedded security solution, and while that offers protection, they interfere with existing enterprise security policies. Better to have a cloud service that plugs into the existing enterprise security solution, allowing businesses the flexibility to choose their own security solution rather than relying on one that comes embedded.
Use object storage
Unlike Block or File storage, Object storage adds comprehensive metadata to the file, eliminating the tiered structure used in file storage. It places everything into a flat address space, dramatically collapsing the traditional file storage hierarchy. This means that data stored as Objects is much more extensible, can be retrieved in parallel to offset latency in the cloud, and is less costly to store data.
Data stored as Objects also has greater durability and is less susceptible to corruption or data rot over time. That’s why financial services organizations are rushing to take advantage of Object storage, because record keeping is vital. Also the speed of data retrieval is key, as each millisecond can represent a change in the financial value of a transaction. That leaves the organization bridging the delta between a higher and lower share price, which is unacceptable. The quicker a transaction is completed, the better it is for the organization and their customers.
Tie your investment in cloud infrastructure to the benefits of new ways of working
Investing in the cloud clearly brings a whole host of IT benefits from new data infrastructure and architectures. But with integrated global file services providing data ‘present’ – easily accessible – wherever an employee is working from, the possibility of real-time collaboration on files becomes a reality. Whether it’s productivity personnel accessing the same data from multiple locations or applications accessing data from multiple data centers, the focus is on data durability and increased productivity.