Each day, thousands of a bank’s clients will perform financial transactions: making deposits, withdrawals, moving money between bank accounts, etc. Banks must also perform certain daily calculations that are required as part of regulatory compliance, including monitoring assets and reporting key metrics. All totaled, a bank today is performing an ever-growing number of daily transactions and calculations that reaches into the millions – a burden that’s increasingly challenging for traditional systems to bear.
Banking software designed in the latter half of the 20th century is not a good fit in today’s digital, real-time era – but yet it’s still prevalent. Bank personnel might need a report to be immediately available online, not at the end of the week. A banking customer may need to know the exact amount of funds in each of her accounts at a certain moment, but the bank’s systems may not be able to deliver that information with real-time accuracy. This customer would expect the same balance information whether using a mobile banking app, an ATM, or the bank’s call center; however, because these use three siloed pieces of software from different vendors, this may not be the case.
The banking industry’s move toward real-time processing across channels means a re-thinking of how legacy core systems are designed and deployed. In many cases, the legacy solutions based on mainframe computing systems are inadequate for tomorrow’s banking needs. What will be needed in the near future are faster systems that can handle and store large volumes of structured and unstructured data that are deployable across institutions’ channels and lines of business.
Overview by Ed O’Brien, Director, Banking Channels Advisory Service at Mercator Advisory Group
Read the full story here