The right solution can reduce costs and complexity, while enhancing security and operating efficiency, particularly in integrating key systems.
If you run a bank and have a controlling, hands-on disposition, the idea of software as a service (SaaS) – storing something as vital as your data and the means to analyze and report it off your premises and on someone else’s servers – must be hard to get used to. But the trend is robust and should be embraced. Institutions that adopt a SaaS solution maintain tighter control over their data than they may realize in all meaningful respects, and they accrue a range of operational and financial benefits that may be out of reach to firms that keep their data closer at hand.
The right SaaS solution is likely to be cheaper, simpler and safer to maintain and run. It can enhance efficiency, most notably in the integration of key information systems covering accounting, loan origination, customer relationship management, risk management, and regulatory reporting. Running everything in house through your own architecture, by contrast, can hinder productivity, while creating the illusion of control more than control itself.
The urge to keep data systems at your fingertips is understandable. Data is one of a financial institution’s most important assets; it must be protected from prying eyes and made available for all manner of analyses and regulatory oversight. And bankers have a familiarity with, and maybe an odd affection for, their existing systems. These systems do an adequate job, at least as the job was defined whenever they were installed. Combined with an understandable wariness of novelty, the desire to stick with what you have can be compelling. Or as cynics through the ages have put it, better the devil you know.
The limits to an in-house solution
There are many limitations of in-house data infrastructure that will be revealed when components are required to work with one another, and to interface with supervisors’ systems, as they increasingly must these days.
Take the mundane case of a prospective client who visits your website to apply for a credit card. In line with the custom in your market, you need to provide a yes or a no, and the credit limit if the answer is yes, within a minute or so once the applicant enters some basic personal details. If a card is approved, your customer relations management program will have to inform your accounting, risk management and regulatory reporting programs that the bank has extended, say, a $5,000 revolving credit line, ideally before your new cardholder has booked that dream vacation in Sardinia.
Your existing tech setup may be ill prepared for this task, which really is not as simple as it seems on the surface. Chances are your systems were installed piecemeal, over a long span of time, each component intended to satisfy a single, discrete need, and to do it with a different data taxonomy, dictionary and usage policy. When it comes time for one bit of software to communicate with others and transfer data, such as details of your new cardholder’s account, you may find that the structure you have built resembles the Tower of Babel.
And the lengthy, time-consuming process of assembling your data systems may have left them stuck in the past, unable to keep up with greater demands for connectivity – pushing more data, faster, to more internal and external destinations. This trend has been driven by two sources of change: competitive pressures that give you seconds to let your prospective customer know if he can use your money for that vacation, and a priority among regulators to automate submissions so they can be delivered faster, more often and in greater detail.
Create a new solution, or use one that is already here
If bankers could go back in time, they might build new data infrastructure from the ground up. It would feature a single, fully integrated platform covering risk, analytics of all sorts and regulatory reporting. Common languages and scripts. No silos. Everyone on the same page, using the same systems. No reconciliation of data from multiple sources and dictionaries required.
If violating the laws of physics is a lot to ask, an institution can always start from scratch in the here and now, learn from past mistakes and install something new and better. An easier and more useful alternative is to go with a SaaS solution that has a fully integrated architecture run on secure servers. Such a solution, when designed well, will include all necessary programs, updated as soon as new versions are released, along with the requisite licenses, and it will be housed on servers owned by industry leaders like Amazon and Microsoft.
If that also seems like a lot to ask, well, it is. But it is doable – if you can be flexible about how flexible you can be. Managing your data your way on your tech may be satisfying, but it may slow you down and create barriers to communication in an increasingly chatty world. Just as a train will not get far if each rail network it runs on uses a different gauge of track, information will have trouble getting where it needs to go if different users perform the same task on it in unique ways.
A standardized solution that is just flexible enough
The ideal SaaS solution, when configured effectively, will thread a slender needle. It will feature a common platform with a suite of programs that execute all necessary functions within and among key systems, while being tailored in limited but important ways to individual clients’ needs, mainly at the point where the solution connects with a client’s legacy systems. This limited but targeted flexibility can be achieved through various adapters and connectors, so-called application programming interfaces, that serve as switches that get a user’s information rumbling down the right track to its destination.
Another key element is a modular design that furnishes access only to features that meet a client’s particular needs, and that can be modified as those needs change. The objective is a turnkey solution that makes it easy for an institution to forgo some of the flexibility it has become accustomed to in return for operational simplicity and enhanced functionality. When that is accomplished, a benefit emerges that furnishes users with control of another sort: the critical one over costs. The licensing fees in a SaaS contract are recurring, known expenses, whereas those associated with owned architecture are potentially open-ended capital outlays.
Costs are likely to be lower, too, for SaaS clients. The specialists who maintain the systems are likely to be more efficient at it than in-house tech departments, which due to the outsourcing of this work can be smaller and require less training. Offering a service that is standardized in important ways, moreover, creates economies of scale for service providers that they pass on to customers in this competitive marketplace. Achieving scale is acutely important for SaaS providers, and therefore immensely beneficial to them and their clients, because their solutions must be equipped with a multitude of programs to meet the needs of clients engaged in a variety of activities and jurisdictions, as well as the increasing demands of financial supervisors.
When shopping around for a SaaS provider, therefore, it is essential to go with a large firm that can make the most of the economies of scale that SaaS affords, and that has the expertise that comes from operating in many markets and serving customers involved in a range of activities.
Familiarity with supervisory authorities and competitive dynamics in diverse markets will help in implementing a solution and in the continual updating of software and licenses that is a fact of modern life for a financial institution. Ultimately the best way to control your data and how you use it is to make the right decision about who gets to manage it.