It has been almost twenty years since Salesforce first introduced the concept of cloud computing to the business world. Despite its revolutionary concept, or maybe because of it, adoption of Salesforce and cloud computing in general was initially slow. Today, cloud computing is so commonplace that even the mortgage industry, a notoriously slow adopter of technology, has fully embraced it. But in order to truly exploit its potential, a little self-reflection is required: how well do you really know cloud computing?
The Industry’s Indispensable Shift to Cloud-computing
In the early days of cloud computing, software vendors broadcast their applications through the internet to allow multiple users to access it using “virtualization.” In this model, a single copy of the application resides on a central server and lightweight “thin client” software is installed on each user’s machine. Virtualization was a big hit.
IT managers loved virtualization because there was only one copy of the software to maintain. Business owners loved it because their IT costs were drastically reduced. Software vendors especially loved virtualization because they could “spin up” their old, on-premise software into a cloud subscription service with minimal cost and effort.
But in reality, virtualization represents an incremental shift in technological innovation. Other than giving users convenient access and their IT teams more free time, the core technology behind virtualized software does not change at all. It’s not as if “going to the cloud” turns legacy software into an entirely new application. In fact, you could take an 8-bit version of VisiCalc, broadcast it through a server and call it a cloud computing system.
The Value Added for Vendors in a Multi-tenancy Solution
To realize the true value of cloud computing, it’s important to understand the way an application’s underlying database is structured. In a virtualized environment, there is a single, separate and independent instance of the software code and the database. Any upgrades, fixes or integrations to the software must be installed separately in each instance. This is known as single-tenancy.
In contrast, a multi-tenant database model is truly transformative because the software code and data resides in a single, unified database. Every user across multiple clients is using the same application simultaneously. Therefore, distributing updates, fixes or integrations requires virtually zero effort from the vendor because all of their users are in the same environment concurrently.
One of the main advantages of multi-tenancy is the ability to scale an application extremely efficiently. It is similar to fixing a centralized furnace in a 100-unit apartment building versus fixing each furnace in 100 single family homes. Multi-tenancy also ensures that every client is using the same exact version of the application and has access to the same integrations.
Just as having the ability to scale an application extremely efficiently, vendors can become active participants in the implementation, configuration and end-user operation of the system because they have direct access to a client’s system and associated data. The old “batteries not included” paradigm of software delivery turns into a white glove approach, where vendors embed consultative services into their product offering.
Big Data: The Next Level of Cloud-Computing
However, these advantages pale in comparison to the true potential of cloud computing: Big Data. In a multi-tenant system, all the data generated by every client resides in one database. This means that data aggregation and normalization become non-issues, allowing for analysis and forecasting insight that can approach clairvoyant levels.
Imagine being able to determine the precise cost of originating any given loan scenario. Or the secondary gains you receive when you know the exact date when every loan in your pipeline is going to close. This level of predictability can only be achieved using big data, and big data can only be gathered in a cloud solution that has a multi-tenant database architecture.
Is big data making an impact in the mortgage industry? Not yet. Big data is still in an evolutionary phase. Even industries with mature cloud computing capabilities are only beginning to scratch the surface of big data. But as we’ve seen before, technological advancements happen quickly. Moving to a multi-tenant cloud computing model is the first step towards big data.
While transforming a single-tenant database into a multi-tenant one takes time and money, it is important to build evaluating its place in the vendors organizations now. Legacy software vendors have no choice but to start from scratch, a monumental task, but one that is manageable with the appropriate planning.
Vendors should consider a few different things when planning to integrate a multi-tenancy database. First, how well prepared is their organization to support the business model? And secondly, can the vendor deal with client expectations that are completely different from their legacy model? Gone are the days of delivering pre-packaged software (including virtualized software) and expecting an IT professional to install and maintain it. Vendors should note configurability, not customizing, is the way in which caters to a streamline process, while delivering results. A multi-tenant solution operates as such just like the air conditioner analogy mentioned previously.
Twenty years is a long time. Salesforce and other “true” cloud computing vendors have already experienced and adjusted to the growing pains of delivering software in a multi-tenant environment. Legacy software vendors who are just entering multi-tenancy – even large, well-funded vendors – will discover how difficult it is to bridge the gap from their single-tenant virtualization model. Virtualization might be a shortcut to cloud computing, but multi-tenancy is the only path to a big data future.