Data Analytics

website-pdf-download

What a surprise! Here I am writing about data again. Data collection has become much more effective with the automation and digitization of the retail mortgage industry’s business processes. The downside is we are overwhelmed by the sheer volume of data, both structured and unstructured.

The development of MISMO’s Logical Data Dictionary over the last ten years has played a large role in reducing both the time and effort in connecting the numerous entities involved in financial transactions. As more organizations adopt this data standard, it will significantly increase the electronic exchange of data.

So what does this mean for the mortgage industry? Let’s start with an Information Management article by William McKnight that emphasized the fact that data must have true quality across the enterprise. McKnight’s theory regarding data quality is centered on the following observations:

The volume of available information is exploding. This includes inside data, through channels, and third-party sources.

Our current business environment plays in real time. Opportunities are lost when decisions and action are delayed.

 It has never been more true that information is a key business asset, regardless of your industry. An organization’s ability to manage its information is a powerful competitive differentiator.

McKnight further asserts that enterprise data lacks standardization, which handicaps our ability to project how business will be conducted in the next decade, and that intra-organization cooperation on data quality is key to an organization’s success. The interesting point is that this article is now over five years old.

Featured Sponsors:

 

Recently, I read an article “Top Three Mortgage Quality Control Trends You Need to Know for 2015” by ACES Risk Management (ARMCO). With their permission, I have extracted some key points from that article. “2015 is shaping up as another year of significant changes in the mortgage industry. Increased regulatory scrutiny continues to raise the stakes for mortgage lenders’ operations. As a result, quality control will be front and center among lenders’ enterprise risk management strategies and truly mission-critical to their success.”

1: The Changing Nature of Risk for Mortgage Professionals. “There has been significant reduction in underwriting and credit quality risk in recent years. Although troubling for lending volume, QM and QRM have made it difficult for unqualified consumers to obtain mortgage loans, without resorting to outright fraud. So where is the risk now?”

2: Mortgage Lender Accountability and Responsibility. “Regulators have made it clear through various guidelines that outsourcing, contracting, subcontracting, or even eliminating certain functions does not reduce a lender’s or servicer’s accountability for those functions.”

3: 2015: The Year of Big Data. “In thinking about big data in the mortgage industry, there are two types to consider – loan data and process data. When lending risk resided in underwriting, loan data became critically important. We’ve now reached a point where underwriting standards are so high that the risk of making a flawed decision based on bad consumer information is relatively minimal for the typical lender. For example, loan quality control defect rates continue to be as high as 10% and more at certain lenders. Typically, these are not loans that are in danger of default because of the borrower’s inability to pay. Their defects rest in the manufacturing process, and the largest single category of those defects, according to our client data, is missing documentation.” (For more information, visit www.armco.us)

Today, the buzz is all about Big Data, defined as extremely large data sets that may be analyzed computationally to reveal patterns, trends, and associations, especially relating to human behavior and interactions. So, let’s explore this concept a little further and look at some definitions and clarifications.

Analytics vs. Analysis: You can’t talk about Big Data without talking about analysis and analytics. Analysis requires a breakdown of data into its constituent elements, along with an identification of metadata that provides context and supports meaningful information taxonomy. Analytics can refer to the methodology for conducting the analysis, but it also defines the output of the analysis: analytics are the meaningful information derived from data analysis. While analysis is a study of the past, analytics supplies a model that can be used to predict future results.

Analytics vs. Business Intelligence: Business intelligence, BI, is a technology-driven process for analyzing data and presenting actionable information to help organizations make better business decisions. Depending on the organization and the industry, BI may draw upon a wide range of tools and methodologies to support collect data from internal and external sources, prepare it for queries and analysis, and ultimately output reports of analytical results to decision makers. Tom Davenport wrote, “I think of analytics as a subset of BI based on statistics, prediction and optimization. The great bulk of BI is much more focused on reporting capabilities. Analytics has become a sexier term to use.

Why is this relevant? In Tom Davenport’s book, Analytics at Work, he reports that “two-thirds of large US companies researched believe they need to improve their enterprise analytics capabilities” and “nearly three quarters said they are working to increase their company’s business analytic usage.” Some recent articles include these statistics: 89% of US businesses are investing in data and data analytics, 85% of CEO’s say that digital technologies related to data and data analysis are creating high value for their organizations. According to Attivio, 90% of an organization’s data is hidden in silos that aren’t known or can be reached.

This initiative broken down by industry in the Big Data Analytics Survey by Peer Research showed 43% in banking, 14% in technology and 9% in consumers. The key results were:

>> Better decision making (49%)

>> Better enablement of key strategic initiatives (16%)

>> Better relationships with customers (10%)

>> Better sense of risk (9%)

>> Better financial performance (9%)

>> Others (6%)

I believe it is finally time for the lending community to take full control of their data, and that will be difficult if lenders don’t have an in-depth understanding of what that means and all that it entails.

Having control of your data across your entire enterprise will provide the following:

1.) A better understanding of your customer’s needs.

2.) Make your processes more efficient.

3.) Reduce risk by proactively recognizing and addressing vulnerable areas.

4.) Finally, reducing costs, which is always an objective.

If your organization doesn’t embrace this; You can rest assured, your competitors will. By the time you realize that, it may be too late.

About The Author

Roger Gudobba
Roger Gudobba is passionate about the importance of quality data and its role in improving the mortgage process. He is vice president, mortgage markets at Compliance Systems and chief executive officer at PROGRESS in Lending Association. Roger has over 30 years of mortgage experience and an active participant in the Mortgage Industry Standards Maintenance Organization (MISMO) for 17 years. He was a Mortgage Banking Technology All-Star in 2005. He was the recipient of Mortgage Technology Magazine’s Steve Fraser Visionary Award in 2004 and the Lasting Impact Award in 2008. Roger can be reached at rgudobba@compliancesystems.com.