News reports suggest cautious optimism ahead for 2014. Unemployment is down to a level existing prior to the Great Recession, stock prices are moving higher creating wealth for those invested, and home prices in many markets are increasing as foreclosed properties are sold and assimilated into the market place. But caution still abounds and, if pressed to reduce my concerns for the upcoming year into one word, I would have to say it is all about the “data.” In my view, data is going to drive compliance efforts by in-house risk managers in two key areas: Fair Lending and qualified mortgages.
For a second time, the U.S. Supreme Court was denied the opportunity to weigh in on whether raw data is proof positive of discriminatory intent when the parties to that proceeding settled their claims. So the Department of Housing and Urban Development’s discriminatory effects rule, which became effective on March 13, 2013 and formalized the “disparate impact” theory, is now the baseline tool to establish liability under the Fair Housing Act.
My past articles in these and other periodicals harp on “scraping data” from your loan origination systems and/or mortgage document preparation software, looking for loan or borrower outliers. While any outliers might have a “legally sufficient justification” (i.e. necessary to achieve legitimate, nondiscriminatory interests of the lender which could not be served by another practice), acquiring knowledge of such outliers arms your compliance team with the information to take action, if needed. Likewise, knowledge of and assembly of that data is critical to your defense to claims of discriminatory intent through the use of statistics, asserted on behalf of a borrower or regulator.
Using data mined from your systems is also critical to compliance efforts under the Ability to Repay Rule, effective January 10, 2014. Under the Rule, loans, which are “qualified mortgages” because they were underwritten against certain guidelines designed to maximize repayment, enjoy either a “safe harbor” or rebuttable presumption that the lender complied with the Rule. But the Rule itself is a minefield of “if-then logic,” blending the amount of fees, points, amortization, debt levels, and APR, (calculated amounts in and of themselves), into a cornucopia of calculations that either stamps a loan with a “QM” designation … or not.
Being able to quickly drill down through the forest of data to find the offending data point(s) that prevents QM designation demonstrates the importance of the data. But, as I have pointed out in my past writings, understanding that data also enables you to peer into your business model and learn where profits are elusive, or your income is “leaking,” or which loan products (or even producers) are stellar performers. And if your mortgage document preparation system and/or loan origination system don’t give you the data in a usable form, compel your provider to do so … it’s your data and it’s essential to the financial health of your business and key to mitigating compliance risk.
Years ago, magazines would profile their subscribers and sell that mined data to other advertisers interested in reselling that data to a targeted audience. Credit card statements were filled with leaflets suggesting products in line with your purchasing habits. While the progression from paper to ever faster computers has produced data in volumes never before imagined, effectively funneling that data into manageable, useful “bytes” (the pun is intended) is a necessary step towards a best practices loan compliance program and equally serves as a scorecard to your financial health.
About The Author
Jonathan M. Herman is a partner with The Middleberg Riddle Group (“MRG”), a law firm whose principal office is located in Dallas, Texas. MRG Document Technologies is a national mortgage compliance services practices group within MRG. Through data analysis, Mr. Herman looks to pinpoint both specific loan issues and global enterprise issues that bear upon whether the loan or enterprise is complaint with pertinent regulations.