The MBA’s Technology Conference just passed us by. Other than the special events, such as the awards ceremony presented by Progress in Lending, there was very little excitement building up to the meetings. Yes, the topic for the first general session was one that interested many people since disruptive technology has rapidly become a part of the technologist’s lexicon, but beside this, there didn’t seem to be a lot of new exciting advances to discuss. Unfortunately there really should be.
One of the largest gaps still existing in technology is the support that is needed by risk management, most specifically in the areas of regulatory compliance and quality control. These areas, which are now the focus of regulators, investors and consumers have had little, if any, technological breakthroughs in since the emergence of automated underwriting. While there are some companies that have developed internal calculations for such issues as high cost loans, the reality is that the result still have to input into another system for inclusion in the overall loan analysis.
Such issues as the “Ability to Repay” (ATR) and “Qualified Mortgages” (QM) are still primarily evaluated by individual underwriters who are subject to making mistakes no matter how hard they work at getting it right, or how many times it is checked. Another issue that has been with us for years, is HMDA data. Regardless of how many validity and quality checks built into the HMDA geo-coding programs, the fact of the matter is that numerous errors are still found in the HMDA data. These programs don’t check for such things as manufactured data or loans that are not included in the LAR. A recent study by the Mortgage True View’s HMDAnalysis program found that the total number of loans submitted on the LAR matched the number in the LOS system, but unfortunately few, if any, of the dates of origination and decision actually matched what was in the LOS or loan files. This analysis also showed that, on average, lenders approved loans with income of around $48,000 for every $100,000 loaned. However, some lenders required as much as $66,000 per $100,000 and one lender was as low as $17,000 income for each $100,,000 lent. Why do lenders not know this before they submit their LAR’s? Where is the technology to test this data before it is submitted? Is there any wonder we have to contend with all he accusations of unfair lending practices?
Of course, Quality Control programs are not any better. They still are basically an automated checklist that individuals must complete in order to conduct the loan file review. We continue to use paper documents to reverify the information included in the loan decision process and provide a subjective analysis of the results. Of course, agency requirements preclude many changes to this process, but surely there are some advancements in technology to not only speed the process but make it more objective. There are risk models in existence today that evaluate operational mistakes to the probability of default but are not being used because there is no technology to support them. There are sampling programs that are run independent of the QC process that exclude the possibility of providing management with information that tells them the issues found in the review are not just random mistakes, but real process issues that need to be addressed, which would allow them to focus on the things that will improve their operational efficiency instead of chasing random error issues.
Technology has proven to be a valuable part of the mortgage lending programs. However, if those involved in technology can’t come up with some advanced support for managing risk, it will become nothing more than window dressing and lenders will, out of necessity, look for that disruptive technology that can handle their problems.
About The Author