Posts

Value Your Data

We have heard industry veterans like Roger Gudobba and others say, “It’s all about the data.” The phrase has become so overused that it almost means nothing anymore. However, lenders and vendors alike should listen to this sound advice. Roger was talking about how data can improve the mortgage lending process, and that’s true, but I’m here to say to you that data can improve your marketing process, as well.

Featured Sponsors:

 

 
In a White Paper entitled “Put Data First: Why Data Quality in CRM and Marketing Automation are Top Priority” written by RingLead, the author states that whatever your situation may be, you will quickly realize that it all comes back to data, because data is the real value in your CRM and marketing automation platform.

We don’t mean to trivialize the importance of workflows, automated processes and drip nurturing campaigns that these systems offer. These features are one of the primary reasons that organizations invest so much time and money into their implementation and ongoing administration and improvement, but many are rendered utterly useless when they come into contact with dirty data.

Featured Sponsors:

 
Dirty data has a way of silently infiltrating your organization, creating frustration, inefficiency, and loss of confidence (eg. dismal user adoption) in the systems themselves. It can affect each department and group of stakeholders in a very different way, but unless there is a “State of Our Data” address, the problem is not brought to the forefront of the organization’s collective psyche.

One of the key requirements of a customer and prospect database is to easily segment the records, allowing your organization to interact with one set of contacts differently from others. This can be easy if you have a strict set of values for each field and the input is controlled at the insertion point.

A common requirement is segmentation by job title, but there are simply too many variations on an individual’s job title to try to account for each with a picklist value, so the standard method of insertion is via a regular text field. This creates a pretty big problem for segmentation.

Featured Sponsors:

 
According to a 2013 Experian QAS survey, 94 percent of businesses believe there is some level of inaccuracy within their CRM systems. When you think about the time, money and focus that is put into CRM, an allowance for inaccurate and useless data is mind boggling.

Think about the time that your organization is wasting sifting through inaccurate or worse, completely useless, data. Inaccurate data leads to:

>>Wasted sales efforts on useless bits of information stored in CRM, leading to discontent and potential abandonment of the CRM system (ie. decreased user adoption)

>>Longer wait times for support while reps are forced to piece together information while on the phone with a customer, leading to decreased customer satisfaction

At Sirius Summit 2013, Jim Ninivaggi, Service Director of Sales Enablement Strategies at Sirius Decisions cited a study that found roughly 30% of an enterprise salesperson’s time is spent doing research on the Internet. If you think about that in the context of an 8 am – 6 pm workday, that means that 35 days per year are spent doing research.

In Data Driven: Profiting from Your Most Important Business Asset, Thomas C. Redman sums up the advantages of data completeness as “A moat around our business [that] gives us a unique competitive advantage.” Nowhere is this more evident than the aspect of data completeness.

Duplicate records in CRM and marketing automation platforms are a familiar aspect of bad data. The errors and frustration that duplicates cause can be felt across most departments at almost every level.

Reports are skewed, the wrong messages are being sent, and quarrels are created over one record that somehow made it into the system twice and was distributed to two different sales reps.

How are duplicate records created? Today’s CRM and marketing automation platforms come equipped with very basic duplicate identification, which is, in almost every case, based on a scan for an exact-match email address.

Many modern, technology-enabled organizations are using more than one software platform to manage their customer and prospect data. It is crucial to keep your data in sync across your email, ERP systems, CRM, marketing automation platform, and more. If your data quality plan is limited to one platform, you’re only solving part of the problem.

It’s important to remember that dirty data can be a big problem, but can be easily solved. Analyze the problem and try to hone in on the areas that are causing the most pain. Then get in touch with a team that has experience in resolving these types of issues.

For example, NexLevel Advisors is focused on companies that are looking to take their business to the next level. NexLevel Advisors assists you in elevating your results. Creating new opportunities, executable strategies, and delivering results creates an environment that promotes continual growth and business value for your company.

We add value through strategic advice specific to your company. Our team has years of experience and have been in your situation and position. These individuals possess in-depth knowledge of your complex product and service offerings, the nuances of your market segment, and the challenges of your product roadmap and lifecycle. We deliver customized differentiation in the marketplace for your organization while producing measurable results.

What this means for your business is that you get customized programs from accomplished executives who offer proven results-oriented solutions specifically created to take your organization to the next level, quicker and more strategically than you could on your own.

NexLevel’s experience has covered multiple industries including: Financial Services, Healthcare, Legal Services and Insurance in delivering marketplace results, with extensive expertise in complex technology oriented products and services. Our customized solutions help you to sell more, more frequently, to more people by clearly establishing your specific value propositions. This is where real world experience, strategy and execution deliver measurable results for your organization.

For over 20 years the advisors of NexLevel have been leading and creating market leaders in business, delivering success after success in taking companies to the next level in revenues and profitability. This vast expertise comes from real world experience in running companies, building organizations and holding the following positions of leadership: CEO, CMO, VP Business Strategy, and Director of Sales & Marketing. Our experience makes the difference in your business.

If your CRM reporting seems “off”, if your marketing campaigns are less than impressive, if your sales team is underperforming, then this is your system flashing the Check Engine light. More often than not, dirty data is the root cause.

About The Author

Michael Hammond
Michael Hammond is chief strategy officer at PROGRESS in Lending Association and is the founder and president of NexLevel Advisors. They provide solutions in business development, strategic selling, marketing, public relations and social media. He has close to two decades of leadership, management, marketing, sales and technical product experience. Michael held prior executive positions such as CEO, CMO, VP of Business Strategy, Director of Sales and Marketing and Director of Marketing for a number of leading companies. He is also only one of about 60 individuals to earn the Certified Mortgage Technologist (CMT) designation. Michael can be contacted via e-mail at mhammond@nexleveladvisors.com.

eClosings Are Picking Up Steam

Talk of the Digital Mortgage is everywhere. This is the big push for lenders looking to increase efficiency and cut cost. And vendors are stepping in to help lenders achieve this goal. For example, Visionet Systems, has released an enhanced web services integration layer for CD2UCD, a solution that converts Closing Disclosure (CD) images to Uniform Closing Dataset XML.  Visionet will be exhibiting its  solutions at the National Technology in Mortgage Banking Conference & Expo 2017, an event held by the Mortgage Bankers Association at the Hyatt Regency, Chicago from March 26 to 29.

Featured Sponsors:

 

 
CD2UCD addresses situations where the data required to produce the UCD is not available, such as CDs modified at the closing table and loans acquired through correspondent or bulk channels.  The enhanced interface allows lenders, closing agents, document preparation providers and Loan Origination System (LOS) vendors to easily pass a CD document and the required GSE data fields to the CD2UCD service and receive back a fully GSE-compliant UCD.  CD2UCD also fulfills single and bulk requests via secure web portal and file transfer mechanisms.

Featured Sponsors:

 
“We’ll be discussing our latest enhancements to the CD2UCD platform at MBA Tech 17, and highlighting the important industry problem it addresses,” said Norman Gottschalk, CTO at Visionet. “CD2UCD is the result of two years of investment, collaboration and learning, and we continue to invest in its development.  We look forward to sharing our expertise and experience with our present and future clients and colleagues, and we encourage partnerships and integrations with all industry participants.”

Featured Sponsors:

 
Visionet’s solutions help lenders, servicers and investors stay compliant with the ever-changing regulatory environment, while staying as efficient and cost-effective as possible.  CD2UCD represents a prime example of how the application of technology can significantly boost back office productivity. Instead of spending countless hours compiling and entering the information required to produce each UCD, our solution is simple and scalable, and drastically reduces the time and resources mortgage firms spend on investor delivery.  Visionet is proud to be the first to bring such a game-changing product to market.

About The Author

Tony Garritano
Tony Garritano is chairman and founder at PROGRESS in Lending Association. As a speaker Tony has worked hard to inform executives about how technology should be a tool used to further business objectives. For over 10 years he has worked as a journalist, researcher and speaker in the mortgage technology space. Starting this association was the next step for someone like Tony, who has dedicated his career to providing mortgage executives with the information needed to make informed technology decisions. He can be reached via e-mail at tony@progressinlending.com.

The Truth About Taxes

Some of you may be familiar with the old adage, you only have to do two things in life, die and pay taxes. While some people might add working out and watching your gluten intake, paying taxes still remains very true. For property taxes to be paid correctly, it is crucial for servicers to set up loans correctly on their system during the boarding process. Not doing so could lead to penalties or even the loss of a property.

Featured Sponsors:

 

The tax line is a record in a loan servicing system that includes all the data needed to identify when property taxes need to be paid, the amount of payment and what jurisdiction receives the payment.

tme916-your-voice-chart-two

The following charts represent the monthly number of items and dollar amounts for occurrences of paying the wrong amount of property taxes. The information is based on LERETA national data from prior servicer/lender acquisitions from July 2015 through April 2016.

Featured Sponsors:

One important area to be cognizant of is the tax line, which includes all the data needed to identify when property taxes need to be paid, the amount and the jurisdiction to receive the payment. The line includes:

tme916-your-voice-chart-one

>> The tax payee code is the unique number assigned to identify the tax collection jurisdiction on the servicing system. This code cross-references to another file that includes the taxing jurisdiction name and mailing information.

>> The tax identification number is the actual number assigned to the property by the taxing jurisdiction and is required when obtaining any information from the taxing jurisdiction about the property.

>> The disbursement amount or last amount paid is the last amount paid to the taxing jurisdiction for this tax line (can be an annual or installment amount) and is updated with the current amount to be paid when new taxes become due.

Featured Sponsors:

>> The due date is the date the servicer assigns to this line to alert them to an upcoming payment requirement (this date is based on business rules and is generally 15 to 30 days before taxes are due to the taxing jurisdiction).

>> The type is a code that identifies the type of taxes due, typically county property taxes, city taxes, school taxes and some non-standard payees, such as sewer taxes, garbage taxes, ground rents, etc. Servicers are required to report escrow payments annually to borrowers, and the tax type is used in preparing tax deductions on the borrower’s income tax return.

>> The term is used to identify the frequency the taxes are being paid, for example a Term of three indicates the annual taxes are being paid every three months, or quarterly (three months x four installments = 12 months).

>> Other codes as determined by the business rules of the servicer.

Servicers cannot rest on their laurels once the tax line is set up on a loan. It is extremely important that tax lines are monitored and maintained to ensure the accuracy of the data due and to any potential changes. Those changes could include new payees or the consolidation of taxing jurisdictions that no longer collects taxes, and how they are now collected by the county or separating taxing jurisdictions (a city decides to collect its own taxes instead of them being included by the county). There could also be changes in the due dates, contact information or tax identification numbers.

Servicers can either take on the responsibility of managing this process or work with a tax service vendor that can offer and facilitate the tax line setup and tax line audit services on behalf of the servicer. Either way, the taxes have to be paid correctly, else penalties will be incurred, which no one wants to happen.

About The Author

Ted Smith
Ted Smith is a vice president and client relations manager at LERETA. He has been with the company for nearly 17 years. Since 1986, LERETA has provided the mortgage and insurance industries the fastest, most accurate and complete access to property tax data and flood hazard status information across the U.S. LERETA is committed to giving customers extraordinary service and cost-effective property tax and flood solutions. LERETA’s services are designed to increase efficiency, reduce penalties and liabilities and improve processes for mortgage originators and servicers. LERETA’s dedicated teams of real estate tax and flood professionals along with LERETA’s experienced management team allow the company to lead the industry in service and technology.

Bloom’s Taxonomy

website-pdf-download

Let’s start with describing Bloom’s Taxonomy. In 1956, Benjamin Bloom with collaborators Max Englehart, Edward Furst, Walter Hill, and David Krathwohl established a framework for categorizing educational goals. Generally referred to today as Bloom’s Taxonomy, this framework has been applied by generations of educators at all primary, high school, and collegiate levels.

Featured Sponsors:

 

 
The Original Taxonomy (1956)

Here are the authors’ brief explanations of these main categories.

  1. Knowledge: Involves the recall of specifics and universals, the recall of methods and processes, or the recall of a pattern, structure, or setting.
  2. Comprehension: Refers to a type of understanding or apprehension such that the individual knows what is being communicated and can make use of the material or idea being communicated without necessarily relating it to other material or seeing its fullest implications.”
  3. Application: Refers to the use of abstractions in particular and concrete situations.
  4. Analysis: Represents the breakdown of a communication into its constituent elements or parts such that the relative hierarchy of ideas is made clear and/or the relations between ideas expressed are made explicit.
  5. Synthesis: Involves the putting together of elements and parts so as to form a whole.
  6. Evaluation: Engenders judgments about the value of material and methods for given purposes.

Although it received little attention when first published, Bloom’s Taxonomy has since been translated into 22 languages and is one of the most widely applied and most often cited references in education.

The Revised Taxonomy (2001)

One of the basic questions facing educators, whose core mission is to improve thinking, has been where to start. As always, definitions are in order. Before we can make a thing better, we need to know more about what the thing is.

Featured Sponsors:

 
In 2001 a group of cognitive psychologists, curriculum theorists, instructional researchers, and testing and assessment specialists published a revision to Bloom’s Taxonomy that focuses on a more dynamic classification. The changes occur in three broad categories: terminology, structure, and emphasis.

A. Terminology: Changes in terminology between the two versions are readily apparent. In short, Bloom’s six major categories were changed from noun to verb forms. The use of verbs more accurately describes the cognitive processes by which thinkers encounter and work with knowledge. It is also notable that the top two categories are switched in this revision so that creating (formerly synthesis) occupies the top position

Featured Sponsors:

 

Bloom’s Taxonomy

tme816-future trends art
Let’s look at the revised Bloom’s Taxonomy and add some reference to MISMO.

  1. Remembering: Retrieving, recognizing, and recalling relevant knowledge from long-term memory. MISMO: The initiative started with a small group of individuals organizing a list of data elements based on their past experience and interactions, partially influenced by their area of interest or expertise.
  2. Understanding: Constructing meaning from oral, written, and graphic messages through interpreting, exemplifying, classifying, summarizing, inferring, comparing, and explaining. MISMO: The first goal was to define a system or method to attach a label and definition of the data element in the hopes of identifying and eliminating duplicates. Obviously, there was a lot of discussion and different opinions.
  3. Applying: Carrying out or using a procedure through executing or implementing. MISMO: This was the Logical Data Dictionary (LDD. Looking back, it was probably the most significant achievement in exchange of information (data) between two entities where they both had the same definition. Although many feel we are in maintenance mode, the increasing focus on demand for more data-driven processes will continue to be a major initiative going forward.
    • The first 3 steps were building the foundation.
    • The next 3 steps were bringing it all together.
  4. Analyzing: Breaking material into constituent parts, determining how the parts relate to one another and to an overall structure or purpose through differentiating, organizing, and attributing. MISMO: Based on the XML standard at the time and our knowledge and experience, some of the earlier transaction sets were defined as Document Type Definitions (DTD). Specifically, they were created around defined transaction types, like credit, mortgage insurance, etc., independent of each other.
  5. Evaluating: Making judgments based on criteria and standards through checking and critiquing. MISMO: Next, the development of the schema and business reference model was also very significant. However, to the non-technical person, the visual of this model can be overwhelming. The need to get the business side involved is paramount to the continuing success of the organization and the industry.
  6. Creating: Putting elements together to form a coherent or functional whole; reorganizing elements into a new pattern or structure through generating, planning, or producing. MISMO: The development of the Logical Data Dictionary and the Business Reference Model by all the volunteer contributors from all areas of the industry was unprecedented. Kudos to all!

B. Structural: Structural changes to the taxonomy are well-considered and provide an easy-to-grasp understanding of the structure’s logical underpinnings. Bloom’s original cognitive taxonomy was a one-dimensional form. The Revised Bloom’s Taxonomy, with the addition of products, takes the form of a two-dimensional table. One of the dimensions identifies The Knowledge Dimension (or the kind of knowledge to be learned) while the second identifies The Cognitive Process Dimension (or the process used to learn).

C. Emphasis: Emphasis is the final category of changes. Bloom himself came to understand that his taxonomy was being used by many groups and organizations that never considered an audience for his original publication. In contrast, the revised version of the taxonomy is deliberately intended for a broader audience. Certainly, the same could be said for MISMO.

People around the world are familiar with the original Bloom’s Taxonomy and are not necessarily quick to embrace its change. After all, change is difficult for most people. The mortgage industry is no exception.

The goals for MISMO are threefold. 1) Increase adoption. 2) Increase membership, especially in the lender community. 3) Be cognizant of new opportunities to further the advancement of the standard.

My goal always is to present something that you might not have known about in the hopes that it will spur you to think differently. So, why do I bring this up? Are you focused on what MISMO is doing right now? Maybe you are and maybe you’re not, but if you’re not, you should be. Data standardization and the industrywide acceptance of that data standard is absolutely necessary for the mortgage industry to advance. Change may not be comfortable, but you can’t have true advancement without embracing change.

About The Author

Roger Gudobba
Roger Gudobba is passionate about the importance of quality data and its role in improving the mortgage process. He is vice president, mortgage markets at Compliance Systems and chief executive officer at PROGRESS in Lending Association. Roger has over 30 years of mortgage experience and an active participant in the Mortgage Industry Standards Maintenance Organization (MISMO) for 17 years. He was a Mortgage Banking Technology All-Star in 2005. He was the recipient of Mortgage Technology Magazine’s Steve Fraser Visionary Award in 2004 and the Lasting Impact Award in 2008. Roger can be reached at rgudobba@compliancesystems.com.

Yes, Innovation Really Matters

website-pdf-download

I say it a lot because I believe it: Those companies that innovator will be tomorrow’s industry leaders. For example, RealtyTrac unveiled a new multi-sourced national property database named the ATTOM Data Warehouse that will be curated by ATTOM Data Solutions, a newly created parent company operating a rapidly expanding property data licensing business along with existing consumer websites.

Featured Sponsors:

 

 
The new ATTOM Data Warehouse features enhanced and standardized data for more than 150 million U.S. property parcels — representing an expanded footprint that covers 99 percent of the U.S. population. The enhanced data warehouse is more than 9 terabytes and contains 13.9 billion rows of data and more than 6,000 discrete data attributes. Data available includes current and historical property tax assessor information, deed, mortgage, foreclosure, environmental risk, natural hazard, health hazard, neighborhood characteristics, and other property characteristics — all mapped to a unique ATTOM ID for each property.

“The new ATTOM ID is an innovation long-overdue in the real estate data industry, linking all property-centric data from myriad sources to one unique parcel identifier, and it’s just one example of how the new ATTOM Data Warehouse creates value for our customers and elevates our industry” said Rob Barber, CEO at ATTOM Data Solutions. “Under the new ATTOM Data Solutions brand, our mission as a company will continue to be increasing real estate transparency for businesses and consumers. That mission will be carried out in a variety of venues, including bulk file licenses, APIs and customized reports, along with our increasingly popular consumer websites.”

Featured Sponsors:

 
The ATTOM Data Warehouse fuels predictive analytics and machine learning developed by Audantic, a Seattle-based company that provides top-tier real estate investors — those typically purchasing about 100 homes a year — with marketing lists of homeowners likely to sell.

“We’ve tried to source national property data from many places, and the ATTOM Data Warehouse is head and shoulders above the rest,” said Franklin Sarkett, CTO of Audantic, which has grown from just two clients in 2014 to now having clients in more than 200 counties. “Without the data we wouldn’t have a business.”

ATTOM’s high-quality data that is easy to ingest has been key for Audantic’s meteoric growth and 95 percent client retention rate, according to Sarkett, who previously worked as a data scientist at Facebook.

“When we’re creating machine learning models, the better the data is, the more predictive the result is. If we’re feeding in bad data or inconsistent data or incomplete data … when we go to do the actual marketing it doesn’t work,” he said, noting the ATTOM data fed through Audantic’s predictive models have produced improved results for clients. “We can eliminate 80 percent of the population and we can double or triple their results.”

About The Author

Tony Garritano
Tony Garritano is chairman and founder at PROGRESS in Lending Association. As a speaker Tony has worked hard to inform executives about how technology should be a tool used to further business objectives. For over 10 years he has worked as a journalist, researcher and speaker in the mortgage technology space. Starting this association was the next step for someone like Tony, who has dedicated his career to providing mortgage executives with the information needed to make informed technology decisions. He can be reached via e-mail at tony@progressinlending.com.

The Power Of Data

website-pdf-download
My last article dealt with the proposed URLA/ULAD and how data should be and is the major focus for the mortgage industry going forward. Data is among the most powerful, underutilized, and sometimes misunderstood forces in technology today. The power of data can be used to create powerful change. It’s time to discover its full potential for problem solving. So let’s begin by exploring a number of philosophies about data.

We will start by examining the model of the Data-Information-Knowledge-Wisdom (DIKW) Hierarchy. This hierarchy is part of the canon on information and science and is intended to represent the progression from knowing nothing to knowing why. While there are both pros and cons on the DIKW Hierarchy as a useful and intellectually desirable construct, my purpose here is to offer a different thought process when it comes to collecting, analyzing, interpreting, and using the results to make business decisions.

Most images depict DIKWs as a pyramid or as an ascending slope where we increase our understanding by adding value at each step in the process. Since data is the focus of this article we will use that as the foundation (see diagram 1) and build up from there.

TME316-Future Trends Chart One

Data, level one, is simply a collection of disconnected, objective facts about an event. It is raw data with no patterns or relations. It can exist in any form, usable or not. We start giving context and add value as we move to the next level.

Information, level two, is where the data is analyzed and organized to describe who, what, where, and when. Data is categorized, calculated, corrected, condensed and given labels and definitions to provide structure. Through this organizational mapping of data, patterns emerge and meaningful relationships can be identified. This is the role of MISMO in the mortgage industry. We give meaning as we move to the next level.

Featured Sponsors:

 

We are just gathering information in the first two levels. We know nothing in the data level and start to know what in the information level. We start to develop a theory or a framework for explaining behavior. We create novel ideas. We rely on experience or knowledge gained through doing. Together theory and experience lead to moving to the next level.

Knowledge, level three, is where we are able to make informed decisions by asking how? The European Committee for Standardization, in its Guide to Good Practice in Knowledge Management, described knowledge as “the combination of data and information, to which is added expert opinion, skills and experience, to result in a valuable asset which can be used to aid decision making.” Think of this as contextualized and organized information. We give insight as we move to the next level.

The first three levels are based on observation and past experience, where hopefully we are doing things right. Now let’s look to the future with the next two levels where we hope we are doing the right things.

Wisdom, level four, is where we are able to explain why and demonstrate judgment. We start to have an evaluated understanding and are able to learn from our accumulated knowledge. We think about what is best. We give purpose as we move to the next level.

Decisions, level five, is not really one of the levels of DIKW, but where we have enlightenment and clarity of perception. Hopefully, this got you thinking.

Now, let’s look at this from the opposite direction. In his book, The Design of Business, Roger Martin unveils a new way of thinking that balances the exploration of new knowledge (innovation) with the exploitation of current knowledge (efficiency) to regularly generate breakthroughs and create value for companies. In short, design thinking converts need into demand. Think about how this approach could impact your organization.

TME316-Future Trends Chart Two

The Knowledge Funnel is the three-stage-model (see diagram 2) for altering knowledge from mystery to algorithms.

1.) Creates value in form of better efficiency.

2.) Requires continuous exploration and exploitation of knowledge.

3.) Functions especially in design thinking oriented communities.

This form of thinking is rooted in how knowledge advances from one stage to another—from mystery (something we can’t explain) to heuristic (a rule of thumb that guides us toward solutions) to algorithm (a predictable formula for producing an answer) to code (when the formula becomes so predictable it can be fully automated). As knowledge advances across the stages, productivity grows and costs drop, creating massive value for companies.

The mystery stage comprises the exploration of the problem. We discover disparate ideas and concepts. At the heuristic stage a rule of thumb is generated to narrow work to a manageable size. We experiment and start to define and develop models. Your intellectual capital and advantage is here. In the algorithm stage the general heuristic is converted to a fixed formula, taking the problem from complexity to simplicity. We deploy systems and procedures. Once systemized it is possible that the algorithms can be commoditized.

There is no better time than now to take a step back and evaluate your IT infrastructure and process flow.

In today’s digital world, business users need easier ways to explore data, uncover new insights and make informed decisions instantly from any device. However, achieving this can be difficult for many organizations given the complexities of outdated IT infrastructures. (PwC)

I will leave you with this final thought, “No new ideas, concepts or innovations can be formed from past information, data or knowledge – all new ideas can be validated only through unfolding the future events.”

About The Author

Roger Gudobba
Roger Gudobba is passionate about the importance of quality data and its role in improving the mortgage process. He is vice president, mortgage markets at Compliance Systems and chief executive officer at PROGRESS in Lending Association. Roger has over 30 years of mortgage experience and an active participant in the Mortgage Industry Standards Maintenance Organization (MISMO) for 17 years. He was a Mortgage Banking Technology All-Star in 2005. He was the recipient of Mortgage Technology Magazine’s Steve Fraser Visionary Award in 2004 and the Lasting Impact Award in 2008. Roger can be reached at rgudobba@compliancesystems.com.

Data Integration Streamlines TRID Compliance

eLynx has completed an integration with Landtech Data, a provider of systems used by real estate settlement services providers.  Landtech Data is also noted for creating the first real estate settlement system for desktop computers in 1981. The integration enables lenders utilizing eLynx’s Expedite ID compliance solution to exchange property, fee and loan data electronically with the 3,000 settlement service providers in 47 states using Landtech Data’s XML Real Estate Settlement System. This bi-directional exchange of data simplifies the collaboration required for lenders to generate the Closing Disclosure mandated by the TILA-RESPA Integrated Disclosure rules (TRID).

Featured Sponsors:

[huge_it_gallery id=”2″]

LandTech users can now save significant time and effort by directly accessing eLynx’s Electronic Closing Network (eCN) from their desktops, eliminating the need to sign into the system separately. They can also receive lender fee data and send back settlement agent fees to the lender from within LandTech’s production solutions, further streamlining the process.

John Ralston, director of Title Services at eLynx, explained: “Direct integration is far and away the fastest, most reliable and most accurate option for lender-settlement agent collaboration and eliminates a major obstacle lenders face in complying with TRID rules: the timely sharing of data between lenders and settlement agents.”

Featured Sponsors:

[huge_it_gallery id=”3″]

Landtech Data’s Director of Sales and Marketing Benjamin Bell noted, “With this integration, settlement agents using Landtech Data’s XML Real Estate Settlement System can automate collaboration with lenders and help them meet their new obligations under the TRID regulations.”

“By seamlessly sharing the data required to comply with TRID, lenders and settlement service providers can increase accuracy as well as reduce the cycle times, potential closing delays and costs associated with consumer disclosures,” said Ralston. “It virtually eliminates the need for manual data entry and the manual exchange of data. eLynx is extending this option to the entire eLynx registry of U.S. settlement service providers through integrations with Landtech Data and other title production systems,” he said.

About The Author

[author_bio]

Tony Garritano
Tony Garritano is chairman and founder at PROGRESS in Lending Association. As a speaker Tony has worked hard to inform executives about how technology should be a tool used to further business objectives. For over 10 years he has worked as a journalist, researcher and speaker in the mortgage technology space. Starting this association was the next step for someone like Tony, who has dedicated his career to providing mortgage executives with the information needed to make informed technology decisions. He can be reached via e-mail at tony@progressinlending.com.

The Evolution Of Data

website-pdf-download

Paul-ImuraOur mortgage industry is arguably the most data-rich industry, but unfortunately, we struggle with how to use this data effectively. For example, one mid-sized servicer revealed that its staff spends upwards of three full work days a month manually extracting basic data from its portfolio, which is a significant amount of time and resources for little reward. So why do today’s lenders and servicers struggle with harvesting the enormous amount of borrower and collateral data in their portfolios? One major contributor to this pain point is the fact that the mortgage industry’s technology platforms were designed by function rather than purpose.

The majority of mortgage technology platforms available in the market are built on legacy systems that skipped generations of functionality and, therefore, do not handle all the needs of today’s lenders and servicers, especially when it comes to data. Unlike many other industries, mortgage technology platforms have been updated and then pieced together again and again, versus just being replaced by entirely new technologies. As a result, these systems silo data, making it nearly impossible to communicate that data across different systems and serve stakeholders beyond primary business functions. The mortgage industry also faces the challenge of constantly changing regulatory requirements, which can slow technology innovation.

Featured Sponsors:

[huge_it_gallery id=”2″]

Millennials can represent a huge opportunity for our industry – representing $1.3 trillion of potential mortgage originations over the next few years. Given the Millennial generation is the largest pool of potential homebuyers in history, the mortgage industry should take note from other industries and find a better way to convert this data into useful output to help improve profitability, efficiency, compliance and transparency. However, this does not necessarily require a complete technology replacement. Take for instance the airline industry – it is one of the few industries to still use dot matrix printers, but these printers work in tandem with advanced self-service technologies to efficiently accommodate an average of 100 thousand flights per day in the U.S. This is proof that legacy and new systems can coexist and still progress to the benefit of consumers.

The mortgage industry needs agnostic tools connecting valuable data from disparate, legacy systems to build transparency across the board. The idea is not to transfer data, but to connect and present it in a digestible format, giving institutions the ability to slice and dice the view of that data and better model “what if” scenarios. To be effective, this tool must not only be enterprise-wide, but also drill-down to cover specific functions within different business lines. Understanding performance and compliance data at the transactional level empowers users to make corrections in real-time versus after the fact. It is also critical that this tool can be accessible and consumable from all involved parties, including servicers, lenders, investors and regulators.

Featured Sponsors:

[huge_it_gallery id=”3″]

Complete data transparency will enable lenders and servicers to identify performance indicators, operational risk and non-compliance quickly and ultimately make faster, better-informed decisions. Automating data analysis and reporting also has the ability to reduce at least one IT resource, which translates into approximately $100,000 in cost savings per year depending on the size of the institution.

Additionally, this level of insight will reduce the need for manual quality control (QC) processes. For example, traditionally, servicers manually manage QC for 10 percent of their servicing book on a quarterly or monthly basis to monitor compliance. The ability to view loan-level performance and transactional compliance data throughout the process on 100 percent of the servicing book mitigates risk compared to a 10 percent sample with quarterly reviews. This saves an additional $75,000 or more for a small servicer with just 10,000 loans in the portfolio, translating into a savings of $17.50 per loan out of a reported total of $220 per loan in servicing. Just imagine this can wipe out eight percent of the cost in the mortgage industry for servicing.

Our industry has made great strides in advancing data utilization. We have gone through the stages of operational data, canned reporting views and data dashboards. It is time to generate insight that to enable better management for our collective future. Insight is the last stop on the evolution of mortgage data.

About The Author

[author_bio]

Paul Imura
Paul Imura is chief marketing officer and senior executive for ISGN, an end-to-end provider of mortgage technology solutions and services. For more information about ISGN’s servicing and default solutions, visit www.isgn.com

The Evolution Of Mortgage Data

Our mortgage industry is arguably the most data-rich industry, but unfortunately, we struggle with how to use this data effectively. For example, one mid-sized servicer revealed that its staff spends upwards of three full work days a month manually extracting basic data from its portfolio, which is a significant amount of time and resources for little reward. So why do today’s lenders and servicers struggle with harvesting the enormous amount of borrower and collateral data in their portfolios? One major contributor to this pain point is the fact that the mortgage industry’s technology platforms were designed by function rather than purpose.

The majority of mortgage technology platforms available in the market are built on legacy systems that skipped generations of functionality and, therefore, do not handle all the needs of today’s lenders and servicers, especially when it comes to data. Unlike many other industries, mortgage technology platforms have been updated and then pieced together again and again, versus just being replaced by entirely new technologies. As a result, these systems silo data, making it nearly impossible to communicate that data across different systems and serve stakeholders beyond primary business functions. The mortgage industry also faces the challenge of constantly changing regulatory requirements, which can slow technology innovation.

Featured Sponsors:

[huge_it_gallery id=”2″]

Millennials can represent a huge opportunity for our industry – representing $1.3 trillion of potential mortgage originations over the next few years. Given the Millennial generation is the largest pool of potential homebuyers in history, the mortgage industry should take note from other industries and find a better way to convert this data into useful output to help improve profitability, efficiency, compliance and transparency. However, this does not necessarily require a complete technology replacement. Take for instance the airline industry – it is one of the few industries to still use dot matrix printers, but these printers work in tandem with advanced self-service technologies to efficiently accommodate an average of 100 thousand flights per day in the U.S. This is proof that legacy and new systems can coexist and still progress to the benefit of consumers.

The mortgage industry needs agnostic tools connecting valuable data from disparate, legacy systems to build transparency across the board. The idea is not to transfer data, but to connect and present it in a digestible format, giving institutions the ability to slice and dice the view of that data and better model “what if” scenarios. To be effective, this tool must not only be enterprise-wide, but also drill-down to cover specific functions within different business lines. Understanding performance and compliance data at the transactional level empowers users to make corrections in real-time versus after the fact. It is also critical that this tool can be accessible and consumable from all involved parties, including servicers, lenders, investors and regulators.

Featured Sponsors:

[huge_it_gallery id=”3″]

Complete data transparency will enable lenders and servicers to identify performance indicators, operational risk and non-compliance quickly and ultimately make faster, better-informed decisions. Automating data analysis and reporting also has the ability to reduce at least one IT resource, which translates into approximately $100,000 in cost savings per year depending on the size of the institution.

Additionally, this level of insight will reduce the need for manual quality control (QC) processes. For example, traditionally, servicers manually manage QC for 10 percent of their servicing book on a quarterly or monthly basis to monitor compliance. The ability to view loan-level performance and transactional compliance data throughout the process on 100 percent of the servicing book mitigates risk compared to a 10 percent sample with quarterly reviews. This saves an additional $75,000 or more for a small servicer with just 10,000 loans in the portfolio, translating into a savings of $17.50 per loan out of a reported total of $220 per loan in servicing. Just imagine this can wipe out eight percent of the cost in the mortgage industry for servicing.

Our industry has made great strides in advancing data utilization. We have gone through the stages of operational data, canned reporting views and data dashboards. It is time to generate insight that to enable better management for our collective future. Insight is the last stop on the evolution of mortgage data.

About The Author

[author_bio]

Paul Imura
Paul Imura is chief marketing officer and senior executive for ISGN, an end-to-end provider of mortgage technology solutions and services. For more information about ISGN’s servicing and default solutions, visit www.isgn.com