A Bright Future Lies Ahead


The term loan origination system or LOS is bandied around a lot in mortgage lending. The LOS is the core system of record that lenders rely on. However, LOS companies have not been known to innovate and are often blamed for holding lenders back from innovating themselves. That’s what has made Mortgage Cadence so successful. The company has always managed to stay ahead of the curve.

For example, the company recently made news when it transitioned all of its clients – more than 600 mortgage lenders across the U.S. – to the Accenture Mortgage Cadence Cloud to better manage loan processing cycle times, increase system reliability and leverage product upgrades. Brad Thompson, the current Mortgage Cadence Technology Lead at Accenture Credit Services; and Paul Wetzel, the Product Management Lead for Mortgage Cadence, talked with our editor about their careers, what’s next for Mortgage Cadence, and what lenders need to do in order to remain competitive these days. Here’s what they said:

Q: I understand you worked for Accenture in the early stages of your career. How has your experience been different this time around? Are there similarities?

PAUL WETZEL: Yes, I worked for Accenture, then Andersen Consulting, my first 9 years out of college. I spent my full tenure working on airline software. First – as everyone did – in a coding role, then finishing up in a business development role. At the time, that project was a bit of an anomaly for Accenture whereas software is a key area of focus for the firm now. One thing that remains the same is the very high caliber of people I work with. Then and now, Accenture is able to attract and retain the best available talent, and I very much enjoy the opportunity to work with them in serving our customers.

Featured Sponsors:


Q: How do you see the industry evolving over time?

PAUL WETZEL: First, the borrower experience will be revolutionized by 2020. This will include the time involved and process to apply, how updates are received and how the borrower interacts with the lender. Heavy emphasis on digital and ease-of-use. Forward-thinking lenders and technology partners will lead the way.

Second, top 20 lenders will continue switching from custom or quasi-custom in-house loan origination solutions to partnering with reliable, SaaS-based loan origination solution providers. The best technology partners continue to improve their offerings, while lenders are beginning to realize that risks are too great to remain on a one-off platform. Their loan origination process is better managed with an origination platform partner that can remain current, compliant, and secure, allowing them to focus on their business.

Third, by 2020 mortgages will be broadly bought and sold more easily in a purely digital way, reducing needless cost and risk for both buyers and sellers. The need is there, and the technology is there. Look for further traction from the industry in the coming years.

BRAD THOMPSON: First, I believe that with all of the recent regulations and imminence of more in the near future, lenders are going to look for new ways to reduce their cost to originate in order to remain competitive. To do this, lenders will turn to their partners to deliver innovative and disruptive solutions. This will also allow them to capture the Millennial generation who will become the predominant target market in very short order

Second, with more regulations in-store, and new generations requiring transparency and a simple origination experience, lenders will find themselves needing to offload the risks and the costs associated with mortgage lending. I’ve recently joined Accenture Credit Services to assist in their BPO (Business Process Outsourcing) practice. We plan to closely align this business with Mortgage Cadence in order to provide lenders with a simpler origination process, all while providing the very best mortgage technology and compliance support the industry has to offer.

Finally, I believe that the mortgage industry is poised to look more like a manufacturing business than a finance business. Quality will be derived through the automation of many currently manual processes. Ultimately, the fact remains: the less a loan is touched by people, the higher the loan quality and consistency will be.

Q: You are a long-time Mortgage Cadence employee Brad, getting your start nearly a decade ago with the company. What has this journey been like for you?

BRAD THOMPSON: It’s been an exceptional journey watching Mortgage Cadence grow from a small technology firm to becoming one of the leading technology platforms in the industry. We’ve seen the industry go through some very prosperous and traumatic times that no one was immune from. Despite it all, Mortgage Cadence has always been true to its core values and mission, and I believe that is what allowed us to emerge on the other side as the premier mortgage technology provider and trusted partner of many of the top lenders in the space.

Q: What should lenders be thinking about now in order to prepare for the future lending landscape?

BRAD THOMPSON: Lenders should be looking at how their loans are originated today. Simplifying these processes will help future borrowers obtain a mortgage in the fastest and easiest way possible. The Millennial generation will expect this from their lender and will have very little patience for those that don’t simplify the process and give them complete transparency into how their loan is being originated.

Q: What is the single greatest threat for lenders in coming years?

PAUL WETZEL: More TRID-like waves of compliance could increase the cost and risk of doing business to new levels. There is always opportunity in this, however, on both the lender and vendor side. The best prepared will survive – and thrive.

BRAD THOMPSON: I agree with Paul. The cost to both technology providers and lenders was immense in order to roll out TRID. Very few technology companies or lenders will have the wherewithal to withstand more major regulatory changes in the near future. Flexibility in processes and technology is certainly the first step. Being prepared and forward-looking is the second.

Q: With that being said, what is the single greatest opportunity for lenders in the coming years?

PAUL WETZEL: To use technology as a true competitive differentiator, allowing lenders to scale and become more efficient, exceeding their own customers’ expectations in the process.

Q: What does the future hold for Mortgage Cadence?

PAUL WETZEL: We have seen tremendous success over the last few years with Accenture, which has granted us an incredible amount of autonomy. This, in large part, has continued to allow us to focus solely on providing the very best mortgage technology the industry has to offer while still being able to bring all the capabilities of Accenture to bear whenever appropriate in serving our customers. As a result, Mortgage Cadence is entering a new era. This is an exciting time for the company and presents a world of new possibilities. By 2020, the lending landscape will look entirely different. Today, we’re making the strides necessary to stay ahead of the industry.

BRAD THOMPSON: Paul said it best. While I am moving on to new opportunities within Accenture, I will continue to stay aligned with Mortgage Cadence to ensure we are delivering the most complete mortgage technology and services possible. As my successor, Katharine Loveland will take over as Client Services Lead. Her expertise will be invaluable to the company’s future success. Over the past three years, Katharine has led a dynamic team of Strategic Engagement Managers, defining the department and working closely with Professional Services to develop new and repeatable processes to streamline customer implementations and engagements. As a company, Mortgage Cadence is absolutely poised and ready to lead the way into this new era of mortgage lending.


Paul Wetzel is the Product Management Lead for Mortgage Cadence. Paul has led Product Development and Product Management activities through most of his 2- year software career, the last 10 serving the Financial Services industry. In his current role, Paul manages both customer and industry requirements to drive product enhancements while also ensuring Accenture Mortgage Cadence leads the way in innovative loan origination technology. Through collaboration and true partnership, Paul and the entire Accenture Mortgage Cadence team is committed to providing the last lending technology customers will ever need.


Paul Wetzel thinks:

  1. The borrower experience will be revolutionized by 2020.
  2. The top 20 lenders will continue switching from custom or quasi-custom in-house LOS to partnering with reliable, SaaS-based LOS providers.
  3. By 2020 mortgages will be broadly bought and sold more easily in a purely digital way.


Brad Thompson, current Mortgage Cadence Technology Lead at Accenture Credit Services, is responsible for assisting Accenture Credit Services in providing their clients with powerful, technology-based BPO offerings leveraging the Mortgage Cadence suite of products. Over his ten years with Mortgage Cadence, Brad was a key contributor to the company’s success by assisting in go-to-market strategies, bringing key market data back into the organization to continually shape our product suite, and maturing our implementation methodology while growing the professional services team into the highly successful unit it is today


Brad Thompson thinks:

  1. With all of the recent regulations and imminence of more in the near future, lenders are going to look for new ways to reduce their cost to originate.
  2. With more regulations in-store, and new generations requiring transparency and a simple origination experience, lenders will find themselves needing to offload the risks and the costs associated with mortgage lending.
  3. I believe that the mortgage industry is poised to look more like a manufacturing business than a finance business.

Breaking Through To Millennials


In today’s ultracompetitive mortgage market lenders of all sizes are realizing the need to connect with millennial borrowers. To do so, it is critical to understand who these potential borrowers are and the best ways to market to them.

I recently listened to a webinar that provided great insight into this topic. The title of the webinar was “Marketing to Millennials: Essential Strategies for B2B and B2C” Presented by: Trish Witkowski, Foldfactory with Steve Belmonte, of Accuzip.

Here are some of the key insights that they shared. They started with who are Millennials and how big of an opportunity this demographic represents?

Featured Sponsors:


Who are the Millennials?

  • k.a. “Gen Y”
  • 1980-2000 15-35 years old
  • Gen Y= 87+million
  • Gen X=69 million
  • Baby Boomers 78 million

Gen Y will be the largest and most powerful group of consumers this nation has ever seen (Kenneth Gronbach, The Age Curve). Its members are already consuming at 500% of their parents’ level in adjusted dollars for age. (Kenneth Gronbach, The Age Curve)

Millennials are digital natives. They have grown up in digital environments. The rest of us are digital pioneers. Only 6% think online advertising is credible. (SocialChorus) 95% see friends as their most believable source of product information. (SocialChorus)

The topic of marketing to millennials is not a new one but everyone wants to talk about what the big guys are doing. The challenge when looking at the big guys is the resources that they have at their disposal and understanding how to apply that to your business. Look at Coca-Cola, their marketing budget is just under 3 Billion. Nike is at $2.4 billion in marketing spending.

The good news is that budget aside, the same principles that these companies use to market to Millennials can apply to your business.

Here are ten principles of Marketing to Millennials (M2M)

  1. Authenticity– Are you authentic? So you have to ask yourself…
    1. Are we who we say we are?
    2. Are we transparent?
    3. Do we accept responsibility?
    4. Do we share our passion?
    5. Are we consistent?
    6. Do we care?
  2. Accessibility– 70% of Twitter users surveyed expect a response from brands they reach out to on Twitter (Lithium Technologies) 53% want that response in under an hour. 72% want that response in under an hour when they have a complaint. In this study it talked about how a negative brand experience could quickly change into a positive one if the response came within under an hour. They want you to acknowledge them and let them know that you are working to address their problem or concern.
  3. The Human Touch– Brands are moving from intellectual connections to emotional connections. The ability to have a human touch is really important. It’s all about creating those brand connections that create brand loyalty. 95% of all decisions are based on emotion. Think about it, you buy from people you like, you buy products that you love. It s about creating the relationship.
  4. Consciousness– Over 85% of Millennials correlate their purchasing decisions to the responsible efforts a company is making. (SquareSpace) Is your company making responsible efforts?

An example of a company that is applying this principle is Life is Good. The last page of their catalog talks all about growing the good. Giving Back. Cause marketing. Cause marketing is when for-profit business partners with a non-profit organization for mutual benefit. Cause marketing tips: Be authentic, make a connection, work it into corporate culture and don’t just write a check.

  1. Personalization– Personalized marketing communications stand out. Advancements in print media have taken the cost and complexity out of versioning. This allows companies of all sizes to do amazing things that really stand out.
  2. Speed- Millennials have a great need for speed. So, as a company you need to think about: Load speed of your web pages, quick clicks, fast checkout process, and fast response. They don’t have a lot of patience waiting for things.
  3. Technology Integration– In general Millennials expect a technology-integrated brand experience. We now spend more than 8 hours a day consuming media. (ZenithOptiMedia, 2015) So much of that is digital- our phones, tablets, laptops, podcasts etc. So print and the tangible experience have the opportunity to have a huge impact. Print + technology (email, PURL’s QR codes etc.)
  4. Social Media -Millennials are very social. Social media tips: No blatant sales pitches, keep it short, use good social media manners, interact with them, create opportunities to share, have a sense of humor. 41% of Millennials said the main reason they abandon content is that it’s too long (NewsCred) You have to be thinking about getting to the point before they abandon the content.
  5. Rewards– Millennials love loyalty programs. 68% of 20-34 year olds said they would change where they shopped if it meant getting more program rewards. (Bond Brand Loyalty) One third of those surveyed reported buying something they didn’t need or want just to earn points or increase membership status. (Bond Brand Loyalty)
  6. Ease of Use– There should be no barriers to entry in the brand experience. Easy access to website, mobile friendly etc.

Reach Millennials on their mobile devices. Think about how to get them on their mobile devices. They are always on their mobile devices. Use print to enhance the digital experience.

B2B- 46% of potential buyers researching B2B products are Millennials today, up from 27% in 2012. (Google and Millward Brown Digital) By 2020, the U.S. workforce will be 50% Millennials. (Millennials as B2B Buyers) So the key is the principles are the same.

So where should B2B focus their efforts? Definitely in mobile. 42% of researchers use a mobile device during the B2B purchasing process. (Google and Millward Brown Digital) and that trend is only going to go up and up.

The “Mobile First” Strategy: Responsive web design, intuitive navigation, speed, ease of use, and integration of video. 70% of professionals researching B2B products and services now use video across the purchase path. (Google and Millward Brown Digital) Integrating video is a big part of the purchase process. Get your clients to take video of themselves using your product and post it to your website.

Millennials preferred attributes for doing business: ease of doing business, willingness to collaborate, and industry/marketplace expertise. (IBM Institute for Business Value) Millennials rely on analytics…and the opinions of others.

70% of B2B decision-makers today begin their research through a generic online search. (Google and Millward Brown Digital) Millennials are open minded. Whether they are spending their own money or someone else’s, Millennials expect the best value.   So what works? Stand out format, personalized content, targeted audience and it must be memorable.

The key to breaking through to Millennials is understanding who they are and how they prefer to be marketed to. What works? Creative and interactive format, integration of mobile technology, combination of print and digital, integration of video, and simplicity of concept.

This is the digital generation who demands personalization, ease of use, and the ability to interact with your brand where and when they want. The combination of digital and print will help you stand out to the Millennial crowd.

About The Author

The Future Of Compliant Doc Prep


The argument between synchronous and asynchronous integrations is valid. There are advantages and disadvantages to both. However, without understanding the context, an argument cannot be accurately made. In the case of document preparation, particularly with regard to the level of compliance validation and service necessary for lenders today, asynchronous integrations to a loan origination system (LOS) are and should be a requirement. They allow the most flexibility with regard to the processing time needed to create the required customized, compliant documentation. In this article, we will examine and establish a basis for this argument.

It is important to appreciate the distinction between a synchronous and an asynchronous integration in connection with the movement of data to accomplish the preparation of various products and programs for mortgage disclosure and closing documents. Synchronous integration simply implies that a service on system “A” (in this case, a LOS) sends a message containing data (i.e., a MISMO 3.3 data file) to a service on system “B” (a document preparation system). Then, system “B” responds a few moments later, with the document package requested. This is a single, electronic transaction containing a request followed by a response.

Asynchronous integration, on the other hand, contains multiple, electronic transactions. For example, an initial transaction may simply be a service on system “A” sending a message containing data (i.e. the same MISMO 3.3 data file) to a service on system “B”, and system “B” responds that the data was received and processing has begun. Then, within a certain timeframe depending upon what action is necessary (to be explained later in this article), a follow up electronic transaction from system “B” sends a message containing the document package to a service on system “A”, and system “A” responds that the document package was received. This would be an example of a “push” transaction from a document preparation system where system “B” is pushing the document package back to a service on system “A”. Likewise, certain document preparation systems will also support “pull” transactions where system “A” can send a status request to system “B”, and, if the document is ready at that time, the document will be provided; otherwise, a status of “pending” will be provided. The purpose is to provide a certain level of flexibility. All clients have differing requirements, and as we have seen over the past few years, those requirements change. Without the flexibility going into a relationship with a client, it is much more difficult to provide the services required.

Featured Sponsors:


Document Preparation is all about the validation of data and the creation of compliant loan documentation. An asynchronous system is designed to accept package requests in the form of various data file formats such as MISMO 2.6 and 3.3, along with other file types. Once the data is received, an asynchronous system goes about its business validating the data, running fee calculations, running the necessary compliance tests and checks, building lender specific, customized and tailored dynamic document package set as mandated by that validated data, and delivering the resulting document package.

It is not unusual for a document system to delay processing due to an issue with the data or per specific legal requirements. Flags and triggers are set within a smart document preparation system to handle these legal interventions and exceptions so that an attorney or compliance officer can act rapidly from a legal aspect if necessary. If no legal interaction is required, the package is then delivered.

One example of this is in the case of Texas loans that require attorney intervention for title review. Another example would be loans that have POA (power of attorney), Corporations or LLC’s, or Trusts and so on. In order to deliver the compliant product clients want and expect, sophisticated document systems are designed to handle any loan scenario that might occur, and sometimes that means extra data processing or legal intervention. While the goal is to deliver documents back to the LOS system as quickly as possible, that should never be at the risk of delivering documents that did not pass through all appropriate system checks and balances.

In some document preparation systems, synchronous integration forces the document provider to provide documents within a few seconds, or the process on system “A” will give up, or timeout. That “timeout” forces the user to resubmit the document package to the document provider a second time or in worst case scenario, documents are presented back to the user that are incorrect or out of compliance.

From a LOS development standpoint, synchronous integration is the simplest form of integration that could be written, but it is not necessarily the best scenario for a lender and can yield documents which are fraught with mistakes upon final delivery. In an asynchronous integration, the user submits the document request and can receive a notification that the document preparation system has received the request. Then after going through compliance tests, data validation, running calculations, interpreting whether or not the document package needs legal intervention, or there are errors to cure, the documents will appear. That time delay is typically less than two minutes, but it may be longer if something necessitates extra data processing or attorney intervention in the processing of the document package.

These two approaches may also affect a user’s workflow. If the workflow is designed to do one loan at a time, then the potential delays that are part of an asynchronous integration will be more evident. Whether the delay is disruptive is something to be determined. If determined to be disruptive, the workflow may have to be modified. A detailed discussion with the lender regarding workflow and processes at early stages of the integration and implementation phase are recommended to achieve maximum efficiency and 100% accurate compliance results every time.

Beyond workflow, there are also service level considerations. Recovery from many kinds of system issues, on either side, is far more tolerant in an asynchronous environment than in a synchronous one. An asynchronous system acknowledges receipt of the request, control of system “A” is returned to the user. System “A” and system “B” can then process independently. Many things are happening in system “B” to ensure compliance. Once the document set is created, system “B” will continuously try to return the document set to system “A” until system “A” responds that it has accepted the document set, thus ending the entire transaction. In a synchronous system, any failure, or extended delay, from either end causes the entire transaction to fail.

Many LOS providers support asynchronous integrations because they see the value this type of integration brings to not only their clients, but to the industry as well. Others have modified their systems, or are in the process of modifying their systems, to support the technology. The development effort for a LOS project team to move toward providing an asynchronous development option within their environment is not assumed to be a simple proposition, but neither is it as daunting a task as some fear it to be. The difference is how the results of the request are displayed. Remember, in an asynchronous environment, the documents are not returned as part of the “request” transaction. Instead, a popup is provided to the user that the documents have been ordered, the system then, behind the scene, validates the data, performs the calculations, runs the necessary compliance tests, checks to see if there is any legal or attorney intervention necessary, then displays and/or sends the 100% compliant results.

In certain circumstances, a report showing a list of errors that are preventing the documents from being ordered can be displayed. If the documents were successfully ordered, they will be made available in very short order. The user simply presses a button within the LOS to check the status of the documents in the document preparation system. If the documents are available, they are immediately retrieved. If not, another “pending status” is displayed. Upon a successful retrieval, various status information about the document request are updated within the LOS, and the user moves on to the next task. This would be an example of a manual “pull” transaction described in the first paragraph. “Pull” transactions are the most popular method of asynchronous integrations among the LOS providers supporting this type of integration; however, “push” transactions are easily managed.

The legal compliance world is changing. Flexibility is the key to being able to adapt to those changes quickly and efficiently. Asynchronous systems were developed to maintain that flexibility and adaptability. Clients who benefit from this type of interface have come to expect a higher level of detail in their software systems, along with the ability to quickly satisfy any requirement from any source along the way. An LOS integration that invokes an essentially instantaneous “timeout” situation in all “documents ordered” transactions mitigates against the core value proposition that a more robust asynchronous integration offers its clients.

About The Author

Homeowners Are Getting Optimistic Again


Nearly half (46%) of all U.S. homeowners with a mortgage expect their equity will increase in 2016, even though three out of five (60%) report equity in their homes has already increased during the last three years of the housing recovery, according to new research conducted for loanDepot.

Of those who expect their equity to change this year, 85 percent expect it to rise as much as 10 percent, with a quarter (27%) expecting it to rise between 6 to 10 percent. More than half (58%) are expecting an equity bump between one and five percent. Industry-wide reports forecast 2016 annual price gains to range between 2.3 and 4.7 percent. Only 3 percent of homeowners expect their equity to fall in 2016, and 27 percent expect it to remain the same.

More than 100 U.S. housing experts forecast home values will reach an average annual growth rate of 3.65 percent through the end of 2016. Today, more than 49 million homeowners – or 66 percent of all homeowners – hold a mortgage on their home.

Featured Sponsors:


The loanDepot research also found that while 57 percent of homeowners believe their home’s value has appreciated in the past three years, the majority (80%) underestimate the amount of value their home has gained throughout the housing recovery. Of those who believe their home’s value has increased since 2013, one in four (27%) believe it increased between one and five percent since 2013. The Case Shiller 20-city index shows prices rose twice that much, in fact 10 percent from Nov 2013 to Nov 2015.

“Homeowners who bought during the housing boom are regaining equity many thought was lost forever, yet too many are not aware of the equity they have gained or they are unclear about how to determine changes in their equity,” said Bryan Sullivan, chief financial officer of loanDepot, LLC. “People who bought after the housing boom when prices were low are realizing homeownership can be a great investment and an asset that they can now leverage through equity to realize many dreams. Whether they choose to leverage their home equity now or reserve it for future needs, millions of homeowners have choices today not available just a few years ago.”


Getting Content Marketing Right


In a white paper written by hubspot and outbrain, they say that content marketing is the new king of the digital marketing world. Marketers of all stripes are embracing non-interruptive inbound marketing. Why? Because it lets them build rapport with their prospects and customers organically. Content marketing’s power to connect with and grow engaged audiences is huge for one reason: people like it. It puts customers first. Instead of bothering them with ads, it supplies them with genuinely useful, entertaining, and interesting content.

The thing about content marketing is that to do it right, you need a plan. To make that plan, you have to really understand what drives readers to click on, read, share, or interact with different kinds of content. You also need to have clear and explicit business goals for a particular piece of content before creating it.

Generally speaking, there are three kinds of goals marketers have for content: traffic, engagement, and conversion. Each goal has its own metric for success, and each does something different for your organization.

Featured Sponsors:


In crafting your content marketing, everything starts with a good headline or title. Your headline or title is often a reader’s first interaction with your brand online, so it’s arguably the most critical component of your content marketing strategy. It’s your first, and maybe only, chance to grab your target audience’s attention.

Your reader has complete control—they either choose to click and engage with your content or they don’t. It’s that simple. Interesting and relevant stories are important, but if your headline doesn’t communicate value to the reader, nobody’s going to see them. You can (and should!) always optimize and try again, but every dud headline costs you crucial opportunities to reach your audience.

The most important headline rule is: respect the reader experience. In this era of clickbait (eye-catching content whose main purpose is to attract attention), it’s more important than ever to write a headline that delivers on its promise. At the most basic level, you want the reader to have a good experience with your brand. When she clicks on a link, you’ve got to be sure she’s getting what she expected and not being duped in some sort of digital shell-game. Once you lose that trust, it’s gone.

And let’s face it, clicks matter. Regardless of what your strategic goals are for your content marketing, without that first click, you’ve got nothing. If you’re trying to grow your traffic, CTR is the metric you want to be looking at. So what increases CTR and what hurts it?

>> When used in a headline, the words “photo” and “who” increase CTR, whereas the words “easy,” “how to,” “credit,” “cure,” “magic,” and “free” decrease CTR.

>> Making references to the reader by using the words “you,” “your,” or “you’re” in the headline decreases CTR.

>> Including positive superlatives (“best”, “always”) in headlines decreases CTR.

>> Headlines generate the highest level of engagement at moderate lengths (81-100 characters).

>> Bracketed clarifications, which are clarifications of the type of content represented by the headline – e.g. [Infographic], increase CTR when included in headlines.

>> When used in the headline, the words “simple,” “tip,” “trick,” “amazing,” and “secret” decrease CTR.

>> Using words that convey a sense of urgency (e.g., “need,” “now”) in the headline decreases CTR.

People care about the Whos, not the Whys. Headlines that included the word “who” generated a 22% higher CTR than headlines without the word “who.” “Why,” on the other hand, decreased CTR by 37%. When it comes to intriguing readers with your headlines, focus on who not why.

People want to be shown things. Headlines featuring the word “photo(s)” performed 37% better than headlines without this word, a margin even larger than we’ve found previously (29% increase among 2013 headlines).

Headlines with bracketed clarifications (e.g., [photos], [interview], [slideshow], etc.) per- formed 38% better than headlines without clarifications, suggesting readers are more likely to click when they have a clear picture of what lies behind the headline.

Similarly, certain words and headline concepts stand out as things people just don’t like to click. One consideration in this category is saturation—once a certain kind of headline becomes popular among clickbaiters, readers no longer trust the keywords in that headline. Because of that, the “bad words” are more likely to evolve over time.

People don’t want instructions. Headlines containing the phrase “how to” performed 49% worse than headlines without this phrase, showing that reader aversion to this phrase has not dissipated much since the 46% decrease we saw in 2013. This year we found that another instruction-oriented word, “tip,” also decreased CTR by 59%. These behaviors highlight the difference between a reader in search mode and in content consumption mode. How-tos can be highly desirable to people searching for specific content, but they’re less appealing to readers who are browsing.

Headlines with the word “easy” generated a 44% lower CTR than headlines without this word, consistent with the 46% decrease we saw in 2013. This year we also saw a 49% decrease in CTR among headlines containing the word “simple.” Readers are constantly bombarded with “easy ways” and “easy steps” that start to sound spammy. Further, “simple steps” speak more to the search mode reader with a goal and less to the person consuming content.

Headlines that used positive superlatives (“always” or “best”) performed 14% worse than headlines that did not, showing not much has changed since the 23% decrease we saw in 2013. Contrary to popular belief and their widespread use in headlines, these words do not appear to be compelling to readers. This may simply be a product of overuse, or it could be because readers are skeptical of sources’ motives for endorsement. On the flip side, sources of negative information may be more likely to be perceived as impartial and authentic.

Headlines that made references to the reader by including the word, “you,” “your,” or “you’re” performed 36% worse than headlines that did not contain any of these words, showing a heightened distaste for this tactic since 2013 when we saw a 21% decrease in CTR among such headlines. The attempt to make readers feel as though they’re being spoken to directly appears to do more harm than good.

Headlines with language that conveys a sense of urgency (“need”, “now”) generated lower CTRs than headlines that didn’t use such pushy language (44% and 12% lower). The negative impact of the word “need” is something we also saw in 2013 (it seems readers have become more amenable to the word “must”, which previously appeared amongst pushy words that hurt CTR). Readers are resistant to words that demand action or attention. These words can have an advertorial feel rather than an editorial feel. Also, their overuse in headlines over time has weakened their ability to convey a true sense of urgency.

Okay, so you got the click. Now what? After successfully driving traffic to your site, often the next goal is engagement. That basically just means that you want your readers to stick around and consume more of your content. Successful engagement has a lot to do with the type of content you are driving audiences to a slideshow or multi-page article naturally encourages readers to click along to read more. Ideally, though, the surrounding content on your site also engages readers by being interesting and relevant.

In the end, content marketing in its truest form should not be overtly promotional. What sets it apart from other marketing techniques is the focus on providing value to the reader or viewer. That said, an important goal of content marketing (and one that is growing in popularity) is generating conversions.

A conversion is getting your reader to take some action. That action could be opting into a newsletter, buying something, agreeing to have a salesperson contact them, or anything else that gets them more involved with your organization. Generating conversions without a hard sell requires finesse.

Whether you’re looking for clicks, engagement, conversions, or all three, a content marketing campaign starts with great headlines, or ends with bad ones.

By engaging a professional to help you, by experimenting, and by using metrics to measure the success of each experiment, you’ll eventually be able to develop your own set of best practices for headline engagement. Data can be the most powerful tool in your content marketing toolbox—use it.

About The Author

Where Is The Technology We Need?


There is a contest going on these days and it is not being played on a field on at a stadium. The prize is not a trophy or pictures on the news. This contest is to decide if the mortgage industry can survive outside the protection of the big banks; the ones “too big to fail.” Without a doubt those banks that were huge participants in the mid-2000s run up to the mortgage crisis have significantly pulled back from offering home loans today. Over the past few years we have seen the #1 lender, Wells Fargo reduce its lending from $125 billion in the fourth quarter of 2012 to $47 billion in that same quarter in 2015. J.P. Morgan Chase has had a similar retreat, going from $51.2 billion in the fourth quarter of 2012 to lending only $22.5 billion in the same period of 2015. This reduction in lending is driven primarily by the cost of capital and the associated cost of compliance which is not only regulatory changes, but stricter underwriting requirements as well. According to Chris Whalen, a consultant in the industry, bankers believe that “…making home loans to American families is not worth the risk.”   So who is going to pick up the slack?

For many in the industry the answer lies in community banks and credit unions. These entities, which are primarily local and well-known to their communities now have the opportunity to step into the void left by the retreat of the big banks. Others however, believe that the “non-banks” will ultimately take charge of the industry. Statistics seem to support their position. In 2014 non-banks had 43% of market share which was up from 23% in the year 2007. Non-banks such as Quicken, Penny Mac Financial and PHH mortgage are rapidly moving in to fill the gap. However, community banks and credit unions are looking to increase their share of the market as well. In fact, the Credit Union National Association (CUNA) is holding an educational conference which highlights mortgage and other consumer lending programs at the end of April.

Featured Sponsors:


While these entities venture into capturing larger shares of the market it is most likely not a surprise to many. However, with the largest numbers of homebuyers divided into the Millennials and the Golden Boomers (those baby boomers who have reached their golden years), it is a toss up to see which draws the largest numbers and originates the greatest volumes.   Those who study lending trends are convinced that the selection of a lender will be bifurcated along these age lines. For those Millennials whose life is wrapped up in their technology gadgets, it is expected that they will gravitate toward the non-banks whose approach is typically through technological applications. These same individuals also think that the more seasoned members of a community or profession will only be comfortable dealing with local bankers who provide a more personnel touch. There are however, a growing number of individuals who believe that this bifurcation may be exaggerated. Based on experience in dealing with both generations, they believe that while the Millennials may initially explore lenders that offer the best technology they will ultimately turn to their parents and elders for advice. These individuals are more likely to recommend a local bank because of the human interaction involved.

Over the past weekend the contest heated up even more as Quicken began advertising their “Rocket Mortgage” on television. In case you were wondering what a rocket mortgage is, it appears to be one that technologically collects, analyzes and makes the mortgage underwriting decision within minutes. Sound familiar? Didn’t we hear this same approach during the early 2000s? There are however, some issues that really aren’t rocket-paced and others that exclude a large percentage of those seeking mortgages from the process. For example, Quicken advises on their web-site for those seeking mortgages that “Products available on Fixed Rate Conventional Products only. No FHA, VA or Jumbo Products. No State Restrictions.” This obviously limits the opportunity for those seeking mortgages, especially first time homebuyers. The web-site for their Rocket Mortgage has a video that describes the process. The couple applying for a mortgage in the video talk the viewer through the process. First, they must answer a “few” simple questions. Then they are asked to “share bank statements and income.” But what if you aren’t a salaried borrower—it doesn’t say. A message then pops up to tell them that Quicken is verifying their information. The couple explains that Quicken obtains their credit report for no charge. But what about income and assets? Next thing you know, they are approved and the rate is locked in. Seems great, especially for those with limited knowledge of how the mortgage process.

The responses to these ads were swift. The majority of the industry members who commented were harsh in their comparisons of this process to the loan programs that caused the Great Recession and the subsequent myriad of new regulations that continue to plague mortgage lending. The appearance of the no verification loans was a scary sight for many. But were they the same and are the applicants as naïve about the type of loan products available under this program? After all the product available, it seems, is only for fixed rate loans; no hybrid ARMS or Pay-Option loans.

Things have changed with more borrowers becoming more aware of potential issues at the same time they have become more sophisticated about technology. Will the Millennial generation be the first to actually benefit from this application process or will they be warned against using this technology by parents who were crushed by the debacle of earlier “easy application” mortgages?

Having said that, not all members of the Gen X and Baby-boom generations are accepting of the slow 60 to 90-day process that requires every piece of documentation that is available to the applicant. As one of my neighbors asked me when he was trying to refinance in 2010, “I just had a colonoscopy, do they want the results of that as well?” Many in fact, want a much faster approval and have found that going through a community bank or credit union is a very slow and laborious process. While many are using automated underwriting systems, the more comfortable approach is with the use of stricter guidelines and more documentation. After all they still have to deal with the capital restrictions and tougher regulatory oversight.

Controlling Operations

This, it seems, is where the great divide occurs. For non-banks the process still involves the originate and sell model. The non-bank lenders worry more about having investors for their closed loans knowing that they are passing the primary risk on. Community Banks and Credit Unions on the other hand, have to retain some, it not all of the risks associated with the loan origination. So as the home buying and home refinancing population becomes more about the Millennial generation and less about the less tech savvy generations, will the Community Banks and Credit Unions have fewer and fewer mortgage loans.

Unfortunately, what we have not yet realized is that this contest is less about technology and more about the ability to control the processes that produce the loans. The risk from the failure of people, process and/or technology has been shown over and over again to be the underlying cause of the Great Recession. This operational risk has been recognized for some time.

In 2007 an Operational Risk Score was introduced to the market. This product re-verified the majority of the data in a loan file and using a proprietary algorithm, calculated a measure of increased default risk. While the model had been tested and validated, lenders at that time were not interested in knowing about the quality of their underwriting. Then in the summer of 2009, two members of Moody’s Investor Service presented their findings on the impact of the poor quality underwriting attributable to loan performance. In “Underwriting versus economy: a new approach to decomposing mortgage losses” by Ashish Das and Roger M. Stein found that abnormal underwriting quality made a significant contribution to loan losses in vintages from 2004 Q4 through 2007 Q1. In other words, it does not appear to be just using technology that makes a difference but the additional risk created by a failure of the people, processes and/or technology. Yet neither non-banks or Community Banks and Credit Unions have done little to control or monitor that risk.

Controlling and monitoring operational risk lies primarily in the hands of an organization’s quality control group. This group is responsible for monitoring the quality of the underwriting and closing processes through data collection and analysis. These reviews, when properly analyzed can determine if there is a weakness or failure in a process that creates additional risk so that management can make changes. Unfortunately, this approach was not the required methodology during the years 2003 to 2009. When Fannie Mae recognized the need to change the program, they along with Freddie Mac and FHA introduced loan quality initiatives that were based on a far superior methodology. Following these changes, they announced that the “quality” of the loans sent to them were greatly improved. Of course the fact that guidelines were much tighter and product parameters were changed was not mentioned as a contributing factor. The reality is that most lenders have really not changed their approach. They throw in a couple of classifications and go on doing what they have always done. Some, especially some outsourcing firms have done nothing. Obviously we need technology support if this is going to change as the industry needs it to.

Where is the needed technology?

Having validated the impact of poor quality underwriting and the ability to model the risk it creates, we need technology to step up to the plate and create a quality control program that validates the acceptability of the underwriting and closing processes. Quantifying this risk benefits everyone in the industry. Lenders, knowing the risk, can make more sound business decisions, even to the point of proactively choosing to take on the risk. Investors will have a quantified measurement that they can incorporate into the pricing of loans rather than just assuming the underwriting process was done correctly. Regulators can easily determine the level of risk assumed by lenders and using that information ensure that capital levels are adequate. It’s a win-win for everyone. Yet technology companies have so far failed to take on the challenge.

It seems to me that the technology companies focused on this industry are too busy chasing updates that promise perfection in the name of regulation rather than looking for existing needs that require change. While they claim that they are trying to develop technology that will make the process better and cheaper, they ignore a process that is labor intensive, extremely costly and the bane of every production manager. Here they have the opportunity to change the face of the entire industry; make every company more effective and efficient and every potential borrower happier with the process. Yet they do nothing. Can’t someone step up to the plate and take on this challenge? It will make a difference in this newest survival contest.

About The Author

Computer Power


In 1996, only twenty years ago, IBM’s Deep Blue became the first supercomputer to defeat a chess grandmaster. There wasn’t anything magical about this. It simply scanned 200 million moves per second and determined the next logical move.

After Deep Blue’s victory in 1997, IBM had been on the hunt for a new challenge. In 2004, IBM Research manager Charles Lickel, was at a restaurant with coworkers when everything went silent. The entire restaurant stopped mid meal to watch Jeopardy! as Ken Jennings was in the middle of his successful 74-game run. Seeing this as an opportunity, Lickel approached IBM Research executive Paul Horn to take on the challenge of playing Jeopardy with an IBM system.

Watson, a computer system capable of answering questions posed in natural language, was developed at IBM by a research team led by David Ferrucci, the senior manager of IBM’s Semantic Analysis and Integration department. Watson was named after IBM’s first CEO, Thomas J. Watson.

Featured Sponsors:


In competitions managed by the United States government, a system named Piquant, Watson’s predecessor, was usually able to respond correctly to only about 35% of clues and often required several minutes to respond. To compete successfully on Jeopardy!, Watson would need to respond in no more than a few seconds, and at that time, the problems posed by the game show were deemed to be impossible to solve. In initial tests run during 2006, Watson was given 500 clues from past Jeopardy! programs. While the best real-life competitors buzzed in half the time and responded correctly to as many as 95% of clues, Watson’s first pass could get only about 15% correct. During 2007, the IBM team was given three to five years and a staff of 15 people to solve the problems. In 2011, Watson competed on Jeopardy, against former winners Ken Jennings and Brad Rutter and received the first place prize of $1 million. Watson had access to 200 million pages of structured and unstructured content consuming four Terrabytes of disk storage.

We have all heard about Moore’s Law that predicts that computational power doubles roughly every two years. To put this in perspective, a Samsung Galaxy S5 can perform 140 billion floating-point operations per second. That is more than 10 times the speed of IBM’s Deep Blue computer. Imagine what the capabilities will be in the future.

At a recent Gartner Symposium, Ginna Rometty, Chairman, President and CEO of IBM, stated, “She hopes that in the next 20 years, people will have their first experience with IBM products in the form of Watson. IBM is making its big bet on what Ms. Rometty called the ‘cognitive era’, and the company announced on October 6, 2015 the formation of the Cognitive Business Solutions.

Ms. Rometty went on to say, “IBM will stake its claim with Watson. This is our moonshot. You stake the claim, and you move fast. Watson is not a supercomputer, but rather a set of 28 systems underpinned by more than 50 technologies.” Digital business + digital intelligence = the cognitive era. So what’s significant about this?


First, the amount of information (data) that can be accessed is enormous. To quote from IBM “We produce over 2.5 quintillion bytes of data every day and 80% of it is unstructured. Therefore, it’s invisible to current technology. IBM Watson is a cognitive system that can understand that data, learn from it and reason through it. That’s how industries as diverse as healthcare, retail, banking and travel are using Watson to reshape their industries.” The key point here is that the majority of data is unstructured. This is one reason why MISMO’s capability to exchange information both internally and externally is so important to the mortgage industry.

Secondly, the power and speed of computers should empower the mortgage industry to develop new, intuitive ways to interact with consumers and business partners. Let’s explore Watson’s capabilities a little further. “Watson is a question answering (QA) computing system that IBM built to apply advanced natural language processing, information retrieval, knowledge representation, automated reasoning and machine learning technologies to the field of open domain question answering. The key difference between QA technology and document search is that document search takes a keyword query and returns a list of documents, ranked in order of relevance to the query (often based on popularity and page ranking), while QA technology takes a question expressed in natural language, seeks to understand it in much greater detail, and returns a precise answer to the question.”

Finally, Artificial Intelligence has been defined as the ability to create computers and computer software that are capable of intelligent behavior. It’s been around for decades but it hasn’t been until recently that its power’s been unleashed. The Automated Underwriting Engines would be an early example. The field was founded on the claim that a central property of humans, human intelligence, can be so precisely described that a machine can be made to simulate it. However, today it is much more than that, primarily because of the proficiency in analyzing vast amounts of data at speeds beyond our comprehension.

All of this prompted me to think about the mortgage industry’s tendency to be comfortable with the status quo and slow to consider change. It’s time to change. “Instead of being disrupted, be the disrupter. I do it inside my own business,” said Ginna Rometty. “You will be the disrupter if you choose to do it.”

I believe some in the mortgage industry are beginning to think ‘outside the box’ and take a more holistic approach to development of a better process. If you are a small lender, community bank or credit union, you may be more nimble and agile to tackle the challenges today. Certainly, the time and expense involved is manageable. However, if you are a major lender or large bank, and your product line is everything from origination through servicing, chances are you have a number of disparate systems (some legacy) that are disconnected. In either case, it is time to take a step back and analyze your overall business process.

The primary objective of any new or modified system should be an enhanced consumer-friendly interface, including online help and educational assistance. Historical and predictive data analytics will also become a key component. The bottom line is that the mortgage industry has to do a better job of harnessing the power of the computer to genuinely improve the process for all parties involved.

About The Author

The Benefits Of A Data-Validated Mortgage


In 2015, industry observers might have thought that CFPB’s “Know Before You Owe” rule, colloquially known as the TILA RESPA Integrated Disclosure (TRID), was the single most important issue facing mortgage lenders. To be sure, TRID dominated the headlines and the discussions over the course of the year. But TRID also exposed a crucial issue that has been percolating over the last seven or eight years, and it is one the industry must solve: How do we assure investors and regulators that the content of the loan file is exactly as it is represented?

What is required is a new approach to accessing the specific data points that support each mortgage loan. Think of the new approach as one in which every data element is standardized and understood within an industry-wide context: a “Data-validated Mortgage.”

What is a data-validated mortgage?

A Data-validated Mortgage is one that has escaped the bonds of paper that have constrained mortgages since the 19th century. Paper wasn’t a problem back in the days when it was the highest-tech medium available, but today’s electronic world requires that mortgage information be accessible outside of the paper document.

Featured Sponsors:


A mortgage can be described as “data-validated” when information within the loan documents and from the mortgage process flow is utilized to do four things:

  1. Substantiate that an approved loan is consistent with the loan sent by the lender to the closing table, signed by the borrower and sold to investors
  2. Demonstrate loan quality to investors by providing greater transparency to the process, the data, and the compliance of the loan data and the documents in advance of a post-close review and audit
  3. Monitor regulatory requirements throughout the mortgage workflow, from pre-close to close to post-close, and highlight potential investor or regulatory errors
  4. Document evidence of compliance throughout each phase of the loan lifecycle

There are many implications to this simple description, and a number of things have to happen before the Data-validated Mortgage becomes possible. First, there is the matter of the information being locked in the paper documents the industry uses. It may seem like a simple matter to turn printed information into data, but it is not as easy as it sounds.

The mortgage business has evolved technologically, particularly over the last 15 or so years. But due in part to the legacy systems involved, it remains largely a document-centric industry. Printed or electronic print-formatted documents that have loan data embedded within them (think PDF or fax) are exchanged throughout the loan process, yet this data is not easy to manage. Lenders use loan origination systems to enter data and create documents; but once created, the data is unstructured and is no longer accessible outside of the document. For the data to become validated, it must be accessible independently of the document itself—and yet still follow the document through the loan process. For example, in a Data-validated Mortgage, while there is a loan document like the Form 1003 that can be printed, the document also exists in an electronic state as a digital, data-rich companion.

Portability and Siloes

Next, there is the matter of data portability. Data becomes portable when it can be shared, analyzed and put to good use by authorized participants to the process. In the Data-validated Mortgage, this is where the concepts of structure and standardization become tremendously important.

A Data-validated Mortgage requires data to be passed from one system (or person) to the next in the loan workflow. To do this properly, the data must have a structure that identifies the data’s context. So in addition to the raw data, each data element must also have a label. For example, “$350,000” isn’t very useful all by itself. Label it “Loan Amount,” and the recipient instantly understands its meaning. Structured data enables lenders, settlement agents, regulators and business partners to access and analyze the data to help improve quality, business processes, compliance and efficiency. However, structured data is only useful when everyone uses the same standards. Unfortunately, there is a basic problem that arises at this point, owing to a lack of standardization.

It took over 20 years for automobiles to develop standardized controls, with the clutch, brake and accelerator in the modern layout still used today. Early Model T drivers used a hand throttle, for example, and brake controls were located wherever the builder felt like putting them. Cars became far more useful in the 1920s when drivers could operate most of them pretty much in the same way.

The mortgage industry is going through a similar evolution. Problems have arisen when lenders use systems with proprietary, independently developed structures for data. As a result, mortgage data is generally “siloed” and accessible only by systems that can talk to one another. Even systems that have undergone integrations to become somewhat compatible with each other are seldom fully aligned. For example, servicers routinely must manually go through loans originated by others in order to board them from an origination system to a servicing platform. All this incompatibility makes data portability difficult, if not impossible – which is why data standards are critical to achieving the Data-validated Mortgage.

MISMO and the Uniform Data Reference Model

Thankfully, progress toward mortgage data standards is improving. The Mortgage Industry Standards Maintenance Organization (MISMO), a not-for-profit subsidiary of the Mortgage Bankers Association, has spent years working with lenders to create a standardized data format for mortgage lending. Instead of having every lender using its own method of handling loan information, MISMO enables all to be on the same page.

For a bit of reference, the mortgage crisis came at a time when about 100 data elements per loan were available to investors and rating agencies to evaluate risk. None of those data points could be digitally validated, by the way. Using MISMO, up to 10,000 individual fields of loan information can now be detailed, enabling very fine analysis to satisfy risk control and record keeping requirements for investors and other stakeholders.

After years of work and untold hours invested by committed industry professionals, MISMO has created a reference model that has become the de facto standard for residential and commercial real estate transactions, enabling data portability between systems that adhere to it. MISMO has also established both company-level and professional-level certification programs that help identify technology vendors and service providers that follow the MISMO standards.

One impactful example of the move towards standardized data portability is the GSEs’ Uniform Mortgage Data Program. The Uniform Closing Dataset (UCD) is the latest in a series of standards set by the GSEs, each focusing on a different part of the loan process. (In 2011, the GSEs released the Uniform Appraisal Dataset (UAD). That was followed in 2012 by the Uniform Loan Delivery Dataset (ULDD).) The UCD, which is based on the MISMO reference model, addresses data from the new Closing Disclosure. Lenders wanting to sell loans to Freddie Mac and Fannie Mae need to adhere to the UCD, which gives the GSEs the ability to use the data submitted with the loan to evaluate the integrity and quality of the loan prior to loan purchase. By basing the UCD on the MISMO reference model, the GSEs helped drive acceptance of the MISMO reference model and established it as the industry standard to follow.

In addition to data portability, a true Data-validated Mortgage requires connections for exchanging data between the various parties and systems associated with the loan. This allows lenders and partners to exchange data quickly and accurately. Ideally the connections are in the form of a network that links multiple lenders and partners. eLynx offers such a network now in the form of the electronic Closing Network (eCN). This enables all lenders and partners who connect to eCN to be connected to all of the member lenders and partners, seamlessly and instantaneously.

Long lasting benefits

Just as interchangeable parts and the assembly line revolutionized manufacturing in during the Industrial Revolution, the Data-validated Mortgage brings numerous benefits to mortgage lending.

First and foremost, it improves the integrity of the loan package and enables granular evaluations that ensure loan quality. Nothing is lost in translation due to systems’ inabilities to understand one another’s “language.”

Second, Data-validated Mortgages allow lenders to vastly reduce exposure in compliance issues by ensuring loans are within regulatory tolerances, including the UCD validation for loans submitted to the GSEs and the upcoming HMDA reporting requirements. These and other analyses can be readily automated during all phases of the loan process, from TRID through delivery, thanks to the Data-validated Mortgage.

Third, it supports greater and more efficient interaction with the borrower, which is a focus of the CFPB and a clear preference of today’s consumer.

Lastly (and perhaps the most important long-term benefit) is the transparency the Data-validated Mortgage brings to the loan process. As the investor community regains confidence in the integrity of mortgage information and its own ability to assess risk, there is a far greater future beyond today’s government-centric environment for the return of private capital in a large scale.

In addition to these tremendous benefits, the Data-validated Mortgage gives lenders the unprecedented ability to leverage the information they collect to understand market trends for business opportunities and improve their internal processes to boost service levels to consumers. They can also gain much greater ability to monitor vendor performance as mandated by the CFPB.

Are we in the midst of a data revolution in lending? Without question. While the overarching potential of achieving data mastery has yet to be fully revealed, the Data-validated Mortgage is emerging as a revelation unto itself – and one with unprecedented potential to transform the mortgage industry in numerous positive ways.

About The Author

Industry Hot Topics


I usually discuss a given topic that’s in the news or top of mind for me each month, but this month I want to switch it up a bit. Why just talk about one topic when there are so many topics to tackle.

For example, regulatory risk is a chief concern today in terms of both compliance and the increasing cost of compliance cutting into profit margins. So, what one or two compliance issues that are most important today? Are there other compliance risks that lenders should be aware of or preparing for?

“As a marketing company, we see a lot of concerns. There is defragmentation in mortgage marketing,” said Mary Beth Doyle, co-founder of LoyaltyExpress. “CFPB is coming down hard on what can be said, it’s getting heavily controlled. However, it is difficult for organizations to get streamlined so they are compliant.”

Featured Sponsors:


LoyaltyExpress is a one of the nation’s largest providers of marketing automation and cloud-based CRM solutions for mortgage companies and banks. The company’s solutions enable lenders to automate marketing campaigns and easily manage customer, partner and recruiting data, while streamlining marketing activities, such as email, direct mail, print and gift fulfillment. This integrated approach enhances compliance by eliminating the need to share sensitive customer data with multiple vendors while ensuring that loan officers only use preapproved marketing materials that comply with regulatory guidelines.

“When we talk about the cost of compliance, there has been a psychology to recruit the a-team and let them go at it,” noted Doyle. “So, it’s a cultural change when the CFPB now places constrains. Having the controls in the system to keep communication automated and consistent is critical.”

Marketing isn’t the only sector to be impacted by skyrocketing regulation, the appraisal sector has been impacted, as well. “The regulations and compliance issues around appraising have skyrocketed,” said Jeffrey J. Bradford, founder and CEO of Bradford Technologies, Inc. “For example, the term ‘desirable’ can’t be used by an appraiser anymore because that’s subjective. There is a reliance on data and being factual. The fear of being sued or having to buyback a loan is amazing and it’s causing a lot of money to go toward appraising.”

Bradford Technologies is an innovator of valuation tools and solutions for residential appraisers. The company pioneered computer-aided appraising, was the first to incorporate statistical support in both mainstream and alternative valuation products, and currently provides one of the most adopted technologies for residential appraisers. AppraisalWorld, the company’s online appraiser community with over 20,000 members provides services focused on building trust and reliability in the appraisal industry.

The answer to maintaining compliance is consistency and accuracy, according to Ann D. Fulmer, a senior industry advisor for FormFree Holdings. “We hear a lot that loan officers are resistant to technological change that would protect the lender, but limit them. Another consideration down the road, to the extent that LOs are loosey-goosy, you open yourself up to fair lending issues. You have to defend yourself and your decision-making when you approved that loan.

“So, ask yourself: Are your decision-making processes consistent? Just because you can show that what you did was an industry standard, that doesn’t mean that you can’t be sued. The biggest risk to the industry is the lack of clarity around the new CFPB rules. Until the CFPB gets clear about what it expects and how they are going to enforce those rules, it’s paralyzing.”

“From a TPO standpoint, the paranoia ratchets up because you have to know who you’re doing business with,” added Gregory J. Schroeder, president of Comergence Compliance Monitoring, LLC. “If you are a small- to medium-sized lender you are in a crosshair because what do you do? You have to ensure compliance without destroying your profit margin. The CFPB has not targeted the smaller lenders yet, but it is coming.”

Comergence Compliance Monitoring, LLC, is a SaaS provider of vendor management solutions, currently focused on third-party originator and appraiser risk. Comergence provides lenders and appraisal management companies with tools that review and continually monitor registered mortgage loan originators and appraisers.

In the midst of all this change it’s important to assess and re-assess how you as a lender are handling this change. “Lenders are trying to do everything at once,” said Fulmer. “The CFPB was intended to protect consumers, but they are actually hurting consumers because there’s a lot of confusion.”

Doyle added that “there are state guidelines as well as federal guidelines to deal with. So, there’s a lot of interpretation to be done for sure. There’s such panic and intensity around compliance. You need to have a full audit trail to at least prove that you are trying to comply. And of course the consumer is hurt because these organizations are so paralyzed by how to interpret the new rules.”

Compliance aside, another hot topic is appealing to Millennials to both enter the mortgage industry as workers and homebuyers alike. “I have a daughter that is a Millennial, shared Bradford. “She wants to work for an employer that is having a social impact. They care about the climate, the world and the environment. There used be 2,000 appraiser trainees, now we only see a few hundred. You now have to have a college degree and many hours of experience to be an appraiser, which makes it economically unfeasible for people to get into the appraisal industry.”

In addition to Millennials turning their back on becoming an appraiser or loan officer, they’re also increasingly turning their back to homebuying, as well. So, how should the industry behave to get this group of people to want to buy a home?

“The challenge is really providing a program to help these young people get into a home given that they are already carrying so much debt in the form of student loans,” answered Doyle.

“I don’t think you can entice these guys to buy a home,” pointed out Bradford. “They want to be mobile and live in the trendiest places. If you could refinance their student loans it might entice them to do other loans with you.”

As we can all see, lenders are facing a number of challenges, but there’s one area that gets less attention than it should: oversight of third-party relationships and vendor management. “Recent regulations driven by Dodd-Frank have made managing these relationships even more critical,” reported Schroeder. “Lenders aren’t just being held legally accountable for their own actions, but for the actions of all the third parties with whom they do business. That includes not just third-party originators but other parties as well, including appraisers and software providers. We’re also seeing that originators are facing demands for more information and documentation each time they decide to apply to a new wholesaler.

“In addition, lenders that allow borrowers to shop for third-party settlement services have legal responsibility under the CFPB’s new regs in case those providers do harm,” continued Schroeder. “It’s clear that mortgage lenders must have an effective process in place for managing their service providers. But how do you keep track of what your vendors are doing in a cost-effective manner while staying focused on your own business? It can be an overwhelming task.”

So, how does the lender stay ahead of all of these issue? Many see the answer in artificial intelligence and smart technology. Let me leave you with this thought: Technology will never replace a human, but if programmed well, it can standardize the process and increase accuracy and efficiency all at the same time.

About The Author

Preventing Data Breaches


When sensitive data is to be transmitted to a vendor, regulations, regulatory guidance, and best practices dictate that you should have a written contract in place. That contract is an opportunity to fairly assign the risk of data loss or breach between you and the vendor. As such, these contracts represent opportunities—and potential pitfalls.

You’re not going to negotiate a one-sided risk allocation into every vendor agreement. You’re not going to negotiate every vendor agreement. Some vendor arrangements don’t present enough risk to justify redlining the vendor’s boilerplate contract. So how do you prioritize? And for priority vendors, how do you determine the right contract language to allocate the risk of data loss or breach?

First, you have to evaluate the potential consequences of a failure under an agreement. To do that, you have to understand what data you’ll be sending to the vendor. One tool that can be useful for these purposes is a Privacy Impact Assessment (PIA).

Featured Sponsors:


Consider performing a Privacy Impact Assessment

A PIA is a concept introduced in the federal E-Government Act of 2002. It requires that agencies of the federal government conduct an analysis of how personally identifiable information is collected, stored, protected, shared and managed, in new and existing systems. The assessment must be “commensurate with the size of the information system being assessed, the sensitivity of information that is in an identifiable form in that system, and the risk of harm from unauthorized release of that information.” An agreement with a vendor processing Social Security numbers would require a much more thorough assessment than the end user license agreement for Microsoft Word.

The Department of Homeland Security’s guidance on PIA prep says that “the purpose of a PIA is to demonstrate that system owners and developers have consciously incorporated privacy protections throughout the entire life cycle of a system. This involves making certain that privacy protections are built into the system from the start, not after the fact when they can be far more costly or could affect the viability of the project.” Closer to “home” for mortgage lenders and vendors, the Department of Housing and Urban Development says that a PIA

>> Identifies the type of personally identifiable information in the system (including any ability to combine multiple identifying elements on an individual);

>> Identifies who has access to that information (whether full access or limited access rights); and

>> Describes the administrative controls that ensure that only information that is necessary and relevant … is included.

HUD’s PIA guide and form is available at http://www.hud.gov/offices/cio/privacy/pia/piaquestionnaire.doc.

The first step is to identify the type of information vendors will have access to. What is the nature and sensitivity of the data? What is its source? Is the data or the sharing of the data subject to a law or regulation, or contractual provisions with another vendor, or an internal policy?

Next, determine how the data will travel to the vendor. Does the system require authentication? Is the vendor accessing and pulling the information from a server of yours? Who will be able to access it?

Why is the data being transferred? To what use will the vendor put the data? Is the vendor’s performance critical to your business?

What controls do you already have in place to help safeguard the data? What controls do the vendors have? Are they adequate?

Cataloguing data flows and vendor relationships in this way will help you gauge the level of risk of data loss or breach. What level of risk are you willing to assume? Can the risk be mitigated with additional administrative or technical controls, or by appropriate contract language?

In addition to the HUD guide, there are many other guides and examples of federal government privacy audits. DHS’s guide is here: http://www.dhs.gov/xlibrary/assets/privacy/privacy_pia_guidance_march_v5.pdf. The Department of the Interior has a guide here, https://www.doi.gov/sites/doi.gov/files/migrated/ocio/information_assurance/privacy/upload/DOI-PIA-Guide-09-30-2014.pdf, and the State Department has dozens of examples of PIAs from 2000-09 here: http://2001-2009.state.gov/m/a/ips/c24223.htm.

Some contract terms to include

The contract is one step in the vendor engagement process, after diligence and selection. The same regulations that require written contracts require thorough diligence, so you should know enough about your vendor and its systems to perform a PIA. But there shouldn’t be any hesitation to ask for information you may be missing from a potential vendor. You should know whether they wipe computers clean when an employee leaves. You should know if their facilities are access controlled.

In the contracting phase, anticipate the next step in the process, management. For higher-risk exchanges, insist on the right to audit your vendor during the term of the contract if it should become necessary. If you’ve collected documentation from the vendor respecting its security measures, require notice of any change, or an annual or other regular provision of updated documents.

Some other terms to be included in a good contract with a vendor include: the vendor’s obligations upon data loss or breach, especially including immediate notice to you; a promise not to introduce any viruses or malware, if the vendor has access to your systems; and an agreement that the vendor will dispose of sensitive data as soon as practical after the exchange.

Legacy contracts

The costs associated with data breach are closer to the front of your mind now than they were three years ago. What do you do about contracts you entered into three years ago? Five years ago?

It’s crucial to perform some level of diligence on existing vendor relationships. Vendors who do business in mortgage or settlement services fields are likely to be ready with any documentation you might have requested if you were contracting with them now. Regardless of what the contract says, simply asking for documentation is likely to yield positive results.

If you encounter delay or resistance, you might find a hook in the legacy contract that provides for audits of financials: you might be able to obtain security and other information under that clause as well. There may be a general promise that the vendor will “cooperate,” and that might give you a leg to stand on. When all else fails, know the contract’s term and termination rights. Often a contract term will automatically renew unless one party notifies the other of its intent not to renew. There may be a separate right of termination for convenience. These give you leverage to obtain any diligence documentation you need, and they also represent opportunities to amend the contract to better protect you and your data.

Enterprise-wide data security assessments

Some organizations find that an enterprise-wide privacy assessment would be beneficial. There are several existing frameworks you can draw from to plan such an endeavor.

The National Institute of Standards and Technology (NIST) is a federal government agency which has implemented the president’s executive order requiring the development of a voluntary framework for “reducing cyber risks to critical infrastructure.” Background, information and an Excel spreadsheet of the framework are available at http://www.nist.gov/cyberframework/.

The International Organization for Standardization (confusingly acronymed ISO) has developed ISO/IEC 27018:2014 which establishes objectives, controls, and guidelines for implementing measures to protect sensitive information in a cloud computing environment. Details can be found at http://www.iso.org/iso/catalogue_detail.htm?csnumber=61498. ISO is a worldwide organization that promotes proprietary, industrial, and commercial standards.

And the Cloud Security Alliance, a non-profit organization that promotes security best practices, has available Security Guidance for Critical Areas of Focus in Cloud Computing at https://cloudsecurityalliance.org/group/security-guidance/.

Whether you’re evaluating a proposed vendor relationship or your entire organization, the key is understanding your data. You can’t effectively mitigate risk without knowing what you’ve got to lose.

About The Author