Posts

Listen To Forbes

For those of you who didn’t get a chance to read the centennial issue of Forbes magazine, you missed a collection of thoughts from 100 of the greatest living business minds. I thought I would select some excerpts from the visionaries and early adopters.

Steve Case, Co-founder, AOL & Revolution: The story of American business over the last 100 years is a story of different sectors rising and falling (and often rising again in unanticipated ways) in different regions of the country. When Detroit was an automobile powerhouse and Pittsburgh was the steel city, Silicon Valley was just fruit orchards. As the industrial revolution peaked and the technology revolution accelerated, the role of these places changed. As we enter the internet’s third wave, where entrepreneurs will leverage technology to disrupt major real-world sectors—like health care, education, financial services—startups will increasingly move to cities where industry expertise exists. The opportunity to grow companies that spur job creation and economic growth holds great promise for what I call these “Rise of the Rest” cities. This will lead to a more dispersed innovating economy, where jobs and wealth are created across the country, not just on the coasts.

Featured Sponsors:

 

 
Michael Milken: Philanthropist: I came of age and went into business right in the middle of these past 100 years. Two issues of Forbes, the 50th and the 60th, had a particularity significant influence on me. Both issues really made me think about how financial structures changed over time and how leading companies changed. A century ago, the automobile was radically changing transportation and mobility. Ford Motor was the 21st largest company. By the time it went public in 1956 with what was then the largest stock sale in history, it was one of the most valuable companies in the U.S. Today its total market value is less than the annual price variations of Amazon, Facebook, Apple, or Google. In 1917, most of a car’s cost was based on raw materials, the country’s largest company by far was U.S. Steel. Today the American steel industry directly employs fewer than 140,000 workers. Today’s growing challenge: create meaningful lives for the world’s population. We’ve accomplished the greatest achievement of mankind, the extension of life. Since 1900, average life expectancy worldwide has grown from 31 to over 70. Economists estimate that about half of economic growth is tied to the public health and medical research advances that underlie increased longevity.

Bill Gates, Co-founder, Microsoft: In early 1975, when I was in college, my friend Paul Allen showed me an issue of Popular Electronics, featuring the Altair 8800 computer, the first commercially successful personal computer. We both had the same thought: “The revolution is going to happen without us!” We were sure that software was going to change the world, and we worried that if we didn’t join the digital revolution soon, it would pass us by. That conversation marked the end of my college career and the beginning of Microsoft.

Featured Sponsors:

 
We’ve just began to tap artificial intelligence’s ability to help people be more productive and creative. The pace of innovation is accelerating—and that opens up more ideas for exploration. Big advances in clean energy will make it more affordable and available, which will fight poverty and help us avoid the worst effects of climate change.

The next 100 years will create more opportunities and we need people to keep believing in the power of innovation and to take a risk on a few revolutionary ideas.

Masayoshi Son, Founder, Softbank: When I was 19 years old I saw a photo of a microprocessor in a science magazine. It was just a tiny chip that could fit on a fingertip but represented an entire computer. ‘Oh my God,’ I said to myself, “this is going to change mankind’s life.” This is the biggest invention that man ever created. Those microprocessors were compacted into PCs, then linked together to create the internet and later smartphones. Now they are extending our knowledge and intelligence via artificial intelligence.

Tim-Berners-Lee, Inventor: I published my proposal for the World Wide Web in 1989. From the outset, I imagined it as an open, universal space, where anyone, anywhere could take their ideas and bring them to life without having to ask for permission or pay royalties. I hardwired these factors into the Web’s design and made a conscious decision not to try to copyright or patent it. In 1993, CERN, my employers at the time, agreed to make the code available to anyone royalty-free, forever. But now, as the Web matures, this openness is under attack. For the economic, social and political benefit of all, the Web must be recognized as a public good and locked open through appropriate corporate and government action—including the preservation of net neutrality.

Featured Sponsors:

 
Marc Benioff, Founder, Salesforce: We are living in the fourth industrial revolution, with advancements in robotics, genetics, stem cells, autonomous vehicles and especially artificial intelligence. All will dramatically change life itself. We need to have a beginner’s mind to think about what is happening. That idea of a beginner’s mind is the core to innovation.

Michael Dell, Founder, Dell Technologies: The Computer Age is just beginning. Most companies today have about a thousand times more data than they actually use to make better decisions. When you overlay the latest in computer science—AI, machine learning, deep learning, unsupervised learning—you will create an explosion of opportunity and a real emergency. Over the next few years, as the cost of making something intelligent approaches zero, companies will succeed and fail based on their ability to translate data, including historical data, into insights and actions and products and services in real time. We like to think of ourselves as a company with big ears: We listen, we learn, we understand—and we create things.

Jeff Bezos, Founder, Amazon: We’re in the midst of a gigantic transition, where customers have incredible power because of transparency and word of mouth. It used to be that if you made a customer happy, they would tell five friends. Now with the megaphone of the internet, whether online customer reviews or social media, they can tell 5,000 friends. In the old days, an inferior product could prevail in the marketplace with superior marketing. Today customers can tell whether a product and service is good because there’s so much transparency. They can compare it to others very easily, and then they can tell all their friends—the customers will do part of the heavy lifting, marketing-wise. Rather than inferior products shouting louder, we have sort of a product meritocracy. It’s very good for customers, it’s very good for the companies that embrace it—and it’s very good for society.

So what can we take away from these various reflections? We don’t have to be the visionaries who start a revolution, but if we want to succeed in an ever-evolving social and economic landscape, we need to be positioned for adaptation. Organizations that make the best use of data will recognize the signs of change first. And the organizations that design their product offering to fit the customer—rather than hoping that customers will change their behavior to fit the products—will have the competitive advantage regardless of the industry in question.

About The Author

Roger Gudobba
Roger Gudobba is passionate about the importance of quality data and its role in improving the mortgage process. He is vice president, mortgage markets at Compliance Systems and chief executive officer at PROGRESS in Lending Association. Roger has over 30 years of mortgage experience and an active participant in the Mortgage Industry Standards Maintenance Organization (MISMO) for 17 years. He was a Mortgage Banking Technology All-Star in 2005. He was the recipient of Mortgage Technology Magazine’s Steve Fraser Visionary Award in 2004 and the Lasting Impact Award in 2008. Roger can be reached at rgudobba@compliancesystems.com.

Hurricanes Harvey, Irma, Etc.

Sunshine, Blue Sky, and Spectacular Sunsets: That’s why I moved to Florida. Oh, lest I forget, I need to mention some other important objectives: unlimited golf and no more snow or wind chills below zero. Then along came Hurricane Irma!

It’s not like my wife Kristy and I weren’t aware of the potential damage from hurricanes. We had just taken possession of a house we had built in Cape Coral when Hurricane Charley ($ 16.3 billion in damages) struck the area in 2004.

Featured Sponsors:

 

This is my observation of the analysis, preparation, and the response by government agencies, businesses, and the general population to the hurricane threat and what the future holds.

It’s all about the data: Mother Nature can be capricious and even though predicting the path and severity is not an exact science, the U.S. and European models for Hurricane Irma came very close and they were able to make adjustments along the way. The forecasting seems to get better every time and I attribute that to the constant monitoring and analysis of a very complex data model. Every hurricane is different. Harvey, for example, stalled over Houston and dumped an enormous amount of rain. Irma, however, was originally anticipated to head to the east coast of Florida but shifted to the west after touching Cuba as a category 3. It elevated to a Category 5 and hit the west coast at Marco Island and Naples. It then traveled north right up the center of the state. Irma was huge, wider than the state. Along the way it was downgraded to a tropical storm, yet Jacksonville was hit hard with storm surge.

Featured Sponsors:

Hurricane forecaster, Phil Klotzbach, recently commented; “Harvey, as well as the damage that Irma had done in the Caribbean, caused people to take this storm very seriously.” Those that didn’t paid the price.

So, as of 9/12, let’s review what is happening with Hurricane Jose. Two of the most robust computer models meteorologists use to determine the odds of landfall— the GFS, which is the American forecast model, and the ECMWF, the European model—keep Jose over the ocean. But models can have trouble forecasting unusual tracks such as Jose’s expected path. There is generally not a dominant weather feature that is steering the storm, so model forecasts can vary widely between each other and from run to run. The National Hurricane Center recognizes this issue in its early forecasts for Jose, saying “there is a lot of uncertainty in the intensity forecast.” Considering we are just at the peak of the hurricane season, which has been predicted to be a very active season, we have to be diligent and prepared.

The takeaway for the mortgage industry is that it is crucial to have a comprehensive understanding of what data is important to your organization, that this data is well-defined, and you have confidence you are collecting it properly. In addition, your data model must be continuously monitored and you need to be able to adjust it as necessary.

Featured Sponsors:

Proactive, reactive and inactive: Everybody in Florida fits in one of these categories.

Sometimes it is best to consider alternative strategies. Kristy and I were very proactive when looking at our options for Hurricane Irma. This looked like a monster storm and it certainly turned out that way. We closed down the house and left on Wednesday, September 6th. We avoided the I-75 parking lot and took the old way (US-41) north. This took us through lots of little towns, but there was little traffic and gas was readily available. We stopped for two nights in Albany, Georgia, and continued to Columbus, Indiana, where we plan on staying until power is restored. As they did in other areas in Florida, the police went through our neighborhood ordering the few remaining people to leave. They were not going to respond to 911 calls and put their officers in danger.

The reactive ones tried to wait it out because of the earlier forecasts that predicted an East Coast track for Irma. When they finally decided to leave, they encountered problems: finding gas and stop-and-go traffic on I-75. Hotel vacancies were basically nonexistent in northern Florida and across the southern parts of Alabama, Georgia, etc.

The inactive ones decided to bunker down, even as all of southern Florida issued mandatory evacuations. Many in this group did not have the means to leave and some were forced to go to rescue centers.

How would you define your organization, especially, as it pertains to technology? Are you ahead of the herd? Remember, if you are not the lead dog, the view never changes. Are you just a follower? Maybe you are waiting for other organizations to blaze the trail so you can follow their lead. Or are you just maintaining the status quo? If so, you may wake up one day and wonder what happened to your business.

At this point, I can’t say enough about the unbelievable effort and collaboration of the federal, state, and local authorities in managing the preparation for Hurricanes Harvey and Irma and their aftermath. They are getting better with each hurricane. Texas and Florida were the latest benefactors. The aid from other states sending in personnel to get power back and debris cleared enables people to get back to their homes to assess the damage and begin the task of getting back to normal.

“The number of people killed in hurricanes halves about every 25 years, in spite of the fact that coastal populations have been increasing, because of what we’re doing with forecasting,” said Hugh Willoughby, a professor of meteorology at Florida International University in Miami. The modern science of hurricane monitoring and preparation, which has saved countless lives through forecasting, satellite monitoring, and government planning, has dramatically improved in recent decades.

The coverage from the major media stations was extraordinary and allowed the evacuees to monitor the storm from afar. The use of social media like Facebook, Twitter, and Instagram let everyone stay in touch with families and friends.

With estimates of 70 deaths and $180 billion in damage from Harvey, 68 deaths and billions of damage from Irma, two thirds of 21 million Florida residents without power, the road back to recovery will be challenging, but manageable.

The focus on data, the absolute necessity to be proactive, and the need to work collaboratively with customers, partners, and vendors should be top of mind for every mortgage lender today. Integrated technology is a necessity. Think differently.

I will leave you with one final thought. It might be time to go to the moon, retrieve the golf balls, and return the rocks. We have upset the whole balance of nature.

About The Author

Roger Gudobba
Roger Gudobba is passionate about the importance of quality data and its role in improving the mortgage process. He is vice president, mortgage markets at Compliance Systems and chief executive officer at PROGRESS in Lending Association. Roger has over 30 years of mortgage experience and an active participant in the Mortgage Industry Standards Maintenance Organization (MISMO) for 17 years. He was a Mortgage Banking Technology All-Star in 2005. He was the recipient of Mortgage Technology Magazine’s Steve Fraser Visionary Award in 2004 and the Lasting Impact Award in 2008. Roger can be reached at rgudobba@compliancesystems.com.

Augmented Intelligence

One of the first introductions of artificial intelligence to the general population came in 2011 when Watson competed on Jeopardy. Ken Jennings and Brad Rutter were arguably the best players the show had produced over its decades-long lifetime. In total, they had walked away with more than $5 million in prize winnings, a testament not only to the breadth and depth of their knowledge, but their strategic savvy with category selection and wagering. Watson, a computer system developed by IBM, was capable of answering questions posed in natural language. Watson had access to 200 million pages of structured and unstructured content, but was not connected to the Internet. Watson consistently outperformed its human opponents on the game’s signaling device, but had trouble in a few categories, notably those having short clues containing only a few words. The key here was the development of a natural language processor that would become the foundation for numerous future applications like Siri. But while its rapid responses to questions may have struck many as robotic, Watson was not a robot in the traditional sense. Robots are machine built to carry out physical actions and may or may not be designed to approximate the human form. I am sure many of you remember the TV series, ‘The Jetsons’ with Rosie, the humanoid robot maid and housekeeper. Or maybe not, because that series was on in the early 1960s.

Featured Sponsors:

 

 
‘In Search of a Robot More Like Us’ was a 2011 New Your Times science article written by John Markoff. He stated that:

The robotics pioneer Rodney Brooks often begins speeches by reaching into his pocket, fiddling with some loose change, finding a quarter, pulling it out and twirling it in his fingers. The task requires hardly any thought.” But as Dr. Brooks points out, training a robot to do it is a vastly harder problem for artificial intelligence researchers than IBM’s celebrated victory on Jeopardy…. Although robots have made great strides in manufacturing, where tasks are repetitive, they are still no match for humans, who can grasp things and move about effortlessly in the physical world. Designing a robot to mimic the basic capabilities of motion and perception would be revolutionary, researchers say, with applications stretching from care for the elderly to returning overseas manufacturing operations to the United States.

Featured Sponsors:

 
So, let’s leave the discussion about robots for another time. Instead, I’ll focus on defining augmented intelligence and differentiating it from artificial intelligence. It’s more than a question of semantics. Artificial intelligence, perhaps from its popular culture use in general and its science fiction use in particular, can conjure up images of the sentient machines with personal agendas. It suggests a culture where, at least in some part, humans are no longer required to make decisions. Some industry experts believe that the term artificial intelligence can create more negative speculation about the future than hope.

Whatis.com defines augmented intelligence as an alternative conceptualization of artificial intelligence that focuses on AI’s assistive role, emphasizing the fact that it is designed to enhance human intelligence rather than replace it. An alternative label for artificial intelligence also reflects the current state of technology and research more accurately.

Featured Sponsors:

 
According to an article by Athar Afzal, “We’ve transitioned from an agricultural-dominated society to the industrial revolution – and now to a more data-driven economy. What we’ve witnessed during each of these stages is some form of mechanics or machinery developed to augment our performance, thereby improving our outcome…. The world has a lot of opportunity to gain and make our lives better with augmented intelligence – it’ll make our lives far smoother and more enjoyable. I invite everyone to view Ginni Rometty’s speech at the World Economic Forum.”

Researchers and marketers hope the term augmented intelligence, which has a more neutral connotation, will help people understand that AI will simply improve products and services, not supplant the people who use them.

While a sophisticated AI program is certainly capable of making a decision after analyzing patterns in large data sets, that decision is only as good as the data that human beings gave the programming to use. The choice of the word augmented, which means “to improve,” reinforces the role human intelligence plays when using machine learning and deep learning algorithms to discover relationships and solve problems. I’ve summarized some definitions by Jean-Albert Eude below.

Machine learning is a type of artificial intelligence (AI) that allows software applications to become more accurate in predicting outcomes without being explicitly programmed. The basic premise of machine learning is to build algorithms that can receive input data and use statistical analysis to predict an output value within an acceptable range. The processes involved in machine learning are similar to that of data mining and predictive modeling. Both require searching through data to look for patterns and adjusting program actions accordingly.

Deep learning is an aspect of artificial intelligence (AI) that is concerned with emulating the learning approach that human beings use to gain certain types of knowledge. At its simplest, deep learning can be thought of as a way to automate predictive analytics. While traditional machine learning algorithms are linear, deep learning algorithms are stacked in a hierarchy of increasing complexity and abstraction. Each algorithm in the hierarchy applies a non-linear transformation on its input and uses what it learns to create a statistical model as output. Iterations continue until the output has reached an acceptable level of accuracy. The number of processing layers through which data must pass is what inspired the label “deep.” The advantage of deep learning is that the program builds the feature set by itself without supervision. This is not only faster, it is usually more accurate. In order to achieve an acceptable level of accuracy, deep learning programs require access to immense amounts of training data and processing power, neither of which were easily available to programmers until the era of big data and cloud computing.

The value of such augmented predictive analytics to a segment of the economy as dependent on data as the mortgage industry is obvious. What is also obvious, unfortunately, is that we may be among the last to seat ourselves at the technology table.

Often, an early title or tag line for a concept or theory evolves over time as others develop their ideas and work toward a solution. In the mortgage industry, the concept of paperless mortgages was proposed in the early 1990s to reduce and/or eliminate what some conceived as unnecessary paper and to improve the overall experience for the consumer. Along the way we started referring to it as an electronic mortgage (e-mortgage) and now it is the digital mortgage, an all-inclusive data and documents packaged in a format for both human and machine consumption. That will certainly achieve the initial objective to eliminate paper and improve the consumer experience. The operational benefits extend from origination all the way through to the secondary market.

But going digital without building the internal architectures to capitalize on data-driven support technology is like going to a 3D movie, but not putting on the 3D glasses to watch it. If we don’t keep moving our own finish line, we risk being trampled by those with a longer view of the race.

About The Author

Roger Gudobba
Roger Gudobba is passionate about the importance of quality data and its role in improving the mortgage process. He is vice president, mortgage markets at Compliance Systems and chief executive officer at PROGRESS in Lending Association. Roger has over 30 years of mortgage experience and an active participant in the Mortgage Industry Standards Maintenance Organization (MISMO) for 17 years. He was a Mortgage Banking Technology All-Star in 2005. He was the recipient of Mortgage Technology Magazine’s Steve Fraser Visionary Award in 2004 and the Lasting Impact Award in 2008. Roger can be reached at rgudobba@compliancesystems.com.

An Interesting Look At The Future

If it is true that the only constant thing in life is change, then the twenty-first century is proving to be a predictably constant time in which to live and make a living. Our industrial revolutions have always been about dramatic changes in the scope and scale of the technology platform supporting societies: water, steam, electricity, electronics, and information technology—all have transformed our standard of living even as they have irrevocably altered the business landscape. And if we look at the timeline of human civilization, we can see that those revolutions are all grouped in the more recent past, with the time between sea changes becoming shorter as we get closer to the present.

And why not? There are mathematical laws about exponential growth that govern our understanding of everything from how an avalanche will cascade down a mountain to how a virus will spread through an unvaccinated population. Change won’t just keep coming; it will keep happening more quickly. Some are even referring to the new industrial revolution as the Exponential Age because of the exponentially accelerating technologies that have the potential to disrupt industries that seem isolated and protected from the trends affecting the more obvious “technology” sectors.

Featured Sponsors:

 

 
An interesting look at the future: Singularity University is a Silicon Valley think tank founded in 2008 at the NASA Research Park in California and is supported by NASA and Google. Udo Gollub, a German writer and entrepreneur, documented his thoughts after a Singularity University Summit. Some of his thoughts are included below to provoke thought and discussion within your organization.

Software will disrupt most traditional industries in the next 5-10 years. Every organization, both big and small, in every conceivable line of business face a very daunting task of deciding how much time and effort should be spent on their current marketplace and product line. My previous articles were focused on looking at new opportunities. You need to ask yourself if the business space in which you want to operate will exist in some form in the future. If you think that it will, what can you do to future-proof it from a fate like Kodak’s?

In 1888, George Eastman founded Kodak. In 1998, Kodak had 170,000 employees and sold 85% of all photo paper worldwide. Within just a few years, their business model disappeared and they went bankrupt. What happened to Kodak will happen in a lot of industries in the next 10 years – and most people don’t see it coming. Did you think in 1998 that 3 years later you would never take pictures on paper film again? Yet digital cameras were invented in 1975. The first ones only had 10,000 pixels, but the technology followed Moore’s Law. So as with all exponential technologies, it was a disappointment for a long time, before the technology advanced enough to gain mainstream acceptance in only a few short years.

Featured Sponsors:

 
Moore’s law is directly related to computing. It is an observation made by Gordon Moore that the number of transistors in a densely integrated circuit doubles approximately every two years. More loosely, Moore’s law here refers to the exponential growth of technology.

What happened to Kodak will happen in a lot of industries: product and service leaders will be blindsided by revolutions that have been hiding in plain sight for some time, but are only now reaching critical mass. Let’s look at some examples and thoughts offered by Gollub.

>>Uber is just a software tool, they don’t own any cars, and are now the biggest taxi company in the world.

Featured Sponsors:

 
>>Airbnb is now the biggest hotel company in the world, although they don’t own any properties.

>>Bitcoin will become main stream this year and might possibly become the default reserve currency in the future.

>>In 2018 the first self-driving cars will appear for the public. Around 2020, the complete industry will start to be disrupted.

1.) You don’t want to own a car anymore. You will call a car with your phone, it will show up at your location and drive you to your destination. You will not need to park it, you only pay for the driven distance and can be productive while driving. Our kids will never get a driver’s license and will never own a car.

2.) It will change the cities, because we will need 90-95% less cars for that. We can transform former parking space into parks.

3.) Most car companies might become bankrupt. Traditional car companies try the evolutionary approach and just build a better car, while tech companies (Tesla, Apple, Google) will do the revolutionary approach and build a computer on wheels.

4.) 1.2 million people die each year in car accidents worldwide. Insurance companies will have massive trouble because without accidents, the insurance will become 100x cheaper. Their car insurance business model will disappear.

5.) Real estate will change. Because if you can work while you commute, people will move further away to live in a more beautiful neighborhood.

>>The price of the cheapest 3D printer came down from $ 18,000 to $ 400 within 10 years. In the same time, it became 100 times faster.

>>Electricity will become incredibly cheap and clean: Last year, more solar energy was installed worldwide than fossil. Solar production has been on an exponential curve for 30 years, but you can only now see the impact.

1.) This represents a smooth doubling every two years of the amount of solar energy we’re creating, particularly as we’re now applying nanotechnology, a form of information technology, to solar panels.

2.) The price for solar will drop so much that all coal companies may be obsolete by 2025.

>>A generation ago students at MIT all shared one computer that took up a whole building.

1.) The computer in your cellphone today is a million times cheaper, a million times smaller, and a thousand times more powerful.

2.) That’s a billion-fold increase in capability per dollar that we’ve experienced. And we’re going to do it again in the next 25 years.

Computers will become exponentially better in understanding the world. Let’s take a moment to compare linear steps with exponential steps. When we take 10 linear steps (1, 2, 3, etc.), we get to 10. If we take 10 exponential steps (2, 4, 8, etc.), we get to 1024. The difference between the two rates of growth becomes staggering in a relatively short period of time.

The exponential growth of computing predates Gordon Moore and applies to any technology with measurable information properties. People have asked about what happens after Moore’s Law comes to an end. The answer, as always: we will then go to the next paradigm.

In the 1950s, technology was shrinking vacuum tubes, making them smaller and smaller. They finally hit a wall; they couldn’t shrink the vacuum tube anymore and keep the vacuum. And that was the end of the shrinking of vacuum tubes, but it was not the end of the exponential growth of computing. We went to the fourth paradigm, transistors, and finally integrated circuits. When that comes to an end we’ll go to the sixth paradigm: three-dimensional, self-organizing, molecular circuits.

Our current generation of business leadership must be able to navigate these industry evolutions faster and more effectively than any time in the past if their organizations are to survive and thrive.

Next month, we will examine the impact of Artificial Intelligence on technology innovation.

About The Author

Roger Gudobba
Roger Gudobba is passionate about the importance of quality data and its role in improving the mortgage process. He is vice president, mortgage markets at Compliance Systems and chief executive officer at PROGRESS in Lending Association. Roger has over 30 years of mortgage experience and an active participant in the Mortgage Industry Standards Maintenance Organization (MISMO) for 17 years. He was a Mortgage Banking Technology All-Star in 2005. He was the recipient of Mortgage Technology Magazine’s Steve Fraser Visionary Award in 2004 and the Lasting Impact Award in 2008. Roger can be reached at rgudobba@compliancesystems.com.

The Art Of Opportunity, Pt. 2

website-pdf-download

In my last article, we examined the business strategies described in The Art of Opportunity, by Marc Sniukas, Parker Lee and Matt Morasky.

Michael Porter’s classic book, The Competitive Advantage: Creating and Sustaining Superior Performance, described strategies for achieving competitive advantage by 1) becoming a cost leader, 2) differentiating your offering, or 3) focusing on a niche.

So how does The Art of Opportunity differ from these more traditional approaches championed by Porter? First of all, we see that there is a shift from focusing on achieving competitive advantage by simply being cheaper or different to finding and seizing opportunities by creating value.

Featured Sponsors:

 

 
To be clear, the authors don’t suggest that the sort of traditional strategic management approaches described by Porter do not work. For some organizations and in certain industries, they work extremely well, if applied in the right way. And yet, a lot of companies nevertheless struggle when attempting to achieve their growth and innovation targets within these traditional frameworks. The following sections summarize some of the authors’ key points.

Where to look for new growth opportunity? This statement from Professor David Bell says it best, “The first principle of finding new growth is that you’re always better off going after customers who are underserved or neglected.” Why is that? The authors state, “Only by gaining a deep understanding of customers, their true needs and expectations, as well as their satisfaction or dissatisfaction with current offerings, will you gain the insights needed to develop solutions that customers really want to buy. Most companies don’t know why customers do or don’t do business with them in the first place.”

Featured Sponsors:

 
Understand customer needs, expectations and choices: The basic idea is that customers do not buy products because they want to own the product, but because they have an objective they would like to fulfill with that product…. So, should you just ask existing customers what they want and need? Henry Ford is credited with saying, “If I had asked people what they wanted, they would have said faster horses.” Henry Ford’s customers might have wanted faster horses, but they probably also wanted something that was a bit more comfortable and convenient, needed less maintenance, and possibly was cheaper. Understanding why and when customers buy a certain product opens new ways of segmenting your market. Opportunities are a function of the chosen customer segment, its needs, and expectations toward the solution offering. Once you understand your customers’ needs and the experiences they have trying to fulfill those needs, you can investigate what stands in their way to having a satisfactory customer experience.

Understand your firm: Looking at your customers is an external search approach to uncovering growth opportunities. While looking at your company with an internal slant, the key is still to discover new opportunities for growth from existing and new customers. Once you understand why customers come to you, you will have a good sense of what you excel at doing. This is the underlying capabilities and competencies that make your company special. Think about the strengths, capabilities, and resources of your business that you could leverage to create new businesses. The book lists several questions for you to identify your valuable, rare and costly-to-imitate resources and how to organize to exploit those resources.

Featured Sponsors:

 
Frame your growth opportunity: For starters, we are not focused on traditional forms of growth like the following:

  1. Selling more of the same: Market penetration occurs when a firm enters the market where its current products already exist or its services are provided, allowing the business to go head-to-head with incumbents in the market.
  2. Growth through mergers and acquisitions: Mergers and acquisitions ae often taken to increase the size of the firm. Some are to add capabilities and some to add product lines outside of their current core business to diversify.

Instead the authors see three types of growth:

  1. Evolutionary growth: This type of growth is closest to your core. You evolve by removing hurdles to satisfaction and barriers to consumption for you existing offerings. This could include making your products more consumer friendly or upgrading your services.
  2. Adjacent growth: This is expanding your offering to cover additional steps in the consumer experience or offering other similar products. It is closely related and complements your core. Adjacent growth bears a little more risk than evolutionary risk, as you are venturing into slightly new territory. Yet, as you are staying close to your core, nevertheless the risk is manageable.
  3. Breakthrough growth: This is the type of growth that goes well beyond the limits of your current business. This entails not only the development and launch of a completely new strategy to market an offering outside of your company’s existing business definition, but also the design of a new business model and/or revenue model as part of the new strategy. Breakthrough growth is obviously not only the most difficult, but also the riskiest type to achieve. Yet it also bears the highest rewards, if successful.

What type of growth are you aiming at? Each growth model is appropriate for specific situations. There is not one prescribed model type and, in fact, you may benefit from combining them in order to adapt your firm’s individual situation. Clarifying your objectives and the type of growth you are aiming at will enable you to focus your subsequent strategy efforts, provide your team with guidance, and avoid pursuing opportunities your company might not be comfortable with at present.

Now that you’ve identified your opportunity, how are you going to seize it? While traditional strategy would have you focus on products and services, strategic innovation means you will focus on the following:

  1. Offering: The mix of products, services and the customer experience.
  2. Business model: The way you operate and the activities to do business.
  3. Revenue model: Where the money will come from, how you set prices and how payment is done.

Although the three parts are presented in a sequential order, innovation can come from each of them. In practice, you are likely to cycle back and forth as each component informs the other. At the end of the day, you need to make sure all three parts are integrated and support each other.

The author’s research has shown that successful strategic innovators go through three phrases.

(1) The inception phase, within which an opportunity for new growth is discovered.

(2) The evolution phase, during which the offering business and revenue model are adapted.

(3) The diffusion phase, during which the focus of activities shifts from designing and crafting the strategy to scaling up the new business.

Summary: It isn’t possible to give adequate emphasis to the depth and breadth of the authors’ material here. There are many examples throughout the book of companies utilizing this process and methodology. The authors have skillfully employed visualizations, diagrams, and templates in support of their concepts. My articles have simply been an overview of this book and hopefully have piqued your interest to explore this further.

About The Author

Roger Gudobba
Roger Gudobba is passionate about the importance of quality data and its role in improving the mortgage process. He is vice president, mortgage markets at Compliance Systems and chief executive officer at PROGRESS in Lending Association. Roger has over 30 years of mortgage experience and an active participant in the Mortgage Industry Standards Maintenance Organization (MISMO) for 17 years. He was a Mortgage Banking Technology All-Star in 2005. He was the recipient of Mortgage Technology Magazine’s Steve Fraser Visionary Award in 2004 and the Lasting Impact Award in 2008. Roger can be reached at rgudobba@compliancesystems.com.

The Art Of Opportunity

website-pdf-download

The 2016 Progress in Lending Innovation Award winners are presented in this issue. As was the case in the previous six years, this year’s honorees are a mixture of well-established companies and first-time entrants. However, what is consistent in the applications is the detailed responses to the application criteria: significance, originality, positive change, intangible ROI, and hard savings ROI. And the applications get better every year. The panel of judges, comprised of members of the Progress in Lending Executive team, rely heavily on those responses and the scores are weighted based on the category. It has truly been an honor over the years to recognize some outstanding innovative solutions for the financial industry.

Featured Sponsors:

 

 
Over the years, I have probably written over 100 articles on the mortgage industry. I certainly don’t consider myself a journalist and sometimes find it very challenging to find something unique to write about that hasn’t been presented many times before. My goal is to simply provide the reader the opportunity to explore ideas that might make a difference in their everyday life. I have primarily focused on how technology can be leveraged. I want the reader to be creative and innovative, to think outside the box, and avoid the limitations of a thought process that beings with, ‘We have always done it this way’. I am an avid reader and many of my story ideas come from a multitude of articles and books that are not necessarily related to our industry. In keeping with the theme Innovation, the focus of this article is the book The Art of Opportunity, by Marc Sniukas, Parker Lee and Matt Morasky. This book lays out a roadmap and a collaborative process supported by visualizations, tools, and templates, as well as many real-world samples, to direct you in developing a business growth plan for your organization. Let’s start with some excerpts from the Foreword:

150-quote

 

_______________________________________________________________________________________

 

The difficulty lies not so much in developing new ideas — as in escaping from old ones. John Maynard Keynes

Our industry does not respect tradition, it only respects innovation. Satya Nadella, CEO, Microsoft

____________________________________________________________________________________________________
 
When many of today’s leaders joined the workforce, ”innovation” was synonymous with research and development or process deficiencies—the hallmarks of traditional competitive advantage. Little did any us know then, that in our lifetime an entire occupational discipline would emerge to keep companies ”innovative” or continuously inventive….

But it did. And for good reason. The relativity short span of time in which we’ve seen some of the titans of industry displaced by ”innovative” start-ups put the entire business world on notice. And the message is clear: merely maintaining your position is no longer sufficient. New growth, the kind associated with genuine innovation, that will bring value to your customers, your business, and even the world around you is the only way to ensure survival….

Featured Sponsors:

 
The problem is, finding and capitalizing on new growth opportunities is hard—especially for established organizations that are often hampered by outdated mindsets, legacy business models, or large scale bureaucracies. Core competencies can morph into corporate rigidities if we’re not strategically alert and careful. Under these types of circumstances, the ability to think outside the box and create new growth initiatives is difficult. But with increased urgency comes the need to find a new path to growth—one that isn’t rocket science.

Over the years there have been numerous business books on how to improve your organization’s innovation, strategy and competitive advantage. So, what makes this book stand out? Mainly, it is because the authors focused on two major points: Strategic Innovation that differs from traditional approaches by directing our focus on finding and seizing opportunities by creating value and Business Design Thinking that is defined as a collection of principles to help understand, address and develop solutions to business problems.

150-quote

 

_______________________________________________________________________________________

 

Instead of simply addressing cost, pricing, and product/service differentiation with how to win, you focus on creating customer value by solving your customer’s needs better than anyone else.

Executives applying business design thinking to their way of working will develop capabilities and practices that differentiate them from their peers.

____________________________________________________________________________________________________
 
Traditional strategic management is fixated on where to play and how to win. You determine where to play in your industry and with a specific market/product offering. You determine how to win by setting your competitive advantage to focus on a niche and by being a cost leader.

Strategic Innovation redefines where to play as finding new growth opportunities. The emphasis is on the customer, their needs, expectations, and experiences rather than the industry or competitors. Instead of simply addressing cost, pricing, and product/service differentiation with how to win, you focus on creating customer value by solving your customer’s needs better than anyone else. You create value for your firm with further opportunities. Inserted between the two is how to play, where you design the business required to seize these opportunities. Let’s examine this further.

Featured Sponsors:

 
Where to play: This is all about finding your new growth opportunities. Research has shown that organizations develop more successful and innovative offerings by starting with their customers. Opportunities are a function of the chosen customer segment, its needs, expectations toward the solution offering, and current barriers to consumption or hurdles to a satisfactory customer experience.

How to play: This is all about crafting strategy, which includes the mixture of products, services, and the customer experience with the manner in which you operate and the activities necessary to do business. This will define where the money will come from, how you set prices, and how payment is made.

How to win: This is all about creating value for the customer, your organization, and the ecosystem. Instead of competing on low cost and/or differentiation, the winners in today’s economies focus on creating value and benefits for multiple stakeholders.

Finally, the book illustrates how the process for strategy making and execution and for building the new growth businesses is neither entirely linear nor completely iterative. It provides examples of how companies go through an iterative process with phases that favor action over analysis and planning.

What is Business Design Thinking? If strategic innovation focuses on the content of your new growth strategy and the process of crafting that strategy, business design thinking focuses on the practices that enable your team to achieve success more effectively and efficiently. These are the five principles of business design thinking: 1) Keep a human-centered focus, 2) Think visually and tell stories, 3) Work and co-create collaboratively, 4) Evolve through active iteration, and 5) Maintain a holistic perspective.

The Art of Opportunity incorporates each of the five principles to represent how an organization can change its way of working. Executives applying business design thinking to their way of working will develop capabilities and practices that differentiate them from their peers.

I would encourage everyone to read this book. We will continue this exploration next month.

About The Author

Roger Gudobba
Roger Gudobba is passionate about the importance of quality data and its role in improving the mortgage process. He is vice president, mortgage markets at Compliance Systems and chief executive officer at PROGRESS in Lending Association. Roger has over 30 years of mortgage experience and an active participant in the Mortgage Industry Standards Maintenance Organization (MISMO) for 17 years. He was a Mortgage Banking Technology All-Star in 2005. He was the recipient of Mortgage Technology Magazine’s Steve Fraser Visionary Award in 2004 and the Lasting Impact Award in 2008. Roger can be reached at rgudobba@compliancesystems.com.

The AI Era Is Here – Pt. 2

website-pdf-download

In the latest issue for Fortune, Erin Griffith examines the investment trends in AI (Artificial Intelligence) technology and poses the question: is AI an overhyped fad or a revolution? She writes, “There’s an easy way to tell when the hype around a technology trend has peaked. 1) Are the smartest venture capitalists complaining about valuations? 2) Are big tech companies snapping up start-ups so young they can barely be considered real businesses? 3) Are Fortune 500 executives talking about their [insert trend here] strategy? If the answer to any of these questions is yes, congratulations! You’ve identified a fad.”

Featured Sponsors:

 

Of course, most revolutions look like fads in their early days—because they are. Distinguishing between those fads that will fade and those that will become the norm in the long term can be difficult, and as in all things, hindsight has a much higher success rate than foresight when it comes to identifying the winners and the losers. So what data might guide such an evaluation?

The research firm CB Insights recently reported that in 2016 there were 658 venture capital deals in the AI sector. In 2016, that amounted to $5 billion in startup funding deals, a significant increase from $589 million in 2012. International Data Corporation projects worldwide revenue from artificial intelligence and cognitive systems to be $47 billion in 2020, up from $8 billion in 2016.

Featured Sponsors:

CB Insights selected 100 of the most promising artificial intelligence startups globally from a pool of 1,650 candidates based on factors like financing history, investor quality, and momentum. A look at the top 50 shows that AI is surging worldwide with 20% located outside the United States. They cover a wide range of market segments: core AI, FinTech, auto, health care, commerce, CRM, cyber-security, robotics, business intelligence, and text analysis and generation.

Interestingly, the fact that AI does not necessarily intersect with established business cases has not proven to be a hurdle to investment. “These are not businesses,” says John Somorjai, executive vice president of corporate development at Salesforce, which has acquired a handful of AI companies. “These [deals] are about technology and talent.”

Featured Sponsors:

The development of artificial intelligence has inspired both fascination and dread.

In 1955, the term AI represented the concept of autonomous systems modeled on the structure of the human brain. At the same time, other researchers were tackling a different problem: finding patterns in what was then considered great volumes of data and making proper selections, or decisions, based on that data. In 1956 William Ross Ashby wrote in his Introduction to Cybernetics that “…what is commonly referred to as ‘intellectual power’ may be equivalent to ‘power of appropriate selection’.” This was not intended to as “artificial intelligence” in the way we typically understand it, and in fact was labeled as the inverse: IA, or Intelligence Augmentation. If this model sounds suspiciously familiar, it is because today’s AI systems are constructed on the IA paradigm. Our real-world applications, including language processing, machine learning, and human-computer interaction are based on IA—data pattern recognition and appropriate decision making—and as such, they augment our capacity to understand what is happening in the complex world around us. While the term “AI” became the label of choice for such technology, it is an ironic misnomer.

Let’s look at the “Why You Should Let Artificial Intelligence Creep Into Your Business” article in the March, 2017 issue of Inc magazine for some definitions:

How AI works: problem solving: Unlike traditional computing, which delivers precise solutions within defined parameters, AI, sometimes referred to as cognitive computing, teaches itself how to solve problems. “Instead of delivering specificity, AI-centric programming generates millions of solutions, evaluating each for efficacy and then choosing the most viable and optimal ones,” says Amir Husain, CEO and founder of SparkCognition.

What it does better: data diving: Manually finding your target customer, by searching and poring through income-level, interest-based, and geographical data, is labor-intensive and time-consuming. AI cuts to the chase. “For example, using a feed of three key pieces of information that the entrepreneur provides; a brief product description text, images and a price range; an AI system can zip through social media and other online outlets, looking for correlations between product and digital conversations,” says Husain. If you give it the green light, AI’s natural language processing technology then writes and sends a sales pitch, notes transmission times, and analyzes feedback. “You can almost hear an AI system going, Aha! I’ve cracked the code.” says Husain, adding that AI constantly optimizes itself by making slight changes to the message.

Where it works: practical apps: One key reason for AI’s upsurge is entrepreneurs’ free or inexpensive access to libraries such as IBM Watson, Goggle TensorFlow, and Microsoft Azure. These application programming interfaces (APIs) allow coders to build AI apps without starting from scratch. Husain expects to see a proliferation of AI-centric marketing, sales and other service startups focused on small and medium-size businesses.

Let’s look at some specific examples from the same article.

Call Centers: The biggest misconception about AI is that it’s robots with human faces sitting at remote desks. “AI is nothing more than an add-on technology, spice and flair, to an otherwise conventional system, such as a traditional travel-reservation site that, because of AI can now converse with a human,” says Bruce W. Porter, an AI researcher and computer science professor at the University of Texas, Austin. Porter emphasizes that future breakthroughs will not be 100 percent AI. “AI will likely provide a 10 percent product or service performance boost,” he says. That is, in fact, huge. Firms that fail to make the leap, he says, may fail to have customers.

Information Retrieval: Not all searches are as simple as typing a few keywords and having Google take over. Entrepreneurs often need more in-depth and complicated excavations for patent and trademark data, for example and that, in turn, involves an often-hefty legal budget to pay a highly-trained human to do. Porter foresees within five years many companies offering services to consumers who have no experience in AI or specific knowledge fields. They’ll be able to conduct their own AI based data retrieval. Count on industry disruption, he says, as this type of AI application will leapfrog current data-retrieval-service providers.

Contract Generation: Because it’s able to generate natural language, AI is an exceptional tool for helping entrepreneurs assemble contracts, as opposed to buying them off the shelf at, say LegalZoom. AI applications will converse with – by text and, ultimately, voice – and tease information out of humans that will become components of formal agreements, such as details about fee payments and product returns. Porter anticipates users will pay to access cloud-based AI computer systems to produce such documents. AI-centric startups, because they don’t require a human in the loop and won’t need to hire staffers, can offer their services at a very low cost, especially given an anticipated large volume of customers and business competition.

AI can displace humans, but it can’t replace them.

Leaders of every industry and institution are sprinting to become digital. Who will win? The answer is clear: It will be the companies and the products that make the best use of data. And the ones that make the best use of data will likely be the ones that use AI to gain efficiencies in data analysis and decision making.

About The Author

Roger Gudobba
Roger Gudobba is passionate about the importance of quality data and its role in improving the mortgage process. He is vice president, mortgage markets at Compliance Systems and chief executive officer at PROGRESS in Lending Association. Roger has over 30 years of mortgage experience and an active participant in the Mortgage Industry Standards Maintenance Organization (MISMO) for 17 years. He was a Mortgage Banking Technology All-Star in 2005. He was the recipient of Mortgage Technology Magazine’s Steve Fraser Visionary Award in 2004 and the Lasting Impact Award in 2008. Roger can be reached at rgudobba@compliancesystems.com.

The AI Age Is Here

website-pdf-download

Artificial intelligence has gained prominence recently due, in part, to big data, or the increase in speed, size, and variety of data that businesses are now collecting. Artificial intelligence, or AI, can perform data-related tasks with great efficiency, and it can identify patterns in the data that often eludes human analysis. As organizations strive to gain more insight from their data, it’s not surprising that the business world is looking to AI for a competitive edge.

Featured Sponsors:

 

 
It’s not like this is the latest and greatest innovation! The term artificial intelligence—an umbrella concept that encompasses everything from robotic process automation to actual robotics—was coined in 1955 by John McCarthy, an American computer scientist, and it gained traction in the academic community at the Dartmouth Conference the next summer. As Daniel Crevier describes it in his book Ai: The Tumultuous History of the Search for Artificial Intelligence:

Featured Sponsors:

 
In the summer of 1956, ten young scientists, some barely out of their doctoral studies, sat down to consider the astounding proposition that ”every aspect of learning or any other feature of intelligence can, in principle, be so precisely described that a machine can be made to simulate it.” Armed with their own enthusiasm, the excitement of the idea itself, and an infusion of government money, they predicted that the whole range of human intelligence would be programmable within their own lifetimes. Nearly half a century later, the field has grown exponentially – with mixed results.

Featured Sponsors:

 
By the early years of the 1980s, a consensus was forming that expert systems were the future of artificial intelligence. An expert system is a computer system that mimics the decision-making skills of a person. It makes sense in theory: feed enough data to the system to create the proficiency of a human expert, and you can theoretically get human-like decisions from it. Unfortunately, such systems are prohibitively expensive to develop and have only proven to be useful in targeted scenarios. In many respects AI has demonstrated a wide scope, but shallow influence: it has touched countless disciplines, but its impact has been limited to the most simple form of call-and-response interactions.

Today’s AI research and development focuses on artificial neural networks: systems duplicating the interconnected process of the human nervous system. AI can combine the reasoning ability of the human mind with the processing power of computers, such as in Apple’s Siri personal assistant and Amazon’s Alexa. A recent article in the Wall Street Journal stated, “Spending on AI technology is expected to grow to $47 billion in 2020 from a projected $8 billion this year, according to market-research firm IDC.”

As a consequence, some business executives are working to become familiar with methods of managing the development of applications and the design of algorithms across multiple lines of business. Brian Uzzi, a professor at Northwestern University’s Kellogg School of Management, has co-developed three AI courses for M.B.A.s. In April 2017, Kellogg plans to introduce Human and Machine Learning, a 10-week elective course. The broader objective, according to Mr. Uzzi, isn’t to create a cadre of engineer-executives, but to introduce future corporate leaders to the idea of making decisions with the help of machines. Artificial intelligence is now on the syllabus at top-tier business schools.

A recent MIT Technology Review looked at a major report from Stanford University, coauthored by more than twenty leaders in the fields of AI, computer science, and robotics and concluded that AI looks certain to upend huge aspects of everyday life, from employment and education to transportation and entertainment. The analysis is significant because public alarm over the impact of AI threatens to shape public policy and corporate decisions.

The report predicts that automated trucks, flying vehicles, and personal robots will be commonplace by 2030, but it cautions that remaining technical obstacles will limit them to certain niches. It also warns that the social and ethical implications of advances in AI, such as the potential of unemployment in certain areas and likely erosions of privacy driven by new forms of surveillance, will need to be open to discussion and debate.

In December 10, 2016, Andrew Tonner published the 9 Artificial Intelligence Stats That Will Blow You Away.

1.) Voice assistant software is the #1 AI app today: Many of these voice-powered AIs still leave something to be desired in terms of accuracy, and it was surprising that voice assistants outnumbered big data in overall popularity with businesses.

2.) AI bots will power 85% of customer service interactions by 2020: Bye-bye, call centers and wait times. According to researcher Gartner, AI bots will power 85% of all customer service interactions by the year 2020.

3.) Digital assistants will “know you” by 2018: Also from Gartner, digital customer assistants will be able to “mimic human conversations, with both listening and speaking, a sense of history, in-the-moment context, timing and tone, and the ability to respond, add to and continue with a thought or purpose at multiple occasions and places over time.”

4.) Amazon, Alphabet, IBM, and Microsoft to host 60% of AI platforms: These 4 tech giants already have significant cloud computing businesses, a trend researcher IDC sees as likely to continue and by the start of the next decade, will control most of the market for AI software applications.

5.) Get excited for self-driving cars: According to a study from leading consultancy McKinsey, the impact of self-driving cars will be tremendous, saving an estimated 300,000 lives per decade by reducing fatal traffic accidents. This is expected to save $190 billion in annual critical care and triage costs.

6.) 20% of business content will come from AIs by 2018: In a potentially apocalyptic turn for members of the media reading (or writing) this, AI-powered software will write as much as 20% of business content in a mere two years’ time according to Gartner.

7.) AI drives a $14-33 trillion economic impact: In a research report to its investors, Bank of America argued that the rise of AI will lead to cost reduction and new forms of growth that could amount to $14-$33 trillion annually, in what it calls “creative disruption impact,” and that’s just the tip of the iceberg in some experts’ view.

8.) Robots will be smarter than humans by 2029? According to Alphabet director of engineering Ray Kurzweil, machines will be smarter than us by 2029. Kruzweil doesn’t necessarily see this as being a negative, though. Among many other “bold” predictions about our AI-laden futures, he believes people will start living forever around the year 2029 as well. Whether that’s the result of some Matrix-like scenario coming to fruition isn’t immediately clear, but obviously leading experts in the field believe major changes to our social fabric are only a little more than a decade away.

9.) Zero people actually know how big an impact AI will have: While it’s certainly easy to get wrapped up in the litany of predictions, it’s perhaps most useful to simply keep in mind that AI should have a major economic impact from which investors can undoubtedly benefit from today.

The one concrete takeaway is that AI will contribute to the rapidly shifting technology landscape for our industry. Organizations that want to get or stay ahead will be flexible adapters who are willing to evolve their operations to take advantage of AI-based tools that enhance the customer experience, streamline internal processes, and feed the business pipeline.

Summary: Artificial intelligence (AI) is all around us – we encounter it in our daily tasks, such as talk-to-text and photo tagging, and it is contributing to cutting-edge innovations such as precision medicine, injury prediction, and autonomous cars. AI is the next big revolution in computing and holds the promise to provide insights previously unavailable while also solving the world’s biggest challenges.

About The Author

Roger Gudobba
Roger Gudobba is passionate about the importance of quality data and its role in improving the mortgage process. He is vice president, mortgage markets at Compliance Systems and chief executive officer at PROGRESS in Lending Association. Roger has over 30 years of mortgage experience and an active participant in the Mortgage Industry Standards Maintenance Organization (MISMO) for 17 years. He was a Mortgage Banking Technology All-Star in 2005. He was the recipient of Mortgage Technology Magazine’s Steve Fraser Visionary Award in 2004 and the Lasting Impact Award in 2008. Roger can be reached at rgudobba@compliancesystems.com.

Analyzing The Election

website-pdf-download

A Forbes article by Jonathan Vanian said it best, “Prior to Trump’s upset win, virtually all national polls showed the businessman and reality television star trailing Democratic nominee Hillary Clinton. Her win was considered inevitable, with prominent pollsters and pundits merely arguing about how big her guaranteed victory would be. And then on Tuesday, voters proved the experts wrong.”

What did Trump do right? To say he ran an unconventional campaign would be a gross understatement. He methodically eliminated 16 primary contenders, countering their talking points with his bombastic personality. He captured an inordinate amount of free TV time by being outlandish. Trump had a sense of what people wanted to hear and recognized that anger among working class white voters ran deep. He played to emotion, not data points.

Featured Sponsors:

 

Steven Bertoni’s article headline in the December 20, 2016 in Forbes magazine read, ”The secret weapon of the Trump campaign: his son-in-law, Jared Kushner, who created a stealth data machine that leveraged social media and ran like a Silicon Valley startup.” Kushner, who had no political experience, committed to Trumps’ campaign in November 2015, after seeing a raucous Trump rally in Springfield, Illinois. On that return trip, Trump and Kushner talked about how the campaign might better use social media.

”At first Kushner dabbled, engaging in what amounted to a beta test using Trump merchandise. ’I called somebody that works for one of the technology companies that I work with, and I had them give me a tutorial on how to use Facebook micro-targeting,’ Kushner says. The Trump campaign went from selling $8,000 a day worth of hats and other items to $80,000, generating revenue, expanding the number of human billboards—and proving a concept. By June the GOP nomination secured, Kushner took over all data-driven efforts. Within three weeks, in a nondescript building outside San Antonio, he had built what would become a 100-person data hub designed to unify fundraising, messaging, and targeting.”

In this case, a lack of awareness of traditional campaigning was an advantage. Kushner was able to look at the business of politics without the constraint of precedent.

Featured Sponsors:

Eric Schmidt, the former CEO of Google and one of the designers of the Clinton campaign’s technology system, agrees with Vanian. “Jared Kushner is the biggest surprise of the 2016 election. Best I can tell, he actually ran the campaign and did it with essentially no resources.”

What did Clinton do wrong? According to The Washington Post, Clinton’s campaign used a custom Algorithm called Ada; a complex computer algorithm that staff fed “a raft of polling numbers, public and private” to play a role in most strategic decisions Clinton aides made. According to aides, Ada ran 400,000 simulations a day and a report was generated that gave Robby Mook, the campaign manager, a detailed roadmap of which background states were most likely to tip the race in one direction or the other, allowing them to decide where and when to send the candidate and her surrogates and where to air television ads.

Like much of the political establishment, however, Ada did not accurately predict the turnout of rural voters in Rust Belt states. Pennsylvania was correctly identified as a critical state early on, which explains why Clinton visited it often and closed her campaign in Philadelphia. Other states that Clinton would lose, like Michigan and Wisconsin, either were not identified as at-risk or were deemed so too late.

A number of election post mortems indicate that Bill Clinton, a politician with proven fluency in reading and responding to voter emotion, advocated that his wife’s campaign pay more attention to white working class voters. Perhaps he reasoned that while that group was not within Clinton’s reach, she might draw enough votes to win Michigan, Pennsylvania, and Wisconsin—states that Trump narrowly won instead.

Let’s look at data analytics as it pertains to a presidential election: An article in Wired by Cade Metz stated, “The lesson of Trump’s victory is not that data is dead. The lesson is that data is flawed. It has always been flawed—and always will be…. But this wasn’t so much a failure of the data as it was a failure of the people using the data. It’s a failure of the willingness to believe too blindly in data, not to see it for how flawed it really is.”

Featured Sponsors:

Summary: The use of data analytics by presidential campaigns did not begin in the 21st century. Clinton aides believed their work with data was the most sophisticated to date, and while this may be true, it did not translate to a strategic advantage over Trump when all other factors were accounted for. If Barack Obama’s 2012 presidential victory proved big data’s triumph for accurately predicting elections, Donald Trump’s 2016 presidential win could demonstrate the opposite.

According to Nik Rouda, senior analyst at the Enterprise Strategy Group, “Polls aren’t really big data. The sample sizes were certainly good enough for a poll, but maybe didn’t meet the definitions around volumes of data, variety of data, and historical depth contrasted against real-time immediacy, machine learning, and other advanced analytics. If anything, I’d argue that more application of big data techniques would have given a better forecast.”

Professor Samuel Wang, manager of the Princeton Election Consortium, which gave Clinton a 99 percent chance of winning as of the morning of Election Day, stated ”The incorrect forecasts don’t appear to be a problem with the margin of error, the polling resulted in a systematic error. The entire group of polls was off, as a group. This was a really large error, around 4 points at presidential and Senate levels, up and down the ticket.”

Wang went on to say he is still evaluating the results. Late candidate selection by undecided voters may have impacted predictions. Whether predication models can better account for such last-minute decisions or changed minds remains unknown for now.

As the world makes the Internet its primary means of communication, we will be confronted with even more data—so-called “Big Data.” On the Internet, fact, opinion, idle chatter, and humor run together in a common sea where intent is difficult to ascertain and the bots are becoming more indistinguishable from the humans. The old database maxim of “garbage in, garbage out” should guide all efforts to incorporate this data in predication models.

And in the meantime, the even bigger promise is that artificial intelligence will produce more reliable predictions. But even the most sophisticated artificial decision engine remains dependent on imperfect data inputs. Neural networks can’t forecast an election without data—data that is selected and labeled by humans. While such AI systems have become adept at object recognition because people have uploaded millions of photos to places like Google and Facebook already, we lack the same kind of clean, organized data on presidential elections to train neural nets.

Conclusion: Are your data analytics predictions models suffering from the same problems as the models that predicted Hillary Clinton would easily win the U.S. presidential election? Next month we will explore this further.

About The Author

Roger Gudobba
Roger Gudobba is passionate about the importance of quality data and its role in improving the mortgage process. He is vice president, mortgage markets at Compliance Systems and chief executive officer at PROGRESS in Lending Association. Roger has over 30 years of mortgage experience and an active participant in the Mortgage Industry Standards Maintenance Organization (MISMO) for 17 years. He was a Mortgage Banking Technology All-Star in 2005. He was the recipient of Mortgage Technology Magazine’s Steve Fraser Visionary Award in 2004 and the Lasting Impact Award in 2008. Roger can be reached at rgudobba@compliancesystems.com.

Technology Innovations

website-pdf-download

At the end of the second quarter we got some good news. Black Knight reported that at $518 billion, Q2 saw the highest volume of 1st lien mortgage originations in a single quarter since Q2 2013. Purchase lending was particularly strong, making up 57% of all lending and seeing a 52% increase in volume from Q1. Purchase originations rose $102 billion from Q1 to a total of $297 billion, hitting their highest level in terms of both volume and dollar amount since 2007. Refinance originations rose by 8% from Q1, but fell slightly below last year’s levels, despite lower rates and a larger population of refinance-able borrowers. In fact, refi lending has risen in each of the past 3 quarters, though primarily in the higher credit segments of the market. The industry was also able to put the burden of complying with TRID behind it. Moving forward, new compliance burdens by the GSEs changing forms and the CFPB changes to HMDA still exist. So, what does all of this mean? There is ample opportunity for success in mortgage lending if the industry adopts a culture that embraces both innovation and change. Roger Gudobba, Vice President, Mortgage Markets at Compliance Systems, talked to our editor about the impact of technology innovation in the mortgage industry. Here is what he said:

Featured Sponsors:

 

 
Q: Let’s start by talking about your first experience in computers.

ROGER GUDOBBA: In 1961 I was hired as a computer programmer by Dr. Beckett at the Lafayette Clinic, a research and training facility that was part of the Michigan Department of Mental Health. It was a little bit scary, since I had never seen a computer before. That led to 18 years of programming computer applications to facilitate their research endeavors on a Bendix G-15 paper tape computer in machine language. Over the years, there were far-reaching advances in computer hardware and software. I quickly realized that computers were just tools to enable you to do a job faster and easier, but it was paramount to stay abreast of new technology.

Featured Sponsors:

 
Q: So, how did you first get involved in the mortgage industry?

ROGER GUDOBBA: I spent the next eight years developing software for a variety of small businesses, mostly on IBM computers in a variety of programming languages. The last two years of that period I managed a Mortgage Loan Origination System. I believe it was at the MBA Annual in Boston in 1986 where laser printers were the hot topic. The ability to bring an image in and overlay the data, creating an electronic document provided cost and time savings that were substantial over using dot matrix printers with pre-printed forms. At that time, VMP was the premier provider of mortgage forms and they hired me in 1987 to help develop the laser form library. The size of the library was huge and I looked for ways to simplify that. Our compliance officer pointed out that some documents, like notes and security instruments, only had minor differences. I wasn’t thinking about dynamic run-time forms. It was more about defining the creation and maintenance required for the source library. Like any new technology, there were some challenges and adoption was slow. And, for the first time, I was frustrated by the industry’s resistance to change and reluctance to embrace technology.

Featured Sponsors:

 
Q: Over the years the key phrase that most associate with you is that you always said: “It’s all about the data.” How did that come about and what led to your involvement with MISMO?

ROGER GUDOBBA: VMP started looking at other authoring technology. That’s when I discovered XML, which was a subset of SGML. I attended an international conference on XML in Spain in 1997. I had the opportunity to meet a number of leaders, including Charles Goldfarb, the father of SGML. I was convinced that XML was a way to exchange data. The key was you had a start tag, information, and an end tag. Compared to the two solutions in use at that time (fixed record layouts or comma delimited files), XML was very flexible, less prone to errors, and, among other things human readable. I met Gabe Minton at a meeting at the MBA in D.C. Gabe and I were both in agreement about XML. In 1999, our respective companies—VMP and ULTAPRISE—formed a non-profit organization called XML Mortgage Partners to develop an XML data library for mortgage. Even though we had a nice cross-section of industry leaders, including lenders, LOS vendors, and consultants, it created some friction in the industry. Dave Matthews worked with the MBA to create MISMO in 1999 and Gabe was named the director. Both of us are still very active today and very proud of MISMO’s progress in establishing standards for the mortgage industry.

Q: What do you see as the major challenges facing the industry?

ROGER GUDOBBA: The U.S. financial industry is experiencing rapid evolution. While we are past the trial-by-fire days of the 2008 financial crisis, the consequences of that difficult time continue to impact the way we do business. Today, there is no doubt: increased regulatory oversight from federal and state agencies, as well as the uncertainty of regulatory changes yet to come, have created a hazardous business environment. The Federal Reserve has increased the imperative for financial institutions to implement an enterprise-wide risk management solution, one that effectively addresses operational risk, legal or compliance risk, reputation risk, and liquidity risk, among other risk types. These risks are ever present and their management by financial institutions is closely monitored by the Consumer Financial Protection Bureau (CFPB) as well as other regulating agencies. The only way to address these problems is with technology that controls and validates your information flow.

Q: How does Compliance Systems Transaction Risk Management Solution address this?

ROGER GUDOBBA: Financial institutions are becoming increasingly familiar with the demands of managing these various types of enterprise risks and many financial institutions are struggling to ensure that they can demonstrate that their specific policies, processes, and procedures are fairly and consistently applied across their customer communities. In the absence of a transaction risk management (TRM) solution, each institution is responsible for maintaining the integrity of the entire transaction risk system. Institutions are responsible for ensuring that their institution’s policy disclosures contain appropriate data based on state and federal regulatory requirements and applicable case law. They are responsible for determining that data is consistent across all documents required to memorialize the transaction. They are responsible for determining in any given transaction that there is data consistency across all documents. Their staff is responsible for determining complex, state-specific entity types such as limited liability and limited partnerships, and for determining the correct organizational authorizations. Failure to do so exposes institutions to the risk of unenforceable transactions that impact liquidity, compromise their reputation, and result in legal and regulatory repercussions.

It is understood that certain data is required in transactions by federal and state regulation, and every institution must present that content at transaction time. However, because all institutions are unique and offer unique products, they need to have the ability to define and control the language used to represent those products.

The CSi Transaction Risk Management (TRM) Solution allows financial institutions to easily configure language that precisely defines their policy decisions and product definitions. The solution ensures that only compliant, validated language is applied at transaction time, mitigating their ongoing operational and compliance risk. Security features and audit trails help them control and track access to their data.

Q: What is the foundation for TRM?

ROGER GUDOBBA: Throughout its history, Dennis Adama, CEO and Founder, has differentiated CSi from competitors with his willingness to conceive new solutions to the problems surrounding transactions. Starting in 1995, Compliance Logic Systems (CLS) was a standalone wizard application used by financial institutions to create and maintain their own ARM and TIL lending disclosures. That CLS was designed as an intelligent data collection and compliance validation tool proved prescient and reflected Dennis’s focus on the data necessary to document financial transactions, rather than on the physical documents or forms that would ultimately present the data. This foresight to build a software foundation on the data would establish CSi’s development model for the years ahead. Around the same time, initial launch of Document Selection Logic (DSL) was underway. That technology component uses transaction data to determine the documents required to perfect financial transactions and relates entities and collateral with the relevant documents, helping institutions mitigate risk at the root level of the transaction. This was quickly followed by the release of dynamic documents—documents in the sense that they end up rendering in the page format with which we are all familiar, but actually software applications that logically include or omit content based on transaction values.

All of these technological advances established the foundation for Transaction Risk Management. Transaction Risk Management (1) provides a warranted contract with a clearly enforceable promise to pay, (2) ensures the institution has an enforceable interest in any collateral on the transaction, (3) follows all governing law language and regulations specific to the transaction, (4) correctly assembles and associates all relationships on the transaction to appropriate documentation, (5) automatically configures any institution-specific language based on the institution’s very own selection criteria, (6) and clearly identifies any missing information.

I joined Compliance Systems in 2008 primarily because of their technology and their business model, which is based on offering strategic technology partnerships. We interact and exchange thoughts, ideas, and development plans with our partners in order to develop and support the best joint solutions possible.

The CSi Data Schema, launched in 2009, was the first technology component I saw built from the ground up at CSi and is another example of how the organization seeks to support partner integration efforts. Rather than simply working with MISMO data schema in the Mortgage market, CSi envisioned a complete data schema that supported Mortgage, Deposit, Consumer Lending, and Commercial Lending. Since then, CSi has only forged further ahead in developing technology solutions that reduce the risk exposure of financial institutions. We have Configurability, which allows lenders to modify, append, or replace text provisions within documents so that they can adapt to market, business, policy, and regulatory developments while maintaining warranted compliance with CSi. We have Simplicity, which expanded that configurable functionality and allows lenders to designate logos, bar codes, etc., to be applied to standard documents automatically.

When the TILA/RESPA Integrated Disclosures (TRID) Final Rule was announced in the fall of 2013, CSi immediately recognized that this was an excellent fit for dynamic document technology. Early estimates showed it could take potentially hundreds of static templates to produce all the variations, assuming that this was even possible. We communicated throughout the process with our strategic partners so that they would understand what we would deliver, actually a year ahead of the deadline. That allowed them to focus on the changes required from their end.

The Data Collection Logic (DCL) is one of our latest technology components. It provides CSi partners with information regarding the data that must be collected in their platform interface in order to fully document a transaction. That data collection is based on existing business and regulatory rules already integrated into CSi technology. CSi’s partners have traditionally struggled with the problem of data analysis in the development of their own systems, as their own data collection and application user interfaces also depend on an ongoing knowledge and analysis of regulatory compliance and business rules. The DCL can decrease the time and costs required for a business partner to get a quality platform solution on the market and to maintain that solution in the face of ongoing regulatory changes.

Q: When did you become associated with Progress in Lending?

ROGER GUDOBBA: I probably met Tony Garritano sometime in the 1990s when he was at Source Media. I served on the Advisory Board for Mortgage Technology magazine when Tony was the editor. I spoke with Tony on a weekly basis and we lamented on the fact the industry seemed to be slow to change. He wanted to give everyone in the industry a chance to express their concerns. With the encouragement from a number of us, he launched Progress in Lending in 2010. This is a central place for industry participants to have discussions about how technology can improve the process, and it provides a place for thoughts and ideas to flow freely. It’s easier to move things forward when you’re in a group. One of the monthly publications is Tomorrow’s Mortgage Executive and I personally have contributed over 70 articles. There are now over 15,000 followers on the daily posts. A high point of the MBA Technology Conference is the PIL Innovations Awards, with over 150 applications over the last 6 years. Progress in Lending has been a resounding success and I am proud to be on the Executive Committee for Progress in Lending.

Q: Lastly, what is the status of technology innovation in the mortgage industry in your opinion?

ROGER GUDOBBA: The use of technology has had a significant impact on many industries. Certainly, the mortgage industry is no exception to adapting technology. But have we exploited technology to its fullest potential? The answer is: No! I believe the industry has woefully underutilized technology. When you look at the loan process, not much has changed over the last 25 years. It is still a very document-oriented process. Certainly we have introduced technology solutions at certain points in the process, but we still have not seen a dramatic change in the process. The main objective for TRID was Know Before You Owe so that there were no surprises at closing for the borrower and sometimes the lender. It ensured that the CD was within tolerance levels from the LE. It changed that responsibility from the settlement agent to the lender. It certainly provided a great opportunity to modify the way we close loans. Instead, it is another opportunity missed. We are just getting around to standardized fee names. I am constantly both intrigued and mystified about the slow rate of technology adoption in our industry, but I’m also optimistic about the future.

INSIDER PROFILE

Roger Gudobba is passionate about the importance of quality data and its role in improving the mortgage process. He is vice president, mortgage markets at Compliance Systems and chief executive officer at PROGRESS in Lending Association. Roger has over 30 years of mortgage experience and an active participant in the Mortgage Industry Standards Maintenance Organization (MISMO) for 17 years. He was a Mortgage Banking Technology All-Star in 2005. He was the recipient of Mortgage Technology Magazine’s Steve Fraser Visionary Award in 2004 and the Lasting Impact Award in 2008. Roger can be reached at rgudobba@compliancesystems.com.

INDUSTRY PREDICTIONS

Roger Gudobba thinks:

1.) Digital mortgages will mandate innovative changes to the loan process and move the industry to focus on data and not documents.

2.) Lenders will need to have an integrated central repository for all their data and have complete control over all their interactions.

3.) Consumers will demand multiple access points to communicate with lenders and it will be different for diverse consumer demographics.

Progress In Lending
The Place For Thought Leaders And Visionaries