Why mortgage lenders should be extremely cautious with AI

Img

While there is clearly tremendous excitement and optimism surrounding the use of AI in the mortgage industry, there are also significant concerns, particularly regarding customer privacy and lender exposure to regulatory, reputational and business risks.

This fear is not unfounded. A team of researchers at Carnegie Mellon University studying actual AI privacy incidents across industries found that AI exacerbated existing privacy risks and created new privacy risks across 12 different categories. 

The highest AI and privacy risks for mortgage companies fall into two categories:

  1. Unintentional disclosure of consumer data, which can occur either directly, as the result of a breach or if the mortgage company's data is inadvertently included in AI training sets used broadly, or indirectly, as a result of AI's ability to infer additional information about an individual from limited data points.
  1. Secondary use, in particular if customer data is used to train an AI application, and then that AI is used for purposes which were never consented to by the consumer, such as affiliate marketing or training of other financial products. For example, two years ago, an artist in California found private medical photos of herself in a set of images used to train many of the largest AI image generators. A resulting investigation identified thousands of other patients' medical photos in the same data set. Imagine that instead of photos, an individual's default history, credit report or an assessment of their likelihood of default found its way into a data set used to train a proprietary AI.

Regulatory RiskFrom a regulatory standpoint, mortgage companies must comply with two primary federal privacy laws. The first is the Gramm-Leach-Bliley Act, which requires companies to have policies in place to keep their customer's nonpublic personal information [NPI] secure and to prevent unauthorized disclosure of NPI.

The second is the Fair Credit Reporting Act that governs access to consumers' credit report information and puts penalties in place to prevent unauthorized distribution of credit information.

There is also a third privacy regulation that mortgage servicers, in particular, should take into consideration: The Fair Debt Collection Practices Act [FDCPA]. While this law is not traditionally seen as a privacy regulation, many of its provisions govern when and under what circumstances a debt collector can disclose information about the debtor's default to third parties. How and to what extent the privacy protections of the FDCPA would apply to the use of debtor data in a training set is an open legal question. 

In addition to these federal laws, state regulations could place additional requirements on mortgage companies to protect their borrower's personal information. The California Consumer Privacy Act is the most well-known of these state regulations, but Colorado, Connecticut, Oregon, Texas, Utah and Virginia also have comprehensive data privacy laws, and 10 additional states have comprehensive data privacy laws that will become effective within the next two years.

Reputational RiskObviously, a major data breach can create reputational risk for servicers, but there are other, AI-specific reputational risks that are worth considering and planning for as well.

AI has an unprecedented ability to create realistic-seeming fake content. This ability can assist hackers and other bad actors in accessing lenders' systems. It also makes it easier for bad actors who have accessed those systems to use that data to scam the company's customers. 

AI can also infer the answers to sensitive questions based on limited data. It is possible to imagine how a well-meaning attempt to anticipate which borrowers may need loss mitigation assistance, for example, could result in invasive and inaccurate assumptions, such as flagging current, performing customers in your system as "at risk of default." These assumptions are reminiscent in some ways of the concept of "pre-crime" in the popular novella and movie Minority Report, in which authorities use psychic technology to accuse people of crimes before a crime is committed. Certainly, the actions in Minority Report are a drastic science fiction parallel to AI technology, but the comparison drives home concerns about well-meaning anticipatory assumptions. Using AI inferences to sensitive questions based on limited data within the mortgage industry could easily result in fair servicing violations if they, for example, prompted repeated outreach attempts to borrowers based on protected characteristics. 

Business RiskMany AI vendors today are offering AI services that use one of the major AI tools (i.e., ChatGPT, Google Gemini) as the underlying system. Companies will want to ensure that their proprietary business data isn't used to train the model, unless they are comfortable with that data becoming available to their competitors. 

Before employing AI, lenders and servicers should, at the very minimum, have clear and definitive answers to the following questions:

For your organization:

  • Does my organization have an internal AI framework in place such that we can prevent unauthorized use of the AI model? 
  • What are our obligations with respect to the use of an AI model trained on our customers' data? 
  • Is privacy a priority in our organization's software development process?

For your AI vendor:

  • Is the AI that we are considering a proprietary AI system or a third-party, bolt-on model?
  • What data was used to train the AI?
  • Will our company's data be used to train the AI model?
  • What are the vendor's information security policies? 
  • What options does the vendor offer to keep our customer's data segregated from the greater model?

Whether, as some observers expect, AI will create new opportunities for the mortgage industry, it will certainly create new responsibilities for mortgage industry users and potentially generate additional oversight and push back from regulators. As an industry, we should be ready for both.


More From Life Style