WSJ article on FICO slipping

Today’s Wall Street Journal has an article entitled “FICO Score’s Hold on the Credit Market Is Slipping”. Here are a couple paragraphs:

"Big lenders are moving away from FICO, according to people familiar with the matter. Capital One Financial Corp. and Synchrony Financial don’t use its scores for most consumer-lending decisions. They are becoming a smaller factor in some underwriting decisions at JPMorgan Chase & Co. and Bank of America Corp.

A key financial regulator, meanwhile, is encouraging banks to de-emphasize credit scores in an effort to expand access to affordable credit. And housing-finance giants Fannie Mae and Freddie Mac are considering allowing lenders to use other scores when evaluating mortgage applicants."



I just saw a PBS program on Independent Lens. The subject was AI, but it examined the numerous algorithm based scoring systems used for all sorts of purposes.

They noted that often the scores from self guided AI learn to blend in various discrimination traits even if they are not deliberate.

Red lining is out, but once I know your home address I can easily make judgements on social class. Add in income.

We have come a long way from the days when you did business with people you knew or were introduced by someone you knew and trusted. I know people who like businesses that call them by name when they come in the store.

Today with a global internet business, that is much more difficult. Credit scores is an attempt to deal with a wider audience of customers. But it definitely is not perfect.

And you might suspect that discrimination elements are likely to be challenged. But then credit score is discrimination in determining who deserves credit.

Looks like a catch 22 to me.


Below is a link to how one company, Upstart, is proactively dealing with AI and discrimination. Here are a few highlights-

‘It’s very reasonable to be concerned about AI because it’s a very sophisticated system and it does have the potential to introduce bias or unfairness. Our answer to that concern is very rigorous testing.’
— Dave Girouard, CEO of Upstart

Upstart agreed last year to work with the SBPC and NAACP Legal Defense and Educational Fund Inc. on a review of its fair lending practices for possible improvements. The company also works with the Consumer Financial Protection Bureau (CFPB) in an effort to “build the most inclusive program possible,” according to a statement from Girouard emailed to MarketWatch.

“Upstart runs fairness tests on every applicant and every loan that goes through our platform,” he said in the statement. “Because these models are new, we share the test results with the government and consumer groups on a regular basis.”
As for industry monitoring, Girouard told the congressional task force that he believes a supervisory system is needed to ensure companies aren’t introducing bias into their AI models.…



To expand on your thought…

If the data coming in is already discriminatory, even if your model is not, you wind up with discrimination in your results, right?

1 Like

Sure, credit score probably favors those with more assets. But the poor tend to be minorities. Hence, a tendency toward discrimination.

It’s not surprising the facial recognition has more trouble with women. Most of the training photos used were of white men. (Of course you would think plenty of minority photos would be available in the files of prisons or law enforcement.) Women are especially difficult due to changes in hair styles, wigs, make up, hats, etc. More changes to work around.

Program reported faults in facial recognition software cause innocent people to be arrested and hassled. They need to do better.