UPST

Thanks to Saul, other seniors and new ones on the board. Great place to discuss. Like you, I’m also thoroughly enjoying it

I like UPST. Great product. It’s AI seems to work. But the only issue I have in more aggressively investing in it is at least a theoretical risk of it’s AI being regulated and thus making their product less effective/desirable for banks and its other clients.

Of course none of us have crystal balls to look into the future - but what do you think is the likelihood that UPST’s AI could be targeted with claims of being discriminatory against certain subsets of population?

Because FICO doesn’t seem take into consideration factors other than things related to financial history (past debt, current debts, repayment history etc)- link below. Clean and simple (though less effective)

https://www.myfico.com/credit-education/whats-in-your-credit…

Whereas UPST AI looks at 1600 factors in assessing risk. Not sure what all those factors are or if any of those factors could be targeted as being unfair/discriminatory. I mean looking at GPA, place of work, profession and other 1500 data points might be great to assess risk. But the question is how many of those 1600 data points are acceptable to regulators. As the company gets bigger, one can be sure the competitors and regulators will have a closer look at those data points. And if regulators want certain data points removed, then will UPST AI be as effective as it is now??

Thanks, John - Long UPST

3 Likes

Someone posted a note mentioning a “no action letter” from the Consumer Financial Protection Bureau (a gov bureaucracy).

https://discussion.fool.com/upst-no-action-letter-34870065.aspx

It is something like pre-approval of their algorithms saying they don’t have a problem with them. This kind of letter is given out to reduce uncertainty over matters like that.

The notice is good to 2023.

14 Likes

KuberB: what do you think is the likelihood that UPST’s AI could be targeted with claims of being discriminatory against certain subsets of population?

I suspect almost all lenders have to deal with such claims eventually (just like nearly all drivers have to deal with car accidents). I suspect they backtest their models for disparate impact against protected classes as well as have actual results disproving discriminatory effects. (At least, that’s what I’d do.) Neither is such a big deal and should be factored in as part of the cost of doing retail lending.

KuberB: But the question is how many of those 1600 data points are acceptable to regulators. As the company gets bigger, one can be sure the competitors and regulators will have a closer look at those data points. And if regulators want certain data points removed, then will UPST AI be as effective as it is now??

If removing an unacceptable attribute doesn’t affect the datamining results, there’s no real impact. Otherwise, some competitive advantage might be lost depending on whether or not an equivalent surrogate can be found/derived.

KuberB: what do you think is the likelihood that UPST’s AI could be targeted with claims of being discriminatory against certain subsets of population?

KuberB: But the question is how many of those 1600 data points are acceptable to regulators. As the company gets bigger, one can be sure the competitors and regulators will have a closer look at those data points. And if regulators want certain data points removed, then will UPST AI be as effective as it is now??

Upstart has already been unfairly targeted last year. And the AI model was tweaked as a result (proactively by management). Please see what I wrote about this previously:

https://discussion.fool.com/tominvest83-wow-that-is-an-incredibl…

I would like to add: even the big institutional giant Goldman Sachs has trouble with its foray into credit underwriting - Let alone being able to be top dog in fair, non-discriminatory lending (https://discussion.fool.com/unfair-ai-underwriting-for-net39s-co…).

I don’t see they have a CFPB No Action Letter like Upstart and I suspect they are afraid to pivot entirely away from use of FICO scores or incorporate too many alternative variables in their AI/ML models due to risk of regulatory audit. As a result, their underwriting models are quite frankly, terrible:

“The Apple Card debuted in 2019 to great fanfare. But two years after launch, Apple hasn’t managed to deploy the card outside of the U.S…Some users have complained about Goldman’s sometimes lengthy approval process, low credit limits and human-designed, but computer-based, algorithms superseding the ability for customer service agents to resolve issues.

Some of the primary issues some prospective and current users have faced are likely due to Apple and Goldman’s focus on algorithms for application and credit limit decisions. That is exacerbated by the limited ability for Goldman support agents to intervene in and modify decisions.

For instance, some Apple Card users have seen their applications rejected because of incorrect or outdated information on credit reports, something as simple as an actual (or misreported) missed payment a couple of years ago, or for having too many, but a not specified number of, credit cards, loans or mortgages.

When reviewing applications, Goldman uses the FICO Score 9 system, a lender standard that takes into account more factors than the normal credit check. That means Goldman is making decisions based on credit scores that are often lower than what is typically reported by apps like Credit Karma, WalletHub or TransUnion. It’s also often a different score than the one used by banks for giving loans. This has caused confusion for people who have seen their applications denied.

Some users with credit scores in the high 700s or 800s have also reported rejections despite approvals from the likes of American Express and Chase. Those are sometimes due to high, but explainable, debt obligations — despite no history of missed payments.

Credit limits are likewise impacted. Some users have reported Apple Card credit limits as low as $250, which isn’t enough to buy the AirPods Pro with tax, let alone an iPhone.

But sometimes the low credit limits are actually the result of bad information from credit reports and decisions made by algorithms that can’t be easily changed by humans. The low credit limits also mean that users may quickly fill up their credit utilization ratio, potentially hurting their credit scores if they don’t immediately pay down balances.

Some users have reported unexpected credit limit increases in recent months (Apple also recently launched Apple Card Family, allowing family members to combine credit lines), but some users are still reporting an inability to increase their spending limit — even after showing an ability to pay off their balance in prior months.“

https://www.bloomberg.com/news/newsletters/2021-07-18/apple-…

14 Likes

"I would like to add: even the big institutional giant Goldman Sachs has trouble with its foray into credit underwriting - Let alone being able to be top dog in fair, non-discriminatory lending (https://discussion.fool.com/unfair-ai-underwriting-for-net39s-co…).

I don’t see they have a CFPB No Action Letter like Upstart and I suspect they are afraid to pivot entirely away from use of FICO scores or incorporate too many alternative variables in their AI/ML models due to risk of regulatory audit. As a result, their underwriting models are quite frankly, terrible:"

FICO has been there for decades and as we know it’s increasingly becoming less useful in assessing credit worthiness. Increased use of alternative models of assessment such as UPST’s AI are inevitable and I don’t think the trend is going anywhere.

But the same way as google and social media platforms are often questioned by various governments about privacy issues and the type of data they collect, I think regulators will also ask companies like UPST to look into the variables they use in their AI/ML. Then they might have to address those concerns and also make sure any changes/deletions/additions of variables don’t make their AIs less effective. And that is going to be an ongoing back and forth process for many many years. But I still think companies like UPST are in a better position to maintain that delicate balance.

John (long UPST)

2 Likes