KuberB: what do you think is the likelihood that UPST’s AI could be targeted with claims of being discriminatory against certain subsets of population?
KuberB: But the question is how many of those 1600 data points are acceptable to regulators. As the company gets bigger, one can be sure the competitors and regulators will have a closer look at those data points. And if regulators want certain data points removed, then will UPST AI be as effective as it is now??
Upstart has already been unfairly targeted last year. And the AI model was tweaked as a result (proactively by management). Please see what I wrote about this previously:
I would like to add: even the big institutional giant Goldman Sachs has trouble with its foray into credit underwriting - Let alone being able to be top dog in fair, non-discriminatory lending (https://discussion.fool.com/unfair-ai-underwriting-for-net39s-co…).
I don’t see they have a CFPB No Action Letter like Upstart and I suspect they are afraid to pivot entirely away from use of FICO scores or incorporate too many alternative variables in their AI/ML models due to risk of regulatory audit. As a result, their underwriting models are quite frankly, terrible:
“The Apple Card debuted in 2019 to great fanfare. But two years after launch, Apple hasn’t managed to deploy the card outside of the U.S…Some users have complained about Goldman’s sometimes lengthy approval process, low credit limits and human-designed, but computer-based, algorithms superseding the ability for customer service agents to resolve issues.
Some of the primary issues some prospective and current users have faced are likely due to Apple and Goldman’s focus on algorithms for application and credit limit decisions. That is exacerbated by the limited ability for Goldman support agents to intervene in and modify decisions.
For instance, some Apple Card users have seen their applications rejected because of incorrect or outdated information on credit reports, something as simple as an actual (or misreported) missed payment a couple of years ago, or for having too many, but a not specified number of, credit cards, loans or mortgages.
When reviewing applications, Goldman uses the FICO Score 9 system, a lender standard that takes into account more factors than the normal credit check. That means Goldman is making decisions based on credit scores that are often lower than what is typically reported by apps like Credit Karma, WalletHub or TransUnion. It’s also often a different score than the one used by banks for giving loans. This has caused confusion for people who have seen their applications denied.
Some users with credit scores in the high 700s or 800s have also reported rejections despite approvals from the likes of American Express and Chase. Those are sometimes due to high, but explainable, debt obligations — despite no history of missed payments.
Credit limits are likewise impacted. Some users have reported Apple Card credit limits as low as $250, which isn’t enough to buy the AirPods Pro with tax, let alone an iPhone.
But sometimes the low credit limits are actually the result of bad information from credit reports and decisions made by algorithms that can’t be easily changed by humans. The low credit limits also mean that users may quickly fill up their credit utilization ratio, potentially hurting their credit scores if they don’t immediately pay down balances.
Some users have reported unexpected credit limit increases in recent months (Apple also recently launched Apple Card Family, allowing family members to combine credit lines), but some users are still reporting an inability to increase their spending limit — even after showing an ability to pay off their balance in prior months.“