New Financial Planning Worry

… if you’re getting financial advice based on your zipcode, using an AI trained on an historical database that reflects and reinforces redlining, discrimination, etc. Is it really the best advice?

And a human advisor may not be any better, since his “training” reflects the same history.

Are Financial Industry Robo-Advisors Racist?…

“To use machine learning and AI, they’re feeding these algorithms with historical data,” said Berman. “The reasoning is that historical data on people could reflect discrimination that has occurred in the past, so the algorithm could become biased even if it was designed to be completely neutral.”

Berman cited an SEC Investor Advisory Committee meeting on “Ethical AI and RoboAdvisor Fiduciary Responsibilities” earlier this month that veered into the possibility that biased historical data can influence AI and machine-learning platforms.


1 Like