A/I Inventing Its Own Culture, Passes to Humans

If you haven’t read Mo Godat’s “Scary, Smart” you are missing out on a wonderul read about the utopia/dystopia which A/I can bring us if we don’t pay attention to our human input now, today:


“Digital technology already influences the processes of social transmission among people by providing new and faster means of communication and imitation,” the researchers write in the study. “Going one step further, we argue that rather than a mere means of cultural transmission (such as books or the Internet), algorithmic agents and AI may also play an active role in shaping cultural evolution processes online where humans and algorithms routinely interact.”

The crux of this research rests on a relatively simple question: If social learning, or the ability of humans to learn from one another, forms the basis of how humans transmit culture or solve problems collectively, what would social learning look like between humans and algorithms? Considering scientists don’t always know and often can’t reproduce how their own algorithms work or improve, the idea that machine learning could influence human learning—and culture itself—throughout generations is a frightening one.

“There’s a concept called cumulative cultural evolution, where we say that each generation is always pulling up on the next generation, all throughout human history,” Levin Brinkmann, one of the researchers who worked on the study, told Motherboard. “Obviously, AI is pulling up on human history—they’re trained on human data. But we also found it interesting to think about the other way around: that maybe in the future our human culture would be built up on solutions which have been found originally by an algorithm.”