Out of the mix of commercial greed, politics, and genuine desires to make the world better, we'll try many ways to "fix" social media. But I think it may take a couple of generations, affected by what we do, for us to begin to agree about what's right and wrong.
Capturing and sharing what you already know is good; and with today's data and text analytics tools, it has become much easier than when we'd first begun this journey.
Yet maybe the most glaring example of underestimating humans we encounter in our work is in the world of AI. It's partly the term "intelligence" in AI that misleads so many, as AI is not intelligent in the same way that humans are intelligent. Though powerful, AI ultimately matches patterns it has learned, and even the smartest of AI systems is limited in how many patterns it can match and make sense of.
As Gary Marcus says, a large language model is just a "spreadsheet for words" that lets it act as a massive autocompletion system that knows how words go together but has not the foggiest idea how those words connect to the world.