At some point a platform is legally responsible for what is posted if they don't take it down. It is the reason (and only reason) why the new fringe platforms made promises of a new freedom of expression, free from tyranny, but in practice delivered the same ole moderated thing. They don't have another option because if they don't then potential liability falls on them. That's important because moderation is not a choice and the lack of can't be excused away by claiming that they are too big.
A platform is only legally responsible for what the law says they're legally responsible for. That's (at least under existing law) dependent on what the content is. Certain content is illegal (again, not a lawyer, but that category is mostly stuff involving criminal activity and not most of what would be called misinformation or hate speech) or otherwise subject to an alternative liability regime (copyrighted material, and let's not go down that hellhole today). If it's not outright illegal content (like CSAM or, more recently, content related to human trafficking) or copyrighted, then, in general, platforms do not
have liability for the stuff their users post. Misinformation and hate speech, for instance, are in the vast majority if not virtually all cases entirely outside the realm of things platforms can be held legally liable for. Every platform moderates less out of fear of liability than out of fear of being overrun by unwanted crap. Every alternative platform that's sprung up, as you mention, with claims of freedom from tyranny has found themselves needing to moderate heavily to deal with the spammers, grifters, and trolls who gleefully make under-moderated sites into useless cesspools. (They also moderate to deal with the aforementioned criminal and/or copyrighted material they could actually be held liable for.)