Statistically, half of what anyone tells you is bullshit. As a species, socially we develop from children who learn to obey authority figures unconditionally to (hopefully) become independent, critical thinkers and for the most part, do so only inconsistently and sometimes, not at all. There are a slew of cognitive bias’ that derive from how we reason that, while sensible when considered on a conscious level, are significant because they affect us on a purely subconscious one. Nor would anyone argue that there is not a significant amount of what we’re told that is verifiable dishonest, even when we do separate this condition from one that merely conflicts with our own bias’.
On examination, this makes a fifty fifty bullshit to truth ratio seem charitable.
The short answer is two part; vet your information by cross-referencing it from multiple sources and insist on performing bias elimination regularly as due diligence. While this is no guarantee of accuracy (whether or not something is true or false has no relationship to who or how many people feel about it), it does at least allow us to separate our own reasoning process from one that tends to predispose us to uncritically consuming or circulating false information.
None of which speaks to the fact that people are inconsistently rational at best and that the determining factor in their beliefs are how they feel about a subject rather than what they think about it. Changing what we believe is frequently downright unpleasant. Being wrong is frequently viewed as shameful, we’re wired to change our beliefs only reluctantly to prevent us from just accepting any old thing anyone cares to tell us, and what we believe is also closely related to what makes us feel good about ourselves; being “right” is a virtue, to be admired.
In attempting to develop from an authoritarian paradigm to think critically, we learn that critical thinking is at it’s most effective when it’s taken us completely out of our comfort zone. Subsequently, not only is the authoritarian default more personally validating, but we continue to self-identify with it when we’re persuaded that the position does reflect the most consistent, factually objective one.
The saving grace of this equation is simply that as we engage in bias elimination, the cumulative emotional validation resulting from its correct application gradually reinforces the practice.