A mass hijacking of our minds is currently underway. Perhaps you’ve felt it for a while now.
Underlying facts seem increasingly rare, “the truth” is debated back and forth without progress, and people’s behaviors continuously bend towards the irrational. It’s all so weirdly ironic — aren’t we supposed to be living in the so-called “Information Age”?
What the heck happened?
Well, we’re all human & sometimes believe in things that aren’t true. It’s a complicated psychological weakness — after all, “what is truth” is a hard question! As we seek truth as individuals, we’re lucky that most of our personal misjudgments are unique and relatively harmless. At scale, however, people have learned our psychology becomes predictable. And exploitable.
Individuals and organizations — especially the media, political parties and politicians — above all seek to increase their power and status. Thus, in their communication, their primary goal is not to tell the truth or describe reality, but instead to win the game they are playing. This incentive leads them to build their audience by engaging in disinformation campaigns and “Narrative Warfare” tactics. They create stories & perceptions — some real, some fake — to manipulate our emotions and align us with their narrative & their goals.
Information, reality, and truth… none of them really matter as long as we end up joining their side and reinforcing their power.
Although mass manipulation is a trend throughout all of history, it turns out that giving everyone smartphones and instant access to every other human changes everything. The internet is jet fuel. Through Facebook and online media, disinformation now spreads to millions within minutes, is targeted to the specific people who will believe it, and evolves social groups to purge skeptics.
Over time, these attacks on our information ecosystem turn many people into fact-resistant conspiracy theorists and radicals — their identity and daily thoughts become dominated by a tribal battles they believe they are fighting. Once enough people radicalize, Politicians need their votes and are forced to radicalize as well to get elected. Finally, the cycle completes: once in power, politicians opt to accelerate Narrative Warfare to reinforce their power instead of summoning the courage to lead. This is what we have seen play out in the US over the past decade and it’s a downward spiral we must reverse. It threatens the core foundations of successful democracy: empathy, collaboration, useful debate, and truthful speech.
Unfortunately, I’ve met these politicians and observed these social patterns. From 2008 to 2010 I founded a political polling company and learned how political parties didn’t just want your data — they wanted to know who your family was, how to convince you of things, and which buttons to push. I got out of politics. Then, from 2011 to 2014 I was the first employee at a well-funded social media startup. We walked into the office every day desiring to “create community”, and even though some of us were educated in psychology, we were totally naive. No where in our heads did we imagine the acceleration of misinformation and disinformation at scale. While we only reached millions of people, Facebook successfully spread to billions and doubled-down on exploitable technologies.
So now we’re all living in manipulated realities, attacking each other on a daily basis, and often resorting to conspiratorial thinking. To get out of this, we all have a responsibility to grow wiser to “truth”, mitigate tribalism and take back control of our minds.
First, let’s start by better understanding the war we’re in.
Table of Contents
- Complexity of Truth
- Psychology of Truth
- Group Dynamics of Truth
- Societal Truth
- The Game of Belief
- Narrative Warfare
- Technological Acceleration
- What Next?
Complexity of Truth
We attempt to determine truth as individuals by making judgements, and these judgements are all based on information we encounter in various forms (e.g. school, books, facebook posts, conversations, etc.
Truth is complex and tricky, though — philosophers call its study “epistemology,” which roughly translates to “the reason of knowledge and understanding”.
“Any fool can know. The point is to understand”
- Albert Einstein
To know “the truth” you need not only good information, but all of the context required to create a full understanding. Without context, you are viewing only one piece to a larger puzzle. Most communication — especially short-form communication on the internet — is missing substantial context, which leads to misinterpretation.
Ask yourself two questions:
- How often are your words on the internet fully understood and interpreted correctly by other people?
- How often do you think, when you’re reading the news, you’re getting robust context?
The more complex a topic — like hedge fund market dynamics — the more context is necessary to create a full understanding. Since civilization is advanced, the problems of the world are all quite complex, and people need to spend the extra care to gather context. With some topics, such as “building a growing economy”, the complexity is so staggeringly high that no single human can ever understand the entire problem. Literally no one knows the full truth.
History of epistemology:— Existential Comics (@existentialcoms) April 27, 2017
300 BC: you can't know anything.
600: some stuff you can.
1600: no, not even that.
1800: what about this?
Even if you understand complexity and have “done your research”, when are you done? There are always “unknown unknowns” — context and perspectives out there that you didn’t even know you had to research in the first place. How unfair!
Thus, “the full truth” is never fully attainable by anyone. Nevertheless, at some point we accept the reality we perceive, make judgements, and take action anyways. There’s no other option — we must try to seek our personal truth over time. As is said “the Truth will set you free,” so we must learn to wade through the complexity and our manage our personal biases.
Example: Recently GameStop has been in the news for it’s skyrocketing stock after a historic short squeeze, led by regular retail traders from Reddit, drove hedge funds to lose billions of dollars. Robinhood and other brokers limited $GME trading, and there was a massive backlash against the apps. The narrative? Robinhood was obeying orders from Billionaire Hedge Funds and the system was rigged. Both Ted Cruz and AOC seem to agree with that narrative. Context that came out later:
- Robinhood faced a liquidity crunch and proactively borrowed several hundred million dollars. They then limited trading so they could afford financial settlement with acceptable risk. It wasn’t that they wanted to help Billionaires, but instead survive as a business.
- Regular retail traders were the loud voices in the news, but other hedge funds aligned with them to exploit the stock. This wasn’t as clean as “Wall Street vs Main Street,” as many wanted to believe.
Psychology of Truth
We all have our flaws. Unfortunately — you and I — we’re absolutely terrible at determining what is real from what isn’t.
Some of our psychological vulnerabilities:
- Confirmation Bias: Once we make judgements, we stick with them, and any information that appears to support our worldview will be welcomed instantly — even if it’s a stretch. Evidence to the contrary is ignored, and even causes us, on average, to irrationally dig into their own beliefs instead of question them.
- Self-gain: We look for information to help us increase our status. We share information we think others will like, because then they’ll like us.
- Entertainment: The more sensationalist and crazy sounding the news, the more we are likely to engage with it, talk about it, and share it.
- Fear: If something makes us fearful, most people will first assume it is true to hedge risk. After all, if it’s not true then there’s nothing to worry about!
- Lack of Time & Laziness: It’s a luxury to have time to do your own research. Even if you have the time, do you want to read a 5 sentence summary or a 10 page paper? Short content captures attention more easily, but leaves no room for nuance or context. This is best shown by the fact that we only read headlines — 59% of all social media shares happen without reading the article.
- Repetition Reinforcement: if we hear something multiple times, we are more likely to believe it. This was the core propaganda technique of the Nazi party.
- Incorrect Focus: Too much information about the wrong thing distracts us. If others talk about a single aspect of a topic, we naturally assume other aspects are less meaningful. This happens in science, for example, where the topics discussed are only those which are well funded.
- Hero Effect: We are inclined to believe powerful people are out to get us. We actively want to be a protagonist against a planned, well-executing group of powerful, evil people. For example, 40-50% of less educated Americans think COVID was intentionally planned by powerful people.
- Short Working Memory: To hold nuanced views we have to be able to hold multiple topics in working, short-term memory. But when we are bombarded with hyper-active amounts of information, nothing stays in working memory long enough to create nuanced views.
And that’s just to name a few! There are hundreds more, such as this list of 100+ cognitive biases.
For most of us, what we believe is more predicated on our psychological vulnerabilities rather than what is true and real. This is why the scientific method was so revolutionary — it’s a framework for determining truth without requiring trust, faith, or belief in other people.
We all suck at determining truth.
Group Dynamics of Truth
Things get really chaotic once you move beyond the individual psychological level and into groups. From a very young age, we all seek to be accepted as part of an in-group — a community. It’s hard-wired into us.
In order to collaborate with others, we modify our behavior and change our opinions to gain favor, and this pattern of conformity leads us into what sociologists call “Groupthink” (Janis, 1972). Most people in their group will align with decisions based on whatever narrative is hot, no matter if it’s true or if anyone has decided to take the time to research.
Some symptoms of groupthink:
- Critical Thinking Delegation: We listen to others and trust that they were able to do their own research. We offload our thinking to them so we don’t have to do it ourselves.
- Confidence Bias: If someone is confident and appears to be speaking honestly, we often choose to believe them. Narcissists and sociopaths are able to convince people of false narratives because they believe in them themselves.
- Inherent Morality: If a group is part of our identity, we believe we are part of “the good guys”. No one wants to believe they are “the baddies”.
- Self-censorship: We won’t say things if we don’t think our group will approve. We also censor negativity so as to avoid judgement of our personality.
- Illusion of Unanimity: Due to self-censorship, once a light decision is made it can falsely appear as though everyone agrees.
- Pressure on Dissenters: If someone does dissent from the common judgement, others are incentivized to call them out and critique them.
In small groups these symptoms help us work together with less drama — we have evolved them for a reason!
Every once in a while, I have my own feelings. Mostly I‘m having someone else’s.— Eric Weinstein (@EricRWeinstein) January 26, 2019
“You’re over-feeling this.” needs to be a thing we can say as easily as we suggest “overthinking” it. Yet, we talk about “Groupthink” when “Groupfeel” is the new wave transforming our public sphere.
Unfortunately, as communities grow beyond our evolved norm (Dunbar’s Number), all of the symptoms become amplified and more dangerous. Why? The larger the community, the less people you know personally. Whereas telling an intimate friend a hard truth might result in a stronger friendship, telling an acquaintance a hard truth usually results in abandonment. Jan Breitsohl, a lecturer out of the University of Glasglow, studied communities online in 2015 and found Groupthink grows stronger at scale.
“So, in the interests of survival, they trained themselves to be agreeing machines instead of thinking machines. All their minds had to do was to discover what other people were thinking, and then they thought that, too.”
- Kurt Vonnegut, Breakfast of Champions
Thus, over time:
- Groups converge to have extremely similar beliefs. Individual, critical thinking vanishes into thin air.
- Incentives grow stronger to follow the group even if it adapts and changes into something bad.
- Larger groups are more likely to be misinformed and less likely to correct their mistakes.
Information spreads beyond just individuals and groups — it expands and adapts as it moves around our grand, interconnected social network of society.
This happens via what anthropologists call a “meme”, and no, they’re not only funny internet pictures. According to Richard Dawkins, “a meme is information that spreads across culture in the form of ideas, behaviors and skills transferred from one person to another by imitation.” Today, we’re exchanging culture faster than ever before. Memes include fashion, inventions, mannerisms, jokes, and even stereotypes of other people.
Examples of memes & their origins:
- Washing your hands to prevent the spread of germs & viruses. Now common, this started when a doctor named Ignaz Semmelweis made a singular discovery in a singular hospital that corpses carried germs in hospitals.
- QAnon is a now a large movement of conspiracy theorists who claim Donald Trump is taking down pedophile rings, and protecting the USA from corrupt Democrats. It started as a single post on a message board called 4chan, likely as a joke, until it gained serious traction & became powerful to its creator.
Our beliefs and world views are nothing more, really, than combinations of the memes we accept into our lives. We might change them a little and have unique thoughts, but by and large most beliefs you hold probably aren’t unique across a million people, let alone 8 billion people.
Once a meme is out there in the wild, it is like a living, breathing thing — you don’t fully control it anymore! It wants nothing more to survive, and will adapt to ensure its survival. How does it live on? It plays a game.
The Game of Belief
You can imagine that ideas and behaviors, once communicated, are all playing a “game of belief”, in that they’re all vying for your attention and you get to pick the winner. When you choose to communicate memes to others, you are helping those memes win the game.
Memes play this game not only through validity, but also by building teams and rapidly evolving.
Memes combine forces to help one another on their path for self-preservation.
There are three types of memes on a team:
- Central: Memes that represent core beliefs.
- Defense: Memes that prevent you from leaving their Central Meme.
- Offense: Memes that spread their Central Meme.
Daniel Schemblachter uses the example of Religion to explain these types of memes. In Christianity, “Belief in Christ as Savior” is a Central Meme. It is the core belief. Defense Memes could be “God demands faith and doubt is a sin” or “If you stop believing, you will lose your community”. If you ascribe to either of those memes, you’re less likely to ever abandon Christianity.
Christianity’s offense Memes? Mission trips, “save the neighbors”, baptise your children, send kids to religious school, etc.
Like Christianity, these patterns are present in every belief system that lasts. Whether it’s capitalism, communism, veganism, or sports fandom. Every world view has its beliefs, and people naturally think of ways to defend and spread them. People fight for what they believe in. People create the offense and defense teams.
As memes spread and are passed to various people, they are changed and edited. This happens both intentionally and accidentally.
To use Christianity as the example again, assess first Christ’s original beliefs. They centered around peace and unity. So how did Christianity somehow spawn violent crusades? Someone in power convinced others that crusades were the way to spread the teachings and take back holy land.
When offense memes encounter defense memes from another belief system, you have competition between the two, and they will evolve to figure out who wins.
The game of belief intensifies.
All power structures have a foundation that is the belief in their validity. If enough people don’t believe in capitalism, it won’t work. If enough people don’t believe the church, it won’t be able to thrive.
All beginnings have an end — all power structures are inevitably replaced by something else — but most of them do not go quietly. Instead, some leaders of those power structures choose to go to war. They have a goal in mind, a worldview to spread, and their own power to defend. They also have influence and can make things happen in the world, which is why narrative warfare becomes so dangerous as it escalates.
What really sets Narrative Warfare apart from the Game of Belief is that the general population — you and I — aren’t really players. We’re the ones being played.
Our minds are the battlefield.
The goal of any power structure is to win the Narrative War by convincing people that you are on their side, believe what they believe, and will do what’s best for them. And instead of winning on merit, those fighting in Narrative War rely on outright manipulation.
“It would not be impossible to prove with sufficient repetition and a psychological understanding of the people concerned that a square is in fact a circle. They are mere words, and words can be molded until they clothe ideas and disguise.”
- Joseph Goebbels, Nazi Reich Minister of Propaganda
Narrative Warfare goes beyond just political parties and history, it is also conducted by modern nation states (including Russia, China, and the United States), their militaries, and intelligence agencies.
“Cognitive scientists, cultural anthropologists, behavior scientists, and game theory experts must be included as professional meme-wielding-gunfighters on future battlefields. The US must recognize the growing need for emerging disciplines in ideological warfare by ‘weaponeering’ memes.”
More recently, Jeff Giesea — an alt-right organizer — penned an article “It’s Time to Embrace Memetic Warfare” in 2017 as a project of the NATO Strategic Communications Centre. Isn’t it telling that the same person “organizing the grassroots movement that helps elect Trump” is pitching the exact same tactics as useful for nation-to-nation combat? He even admits the risks, all while engaged in the corruption: “[my] biggest change of view, if I am being honest, is a nagging sense of dread. Memetic warfare is only going to get more intense… free societies must be concerned about the corrosive effect these battles may have on our sense of trust in each other and our institutions”. Well, Jeff, thanks for calling out yourself!
Own the narrative, via memes, and own the world. History has shown this is true, and so it is true today. Let’s cover some rules of Narrative War so we can better understand our present situation.
Who controls the memes,— Elon Musk (@elonmusk) June 26, 2020
controls the Universe
Rule #1: Start by co-opting an existing movement
Looking throughout history, you can spot a trend: memetic evolution related to existing power structures. Those with power are able to co-opt existing movements, edit memes to their liking and distribute their “Shiny, Brand New!” version to their followers. This helps them gain and reinforce power very rapidly.
Although QAnon was spawned by a random person on 4chan, it was later co-opted by people like Lin Wood, Rudy Giuliani, Jeff Giesea, and Donald Trump to help accelerate their fame and power.
Put simply, it’s a lot easier to convince people you’re on their side if you publicly pick their side.
Rule #2: Make people less unique
Once you have your foot in the door, you have to keep it going and convince more people. You have to make statements that people believe in. Unfortunately for you, many people have different beliefs.
How do you convince millions of people that “you believe what they believe” if they’re all unique?
You can’t. Instead, manipulate the narrative and divert people’s attention towards similar problems and similar beliefs. In effect, everyone becomes less unique — less of an individual. Once people have shifted their talking points to an topic you chose, align yourself with it.
Traditional democratic republicanism is about representatives learning what their community believes and fighting for the people they represent — politicians adapting to constituents. Modern politics, via Narrative Warfare, is the exact opposite. It’s about manipulating people to believe things so they naturally end up on your side — constituents adapting to politicians.
Manipulating people like this sounds a bit evil, right? Well…
Rule #3: Power games attract sociopaths
There are many “zero sum” games in the world, and hierarchical games such as politics filter for people who are good at competing. Unfortunately for us, sociopaths outperform others at playing these games, and thus more of them end up as leaders of power structures engaging in Narrative War.
Rule #4: Threatened power structures escalate manipulation
While all politicians lie, some of them will go much further than others. Truth is hard and some sociopaths feel the need to win no matter the cost.
If these people want to manipulate the narrative and convince people to believe in something, what is the easiest psychological vulnerability to prey upon? What is the fastest and most reliable way to build an audience?
Answer: Fear! Conspiracy! Evil people are out to get you!
And what if it’s hard to make people fear another side? What if they’re not all that threatening?
Answer: Lie! Create alternative facts and alternative realities!
Rule #5: Lie & distort reality to create a competitive advantage
It is much easier to imagine and create fake realities than it is to accurately describe reality itself — in the time it takes to investigate facts and uncover truths, you could have made 10 alternate realities instead.
Narrative Warfare combatants employ many strategies:
- Disinformation: a direct lie, usually reserved for the most extreme of politicians and media outlets.
- Distortion: lie via emphasis bias. Political “spin”.
- Omission: lie by leaving out important counter-points. The most common type of lie on political media, where you only hear one side of the story.
- Bias alignment: To maximize people’s attention and affinity, find out their pre-existing biases and align with them across the board. This works extremely well with “single issue voters”.
- Label generalization: make generalizations about another group of people to make them seem like one set of people who are all the same. Very easy in a two-party political system.
- Make an enemy: pitch the other side as the enemy, or against what you believe in. Do not let your followers believe the other side is interested in real debate. Be a victim.
- Conspiracy creation: beyond harmless enemies, convince your followers the other side is filled with a powerful, secretive elite plotting their destruction. This triggers their psychological “hero effect” to believe you’re the protagonist and cultivates an environment where they’ll see all actions from the other side as misdirection (automatic creation of defense memes).
- Hero alignment: brand yourself as the good guy. Take claim to irrefutably honorable qualities — you have no bias and are defending principles like “free speech” and “independent truth”.
- Over-simplification: spoon feed information that is simple so people get it, even if there is missing context. Leave out things people might disagree with or might make them think critically. Over time this dumbs down the population.
- Complexity obfuscation: make things sound more technically complex than they are, in hopes people view you as smart and off-load their critical thinking to you.
- Viral focus: Communicate ideas that make people click “like” and “share”. These are often lies or manipulated stories that prey upon our craving for sensationalism.
- Attention hijacking: overwhelm your followers with fear against a specific group so it seems as though your group is under attack. Exploit people’s psychology that defaults to trust you when under attack.
You might see a pattern here. Every single strategy used in Narrative Warfare preys upon a combination of our human, psychological vulnerabilities. This is why they’re super effective, and why so many fall to them.
Rule #6: Meme reinforcement creates a trap
As Narrative Warfare media begins to distort people’s perceptions of reality and radicalize them, they pick a side and entrench themselves in the narrative.
Once that narrative is set, it is very difficult to change. As mentioned above, people will invent new defense memes in order to protect their own beliefs, even if the underpinnings of those ideas are faulty.
If the power structure’s narrative was predicated on, for example, making immigration sound like a threat to the nation, the party cannot effectively backpedal. If they try to change course, people will then claim those very individuals are against the original movement and purge them from the ranks.
Rule #7: Power structures incentivize credible people to join
Within any political party or organization that has been caught doing terrible things throughout history, there have always been “good people” who just happened to fall under the spell.
They didn’t really believe it!
They didn’t think it would get that bad!
In a democracy, when your voters are becoming radicalized and beholden to the Narrative War, you too must adapt or quit. Radicalize or lose the votes! Most politicians choose to keep their power, and thus join the side of the manipulators. In doing so, they make the entire process credible and far more dangerous. Instead of bad actors coming in and tricking all of us by overthrowing a working system, the system itself begins to decay.
Rule #8: The Narrative War is bigger than the players
And so the decay continues on and on — the war is now out of any individual’s control. Those who set in into motion, perhaps fearful of their own creation, will find themselves far less powerful than before if they combat the narrative.
“A man may die, nations may rise and fall, but an idea lives on.”
- John F Kennedy
Social movements and people’s beliefs are resilient. Once set into motion, they evolve and, as per Rule #1, become co-opted by others who are perhaps even more nefarious. It could be another political leader who dreams of power or perhaps its a foreign nation-state seeking to sow chaos.
As our luck would have it, the internet massively accelerates Narrative Warfare.
When I came to Silicon Valley in 2011, everyone was filled with hope and enthusiasm. Facebook was early in its mission to “connect everyone in the world” and few were very concerned about Google’s data collection.
Sophocles said “Nothing vast enters the life of mortals without a curse.” As it turns out, if you succeed at connecting everyone in the world, people communicate in really strange ways. Often dangerous. And if you collect all of someone’s entire search history, the power you have over them is too immense for any individual to responsibly hold.
I can attest — often engineers, product managers, and designers do not know what they are creating. While they believe they were on a personal mission to connect people across the globe, they likely do not realize the potentially evil tool they are giving sociopaths. I used to think the same way.
From 2008 to 2010 I built a political polling company with the hopes of connecting voters to their politicians instantly. Our goal? Encourage politicians to actually talk with their voters more, learn what their needs were, and enable listening at scale. Guess how it was used? Ask your voters questions, slice and dice your data, and then target them specifically so you can use them. Top GOP pollsters asked us if we could “collect data on families” and offered to buy us out. We refused because we didn’t want to be partisan, and within 2 weeks they had forced all of their politicians off of our platform.
Then, from 2011-2014, I built a community network with Lady Gaga that raised $20 million dollars and scaled to millions of users. Our goal? Bring communities together and create healthy relationships. With this in mind, we studied sociology to help communities form at a rapid pace and psychology to “increase engagement” – cough, make users addicted. Although we only reached a couple million people, Facebook now covers half of the global population.
We cannot change the past, but today we must assess, realistically and honestly, what role technology has played in amplifying Narrative Warfare. It’s not pretty.
Accelerant #1: Misinformation spreads within minutes to millions
When sensationalist information is shared by political manipulators, it spreads fast. Outlets claiming to have no bias spread misleading narratives and outright false information on an hourly basis. Millions of people engage with it within minutes.
Accelerant #2: Groups grow large very fast and easily remove skeptics
You can spin up a new blog website or Facebook Group and get 50,000 members within days if you have the right message. As we learned earlier, big groups are terrible at communicating reality and facilitating nuanced discussion.
Worse, on internet platforms skeptics can be purged in one-click — you don’t even have to debate them any more. Instead, censor first and… well, never ask questions either.
Accelerant #3: Algorithms analyze our beliefs & let Politicians target us via advertising networks
Every word, every click, and every second that passes as you look at a picture online is tracked, recorded and logged. This data from billions of people is then fed into computers which categorize you across thousands of different variables.
If, for example, a politician wants to find people who might be afraid of immigration, they can target those who live in cities with changing demographics. Advertising networks, especially Facebook Ads, give politicians and organizations these super powers for pennies.
All hyper active politicians on Twitter are de facto ‘governing’ by algorithm.— Geoff Lewis (@GeoffLewisOrg) January 17, 2021
This isn’t a values judgement — I don’t have a blanket view on whether it is good or bad — but I do believe it’s what’s happening.
Accelerant #4: Targeting automatically exploits psychological weaknesses with “A/B tests”
Even if you can target someone who might be afraid of immigration, how do you really know they’re afraid? How do you know if you’re effectively making them afraid, or just shouting into the wind with your advertisement?
Facebook Ads and other advertising engines allow you to run multi-variable A/B tests. Imagine if you could easily make 100 different images and slogans, then test them within minutes to see which versions were getting the most clicks. That’s how Facebook’s systems work.
And once someone clicked, you know who they are! You now know with 100% confidence they are your future follower — someone you will “re-target” over and over again, until they are your pawn in the Narrative War.
This played out clearly in South Florida for the 2019 election. Turns out if you say “socialist plot” enough, Cuban Americans in Florida really get scared and believe they’re fighting U.S. Communism. Odds are the Trump campaign never thought about targeting Cuban Americans in Florida, and instead Facebook automatically showed them that their message was working with those voters.
Accelerant #5: The internet has fragmented the media and demonetized trustworthiness
While people are subject to this manipulation, the foundation of trust is eroding around us. The internet has given people a lot of things to do besides read the news, so media outlets have become more sensational to drive more clicks. Fact-checking and editorial boards are no longer rewarded, so traditional outlets have destroyed their own credibility in their forced evolution.
Spinning up a new media blog now takes minutes and anyone can do it — all you need is an opinion. These new media blogs are, contrary to popular belief, even more willing to spread fake news and misinformation than traditional outlets. They have no serious reputation to protect and the money/engagement is of vital importance to their early growth. There is a lot of money in Narrative Warfare, so many decide to play the game.
So what do we do about it? This is a historic moment where civilization is going to have to learn to adapt, else our suffering may dramatically increase.
We have to get a lot wiser – there is no silver bullet. It’s a lot easier to identify the problem than recognize a solution, and most ideas fall into the bucket of “personal responsibility,” which is unreliable at scale. Still, I believe we must try, so here are some ideas.
- Fully understand the foundations of Narrative Warfare, Memetics, and personal incentives.
- Stop trying to find simple answers and perspectives — let reality be complex. Be less confident and more intellectually vulnerable.
- Default to “no” on all conspiracy theories. There is no personal benefit.
- Avoid believing other groups are your enemy. Find conviction to make them potential collaborators.
- Ask “why” this group or person is sharing what they’re sharing, and not assuming they are being truthful.
- Be careful with what you share and understand its power to deceive.
- Seriously consider alternative perspectives to your beliefs. Join more groups.
- Steelman your arguments. Discuss tradeoffs between beliefs and multiple solutions.
- Apply the golden rule across your interactions with others. Do unto others as you would have them do unto you.
Moderate Your Groups
- First of all, do not encourage those who escalate the game — especially those who call for violence.
- Demand accountability for those who have nefariously played these games.
- Encourage alternative viewpoints and debate, even if you disagree with it.
- Encourage diverse communities where others aren’t censored easily.
- Encourage critical thinking, balancing of tradeoffs, etc.
- Point out deceptive media but acknowledge the core reasons why someone shared it.
- Demand advertising network changes at Facebook and other social networks. Do not allow them to categorize or target us based on beliefs, political look-alike audiences, or other exploitation vectors.
- Encourage social networks to experiment with fact-checking tooling and misinformation management. If you need to extend faith, put it towards fact-checkers more than your own biases. If you distrust a specific fact-checker, try multiple.
- Spread the lessons of Narrative Warfare. Although I’m biased, think about sharing this post.
If you have any more ideas, I’m very interested in hearing them over Twitter!
To end on a more optimistic tone, I will state a core belief: we will evolve and learn. Humanity has always evolved to face its challenges, and this specific set of problems is so new it would be really remarkable if we already had the answers.
If you take the assumption that we will get through this, we must come out the other end as a far wiser civilization. It requires us to solve core questions around humanity, and become smart about our own biases in the face of a complex search for truth. Just as the scientific revolution led to a world-changing 300 year spike in technological progress, perhaps a revolution stemming from Narrative Warfare will create a spike in cultural progress. I am hopeful.