Have you ever been completely certain you were right in an argument, only to realise later you weren't?

Login or Subscribe to participate

On March 24, India's Parliament passed the Transgender Persons Amendment Bill by voice vote.

It removed the right to self-identification that the Supreme Court had granted in its landmark 2014 NALSA judgment. 

Handed gender recognition to a government-appointed medical board instead.

Within hours, two versions of the same bill were circulating across Indian social media.

Version one: The bill protects vulnerable people from exploitation and fraud. It streamlines a broken system. Adds stronger penalties for crimes against transgender persons. Finally, proper governance.

Version two: The bill erases people. Amnesty International called it a serious setback for human rights. Two members of the National Council for Transgender Persons resigned. Trans communities protested. Lawyers started preparing constitutional challenges.

Two groups of people reading identical clauses and arriving at conclusions so different that each side genuinely cannot understand how the other side got there.

This was all over my WhatsApp groups.

People I respect on both sides. Smart, thoughtful people. And none of them were even fighting. They were just confused. Genuinely unable to see how the other group could possibly read the same bill and come away reading it so differently.

A football game in 1954 that two universities watched and saw two completely different things

In November 1951, Princeton and Dartmouth played a football game that got ugly. Lots of penalties. A Princeton player broke his nose. A Dartmouth player broke his leg.

After the game, both campuses were furious at each other.

Princeton students said Dartmouth had played dirty. Dartmouth students said Princeton started it.

Same film running through the same projector. But what each group noticed, what they flagged, what they dismissed, completely different. 

Their loyalties shaped what their eyes picked up.

Hastorf and Cantril published their findings in 1954. It became one of the earliest demonstrations of something psychologists would spend the next 70 years studying.

Nearly 30 years later, three researchers decided to test whether giving people better information could fix this.

In 1979, Lord, Ross, and Lepper gathered 48 people. Half of them supported the death penalty. 

And the other half opposed it.

They gave both groups two research papers to read. One paper had evidence supporting capital punishment. The other had evidence against it.

You'd expect people to read both and conclude somewhere in the middle, right?

The opposite happened.

Supporters read the paper that backed their position and found it well-researched and convincing. Then they read the paper that challenged their view and immediately found problems with it, weak methodology, small sample sizes, not enough nuance. 

Opponents did the exact same thing in reverse.

After reading both papers, neither group moved toward the middle.

They moved further apart.

More information made each side more confident that they were right.

Lee Ross eventually gave this entire phenomenon a formal name: naive realism.

Three assumptions we all carry without realising it.

First, that we see the world as it actually is. Second, that anyone with access to the same facts should reach the same conclusion we did. Third, that if they don't, something is wrong with them. They're uninformed, irrational, or biased.

The Handbook of Social Psychology in 2010 recognized it as one of four foundational contributions to the field.

And it explains nearly every argument you've ever had where you walked away thinking "how can they not see this?"

They were thinking the same thing about you.

The transgender bill is just the most recent version.

But this is the same thing that happens with the reservation debate, with election results, with the farm laws.

Two sides reading the same information, arriving at completely opposite conclusions, each fully convinced the other is either not paying attention or arguing in bad faith.

India's information environment makes this worse than it already is.

Whether it's a forwarded WhatsApp message, an article the algorithm served up, or a dinner table argument, the logic is the same: I see it clearly, so if you disagree, the problem is on your end.

What Ross found actually breaks this

Ross didn't just study the problem. He tested solutions too.

In his conflict resolution experiments, regular dialogue between opposing sides usually made things worse. People just restated their positions louder.

But one technique produced 100% agreement.

He asked each side to name one point from the other side's argument that they believed had some legitimacy.

They didn't have to agree with the other side or change their position. Just find one thing in the opposing argument worth taking seriously.

That's it.

When you actively look for what's legitimate in the other view, your brain has to accept that a reasonable person could hold that position for reasons that aren't stupid. It breaks the assumption that anyone who disagrees must be ignorant or biased.

A conversation about the bill was getting heated in one of my WhatsApp groups. I asked one question: "What's the strongest version of the argument you disagree with?"

Nobody responded for a bit. Then someone on each side reluctantly named one thing.

The conversation didn't become agreement. But it stopped being two monologues.

Took about 30 seconds to shift the entire tone.

Since then, I keep coming back to a simpler version of what Ross was doing.

Before I react to any disagreement, I ask myself: what would my life have had to look like for the other person's reading to be the obvious one?

If I'd spent years watching welfare systems get exploited, a verification requirement would look like basic governance to me.

If I'd spent years watching people fight for the right to define their own identity, removing self-identification would feel like an attack on their dignity.

Both readings are internally consistent. Both are incomplete.

And naive realism is why neither side can see what's missing from their own.

Ross called it "a dangerous but unavoidable conviction about perception and reality."

Dangerous because it makes you certain. Unavoidable because it's how human perception works by default.

The only defence is knowing you have it.

Hit reply and tell me: what's a disagreement you're in right now where you feel completely certain? Run the question on it. What would your life have had to look like for the other side to seem obvious?

Tell me what you find.

I read every email.

Until next week,
Ritesh

P.S. The bill received Presidential assent. The legal challenges have begun. Both sides remain certain. Some things take longer than one conversation.

Keep Reading