Why Social and Analytical Intelligence Rarely Coincide

Or: Why Terrorists and Communist Leaders are often Engineers

Or: Why Autism Would Not Be A Disability If Presidents had Ph.D.’s

We all know the stereotype of the nerd with no social skills. Big Bang theory is an entire television show based around this trope – these guys are super smart, but they don’t understand people. We also have the stereotype of the person who is extremely charismatic and socially intelligent, but not very analytically smart.  The first character is the stereotypical engineer, and the second character is the stereotypical sales guy.

The stereotypes exist for a reason.  I was once the stereotypical engineer. I told myself I couldn’t learn social skills, because I didn’t have the innate talent – kind of like how some people say programming isn’t for them. I suffered for this, and eventually decided to learn social skills. I’ve put a lot of effort into that area, and a lot of thought along the way.

I once worked an IT job, where the CEO insisted that all of his passwords be his first name, because anything else was ‘too complicated’. Everybody else in the company was required to have a real password and change it regularly. The one person who really shouldn’t be comprised was allowed to have a total joke password that anyone could guess.  This guy had social skills, which is what you need to be in a position of leadership – but he didn’t have technical skills, which you now need to stay in a position of leadership.  With all of these companies and organizations being hacked regularly, we need leaders who have both social kills and analytical intelligence.  Why do the two seem to coincide so rarely?

This post is an attempt to explain this pattern. My hypothesis is this: the habits that one develops when developing analytical intelligence can get a person hurt when applied in social environments.  Likewise, the habits that one develops when learning social intelligence can get a person hurt in analytical environments.

Ultimately, the social environment we live in is dependent on our culture. The mathematical environment we live in would be the same regardless of which physical universe we lived in – let alone which tiny rock we were hitching a ride on. For us to survive the future, we will need to construct a culture that does not pit these two forms of intelligence against each other.

The social environment we evolved in was one of ignorance. The assumption of universal ignorance is baked strongly into our culture, and this assumption is what gives rise to cultural mechanisms that are designed to punish people who cling strongly to a truth – unless that truth is one we happen to already agree is true.  When we don’t know what’s going on, someone insisting too strongly on a claim can cause serious damage.

Our Inflexible Relationship with the Truth

Social situations in our culture require flexibility. This requirement is so strong, you could say it’s the only thing the culture is inflexible on – its demand that everyone must be flexible when it comes to things which aren’t agreed upon.   At the same time, we are required to be entirely inflexible when it comes to things which are agreed upon. This rigid inflexibility – “you must not be flexible here! but you must be absolutely flexible there!” lies at the root of the problem.

For example, if everyone is convinced that some shape is actually a circle, and you say “but no! it’s a square, see, it has four sides!”, you are labeled “trouble maker.” If you’re already in charge, this works – people will listen to you – but they may still grumble behind your back, unless they really respect and trust you.  If you are not in charge, and you tell the leader they are wrong, you’ve now set yourself back twofold – you have relegated yourself to an ‘outsider’ role, and your true claim is now seen as heresy. The group will ignore anyone attempting to make that claim in the future, and label them an outsider.

I’ve been in a situation where one engineer would get very upset when the management made bad decisions.  I agreed with the engineer that the decisions were bad – I thought they were bad decisions, too. But I understood that expressing anger, unless you are the leader, gets you laughed out of the room and ignored in the future. If you are the only one who is angry, you are either the leader, or you are a loser. There is no in-between.   If everyone has agreed that the Bad Decision is actually a good one, and you say “this is a bad decision”, they will blame your lack of cooperation for the failure caused by the bad decision.
“Of course it failed! We didn’t work together!”

The best thing to do, if you find yourself in this situation is 1)  insist in the leadership skills of whoever made the bad decision, 2) work tirelessly for the cause of the Bad Decision until it blows up. If it’s bad, it inevitably will. 3) Blame the leader for making the Bad Decision, which you fully believed in – just like everyone else! – because it was shown to be true by the wise leader, who perhaps made a mistake and maybe we should have a new leader.

I agree, that’s a crappy way of doing things, which is why i’ve designed technologies to enable a smarter culture, but in the mean time, this is the world we live in. For those who are interested in Zen, the above technique is how zen masters manage to befuddle their students. The student  insists X is true, the master takes that conclusion to its paradoxical end, and the student goes, huh, “how did he do that.”

Notice i’m not talking about the truth here – i’m talking about what people agree on.  If there’s one thing our culture agrees on now, it’s that we haven’t agreed on anything which is totally false. That’s absurd, but it’s what our culture believes about itself – that it doesn’t have any false beliefs it’s certain of. I know the truth exists, but i also know that whatever i believe has only some intersection with the truth. I am perfectly content to doubt my own thoughts, and I think this is essential for sanity in the modern world. I believe fully that the truth exists, but to claim that you have it, or know anything for certainty –  i think this claim is the height of foolishness. A claim requires axioms, and axioms require a frame of perspective; to assert claim X as undeniable truth is also assert the correctness, consistency and validity of both an axiomatic system and a frame of reference – which is much trickier to understand and often impossible to prove. I’m not certain of that, though.

This mandatory flexibility, the willingness to accept something that you don’t agree with – because everyone around you believes it – is essential to survive in our political climate. A politician who came out strongly in favor of slavery in the 1800s would get nowhere. A politician who came out strongly in favor of gay marriage just 20 years ago would get nowhere. I don’t believe for one second that Barack Obama was opposed to gay marriage until 2013, when suddenly he had a change of heart that coincided with the opinion polls shifting to show support of it.  He changed his mind when it was publicly safe to do so, because he was afraid to sacrifice his position of leadership.

Is that really leadership? It has been in the past. It won’t be forever.

Flexibility of belief – when it comes to things which aren’t publicly agreed on –  is essential for survival in our social culture.  In analytical thinking, however, too much flexibility can prevent you from seeing the truth.  If you know it’s really a square, and you say it’s a circle because that’s what everyone else is saying, you can start to convince yourself it’s a circle.  If insisting “it’s a circle!” is essential to your survival, then believing the truth can get you killed,  unless you are constantly on guard against the truth coming out of your mouth.

If you want to know why politicians are liars, this is the reason. We push them into it with our culture.  The official story in america is “You’re overcomplicating it! Just give people freedom to choose and protection from bad guys, and they’ll figure it out.” Then we argue incessantly over who and what the bad guys are.

The nice thing about flexibility – when it’s universally applied – is that it prevents us from fighting with each other. We aren’t universal about it though – if someone challenges our beliefs, and it’s an area “we have all agreed on” (depending, of course, on your defintion of “We” and “all”), you are likely to become angry and attack them for Saying The Thing Which Is Not.

An autistic kid who freaks out because they are over stimulated, and things aren’t what they expected – it happens because they see more and hear more than most people do. They don’t make eye contact because a fraction of a second for them is  like 10 minutes of eye contact for you. Our culture says autistic kids have a problem because they don’t do well in our culture, but our culture isn’t built for them. It’s built for us.  We are inflexible about enforcing the unwritten rules we have designed to make ourselves comfortable, and then blame people capable of amazing things for not being with a program we don’t fully understand ourselves.

A complex set of axioms, which, when adhered to rigidly, can produce great works – this could either describe the world of engineering, which allows us to build giant buildings – or a militant religion which inflexibly says how the world works, and gives us reasons to blow the buildings up.

The agreed upon truths – the “official story” in China and Iran is not “give people freedom and keep them safe from bad guys”  – they both have their own “official stories” which are wildly complicated and full of axioms. Consequently, their leaders are all engineers – because you’d have to be able to deal inflexibly with a complex set of axioms to be able to win an argument there. When they go behind closed doors to hash something out, they’re probably talking about marxist theory (in china) or some complex interpretation of the qu’ran (Iran) not because they really think these things are true, but because agreement is 1) a very hard thing to accomplish and 2) necessary for society to function.

An agreed upon lie is much socially safer than an ambiguous,  unproven, possible truth. Until we advance as a culture – to privilege the truth above all else – we will have liars for leaders.  A society that privileges the truth above all else will not attack anyone for any belief – only actions. In such a society, the weak can challenge the powerful by being right , and being the first to believe something eventually found to be right is much more rewarded than any other position. A true leader changes their opinion before the masses – not afterwards.

share your thoughts!

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s