Wednesday, February 17, 2010

Self Deception Is Vague

From a forthcoming study [pdf] in Cognition (HT Vaughan Bell):
We hypothesized that such self-deception depends on imprecision in the environment that allows leeway to represent one’s own actions as either observations or interventions. Four experiments tested this idea using a dot-tracking task... Precision was manipulated by varying the vagueness in feedback about performance. As predicted, self-deception was observed only when feedback on the task used vague terms rather than precise values ...

Self-deception requires imprecise feedback on performance. Simply having vague terms (‘fast’, ‘slow,’ or ‘average’) is sufficient to satisfy this requirement. Precise feedback makes it too obvious that the agent is intervening.

The more precision you seek, the more you honestly want to know the truth. That's instead of just wanting to seem like you want to know the truth, to others or to yourself. But before you go all rational-fu on me, recall Mezulis et al:
The self-serving attributional bias has been associated with greater self-reported trait happiness, less depression, more positive mood states, better problem solving, better immune functioning, and lower mortality and morbidity longitudinally. By contrast, an attenuated or absent self-serving attributional bias has been associated with depression; worse physical health; and worse academic, work, and athletic performance.

It may or may not be worth it to excommunicate your self-serving biases. There's a good chance you won't be able to do so even if you try. If you can, you'd probably like to be able to turn on the precision-seeking for topics you care about and remain vague on topics about which you don't particularly care for the truth. Unfortunately, it might not be that easy. As David Foster Wallace wrote, the truth will set you free. But not until it is finished with you.

Ref: doi:10.1016/j.cognition.2009.12.017, pdf here. And doi: 10.1037/0033-2909.130.5.711, abstract. here.

Sunday, February 14, 2010

Testing Short Run / Long Run Coolness

It's time for me to put my theory on coolness to the test. I've tried to determine some traits / behaviors that are fun / helpful in the long run as opposed to the short. Here are a few of them:
  • Flossing on a consistent basis
  • Extreme sports (i.e. bungee jumping)
  • Doing something else instead of doing your assigned school work
  • Opening a bottle with your teeth
I've tried googling "cool activities" to get ideas but it's mostly just propaganda. Adults trying to say that things are cool in the hope that they will magically become cool, when in the reality the opposite is probably more likely. I'll hopefully end up with 5 that I think of generally as in the "cool" category and 5 that I think of generally as in the "not cool" category.

What I want to do is to run two tests. In the first one people will rate whether the given trait / behavior is more fun or smart in the short run or the long run, and to what degree. In the second one people will simply say how "cool" the given trait / behavior is. There should be a correlation between how fun / smart the trait / behavior is in the short run as opposed to the long run and how cool that trait / behavior is. Granted, it won't be causation but it'll be a start. E-mail me if you have any ideas for traits / behaviors you think might mess up the theory and that I should therefore test. I'm game for whatever.

Saturday, February 13, 2010

Subsdize Earnestness

Incentives today pressure people to drastically downplay how hard they've worked on a given project. For example, I bet Malcolm Gladwell is vexed about telling FT columnist Gideon Rachman that he memorizes all of his speeches. It hurts his free-wheeling, effortless speaking persona. Another is example is how Jack Kerouac's supporters love to play up how he wrote up On The Road in just three caffeine-fueled weeks, even though that spontaneity is probably overstated.

But man, how counter-productive is all of this posturing? It confuses people, making their estimates of how long a given project will take less accurate. And it places an unruly emphasis on intrinsic qualities like intelligence, instead of controllable things like the total number of focused minutes spent.

So to get away from our counter-productive status quo, let's subsidize earnestness, at the expense of mysteriousness. I see two paths towards a more earnest culture:

1) Glorify the revision process. The goal would be to illuminate the messy middle steps that underly successful endeavors. For example, in his interview with Ben Casnocha, Colin Marshall suggested a museum of rough drafts that would emphasize how most everyone's first draft sucks. This applies particularly well to art but generalizes, as we could include first business plans, first lines of code, and first experimental designs. If these messy middle steps are glorified then people will be more willing to share them.

2) Shun those who act mysteriously. Mysteriousness is cool because it emphasizes the short run over the long run. In the short run your onlookers will think of your success as effortless, which will raise your status. But in the long run, nobody knows how to help you or whether they can offer you advice, because you haven't made your plans transparent. So we should punish mysteriousness and unabashedly pressure people to open up.

Friday, February 12, 2010

What Does Honesty Require?

Robin Hanson made a useful distinction in his interview with Colin Marshall (text here) between sincerity and honesty:
That's the hard thing, looking inside ourselves to realize that, even when we're very sincere, we're rarely honest, in the sense that we're not being very careful to be accurate. But is sure feels sincere to us....

Honesty is when you're really trying hard to be accurate. To be honest, you have to think about possible criticisms and take them seriously. You have to ask what the evidence on the other side would be. You have to wonder who has supported which sides of a position. If you're going to be honest about something, there's a set of considerations you're supposed to look at. We all pretty much know what they are. But when we're sincere, that doesn't mean we've done those sorts of things.
Do we all pretty much know what the requirements of honesty are? Just to be sure, let's spell a few of them out:

1) If there are "opponents" that ostensibly hold beliefs inconsistent to yours, you should seek them out and see where they falter. Be sure to "pick on people your own size"--don't focus on the weak arguments of the opposition or invent straw men. If you can't identify exactly how they falter but still have an intuition that they're wrong, then OK (see #11): admit that. Don't rationalize more complicated explanations. But if you can't point to the exact reason why your opponents are wrong and back up your claim, then be extra wary of your own unconscious biases.

2) At any given point, your honest beliefs all should be consistent with one another. This holds for analogous situations to an extent. For example, if you honestly believe that most people behave a certain way, then you need a good reason to argue that you yourself don't follow that trend too. Reversal: This doesn't mean that you can't change your mind on beliefs from one moment to the next! In fact that's the appropriate response when your knowledge of the facts change. What it means is that at any one moment your honest beliefs must be consistent.

3) In order to be honest, you should consider what predictions your belief implies. If there is any way you can test those predictions at low cost, you should do so. If this is a belief that you profess to really care about, then you should be actively trying to test those predictions even at moderate cost (of your own time, for instance). And if there are no predictions of your belief, then you should admit that maybe your belief doesn't really matter.

4) ... ?

Sunday, February 7, 2010

Blog Slow Homie

Nick Carr points to a Pew Study on blogging trends:
[The study] put a big fat exclamation point on what a lot of us have come to realize recently: blogging is now the uncoolest thing you can do on the Internet. It's even uncooler than editing Wikipedia articles or having a Second Life avatar. In 2006, 28% of teens were blogging. Now, just three years later, the percentage has tumbled to 14%. Among twentysomethings, the percentage who write blogs has fallen from 24% to 15%. Writing comments on blogs is also down sharply among the young. It's only geezers - those over 30 - who are doing more blogging than they used to.
Let's analyze the coolness of blogging with my long run / short run theory. As more old people start to blog, blogging should become more socially acceptable, because those with more power tend to define social customs. This will emphasize the long run aspects of blogging, like personal development and positioning. It will also de-emphasize the short run aspects of blogging, like rebelling against those with power over you.

Cooler blogs should be ones where people eschew the long run attributes of blogging. Instead, cooler blogs should focus on short run activities and do less personal branding / marketing. If cool blogs do do personal branding, it should be in a roundabout, counter-signaling way. For example, it should be cooler to write under a pseudonym.

I admit my theory can't explain why Unhappy Hipsters is so cool. But I think it can explain a lot about the sphere.

(Thanks to Bonnie for the idea)

Monday, February 1, 2010

Eleven CMRHMOI Thoughts

Robin Hanson was the most recent guest of Colin Marshall's on the Marketplace of Ideas. You can find the dialogue mp3 here. It's fifty-six minutes and, unless you're like saving the world or something, worth every femtosecond of your time. Colin probes Robin on many of his major themes: disagreement, signaling, near-far, academia, the future, and the dearth of objectivity. He did his homework. Here are my thoughts:

1) One of Robin's creeds is that we should be able to take other people's opinions seriously. He strives for methods that will make this possible; thus, the fixation with prediction markets. To identify which people are actually experts, you must put a price tag on uninformed opinions. The cost need not necessarily be paid in currency. If pundits were pressured to make predictions consistent and objective, and the public cared, that too would subsidize a markedly less noisy marketplace of beliefs.

2) The two frequently engage in pretend-tongue-in-check meta talk, colloquially known as "going there," to great success. For example, at one point Colin talks about how he is showing off his impressive interviewing skills while simultaneously showing off his impressive interviewing skills.

3) Colin keeps offering Robin the opportunity to gloat for a moment about the blog he's built and the following he's developed. Robin parries these advances amicably, re-framing his success as a byproduct of the inevitability of niche markets on the web and the desires of competing groups to have new tools--cognitive biases--to accuse their opponents of falling victim to. In so doing, he places too much emphasis on intentions. Since he is a ruthless universalizer, he must now admit that his own intentions are probably not so noble. He is right to apply his principles freely to himself, but overly pessimistic in hardly allowing for good actions by humans. Is this professed pessimism about all people's intentions in part influenced by a desire to maintain his aura of humility? If it is, then it is actually an example of one way in which Robin does bias his opinions via signaling, because being humble is high status. So in this case his pessimism is on point. But if this pessimism doesn't impact his belief about people's intentions, then I sincerely can't identify anything he is biased about, in which case he's wrong about his own intentions. So, he's either right for the wrong reasons or wrong for the right reasons. I will now throw up on my keyboard from dizziness due to all this circular reasoning.

4) We're back.

5) In discussing academia, Robin mentions how sexy innovations are often not as important as we think they are. In fact our richness as a species probably has little to do with our capacity for abstract thought. This is my favorite quote by him, and one that I think about all the time: "The truth is that the artistic creations or intellectual insights we most admire for their striking 'creativity' matter little for economic growth. Instead, most of the innovations that matter are the tiny changes we constantly make to the millions of procedures and methods we use." True and immensely useful as a mental hack: Worshiping and waiting for the big idea only leads to deep procrastination. Instead, focus on the various puzzles you can solve now, cutting what you perceive as big, important tasks into smaller, less important ones. The small tasks are where you are more likely to actually make useful contributions, anyways.

6) There is a deficiency of neutral analysis for determining who exactly is rational and truth-seeking. Robin keeps commenting on LW about how gathering data and developing a such a neutral test would be a very useful project for someone to undertake (see here, here, here, and here), so someone should get on that already!

7) Rationalist-oriented people on the internet are ultimately most interested in talking about rationality. This makes in terms of the relevant selection biases for ending up at Robin's blog or on Wikipedia's list of cognitive biases. Moreover, most RSS feeds are read by folks procrastinating at work and OB is probably not much of an exception. So, there's a huge filter between passively reading about rationality and actually acting on it.

8) One's intellectual history should be composed of viewquakes, the ideas that change your conception of the world dramatically. Robin's intellectual viewquakes: relativity, quantum mechanics, thermodynamics, managing complexity in comp sci, supply and demand, incentives, rationality (!), and Aumann's theory of disagreement. Also, the idea that the future might be different from the present to a similar degree that the past is different from the present.

9) Robin says it's hard for him to specialize because he is naturally curious about lots of things, but he forces himself to do it anyway. He also says that most intellectual failures, people who are smart but still don't succeed, tend to be underspecialized. They can't figure out how to focus on just a few topics. This reminds me of some of Taleb's wisdom, sic, "Here is a quote by Paul Valery. He met Einstein at a party in the 1920s, and he asked him 'Do you carry a notebook' around? Einstein asked,'Why?', and Valeria said 'To write down your ideas, to put them down,' and Einstein said, 'I only have one idea.' To succeed, you only have to have one idea—two ideas, you’re dead." The intense division of labor is a very new idea--for most of the world less than 100 years old, and it's one that conventional wisdom is so far from understanding it's not even funny.

10) The main folks who will correctly apply the ideas they read about are those who care even more about the outcome of a particular event than their own status. But once you begin a quest for the truth on one particular subject and learn techniques to aid you in that quest, is it possible to turn your new skills "off" for less relevant subjects? Can you still let that which does not matter truly slide? David Foster Wallace argues no with respect to grammar. In his mind, once you learn grammar rules you are compelled to notice flaws in the grammar of others and be annoyed by them. I actually think that it is possible to learn about biases without overly applying them. But some of my friends might disagree! In fact I have been openly criticized for "talking about psychology too much" at least once. Good thing I have you guys.

11) I was going to end at ten thoughts but I didn't want to be yet another data point in Ben Casnocha's lonely crusade against round-numbered lists. So this is the bonus thought. Enjoy.