Sunday, November 30, 2008

Two theories of self-education

The first is by Paul Graham (HT: Charlie Hoehn), from his rejected high school graduation speech,
I propose instead that you don’t commit to anything in the future, but just look at the options available now, and choose those that will give you the most promising range of options afterward…. Suppose you’re a college freshman deciding whether to major in math or economics. Well, math will give you more options: you can go into almost any field from math. If you major in math it will be easy to get into grad school in economics, but if you major in economics it will be hard to get into grad school in math.
The second from Cal Newport, in an IM conversation with Ben Casnocha,
What about the big question of “what should I do with my life?” As you know, my approach is sort of “there is no wrong answer, choose something and focus on it so you’ll start reaping rewards, you can always change later.”... I’ve been a big believer in the 10,000 hour rule. Roughly, that being good at anything takes a long time. If you want to be good at something in your 20s, start in college.
These might at first look similar but on closer inspection they are radically different. Newport is saying that you should choose one topic and commit to specializing it, while Graham believes that you should pick your studies so as to keep your options open in case you choose to switch fields.

I have no way of knowing which tactic is actually more fruitful, but what I like about this dichotomy is that there is no free lunch. What you gain by specializing you give up in terms of ability to switch fields, and what you gain in keeping your options open you give up in terms of expected success in one particular field.

The only other option is a nihilistic stance towards the future, and although I would not prefer it personally, that is probably a rational choice for some people.

Friday, November 28, 2008

Comparing men and women's taste in movies

One way to do this would be to compare imdb's top 50 movies as voted on by females versus imdb's top 50 movies as voted on by males. You may complain about differences in sample size, since there are about 10 times more votes by males than females on imdb (why is that?). But with 10,000+ votes on most of these movies by females, I would still prefer these scores to any other measure.

Movies much higher on the female list: The Lord of the Rings (#4, 6, and 12 as opposed to #14, 20, and 31 on the overall list), Gone with the Wind (#11 vs #167 on overall list), Amelie (#14 vs. not on the male list), The Pianist (#17 vs. #56 on overall list), Pan's Labyrinth (#22 vs. #61 on overall list), Finding Nemo (#24 vs. 152 on overall list), The Notebook (#24 vs. not on the overall list), Beauty and the Beast (#34 vs. not on the overall list).

Movies much higher on the male list: The Good, The Bad, and The Ugly (#4 on male list, not on female list), Seven Samurai (#11 on male, not on female list), The Matrix (#25 on male list, not on female list), Se7en (#27 on male list, not on female list), Apocalypse Now (#31 on male list, not on female list), City of God (#18 on male list, not on female list).

I am truly amazed that the Matrix is not on the female list, since it probably has a high weight given that it is such a highly watched movie. Plus... Keanu Reeves! Females only give it a 8.3, and a 7.6 by females under 18, which is weird to me. But then again I'm not female.

If you are looking for biases in the top 250, this would be a good place to start, since there are so many more males voting on imdb than females. There is a clear trend on the male list toward movies with more violence, and a clear trend on the female list toward movies with more mushy stuff.

Wednesday, November 26, 2008

Objectively higher states of morality?

Lawrence Kohlberg's stages of moral development range from level 1 to 6. They increase in order from "obedience" to "individualism and exchange" to "interpersonal relationships" to "social order" to "the social contract" to "universal principles." The stages are meant to be invariant--one must first reach stage 2 before moving on to stage 3, and he found that older children were more likely to be at higher stages.

In evolutionary biology there is a movement to stop calling certain species "higher" than others just because they have more base pairs or divergence from common ancestors. I wonder if the same principle can be applied here too. Just because one must pass through stage 2 to reach stage 6, does that mean that stage 6 is more moral in some objective sense? What does it matter that older children are more likely to rate at higher levels? Judge for yourself, but I've yet to be sold on Kantian universalization.

Tuesday, November 25, 2008

Bad experiences and great stories

Paul Zak's article in Psychology Today describes how he fell victim to the "classic" Pigeon Drop con at the age of 16 and lost $100 in the process. He probably felt bad about it at the time, but having been conned has paid huge dividends as blog material. His post has been tagged 228 times on del.icio.us, by far the most of any article in the Psychology Today blog world. In the heat of the moment, loss aversion biases us against placing ourselves in risky situations. However, because of their potential utility as stories, it may be rational to expose yourself to situations with chances of bad outcomes that would at least lead to a good story. Here are some thoughts: 1) The younger you are, the more risks you should take. The amount of time that you will be alive to tell the story is probably greater the younger you are, so the utility of going through a wild experience is higher. 2) Writers or people who have access to a large audience should take more risks and be more careless than those in the general population. 3) We should feel bad in general when people have unlucky negative experiences, but especially bad if there is no interesting story behind the malaise. It is better to be robbed at gunpoint than to misplace your wallet, story-wise, and we should adjust our sympathy levels accordingly. 4) If you are going to take advantage of somebody, the thoughtful way to do it is to pull off a creative stunt so that your victim will have an engaging tale to tell. It's nicer to conduct an elaborate heist Danny Ocean-style than to simply steal a valued possession during a lull in attention.

Monday, November 24, 2008

How important are recommender systems?

Tom Slee thinks that recommender systems are the wave of the future. The prototypical example of these is Netflix, where you are alerted to movies that the algorithm thinks you will enjoy based on your vote history. He makes some interesting points about transparency in these systems:
Transparency matters. The unmarked presence of sponsored items in a recommendation list would be widely viewed as a corrupt set of recommendations, but just as bookstores charge for premium display sites within the store, so sites of recommendation lists may be sold. Recommendees have a right to know if payola is part of the system.
In recommender systems, transparency is crucial because otherwise people will think that your opinion has been bought. In a pure rating system, however, you can't have transparency because that would make it easier for firms to game the system and watch their product climb to the top. I am not as optimistic as he is about the future of recommender systems, although I am happy to be proven wrong.

Saturday, November 22, 2008

Increasing risk of nuclear weapon use

The Global Trends report of 2025 was just released, and it contains some predictions about what the geopolitical structure will look like in 17 years. Although it is long, various keywords are bolded so you can hone in on what interests you. What caught my eye was their section on nuclear proliferation:
The risk of nuclear weapon use over the next 20 years, although remaining very low, is likely to be greater than it is today as a result of several converging trends. The spread of nuclear technologies and expertise is generating concerns about the potential emergence of new nuclear weapon states and the acquisition of nuclear materials by terrorist groups. Ongoing low-intensity clashes between India and Pakistan continue to raise the specter that such events could escalate to a broader conflict between those nuclear powers. The possibility of a future disruptive regime change or collapse occurring in a nuclear weapon state such as North Korea also continues to raise questions regarding the ability of weak states to control and secure their nuclear arsenals.
Just as an individual is more likely to kill themselves than to be murdered or killed in war, I fear that the human race is more likely to destroy itself than to be destroyed by any outside threat.

And out of all of the doomsday scenarios we hear about on a regular basis--meteor strike, rampant disease, rapid climate change--I think that a nuclear winter is by far the most likely scenario and the one that deserves the most attention. We should celebrate having made it over 70 years without destroying ourselves, although we should remember that we have come close.

Friday, November 21, 2008

Does primacy trump recency?

One of the more nuanced critiques of the literature on human cognitive biases is that some of them posit conflicting effects. Probably the most glaring discrepancy is the difference between the primacy and the recency effect in hypothesis formation. Which factor is more important? Marsh and Ahn take on the challenge in their 2006 paper,
Some studies have shown a recency effect: Information that is presented later in a sequence is more heavily reflected in judgments than is information that is presented earlier. Other studies have shown a primacy effect: Early-presented information is reflected in judgment more than is later information...

Throughout this study, we have maintained the position that the primacy effect is obtained because people form a hypothesis from earlier data and underadjust this hypothesis... The primacy effect was found to be moderated by the cognitive load required by the hypothesis-testing nature of the task and by the size of the verbal working memory capacity available to process information.
Unless the subject is overloaded with information, the primacy effect is dominant. A real world application can be found in the work of Trevon Logan, who analyzed college football votes in order to see which games had more of an impact on the rankings of AP voters. Consistent with the idea that primacy is more salient than recency, he found that it is better to lose later in the season than earlier.

In order to be rational, you must obfuscate your initial opinions and see the whole story before you begin to draw conclusions. If you must choose, err on the side of weighing the later data more heavily in order to compensate for your cognitive flaws.

Thursday, November 20, 2008

Neuroplasticity and the effect of brain games

Norman Doidge writing at The New Humanist passes along a hopeful anecdote about brain training,
Dr Stanley Karansky was 90 years old when we spoke. He was a medic at D-Day. He practised medicine until he was 80. When he turned 89, he told me, he began to have trouble remembering names; he couldn’t register their auditory impressions clearly. He had trouble communicating and withdrew socially. He became less alert and had trouble driving. Then he began The Brain Fitness Program, developed by Merzenich, which sharpened his auditory processing. In six weeks, with an hour a day of brain exercise, Stanley’s age-related cognitive decline was reversed.
Yes there are some studies backing up many of his claims, and from the papers I have read there appears to be no concoction going on. I think that the idea of computer games actually improving mental health is so absurd that we should take these results more seriously.

But still, I do doubt that all of Stanley's age-related cognitive decline was reversed in just six weeks. This kind of embellished press might hurt the brain training industry in the long run, as consumers could expect too many immediate effects and be unwilling to commit to a serious program.

Wednesday, November 19, 2008

How to make YouTube rating better

Currently the rating system on YouTube is a disaster. About the same number of people rate videos as the number of people who comment on them, which is ridiculous because commenting takes much more energy than the one click of the mouse it takes to rate something.

Their "top rated" video section is a joke too--it has 25 videos on it, all of which appear to have the same rating of 5/5 and none of which look particularly enticing. How should Google fix the current mess? I have a few suggestions:

1) Ever time a user rates a video, it should be recorded on a list that can be made publicly accesible. Imdb has this kind of a "vote history" list and it's awesome. Somebody can quickly scan my list and see if I am a big idiot or whatever. Alternatively, they could show the ratings on the "my favorites" page, this kind of a list. If you have a record of all the ratings a user has made, you give that user an immediate incentive to start rating videos.

2) They need to switch to a 10 point scale, and show the actual number out of 10 that the video averages. Switch all of the current votes of "5" to "10", "4" to "8", etc., and then let users tell you which videos are the best. It's hard to tell exactly what the rating is with the status-quo 5-star metrics.

3) There should be a larger list of the top rated videos, perhaps organized by section. These should be computed using a formula where videos with more votes get weighted higher, just like it is done on imdb. They could even call it the Top 251; I don't see why they should be bashful about copying the success of others.

Why does this matter? Because right now the only criteria that you can use to watch good videos is views, and watching movies based on views causes a nasty feedback effect. How do you take a view back if you don't like the video?

I think it would be cool if there was a way to aggregate the best videos on the site so that we would know what to watch instead of relying on random e-mails. Google, you have so much potential here with YouTube. Make it happen before somebody else does.

Sunday, November 16, 2008

Fenyman on what it takes to be a scientist

In his famous 1974 Cal Tech commencement address, Richard Fenyman explained his philosophy of science,
The first principle is that you must not fool yourself--and you are the easiest person to fool. So you have to be very careful about that. After you've not fooled yourself, it's easy not to fool other scientists. You just have to be honest in a conventional way after that.
Not fooling yourself is not trivial, and as soon as you are comfortable that you've succeeded, you run the risk of slipping. Plus, the pursuit of honesty sometimes runs against other human emotions, such as ambition, a need for security, and a desire for recognition.

Yet scientists are still among the most trusted professions, ranking above pollsters, athletes, and civil servants. I suppose that there is an incentive to lie in most every profession, but scientists are among the most introspective of that fact.

Saturday, November 15, 2008

Two short lists

Things we talk about too much: Global warming, Malcolm Gladwell, neuroscience, politics, altruism, nature vs. nurture, whether a "God" exists, daily fluctuations in the stock market, consciousness, Iran, The Vietnam War, David Brooks, cars, evolution vs. intelligent design, and productivity hacks.

Things we don't talk about enough: Patent law, water rights, to what degree the state should redistribute wealth, Robin Hanson, how dolphins sleep, monetary economics, whether the universe is random or ordered, the specifics of how we die, Oceania, the Spanish-American War, and statistics.

Note that being on the first list does not imply that the topic is completely unimportant. For example, I am studying neuroscience, I just think that most of the stuff written about it is either too sweeping in its conclusions or simply boring to the non-specialist. And by the way, this is not "just my opinion," these are the icy cold facts. What would you add, second, or take away from either list?

Friday, November 14, 2008

Why are smart people so old?

That's the question that this essay by Ian Deary poses in a recent issue of Nature. Deary notes that intelligence predicts mortality better than BMI, total cholesterol, or blood pressure, and at a similar level to smoking. He offers up four hypotheses, and I am partial to the idea that people with higher intelligence are more likely to engage in healthy behaviors. I believe that a lot of our success is based on an ability to project our current actions onto future states, and that this ability improves as intelligence increases.

It's a concise article, so check it out and come to your own conclusions. I would venture that if you are a self-selected blog reader, this correlation should probably come as welcome news.

Thursday, November 13, 2008

The passion of Tom Friedman

I've always pictured NYT columnist and author of The World is Flat Tom Friedman to be quite level-headed as he calmly and succinctly extols the virtues of a carbon tax. But the few days he has seriously amped up the intensity, much to my delight.

First you have his November 11 column, where he tells of the time that he began screaming at the TV screen while the CEO of Chrysler was being interviewed. Later in that column he reveals his plan for all current and past representatives of Detriot to be the "pallbearers" at the auto industry's funeral.

But it is in his own CNBC interview that he truly begins to unleash his fury. He offers a few hypotheses for Russia's recent hostile behavior, and explains that the United States has traded their trust for the Czech navy. The Czech Republic doesn't have a navy. You can always tell that somebody is really angry when observers being to nervously chuckle yet they show no signs of slowing.

Although his delivery may be funny, at least Friedman is showing is true colors. It is admirable that he feels no need to hide behind discussing "what the majority thinks," like some of his fellow NYT columnists.

Wednesday, November 12, 2008

How to lie with anecdotes

Osama Bin Laden fasts completely two days a week. Mao Zedong had an unhealthy obsession with green tea, even brushing his teeth with it. Hitler was a vegetarian. Therefore, you should watch out for people with unorthodox eating habits. Beware vegans.

Ever since Darrell Huff released his classic book How to Lie with Statistics, the American public has been told time and time again to be wary of numbers. And it is probably prudent to be somewhat wary of numbers, as long as you still prefer them to anything else. Because easy as it may be to lie with statistics, it's surely much easier to lie with anecdotes.