Friday, August 6, 2010

NYT muddles through my field

The NY Times published a piece that tries very hard to look at popular predictions about Greece's economy through a lens of cognitive-neuroscience decision making theory. Two authors were involved - one works for an investment firm, the other is a neuroscientist.

According to this article, "[T]here are lessons to be learned from neuroscience on what distorts our cognitive abilities that are highly relevant to understanding the Greek crisis." WORD. The notion of rational choice used in a lot of classic economic theory has always struck me as a little silly, just because (I'm not sure if you noticed this) people are frequently irrational. Decision-making science totally has a lot to offer those who are looking at the neurological basis for our irrationality.

But not in this piece.

I think the economist was the primary author here, because the nuggets of neuroscience seem rather shoehorned in... and that's after they've been sanded into the right shape.
The neural system used when anticipating rewards is active long before the one in charge of evaluating risks and losses... Therefore, an outlandish prediction (albeit, perhaps, inadequately grounded) of a euro zone implosion is likely to be rewarded by editorial success and intellectual kudos; and by the time it may be proven wrong, it might well go unnoticed.
Wait, really? Ok yeah I agree with the conclusion there, but that premise bugs me. What two systems are you talking about here? Because "anticipating rewards" and "evaluating risks" are 1) totally intertwined and 2) distributed across a bunch of systems. Both risk and reward anticipation are tangled up with how impulsive a person is, and there's no reason to think that it couldn't tip both ways. It's a generally accepted idea in my lab that brains certainly can be overly sensitive to rewards (ahem, addiction), but brains can also be overly sensitive to risks (ahem, anxiety).

This would be a lot better explained by something like selection bias. In fact, I think most of this piece would have come across as less muddly if the authors had stuck to explaining decision-making errors in cognitive terms instead of trying to make it all about the brain.
Although our brains do not function biologically in a dichotomous fashion, binary thinking is the brain’s favored method, as it is easier to categorize events in terms of success/failure, cooperation/competition, rational/irrational, etc.
This is a straight-up example of a false dilemma. and yes, it is an easy trap to fall into. I have no idea what neuroscientific evidence there is that backs up the idea that "binary thinking is the brain's favored method". I mean, there's been a lot of two-choice tasks used to study decision-making... but that's because it's easy to analyze. It's the researcher's favored method, not the brain's.

I suppose you can justify it -- I mean, our cognition is generated by our brains right? So if we have a tendency towards some cognitive error, it must necessarily be something that our brain is doing. But if your essay is supposed to be about "lessons to be learned from neuroscience on what distorts our cognitive abilities," then you've put the cart a couple miles ahead of the horse.

My point is, the actual neuroscience has been mashed to a pulp, which I suppose makes it easier to swallow for those non-neuroscientists reading this. I guess. You really can't turn neuroscience into a soundbite. It just doesn't work, as this article plainly demonstrated. Which is why it cracked me up to read:
[R]egardless of the context, a careful “it depends” explanation will never be as convincing as a clear-cut opinion.
Hah! Look in the mirror, whydoncha?

P.S. The piece also ragged about building silos! Hahahahaha one of my professors is really hung up on how we should be building "farmhouses", not "silos". I personally plan on building a freaking pyramid. THAT'S RIGHT. A PYRAMID.

No comments:

Post a Comment