Saturday, July 16, 2005

The Experience Machine

Battlepanda brings up Nozick's Experience Machine case, which led Julian Sanchez to believe that happiness was not the only rational thing to pursue. (If you're not familiar with the case, follow the link to Battlepanda's site -- she has it there.) I agree with Julian on this. But I'm still a utilitarian, like Battlepanda. How can this be? It depends on a distinction between rationality and what contributes to well-being. I shall explain.

When people are presented with the opportunity to enter the Experience Machine, they often don't, because they don't desire the experiences it offers with any great intensity. Their most intense desire is, perhaps, to write a great novel. This is different from a desire to have the experience of writing a great novel, or to have the pleasure of writing a great novel. The Experience Machine offers the second and third, but not the first. While pleasure shows up whenever our desires achieve satisfaction, this isn't because pleasure is the object of desire. The objects of our desires are many and varied. It's just a fact about desiring that when you attain the object of your desire, you feel some pleasure, but this is different from pleasure being the object of desire. What it's rational to do, on my view, is what maximizes expected desire-satisfaction. Since people don't desire what the Experience Machine offers, it's rational for them to not plug in to it. (It's possible to read Battlepanda's dialogue as making this point, but I doubt that's how she intended it.)

Well-being, or what's good for someone, is different from what it's rational for them to pursue. While expected desire-satisfaction is at the core of my views about rationality, I think pleasure is the currency of well-being. Suppose you met a person who felt intensely guilty about what he'd done in the past, and wanted to kill himself in some painful way as a punishment. Assuming that this was his most powerful desire, and that the sum of his other desires didn't go against it, it'd be rational for him to buy implements of torture for use against himself. But if you wanted to do what was good for him, you might kidnap him on the way to the torture store and inject him full of a euphoria-inducing drug that would also eliminate the memory of his past bad deeds.

When designing public policy, our goal isn't to give people what it'd be rational for them to pursue. It's to set things up in the way that maximizes overall well-being. (Of course, keeping in mind that a pretty large amount of freedom is useful in generating well-being.)

3 comments:

Anonymous said...

Well, to tell you the truth, I don't really know exactly what I'm trying to say with that dialog. Mostly just to highlight how much Nozick is relying on the 'Ick!' factor to make his point, I suppose. And how often do you get to write an hypothetical dialog between a caveman and a 31st Century visitor?

A much more cogent attack on the Experience Machine can be found on this page. I think you guys are coming from roughly the same place in your criticism of Nozick.

As for me, I think a big part of what makes us human is our ability to construct experience machines. A book of fiction is an experience machine. An alcoholic beverage is an experience machine. Our lives would be vastly less rich if it were always to be tied reality.

Anonymous said...

Oh, I see. After reading Julian's comment and re-reading your post I realize that I mis-construed the point of the experience machine. Nothing to do with how far away removed from physical reality one is.

So, but I'm still confused. So as a utilitarian, you will not step into the experience machine, but given the chance, you would stuff everybody else in because pleasure is the coin of utiltiy?

Neil Sinhababu said...

I think this is the way to put it:

For any given person, what's best for that person is stepping into the experience machine.

Nevertheless, some people can be making the right decision (taking into account what they want) by not stepping into the experience machine.

In other words, sometimes the right decision won't lead to what's best for you (and not because of a lack of information). The concepts of "the right decision" and "what's best" come apart that way.