January 13, 2003
One Hundred Interesting Mathematical Calculations and Puzzles, Number 11

One Hundred Interesting Mathematical Calculations Calculations and Puzzles, Number 11: The All-Knowing Alien Paradox

An all-knowing alien who has a perfect computer model of your mind lands on earth. Xhsbr (that's a pronoun, not a proper name) shows you a box with two compartments, one compartment of which is clear and the other compartment of which is opaque. Each side has a door. You can see $10 in the clear part of the box. The alien says that xhsbr has analyzed your psychology, and if you are the kind of human who would not take the $10 xhsbr has put $1,000,000 in the other, opaque compartment, which will be yours. But if you are the kind of human who would take the $10, xhsbr has put nothing in the other, opaque compartment. The alien says that you must first open the door to the clear compartment (and take the $10 or not) before the door to the opaque compartment will open. The alien says that the door to the clear compartment will only open once.

Xhsbr says that there will be no sanctions or negative consequences if you take the $10--that xhsbr will fly off and never return.

Xhsbr flies off. You are left with the box. You open the door to the clear compartment. You are completely certain that nothing the alien can do now affects how much money is in the closed, opaque compartment.

Do you take the $10 from the clear compartment before you open the other one? It is, after all, free money--either the $1,000,000 is there or it isn't, and whether you take the $10 has nothing to do with that. On the other hand, you know that the alien has been right in every single one of 1000 other experiments xhsbr has conducted around the galaxy in the past two years. So you know that the way to bet is that people who take the $10 find nothing in the opaque compartment, and people who leave the $10 find $1,000,000 in the opaque compartment.

What do you do?

Posted by DeLong at January 13, 2003 05:25 PM | Trackback

Email this entry
Email a link to this entry to:


Your email address:


Message (optional):


Comments

Leave the money, open the opaque compartment. We know that the 1,000,000 is in there, because by definition anything I do is what he expected me to do. Since he expected me to open the compartment, the 1,000,000 will be there.

(Now if economists had that kind of modeling ability, maybe we could get somewhere...)

Posted by: jimbo on January 13, 2003 06:09 PM

As someone who left math behind in high school, what exactly is this supposed to demonstrate mathematically? It seems to me the question is, am I smarter than the alien?

If the alien is smarter than me, leave the 10 dollars and take the million in the other compartment. If I'm smarter than the alien, take the 10 dollars from the clear compartment because either he accurately predicted my actions and there is no million dollars or he was wrong about me and I can get a million and ten dollars. The only question I need to ask is "Is the alien really able to predict my behavior?" How is that math related?

Or is this some of that game theory of economics that was in "A Beautiful Mind"?

Mike

Posted by: MBunge on January 13, 2003 07:53 PM

You're feeding Newcomb's problem to your 9 year old? Call Social Services! :)

Posted by: Kieran Healy on January 13, 2003 08:27 PM

I agree with MBunge--this is a philosophical puzzle, not a mathematical one.

If you think it's at all likely that this will occur, you should say, now, that you'll only take one box. Because the alien's prediction of what you'll do, is no doubt based in part on the public statements you're making on this very blog.

(Yes, I'm dodging the question :-))

Posted by: Matt Weiner on January 13, 2003 08:40 PM

Take the $10 from the clear box.

By the time you come to MAKE that decision and follow that action, the all-knowing alien has already settled what's going to be in the other box. Nothing you do now (by assumption) will affect what you get from the opaque box.

(My $0.02, but no doubt there's some clever Bayesian refutation, which I shall await.)

Posted by: Michael Harris on January 13, 2003 08:48 PM

>>(My $0.02, but no doubt there's some clever Bayesian refutation, which I shall await.)

Only if there is some kind of uncertainty about what kind of person you are, which your actions help resolve...


Brad DeLong

Posted by: delong@econ.berkeley.edu on January 13, 2003 08:54 PM

>>You're feeding Newcomb's problem to your 9 year old? Call Social Services! :)

And what, pray tell, is the correct age at which to introduce Newcomb's Problem to a child?


Brad DeLong

Posted by: Brad DeLong on January 13, 2003 09:42 PM

Leave the $10. That's, what, 6 these days? Losing that to gain either: 600,000 or: the sure and certain knowledge that FTL travel is within the reach of the stupid represents a rational choice to me - I'd certainly value knowing that I was smarter than a random representative of a star-faring civilisation at 6.

Of course, if there was a $100 note in the clear compartment, I'd have to think harder - that piece of knowledge isn't worth, to me, the price of a nice dinner for two*. If the alien _did_ have a perfect model of my mind, then I'd leave the $100 and take the $1,000,000. If it didn't, I'd have the $100 and go buy dinner.

So do I trust the alien? Well, does he have any reason to trick me? Not really.

On the other hand, he's just given me a box that he claims will resist all efforts to open it for the sake of $1,000,000 - the materials technology alone is worth more than that.

So the answer is: I check the opaque compartment to see if it's got the seed capital in it for the lab I need to set up to analyse this indestructible alien box. And I'm (probably) set for life!

*Not a _really_ nice dinner for two, and I'd have to spring for theatre tickets as well, but this world is not perfect.

Posted by: Andrew Dennis on January 14, 2003 01:38 AM

Leave the $10. That's, what, 6 these days? Losing that to gain either: 600,000 or: the sure and certain knowledge that FTL travel is within the reach of the stupid represents a rational choice to me - I'd certainly value knowing that I was smarter than a random representative of a star-faring civilisation at 6.

Of course, if there was a $100 note in the clear compartment, I'd have to think harder - that piece of knowledge isn't worth, to me, the price of a nice dinner for two*. If the alien _did_ have a perfect model of my mind, then I'd leave the $100 and take the $1,000,000. If it didn't, I'd have the $100 and go buy dinner.

So do I trust the alien? Well, does he have any reason to trick me? Not really.

On the other hand, he's just given me a box that he claims will resist all efforts to open it for the sake of $1,000,000 - the materials technology alone is worth more than that.

So the answer is: I check the opaque compartment to see if it's got the seed capital in it for the lab I need to set up to analyse this indestructible alien box. And I'm (probably) set for life!

*Not a _really_ nice dinner for two, and I'd have to spring for theatre tickets as well, but this world is not perfect.

Posted by: Andrew Dennis on January 14, 2003 01:43 AM

Dressing up Calvinistic predestination with space aliens, I see. :D

Posted by: Jason McCullough on January 14, 2003 02:28 AM

Give the box to a friend. Agree to split with him whatever money comes out. Take that, alien mind reader!

Posted by: Andrew Edwards on January 14, 2003 07:03 AM

>>You're feeding Newcomb's problem to your 9 year old? Call Social Services! :)

And what, pray tell, is the correct age at which to introduce Newcomb's Problem to a child?

Good point. Maybe you could integrate it into other popular childhood stuff --- perhaps replacing Xhsbr with Santa. This would give the kids an incentive to think about the problem seriously. :)

I remember David Lewis saying once that, in his experience, people's first reaction to Newcomb's problem tended to determine their long-term position.

As MBunge says, it's a philosophical rather than a mathematical puzzle.

Posted by: Kieran Healy on January 14, 2003 07:06 AM

Flip a coin.

Posted by: Scott Ferguson on January 14, 2003 09:13 AM

So the important question is: Mike, is that your real name, or did you take a pseudonum from Mario Bunge?

That said:
Usually when you do this problem you don't stipulate that the alien is all-knowing--ominscience and free will (in this case, yours) combine to produce some hairy paradoxes--but that you've watched a lot of trials, and the alien has got it right every time.

Michael Harris did a good presentation of the standard two-boxer's argument.

My argument wasn't the standard one-boxer's argument, which is, "If you're so smart, why aint'cha rich?"* That is, on repeated trials, the one-boxers get more money.

If you think that two-boxing is obviously correct--leaving aside the Andrews' considerations--you should consider that this is much like the Prisoner's Dilemma. (David Lewis thinks that the Prisoner's Dilemma just is two simultaneous Newcomb problems.) And the conclusion that it's rational to defect in Prisoner's Dilemmas is very disturbing to a lot of philosophers--it leads to the conclusion that we may all be better off if we all behave irrationally.

In bull sessions and blogs, my take on Newcomb's problem is that it's a matter of follow-through. After your bat has hit a baseball, it's to your advantage to stop swinging it as fast as possible (so you can run to first). But if you try not to follow after the ball has hit, you'll inevitably mess up your swing. Similarly, after the alien has left, it's to your advantage to take two boxes; but before xhsbr has left, it's to your advantage to be the sort of person who will take one box, so xhsbr will predict this and leave the million bucks.

You could, for instance, form the intention to take one box before you ever meet the alien, and if you're the sort of person who doesn't follow through on intentions, the alien will detect that and won't leave the million dollars anyway....

None of which addresses Michael Harris's point, but hey, if you're so smart, why aint'cha rich?

(These considerations also come up in questions of whether you should keep promises or carry out threats; see among others, David Gauthier's "Assure and Threaten," available on JSTOR for those with access.)

*David Lewis has a paper called "Why Aint'cha Rich," in which, IIRC, he admits the failure of a certain attempt to demonstrate the rationality of two-boxing without begging the question against the one-boxer. Since Lewis was a two-boxer, this was a fine demonstration of Lewis's intellectual integrity, which Kieran movingly blogged about on the occasion of Lewis's death.

Posted by: Matt Weiner on January 14, 2003 09:16 AM

Nope, Mike Bunge is what my Mom and Dad named me. Of course, I've never asked them where they came up with the name.

But again, could somebody explain to me (in the most remedial way possible) what sort of mathematical process, theory, practice or whatever this puzzle is supposed to deal in? I just don't see how math, at least my limited understanding of it, factors into this decision at all.

Mike

Posted by: MBunge on January 14, 2003 10:24 AM

Mike--In fairness, I should disclose that "Matt Weiner" is what my parents named me. (Mario Bunge is an Argentinian philosopher, but probably not that uncommon a name.)

Anyway, I don't think math is playing any big role in this puzzle, except insofar as $1m is a lot bigger than $10.

Maybe I should say a little about the connection to the Prisoner's Dilemma. (Hope everyone knows what that is.) Suppose, going into the Prisoner's Dilemma, your partner says "I'll cooperate if you do" (and you say the same). Now you're faced with the choice.

Well, if your partner decided you would cooperate, then he cooperated. If he decided you would defect, then he defected. You're better off if he thought you would cooperate. (That's the $1m in the opaque box.) But no matter what he did, you'll be better off if you defect. (That's the $10 in the transparent box.) A two-boxer will say, "Well, he's made his choice, and what you do won't change it--you should defect." A one-boxer may say, "Well, you'll be better off if you're the sort of person who cooperates (because he'll cooperate), and so you should cooperate."

Posted by: Matt Weiner on January 14, 2003 12:30 PM

Leave the $10. If the alien is correct that he understands your mind, there will be a million in the other box. If the alien was wrong, then you ask him for a million in return for keeping silent about his mistake . . .

Posted by: rea on January 14, 2003 01:08 PM

Considering the problem in advance, it is clear that, in the circumstances as described, those who are so constituted as to leave the $10 are better off than those so constituted as to take it. I therefore now made a public promise to to always leave the $10 if given that choice. Violating such a commitment is not something I would do for $10, so when the time comes I will leave the $10 because of the commitment. If the circumstance is actually as stated, I will then collect $1 million, and feel grateful to Professor Newcomb.

Posted by: Mark Kleiman on January 14, 2003 03:07 PM

The version I read in some article on game theory
was...


Suppose you have met an eccentric billionaire
who wants to give you a million dollars if you
can win his contest. He has found a foul-tasting,
disgusting liquid. If you drink a glass of it,
you will feel awful for 24 hours - imagine your
worst hangover on top of the flu. But this
drink will have no lasting effect after 24 hours.

The billionaire has also assembled a panel of
expert pyschologists, lie detecter examiners,
truth serum experts, etc. For the purposes of
this question, assume the panel is infallible
at determining your intentions and plans when
you speak to them.

The rules of the contest are as follows:

o You are to meet the panel on Monday. If
you can convince them by 6 PM Monday that
you intend to drink a glass of the liquid
the following day at 12 noon, you win the
million dollars.

o The billionaire will then deposit a million
dollars in your account such that it clears
before 12 noon Tuesday. He will do this
in such a way that he cannot retract this
deposit in any way.

o You will then be handed the drink on 12
noon Tuesday to do with as you please - to
either drink or not.

Can you win the million dollars? Would you
then drink the liquid?

Posted by: Not An Economist on January 14, 2003 03:27 PM

What's the market rate for two-compartment boxes of extra-terrestrial origin?

And if you meet an all-knowing alien, wouldn't there be a lot of much more interesting things to ask than trying to scam a bit of cash?

(In any case, I've spent the past several years slowly building up an immunity to iocaine, so I would put it in both glasses. Cheers)

Posted by: slacktivist on January 14, 2003 03:46 PM

What's the market rate for two-compartment boxes of extra-terrestrial origin?

And if you meet an all-knowing alien, wouldn't there be a lot of much more interesting things to ask than trying to scam a bit of cash?

(In any case, I've spent the past several years slowly building up an immunity to iocaine, so I would put it in both glasses. Cheers)

Posted by: slacktivist on January 14, 2003 03:47 PM

I would toss a lit match into the box, record the destruction of the currency with a videocamera, and ask the Treasury Department to replace the $10 bill.

Posted by: Steven desJardins on January 14, 2003 05:11 PM

>>"None of which addresses Michael Harris's point, but hey, if you're so smart, why aint'cha rich?"

Heh. Mostly cos I don't meet enough aliens offering me such big prizes.

The money in box 2 is either in there or not. Taking the $10 from the clear box changes nothing. If you DON'T take it, you are not signalling anything to the alien -- you can only signal that BEFOREHAND. Once you're faced with the choice, the payoff matrix has already been settled. So NOT taking the money achieves nothing for you.

Only somehow ex-ante signalling that you are the type who WOULDN'T take the $10 will help you out, and even if you could do that somehow, by the time you come to MAKE the choice, the time for signalling is past. Hence my silly snipe about Bayesian refutations -- it's too late for resolving uncertainty, everything is set in place. (As opposed to, say, the conundrum about picking a box, out of three choices, on Jeopardy I think it is, and then whether to change your choice once one of them has been revealed to you.)

I agree (I think, heh) with the parallels between this and the Prisoner's Dilemma. And the millionaire-foul drink version just firmed my resolve to take the damn money, heh.

Posted by: Michael Harris on January 15, 2003 12:28 AM

The problem is actually a little misleading; it's cast as a decision theoretic puzzle, but it's really just an illustration of the incompatibility of the idea of free will with the idea of a deterministic universe. The real issue becomes clearer if the problem is simply changed slightly: let the box be transparent. Now you can see exactly what the alien predicted, and you have no incentive to pass up the extra $10. Or do you?

Well, it depends on what you are, really. If you're possessed of free will, and can truly decide on the spur of the moment to take the $10, then you obviously would--precisely because the alien wouldn't be able to predict such an action in advance. If, on the other hand, you're a deterministic algorithm forced by the laws of nature to make a predetermined choice, then that choice could as easily be to forgo the $10 as to grab it--but in that case, there's no point asking "what would you do?", except as a purely empirical question (i.e., "what would you have no choice but to do?"). You can believe yourself to be one or the other--but Newcomb's problem asks you to believe both at the same time, and is therefore fundamentally self-contradictory.

Posted by: Dan Simon on January 15, 2003 01:59 AM

I meet the alien, not in a scientific forum, but doing a stage act in Vegas. I see him playing the game with a number of audience volunteers (presumably for much smaller stakes), and everyone who picks both boxes ends up only with the minimal prize. Now it’s my turn.

Since I assume he is doing a magic act, and I know there are a million ways for a magician to fiddle with boxes to get the desired result, at this point I feel no temptation at all to try to beat him at his game by taking both boxes. I am sure the rational decision (supposing that I actually do get to keep whatever money I ‘earn’) is to take only the one box, and I'm ready to do so.

Then the alien whispers in my ear: “actually, I’m NOT a stage magician; I really am an essentially omniscient being,” and then demonstrates this to my satisfaction.

At this point, I say to myself, “Damn! I wish he hadn’t told me that! Because now that I know the game is for real, I have to analyze this according to game theory logic. And game theory logic tells me that the reasonable choice is to take both boxes. But if I do that, it looks like I’m sure to lose.”

Where have I gone wrong?


Posted by: Jeffrey Kramer on January 15, 2003 02:26 AM

I look the alien square in the eye and say "Dominant strategy really isn't quite the hot equilibrium concept that you and John Nash thought it was, and it's misleading to assume by fiat that it is the definition of rationality"

and leave the $10 note.

Posted by: dsquared on January 15, 2003 04:45 AM

I look the alien square in the eye and say "Dominant strategy really isn't quite the hot equilibrium concept that you and John Nash thought it was, and it's misleading to assume by fiat that it is the definition of rationality"

and leave the $10 note.

Posted by: dsquared on January 15, 2003 04:47 AM

The problem is actually a little misleading; it's cast as a decision theoretic puzzle, but it's really just an illustration of the incompatibility of the idea of free will with the idea of a deterministic universe. The real issue becomes clearer if the problem is simply changed slightly: let the box be transparent. Now you can see exactly what the alien predicted, and you have no incentive to pass up the extra $10. Or do you?

Even if you don't agree that free will and determinism are compatible (which is a mistake IMHO), the puzzle can still arise.

A weaker form of the puzzle. Assume that the alien is right 75% of the time whether making predictions about people who are going to take one box or people who are going to take two boxes. If you are tempted by the one-box reasoning you're now going to think that taking one-box gives you a 75% chance of getting the million, and taking two gives you just a 25% chance, so you'll still leave the $10 lying there. So there's still a motivation to not choose the dominant act even with a less impressive alien.

One might try arguing that even a 75% success rate is incompatible with free will. But that would be very implausible, as can be shown by some easy experiments. Try making predictions about which of your friends will vote Republican at the next election, or which of them will eat any meat in the next 48 hours. I bet you'll do better than 75%, and no one thinks that shows your friends lack free will.

Posted by: Brian Weatherson on January 15, 2003 08:01 AM

Brian, this doesn't work at all.

>>A weaker form of the puzzle. Assume that the alien is right 75% of the time whether making predictions about people who are going to take one box or people who are going to take two boxes. If you are tempted by the one-box reasoning you're now going to think that taking one-box gives you a 75% chance of getting the million, and taking two gives you just a 25% chance, so you'll still leave the $10 lying there. So there's still a motivation to not choose the dominant act even with a less impressive alien.

This still assumes the backward causation which is at the heart of Newcomb's Paradox. Whatever the alien's predictive ability, he's made the prediction now, and it was either right or wrong. That doesn't change the fact that, whether he was right or wrong, you can make either $1m or $1,000,010 here today.

IMO, the correct response to Newcomb's Paradox is to reject the intuition that it is always rational to choose a dominant strategy if you have one.

Posted by: dsquared on January 15, 2003 09:09 AM

I'm a one-boxer. This alien is from some advanced civilization. Could Xhsbr rig up the two boxes so that there's always $1,000,000 in the opaque compartment, but that removing the $10 sets off some kind of process that destroys the $1,000,000? Seems plausible to me.

The problem seems to ask whether we accept Xhsbr's infallibility, AND the explanation that it is based on psychological insight rather than engineering wizardry. Offhand, the answer is that the two-boxers obviously don't accept this and the one-boxers do.

But suppose the clear compartment contained not $10 but $999,990. How many one-boxers would remain? Not me, and not many others, I'd guess.

Doesn't that suggest that the one-boxers, in the $10 case, are not taking a deterministic view of the universe, but are rather making a rational strategic choice, possibly allowing for a bit of risk-aversion? That they're betting that Xhsbr is an engineer, not a prophet, and that the free-will/determinism conflict isn't there at all?

Posted by: Bernard Yomtov on January 15, 2003 09:55 AM

No paradox, just conflicting assumptions:

Say xhsbr had a only .1% chance of 1000 correct predictions, that xhsbr predicts one-boxers and two-boxers equally well, and that xhsbr's chance of a correct prediction remained constant (to simplify calculations.) i.e. xhsbr got very lucky before meeting you. Xhsbr then had a 99.3% chance of success for each prediction (.993 = 0.001^(1/1000)).

So, xhsbr is absurdly accurate. So accurate that I have to figure either:
1) Xhsbr is a cheater.
2) I don't know what the hell is going on. Maybe xhsbr used magic, maybe this 'backwards-causality' deal isn't as impossible as we thought.

Given that xhsbr used magic, leaving the $10 isn't such a bad idea.

Posted by: Grayson Calhoun on January 15, 2003 10:20 AM

I think what we are missing here is the assumption that the 1 mill$ is of as much value to the alien as it was to us. If that is the case then its just like the prisoners dilema, except that he gets to see our choice before we do (i.e. he only makes this offer to people who will take the $10). If the $1 Mill is meaningless to him (he can travel FTL. what does he need earth money for??) then you should open the opaqe compartment. Your million will be there waiting for you (assuming that the alien is always right.

Posted by: Nader on January 15, 2003 11:48 AM

dsquared, I wasn't trying to invoke backwards causation.

The only point I had was that if you like the reasoning - most people who have chosen one-box before me have got rich, so if I choose one-box I'll get rich - in the more or less infallible case, there's little reason not to like in the case where it's just a very good predictor. Taking that approach in either case means, I think, taking facts about the causal connection between your decision and the outcome to not be as important as raw correlations between decisions like yours and the outcome. I think if you focus on the causal connection (as I think one should) you take the extra $10 in any case except where there really is backwards causation involved.

If I can make "either $1m or $1,000,010 here today" I'm taking the $1,000,010, although that's more likely evidence of pathological greed than basic rationality.

Posted by: Brian Weatherson on January 15, 2003 12:05 PM

US$10 isn't very much. I'd leave the US$10 so that future aliens and, for that matter, future citizens could see that I'm the leaving type. That way they'll be sure to leave the million next time, whether the alien is right this time or not.

But if the US$10 becomes US$500 000, I might be tempted to ignore what future aliens and citizens think.

Posted by: Brian on January 15, 2003 02:07 PM

I am confused. The only rational answer is to flip a coin. That gives the highest average return of any strategy, on average. There isn't a point in debating past that. If determinism is bound to screw you, screw determinism.

Of course, if you were buddha, and knew yourself completely, this would all be sort of pointless, for a couple of reasons.

Posted by: Brennan on January 15, 2003 04:15 PM

As far as I'm concerned, this paradox (also known as "Newcomb's paradox", which has exhausted a shocking amount of the time of some grown-up philosophers who should know better) is ridiculously obvious: it's just another version of the "grandfather paradox" which proves that time travel into the past is impossible (if you can do, so then you can travel into the past and shoot Grandpa, thus keeping yourself from being born to travel back into the past and shoot Grandpa). Similarly, in Newcomb's case, the alien informs you that it knows what you would be certain to do in the future if it had NOT informed you of that fact -- which leaves you perfectly free to change your behavior in response to that information. The whole fake "paradox" is just the consequence of the logically self-contradictory idea of perfect precognition, as Jason McCullough says.

Posted by: Bruce Moomaw on January 16, 2003 10:29 AM

Let me elaborate: if your initial inclination is to take the $10, then you will think that the alien knew that and that you had better NOT take the $10. But then you will realize that the alien has taken into account that you would make that decision, so you HAD better take the $10. But then you will realize that the alien has taken into account that you would make THAT decision, so you had better NOT take the $10. But that means... and so on forever. It's exactly analogous to a machine whose only purpose is to travel back into the past and keep itself from ever switching on to travel back into the past -- it would exist in neither the "on" nor "off" state, which is impossible; and similarly you can never have any information on which you could validly base a decision as to whether or not to take the money. Brennan's right; just flip a coin, thus showing up the alien's boast about always being able to predict your decision through its knowledge of your "psychology" for the lie it is.

Posted by: Bruce Moomaw on January 16, 2003 10:56 AM

Let me modify that last comment:

Let me elaborate: if your initial inclination is not to take the $10, then you will think that the alien knew you'd do that, that the million bucks is safely in the box for you already, and that you should take the $10 as well and get an added $10. But then you will realize that the alien has taken into account that you would make that decision, so you had better NOT take the $10. But then you will realize that the alien knew that you'd make THAT decision and the million bucks is safely in the box, so you SHOULD take the $10 as well. But that means... and so on forever. It's exactly analogous to a machine whose only purpose is to travel back into the past and keep itself from ever switching on to travel back into the past -- it would exist in neither the "on" nor "off" state, which is impossible; and similarly you can never have any information on which you could validly base a decision as to whether or not to take the $10, because any decision you make provides you with a good reason to change your decision. Brennan's right; just flip a coin, thus showing up the alien's boast about always being able to predict your decision through its knowledge of your "psychology" for the lie it is.

Posted by: Bruce Moomaw on January 16, 2003 11:12 AM

I'd believe the alien and leave the ten dollars. If the alien knows how to travel to earth from light years away, which I think is impossible, then maybe it can see the future, which I also think is impossible. Besides, if there is not a million dollars there, it's worth losing ten dollars to prove that the alien is a liar. And if the money is there, then either the alien can see the future, in which case it was good I didn't take the ten dollars, or it can't, in which case ten dollars is insignificant anyway.

I can see how this might be a difficult decision if there was a lot more money in the clear box, but not enough to make extra money irrelevant, like say $50,000. But if it's ten dollars, why would you need to flip a coin, or even put much thought into the decision?

Posted by: Mitch on January 16, 2003 05:21 PM

Bruce,
I've got to disagree, both on your analysis and on the idea that it's a waste of philosophers' time to discuss it.

As far as your analysis goes, you're implicitly assuming backward causation. Michael Harris and other two-boxers can still say, either the million is there or it's not; either way, you're better off taking the $10. No point in trying to use your initial inclination to guess whether the alien's put the million in the box.

Brian W. is right that the problem can be posed without supposing that the alien is always right. Suppose that xhsbr is right 90% of the time; still 90% of the one-boxers get to say to 90% of the two-boxers, "If you're so smart, why aint'cha rich?" This doesn't require supposing backward causation, as the time-travel paradox does; we need only suppose that the alien is an extremely astute psychologist.

As for why it's important: Consider promises. If it's rational to maximize your own interests, you may wonder why you shouldn't welch on contracts. Obviously it'll often be to your advantage to contract with people for services you'll perform later. And they won't contract with you if they think you won't fulfill the contract. So one reason to fulfill your promises is to enhance your reputation.

But by this reasoning, it's only rational to fulfill your contract when the benefit of welching is outweighed by the reputational costs. If your stated policy is to welch whenever you can get away with it, people still won't want to deal with you.

So people who contract with you are in the position of the alien: If they predict that you won't welch, they'll do something for you; if they predict that you will welch, they won't. So you want them to act as if you'll keep your contract; but at the time you have to decide whether or not to keep the contract, they've already done their part. What you do won't affect that. If there's no backward causation, why bother fulfilling the agreement?

Except, if everyone reasons like that, no agreements will be possible, and we'll all be worse off.

This case--known as Hume's farmer--is even more intractable than the Newcomb problem, because you already know whether your contractor has fulfilled their end of the bargain. It's like Newcomb's problem with two transparent boxes.

The way I'm inclined to think of it is analogous to my thoughts on skepticism about the external world: If I take it that I know only what cannot be doubted, then I don't know that there is an external world; I had better know that there is an external world; so the original conception of knowledge can't be right.

Similarly, if we take it that it is always rational to directly maximize your utility, it isn't rational to keep your promises; it had better be rational to keep your promises (if we're all better off acting irrationally, something's deeply wrong); so it can't always be rational to directly maxmize your utility.

This may be what dsquared is getting at when he says it's not always rational to choose a dominant strategy.

Anyone who read through this long post deserves a prize; e-mail me with suggestions.

Posted by: Matt Weiner on January 16, 2003 08:17 PM

Actually the prisoner's dilemna version opens a more interesting question. Suppose 2 players are given this choice, and I am told I can either take the ten dollars or place the million in the other player's opaque box. I am quite certain that my decision and my opponent's are correlated, so I assess the probability that we will choose opposite strategies as much less than .5. Then cooperation is the utility maximizing strategy in the sense that if I form an expectation of my payout based on my action I will assess a higher chance that my opponent will leave the money if I have chosen to do the same. I do not think it is irrational to believe that many players would choose to leave the 10, although it is a dominated strategy. I believe that I would make that choice, although I might change my mind if it were $1000. Again I would view my decision as the best proxy for predicting the behavior of my opposite. I recognize that at some point the lure of the sure thing will win me over and buying hope that my counterpart will reach the same "irrational" conclusion will be too expensive.

Of course my decision would be influenced by information that pertained to how closely correlated I thought our behaviors might be and if I knew that my opposite came from an unfamiliar cultural, social or ethnic background I would be less likely to cooperate.

Dave

Posted by: Dave Richardson on January 17, 2003 03:28 PM

The thing I don't like about the Newcomb paradox is that it's too damn deep. It brings up all kinds of knotty philosophical issues, and you can wind up dealing with those forever.

I like better another paradox, the so-called suprise exam paradox, because it is less encumbered philosophically.

The paradox, as best I can reconstruct it, goes like this.

A professor tells his class on Monday, "I'm going to give you an exam this week, but it's going to be a surprise which day it'll be -- you'll have no way of knowing in advance that it's going to be on that day. So be prepared."

The next day, in class, a student raises his hand and says,

"Professor, I thought about the suprise exam you said you're going to give us, and I realized that it was impossible for you to do it.

"You can't give it to us on Friday, because we would know the night before that it HAD to be on Friday, because Friday was the last day it could be. And it can't be on Thursday, because we would know it had to be on Thursday, since, as I just showed, it can't be on Friday. And if you keep up the same line of reasoning, you realize it can't be a suprise on ANY day of this week. That means that you CAN'T give a surprise exam like you said. So why should we prepare?"

The professor didn't answer the student. Instead, he simply handed out the exam.

And boy was the student surprised.

-------------------

So the question is, what if anything was wrong with the student's logic?

I'm not sure that I've ever seen an answer to this that is entirely satisfactory -- and yet it shouldn't require deep philosophy or logic to resolve it.

Posted by: frankly0 on January 18, 2003 09:29 AM

The puzzle is logically impossible to solve. On one hand, it can be said that by intentionally leaving the $10 so that you can get the $1m, the alien has depicted you as the "the kind of" human who would take the money (therefore, no $1m). On the other hand, by not taking the $10 thinking that the $1m will be there, you are "not the kind of" person who would take it. In both cases you are "the kind of" human who wants to maximize your return. So, I would say that it comes down to how much you need the $10. If you do, take it; if not, have some fun.

Posted by: Alex on January 24, 2003 01:35 PM

The puzzle is logically impossible to solve. On one hand, it can be said that by intentionally leaving the $10 so that you can get the $1m, the alien has depicted you as the "the kind of" human who would take the money (therefore, no $1m). On the other hand, by not taking the $10 thinking that the $1m will be there, you are "not the kind of" person who would take it. In both cases you are "the kind of" human who wants to maximize your return. So, I would say that it comes down to how much you need the $10. If you do, take it; if not, have some fun.

Posted by: Alex on January 24, 2003 01:35 PM

There's either $1m in the second box or there isn't. Therefore, the money in that box has a 50% chance of being there (arbitrary, but rational, since the average of all possibilities that it will or won't be there for all cases where the boxes are handed out is 50%).

So you have $10 in one box, and $500,000 in the other, you take the $500,000.

If there is $499,999.99 in one box, and maybe $1m in the other, you still go for the other box if there is an actual either/or choice to be made.

The alien is BY DEFINITION always right. The problem says that he is always right, so you have to believe that, it is one of the basic facts of the question. Therefore, it IS an actual either/or choice.

If there were $500,000 in box 1, then you would just take that.

Posted by: J. Goodwin on February 24, 2003 09:15 AM

An all-knowing alien who has a perfect computer
model of your mind lands on earth. Xhsbr (that's
a pronoun, not a proper name) shows you a box
with two compartments, one compartment of which
is clear and the other compartment of which is
opaque. Each side has a door. You can see $10 in
the clear part of the box. The alien says that
xhsbr has analyzed your psychology, and if you
are the kind of human who would not take the $10
xhsbr has put $1,000,000 in the other, opaque
compartment, which will be yours. But if you are
the kind of human who would take the $10, xhsbr
has put nothing in the other, opaque
compartment. The alien says that you must first
open the door to the clear compartment (and take
the $10 or not) before the door to the opaque
compartment will open. The alien says that the
door to the clear compartment will only open
once.

This problem becomes easier if you cut out
irrelevant information. It states the alien is
all-knowing, but that touches on predestination
which is philosofical and theological, not
analytically useful. What IS analytically useful
is that xhsbr (Read: he) has a perfect computer
model of your mind. Box configuration is irrelevant also. Also, The alien leaving is irrelevant since he is infallable by definition. To simplify:

Someone offers you $10 if you are the kind of
person to take $10, $1,000,000 if you are not,
and nothing if you are and claim you're not. If
you are but take the 10 anyway, you get
$1,000,010.

Remember the alien is infallible? Cut out the
$1,000,010 option. What IS the
kind of human who would not take $10? That's
free money. In order for this experiment to be
necessary we must cut out the free money theory
by understanding that to be the kind of human
who would not take $10 instead of a million. And that is everyone.

To summarise: The alien is infallible. All humans want to maximise profit. 1M is better than 10 which is better than 0.

So .. $0 ? Not possible since.

$10 or $1,000,000? Ratio is also unimportant as more money is always better than less money and there is no probability involved.

Posted by: Sultan on March 21, 2003 10:50 AM

It seems like everyone here is assuming that the $10 goes away once the opaque box is opened. The problem specifies nothing of the sort; ergo the $10 is not just a sure thing whether or not the million is in the opaque box, it's a sure thing whether or not you open the opaque box. Opening the opaque box therefore has a penalty of zero and a potential reward of $1Million. So unless one is the sort of person who leaves the ten spot because he thinks the million is enough profit for a day's philosophizing, there couldn't be a million in the opaque box, anyway. Of course, my answer sidesteps the problem's intent, but it's another angle, all the same.

Posted by: Dave on May 31, 2003 04:38 AM

thanks

Posted by: ranjeet kumar on July 14, 2003 04:30 AM

thanks

Posted by: ranjeet kumar on July 14, 2003 04:30 AM

thanks

Posted by: ranjeet kumar on July 14, 2003 04:30 AM

thanks

Posted by: ranjeet kumar on July 14, 2003 04:31 AM

thanks

Posted by: ranjeet kumar on July 14, 2003 04:31 AM

thanks

Posted by: ranjeet kumar on July 14, 2003 04:32 AM

thanks

Posted by: ranjeet kumar on July 14, 2003 04:32 AM

thanks

Posted by: ranjeet kumar on July 14, 2003 04:32 AM

thanks

Posted by: ranjeet kumar on July 14, 2003 04:34 AM
Post a comment
Name:


Email Address:


URL:


Comments:


Remember info?