Choosing a Game Show Briefcase

Let’s see who knows the answer to this one.

I’m sure everybody knows how “Deal or No Deal” works so I won’t take the time to explain the rules.

Lets assume that after knocking out 24 “bad” cases, you are left with \$1 on the board and \$1,000,000 on the board. Knowing that you have been going for the \$1,000,000 prize all along, is it advantageous to switch cases at the end?

only if theres a goat in one of them.

Take whatever the banker offers and go home. Yes I wouldn’t either… you either win or you lose… it’s 50/50 there is no advantage in changing your selection.

Wasn’t there something about a goat behind door #3 sometime ago… Why am I thinking of goats… must of read one of dk44’s posts, that guy cracks me up ha-ha! (not really).

Let’s see… I’m not actually totally familiar with the game so I might be missing something…

There’s 26 cases total, correct?

The probability of you initially choosing the million dollar prize is 1/26, so that’s the probability that the case you hold contains the million dollar prize.

Since 24 of the possibilities have been eliminated, the only possibilities for the remaining case should be \$1 or \$1,000,000. So, since you are choosing a case that has only two possibilities, it seems to follow that the other case has a (1/2) chance of containing the remaining prize.

I’m not sure that is totally correct mathematically, but I believe common sense indicates that you should switch…

[quote]jtrinsey wrote:
Let’s see… I’m not actually totally familiar with the game so I might be missing something…

There’s 26 cases total, correct?

The probability of you initially choosing the million dollar prize is 1/26, so that’s the probability that the case you hold contains the million dollar prize.

Since 24 of the possibilities have been eliminated, the only possibilities for the remaining case should be \$1 or \$1,000,000. So, since you are choosing a case that has only two possibilities, it seems to follow that the other case has a (1/2) chance of containing the remaining prize.

I’m not sure that is totally correct mathematically, but I believe common sense indicates that you should switch…[/quote]

Following your “common sense” there… what chance does the other case have then?

If they offer anything under \$499.999 fuck it I’ll go for the 1 million.

If I do end up with \$1 I’ll ask the host for my gas money and leave.

i cant see any way it would advantageous to switch cases. either you have it or dont, its the same logic from the other too, why is this dead horse getting beat?

If one case has a 50% chance then so does your case. =P

The difference with this game is that there’s nothing preventing you from opening the million dollar case at any point during the game. Let’s reduce the problem to three briefcases containing \$1M, \$1 and \$5.

1. You have the \$1M case. You eliminate the \$1 case. If you stay you win.

2. You have the \$1M case. You eliminate the \$5 case. If you stay you win.

3. You have the \$1 case. You eliminate the \$5 case. If you stay you lose.

4. You have the \$1 case. You eliminate the \$1M case. You don’t have the option to stay or switch.

5. You have the \$5 case. You eliminate the \$1 case. If you stay you lose.

6. You have the \$5 case. You eliminate the \$1M case. You don’t have the option to stay or switch.

Of the 4 scenarios where the big prize is still in play, it is no more likely to be in your case than in the other case. Thus if you’re lucky enough not to have eliminated the million dollars during the course of the game, you have even odds.

[quote]LiveFromThe781 wrote:
i cant see any way it would advantageous to switch cases. either you have it or dont, its the same logic from the other too, why is this dead horse getting beat?[/quote]

Not the same at all.

difference is they always offer you like 3-400,000 inbetween.

a garuanteed 30-40k to a 50/50 1M wins out in my book.

The point was just to prove that in this case, if you’re down to those two cases, the odds are indeed even.

[quote]wfifer wrote:
The point was just to prove that in this case, if you’re down to those two cases, the odds are indeed even.[/quote]

wouldnt it be that your odds are out of 100 though, since thats how many cases there are initially?

and from the logic in the other thread it seems that itd be to your advantage to switch cause youre now switching in a 50/50 scenario from a 1/99 scenario.

lol, idk i think this stuff is funny either way, dont take it too serious.

The difference between this game and the other is that this time you’re the one opening stuff. And you have no idea where anything is. In the other game, the host is not allowed to open the door with the car.

You can think, “wait, what are the odds that I actually picked the million dollar case?” But then I’d ask you, “what are the odds that you just eliminated every case except the million dollar case?”

[quote]wfifer wrote:

The difference between this game and the other is that this time you’re the one opening stuff. And you have no idea where anything is. In the other game, the host is not allowed to open the door with the car.

You can think, “wait, what are the odds that I actually picked the million dollar case?” But then I’d ask you, “what are the odds that you just eliminated every case except the million dollar case?”[/quote]

In the scenario he presented(and in the game) you know if the million dollar case is still left, you just don’t know if it’s in case X or case Y, which is exactly the same as the door game.

[quote]wfifer wrote:

The difference between this game and the other is that this time you’re the one opening stuff. And you have no idea where anything is. In the other game, the host is not allowed to open the door with the car.

You can think, “wait, what are the odds that I actually picked the million dollar case?” But then I’d ask you, “what are the odds that you just eliminated every case except the million dollar case?”[/quote]

Precisely 1/26 for both. Putting you at an even 50/50 if you get to that point.

It’s funny how much this can confuse some people. You nailed it when you said the difference was that the million could have been opened any time. In the other game there was a 100% chance of the host opening a door with a goat. That little detail changes everything.

I am doubting myself, so somebody smarter than me tell me where this line of mathematical logic falters:

1.) Your initial probability of choosing the \$1,000,000 case is (1/26)

2.) The probability of it being in either case at the end is 1, or 100%, however you prefer. It has to be in one of the two cases.

3.) If the probability of it being in your case is (1/26), and the probability it is in both cases combined is 1… then why is the probability of the other case not (25/26)?

But then I am confusing myself because then it seems that you could do the same thing for the \$1 case. That is why I am thinking it may not matter whether you switch or stay.

Oh, the host doesn’t open the case. Thus, in this situation the cases are all independent. Duh.

[quote]jtrinsey wrote:
I am doubting myself, so somebody smarter than me tell me where this line of mathematical logic falters:

1.) Your initial probability of choosing the \$1,000,000 case is (1/26)

2.) The probability of it being in either case at the end is 1, or 100%, however you prefer. It has to be in one of the two cases.

3.) If the probability of it being in your case is (1/26), and the probability it is in both cases combined is 1… then why is the probability of the other case not (25/26)?

But then I am confusing myself because then it seems that you could do the same thing for the \$1 case. That is why I am thinking it may not matter whether you switch or stay.[/quote]

Your initial probability is 1/26, but after you eliminate one case, it is then 1/25 that it has the million. And so on and so forth, until it is 1/2. People are confusing this problem with the Monty Hall problem, but it’s different because in that scenario, the host knows which case has the \$1 million dollars, whereas in this one, he has no idea. You could have picked it at any time throughout playing the game and eliminating cases, you just happened not to.

[quote]wfifer wrote:

The difference between this game and the other is that this time you’re the one opening stuff. And you have no idea where anything is. In the other game, the host is not allowed to open the door with the car.

You can think, “wait, what are the odds that I actually picked the million dollar case?” But then I’d ask you, “what are the odds that you just eliminated every case except the million dollar case?”[/quote]

they know whats been eliminated though, hes referring to a show where the whereabouts of the case are known. the “bankers” offer a wager to buy the briefcase based on the likelihood the player has it because they open up so many cases. the difference is that the million dollars isnt always a million, there could only be a 400,000 and 20,000 case, im basing this off my recolection of only seeing the show once or twice, but im pretty sure thats how they do it.

Your expected value is essentially 500k so if that is all we’re going on then you should reject any offer less than that. From an economic perspective though the question would center around expected utility rather than value. If a person’s marginal utility for the millionth dollar is less than that of the 1st dollar (and we assume the curve is monotone for simplicity) then that person will be better off taking the sure thing that may pay slightly less than 500k. So the question is, does you satisfaction increase as much when someone goes from giving you no money to giving you one dollar as it does when you’re being given \$999,999 but then you’re given an extra dollar on top of that?

reject anything less than 500k? why?

i would accept 400k if it boiled down to a 50:50 with 1M

50% says i get nothing
100% says i get 400k (or whatever it is after taxes)

you might feel shitty after if you had the 1M in your briefcase but reality is you made the right choice because that just could easily have said \$1.

risk aversion is all fine in theory but when you have 3 kids to put through college at home, gas & electric bills plus food to put on the table how do you explain to them your economics course said itd be wiser to take the shot at 1 million because they didnt offer you 10% more than they “should” have.