October 08, 2014
On 10/7/2014 6:18 PM, Timon Gehr wrote:
> I can report these if present.

Writing a strongly worded letter to the White Star Line isn't going to help you when the ship is sinking in the middle of the North Atlantic.

What will help is minimizing the damage that a detected fault may cause. You cannot rely on the specification when a fault has been detected. "This can't be happening!" are likely the last words of more than a few people.


> Can we at least agree that Dicebot's request for having the behaviour of
> inadvisable constructs defined such that an implementation cannot randomly
> change behaviour and then have the developers close down the corresponding
> bugzilla issue because it was the user's fault anyway is not unreasonable by
> definition because the system will not reach a perfect state anyway, and then
> retire this discussion?

I've been working with Dicebot behind the scenes to help resolve the particular issues with the code he's responsible for.

As for D, D cannot offer any guarantees about behavior after a program crash. Nor can any other language.
October 08, 2014
On 10/7/2014 3:54 PM, Nick Sabalausky wrote:
> It's a salesman's whole freaking *job* is be a professional liar!

Poor salesmen are liars. But the really, really good ones are ones who are able to match up what a customer needs with the right product for him. There, he is providing a valuable service to the customer.

Serve the customer well like that, and you get a repeat customer. I know many salesmen who get my repeat business because of that.

The prof who taught me accounting used to sell cars. I asked him how to tell a good dealership from a bad one. He told me the good ones have been in business for more than 5 years, because by then one has run out of new suckers and is relying on repeat business.

> But then again, slots and video poker aren't exactly my thing anyway. I'm from
> the 80's: If I plunk coins into a machine I expect to get food, beverage, clean
> laundry, or *actual gameplay*. Repeatedly purchasing the message "You loose"
> while the entire building itself is treating me like a complete brain-dead idiot
> isn't exactly my idea of "addictive".

I found gambling to be a painful experience, not entertaining at all.
October 08, 2014
On 10/7/2014 4:49 PM, Timon Gehr wrote:
> On 10/08/2014 12:10 AM, Nick Sabalausky wrote:
>> On 10/07/2014 06:47 AM, "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?=
>> <ola.fosheim.grostad+dlang@gmail.com>" wrote:
>>> Yep, however what the human brain is really bad at is reasoning about
>>> probability.
>>
>> Yea, true. Probability can be surprisingly unintuitive even for people
>> well-versed in logic.
>> ...
>
> Really?

Yes. See "Thinking, Fast and Slow":

  http://www.amazon.com/Thinking-Fast-Slow-Daniel-Kahneman/dp/0374533555

I'm working my way through it now. Very interesting.
October 08, 2014
On Wednesday, 8 October 2014 at 01:22:49 UTC, Timon Gehr wrote:
>> The secret behind the monty hall scenario, is that the host is actually
>> leaking extra information to you about where the car might be.
>>
>> You make a first choice, which has 1/3 chance of being right, then the
>> host opens another door, which is *always* wrong. This last part is
>> where the information leak comes from.  The host's choice is *not* fully
>> random, because if your initial choice was the wrong door, then he is
>> *forced* to pick the other wrong door (because he never opens the right
>> door, for obvious reasons), thereby indirectly revealing which is the
>> right door.  So we have:
>>
>> 1/3 chance: you picked the right door. Then the host can randomly choose
>> 	between the 2 remaining doors. In this case, no extra info is
>> 	revealed.
>>
>> 2/3 chance: you picked the wrong door, and the host has no choice but to
>> 	pick the other wrong door, thereby indirectly revealing the
>> 	right door.
>>
>> So if you stick with your initial choice, you have 1/3 chance of
>> winning, but if you switch, you have 2/3 chance of winning, because if
>> your initial choice was wrong, which is 2/3 of the time, the host is
>> effectively leaking out the right answer to you.
>>
>> The supposedly counterintuitive part comes from wrongly assuming that
>> the host has full freedom to pick which door to open, which he does not
But yes. He has. It makes no difference.
If he would ever open the right door, you would just take it too.
So if the win is behind the two doors you did not choose first,
you will always get it.

> The problem with this explanation is simply that it is too long and calls the overly detailed reasoning a 'secret'. :o)

So take this shorter explanation:

"There are three doors and two of them are opened, one by him
and one by you. So the chance to win is two out of three."
It doesn't matter if he uses his knowledge to open always a
false door. It only matters that you open your door AFTER him,
which allows you to react on the result of his door. If you
open the door first, your chance is only 1/3.
October 08, 2014
On Wednesday, 8 October 2014 at 07:00:38 UTC, Dominikus Dittes Scherkl wrote:
> On Wednesday, 8 October 2014 at 01:22:49 UTC, Timon Gehr wrote:

> It doesn't matter if he uses his knowledge to open always a
> false door.

It does. Actually, this is the single most important thing.

> It only matters that you open your door AFTER him,
> which allows you to react on the result of his door. If you
> open the door first, your chance is only 1/3.

If his choice would be completely random, as you seem to suggest above (because, actually, its choice is conditioned by your first choice) then, even if you open a door after him, the only thing you have is the fact that you are now in problem with 50% probability to win.

If you remove the above piece of information, you could simply assume that there are only two doors and you are to open one of them. In this case is just 50/50.
October 08, 2014
On 10/07/2014 07:49 PM, Timon Gehr wrote:
> On 10/08/2014 12:10 AM, Nick Sabalausky wrote:
>> Ex: A lot of people have trouble understanding that getting "heads" in a
>> coinflip many times in a row does *not* increase the likelihood of the
>> next flip being "tails". And there's a very understandable reason why
>> that's difficult to grasp.
>
> What is this reason? It would be really spooky if the probability was
> actually increased in this way. You could win at 'heads or tails' by
> flipping a coin really many times until you got a sufficiently long run
> of 'tails', then going to another room and betting that the next flip
> will be 'heads', and if people didn't intuitively understand that, some
> would actually try to apply this trick. (Do they?)
>

I have actually met a lot of people who instinctively believe that getting "tails" many times in a row means that "heads" becomes more and more inevitable. Obviously they're wrong about that, but I think I *do* understand how they get tripped up:

What people *do* intuitively understand is that the overall number of "heads" and "tails" are likely to be similar. Moreover, statistically speaking, the more coin tosses there are, the more the overall past results tend to converge towards 50%/50%. (Which is pretty much what's implied by "uniform random distribution".) This much is pretty easy for people to intuitively understand, even if they don't know the mathematical details.

As a result, people's mental models will usually involve some general notion of "There's a natural tendency for the 'heads' and 'tails' to even out" Unfortunately, that summary is...well...partly truth but also partly inaccurate.

So they take that kinda-shaky and not-entirely-accurate (but still *partially* true) mental summary and are then faced with the coin toss problem: "You've gotten 'tails' 10,000 times in a row." "Wow, really? That many?" So then the questionable mental model kicks in: "...natural tendency to even out..." The inevitable result? "Wow, I must be overdue for a heads!" Fallacious certainly, but also common and somewhat understandable.

Another aspect that can mix people up:

If you keep flipping the coin, over and over and over, it *is* very likely that at *some* point you'll get a "heads". That much *is* true and surprises nobody. Unfortunately, as true as it is, it's *not* relevant to individual tosses: They're individual likelihoods *always* stay the same: 50%. So we seemingly have a situation where something ("very, very likely to get a heads") is true of the whole *without* being true of *any* of the individual parts. While that does occur, it isn't exactly a super-common thing in normal everyday life, so it can be unintuitive for people.

And another confusion:

Suppose we rephrase it like this: "If you keep tossing a coin, how likely are you to get 10,000 'tails' in a row AND then get ANOTHER 'tails'?" Not very freaking likely, of course: 1 in 2^10,001. But *after* those first 10,000 'tails' have already occurred, the answer changes completely.

What? Seriously? Math that totally changes based on "when"?!? But 2+2 is *always* 4!! All of a sudden, here we have a math where your location on the timeline is *crucially* important[1], and that's gotta trip up some of the people who (like everyone) started out with math just being arithmetic.

[1] Or at least time *appears* to be crucially important, depending on your perspective: We could easily say that "time" is nothing more than an irrelevant detail of the hypothetical scenario and the *real* mathematics is just one scenario of "I have 10,001 samples of 50% probability" versus a completely different scenario of "I have 10,000 samples of 100% probability and 1 sample of 50% probability". Of course, deciding which of those problems is the one we're actually looking at involves considering where you are on the timeline.

>
> That said, it's just: When you first randomly choose the door, you would
> intuitively rather bet that you guessed wrong. The show master is simply
> proposing to tell you behind which of the other doors the car is in case
> you indeed guessed wrong.
>
> There's not more to it.
>

Hmm, yea, an interesting way to look at it.

October 08, 2014
On Wednesday, 8 October 2014 at 07:35:25 UTC, Nick Sabalausky wrote:
> On 10/07/2014 07:49 PM, Timon Gehr wrote:
>> On 10/08/2014 12:10 AM, Nick Sabalausky wrote:
>>> Ex: A lot of people have trouble understanding that getting "heads" in a
>>> coinflip many times in a row does *not* increase the likelihood of the
>>> next flip being "tails". And there's a very understandable

Of course it does not increase the probability to get a "tails". Actually, it increases the probability that you'll get "heads" again.

For the simplest explanation, see here:

http://batman.wikia.com/wiki/Two-Face's_Coin
October 08, 2014
On Tue, 7 Oct 2014 17:37:51 -0700
"H. S. Teoh via Digitalmars-d" <digitalmars-d@puremagic.com> wrote:

> told in a deliberately misleading way -- the fact that the host *never* opens the right door is often left as an unstated "common sense" assumption, thereby increasing the likelihood that people will overlook this minor but important detail.
that's why i was always against this "problem". if you giving me the logic problem, give me all information. anything not written clear can't be the part of the problem. that's why the right answer for the problem when i didn't told that the host never opens the right door is "50/50".


October 08, 2014
On Wednesday, 8 October 2014 at 07:00:38 UTC, Dominikus Dittes Scherkl wrote:
> On Wednesday, 8 October 2014 at 01:22:49 UTC, Timon Gehr wrote:

> If he would ever open the right door, you would just take it too.

Almost. If he opens the winning door, he gives you another very important information: the correctness of your first choice. If you already know if your first choice is correct or wrong, then having the host opening a door (does not matter which of the remaining two, in this case) solves the problem without ambiguity.

But, when you make your second choice, you still not know if your first choice was correct or not. The only thing that you know is that the chance that your first choice was correct is two times less than the chance it was wrong.

So you bet that your first choice was wrong, and you move on to the next problem, which, assuming this bet, now becomes a non-ambiguous problem.

The key is this: "how would a third person bet on my first choice?" Reasonably, he would bet that the choice is wrong. So why wouldn't I do the same?
October 08, 2014
On Tuesday, 7 October 2014 at 23:49:37 UTC, Timon Gehr wrote:
> On 10/08/2014 12:10 AM, Nick Sabalausky wrote:
>> On 10/07/2014 06:47 AM, "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?=
>> <ola.fosheim.grostad+dlang@gmail.com>" wrote:
>>> On Tuesday, 7 October 2014 at 08:19:15 UTC, Nick Sabalausky

> What is this reason?

This one:

Result of coin tossing is independent at any current attempt. It does not depend of past or future results. Probability is 0.5

On the other hand, obtaining a series with 100 heads in a row is very small (exactly because of the independence).

Obtaining a series with 101 heads in a row is even smaller, so people will assume that the 101 tossing should probably give a "tails".

But they forget that the probability of a 101 series where first 100 are heads and the 101 is tails is exactly *the same* as the probability of a 101 series of heads.

They compare the probability of a 101 series of heads with the probability of a series of 100 heads instead of comparing it against a series with first 100 heads and the 101rd being a tail.

It is the bias choice (we have tendency to compare things that are easier - not more pertinent - to compare).