Just a thought experiment. I’ve got a thousand people and I ask them all the following question:
“Choose a number between 0 and 100. The aim is to choose a number which is two-thirds of the average choice.”
A reasonable place to start would be to assume that everybody else randomly chooses a number anywhere between 0 and 100. Assuming a uniform distribution, the average of all the numbers chosen would be 50. Therefore you should choose the number that is two-third of that, which is 33.
What’s the possible flaw?