Dr Ben O'Neill
This seminar examines some recent research in prediction problems involving sequences of values with some fixed finite range (e.g., coin flips, dice rolls, Roulette wheel spins). These sequences are commonly modelled in statistics as binomial or multinomial models. Sequences of particular interest are those that are designed to be give equal likelihood of any outcome. For this kind of sequence we examine the "gambler's fallacy", which is based on the idea that outcomes must 'even out' over time, so that observed deviations in one direction should (allegedly) lead us to predict later deviations in the other direction. We examine a standard rebuttal to this argument and consider whether or not this is correct. We put forward an optimal prediction method for such sequences and we look at the accuracy of this prediction method. We compare this prediction method to the gambler's method and also to an idealised method based on knowledge of the underlying bias in the process. Through this process we obtain an optimal method which has good foundations and good long-run properties, converging in accuracy to the idealised method. This seminar should elucidate the gambler's fallacy and give some insight into prediction problems.