tag:blogger.com,1999:blog-21043237351530930072017-07-25T19:17:16.300+02:00EconomatheekEconomics, math, statistics other essential parts of life.Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.comBlogger18125tag:blogger.com,1999:blog-2104323735153093007.post-11572844191282142202010-02-19T01:02:00.003+01:002010-02-19T01:03:31.218+01:00Augmented-reality mapsI think you may find this interesting:<br /><br /><object width="446" height="326"><param name="movie" value="http://video.ted.com/assets/player/swf/EmbedPlayer.swf"></param><param name="allowFullScreen" value="true" /><param name="wmode" value="transparent"></param><param name="bgColor" value="#ffffff"></param> <param name="flashvars" value="vu=http://video.ted.com/talks/dynamic/BlaiseAguerayArcas_2010-medium.mp4&su=http://images.ted.com/images/ted/tedindex/embed-posters/BlaiseAgueraYArcas-2010.embed_thumbnail.jpg&vw=432&vh=240&ap=0&ti=766&introDuration=16500&adDuration=4000&postAdDuration=2000&adKeys=talk=blaise_aguera;year=2010;theme=the_creative_spark;theme=a_taste_of_ted2010;theme=new_on_ted_com;event=TED2010;&preAdTag=tconf.ted/embed;tile=1;sz=512x288;" /><embed src="http://video.ted.com/assets/player/swf/EmbedPlayer.swf" pluginspace="http://www.macromedia.com/go/getflashplayer" type="application/x-shockwave-flash" wmode="transparent" bgColor="#ffffff" width="446" height="326" allowFullScreen="true" flashvars="vu=http://video.ted.com/talks/dynamic/BlaiseAguerayArcas_2010-medium.mp4&su=http://images.ted.com/images/ted/tedindex/embed-posters/BlaiseAgueraYArcas-2010.embed_thumbnail.jpg&vw=432&vh=240&ap=0&ti=766&introDuration=16500&adDuration=4000&postAdDuration=2000&adKeys=talk=blaise_aguera;year=2010;theme=the_creative_spark;theme=a_taste_of_ted2010;theme=new_on_ted_com;event=TED2010;"></embed></object>Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com1tag:blogger.com,1999:blog-2104323735153093007.post-31751978538954550392010-01-11T12:32:00.003+01:002010-01-11T12:40:03.819+01:00Architecture is artThis is a breathtaking video on architecture through the viewpoint of a photographer. Somewhat surrealistic and fully computer generated.<br /><br /><object width="400" height="225"><param name="allowfullscreen" value="true" /><param name="allowscriptaccess" value="always" /><param name="movie" value="http://vimeo.com/moogaloop.swf?clip_id=7809605&server=vimeo.com&show_title=1&show_byline=1&show_portrait=0&color=&fullscreen=1" /><embed src="http://vimeo.com/moogaloop.swf?clip_id=7809605&server=vimeo.com&show_title=1&show_byline=1&show_portrait=0&color=&fullscreen=1" type="application/x-shockwave-flash" allowfullscreen="true" allowscriptaccess="always" width="400" height="225"></embed></object><p><a href="http://vimeo.com/7809605">The Third & The Seventh</a> from <a href="http://vimeo.com/user1337612">Alex Roman</a> on <a href="http://vimeo.com">Vimeo</a>.</p><br /><br />Although I've embedded this video, you really shouldn't watch it here. You should <a href="http://vimeo.com/7809605?hd=1">watch it in fullscreen on Vimeo</a>.Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com0tag:blogger.com,1999:blog-2104323735153093007.post-41644622740723111382009-09-20T15:49:00.009+02:002009-09-22T21:15:13.463+02:00Business Myths<a href="http://www.wisebread.com/">Wise Bread</a> is an interesting blog, which featured an article named <a href="http://www.wisebread.com/10-myths-non-business-people-believe-about-business">10 Myths Non-Business People Believe About Business</a> by Joshua Ritchie a few days ago. In this article, Joshua refutes some persistent ideas about business that are often viewed as old truths. My general opinion on the article is that it's good, but it looses credibility on some points. Let me give an example.<br /><br /><span style="font-weight: bold;">Myth number 4: "Prices are whatever businessmen arbitrarily decide to charge."</span><br /><br />In his article, Joshua turns against the myth that prices reflect greed. When prices rise, it's supposed to be because the selling company wants to squeeze an extra profit out of the product or service. While this is a myth by all means, the explanation he gives is only half the story and actually serves to reinforce another, related, myth.<br /><br />Joshua argues that, instead of reflecting businesses' greed, prices reflect the cost of producing their product or service. When demand for oil is high, Joshua argues, a gas company has to bid higher for the raw material for their product. Subsequently, they have to rise the gas prices in order to make a profit. This can be true sometimes, but an important piece of the puzzle is still missing.<br /><br />We have to remember that companies charge whatever they can charge for a certain quantity of goods or services. When our gas company charged $2.50 for a gallon of gas, it was because they expected to sell a certain quantity of gas at that price and that the expected quantity times that price would give them the best profit. If they lowered the price, they'd sell more but still they wouldn't make the same ammount of money (or at least their key figures wouldn't turn out as favorable), and if they raised the prices, they wouldn't sell as much even though each gallon would be more profitable.<br /><br />This relationship is easy to see if we imagine the extremes; giving out free gas on one end and charging infinitely high prices on the other end. Giving out free gas (charging $0) would make no profit at all even though they'd "sell" fantastic quantities. On the other hand, charging infinitely high prices wouldn't make a profit either, because no one would buy. The optimum is somewhere in between.<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://4.bp.blogspot.com/_cKzFMFmX45I/SrY891S4HVI/AAAAAAAAATU/M7HJo0-sRh8/s1600-h/Optimum+price.gif"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 236px; height: 61px;" src="http://4.bp.blogspot.com/_cKzFMFmX45I/SrY891S4HVI/AAAAAAAAATU/M7HJo0-sRh8/s320/Optimum+price.gif" alt="" id="BLOGGER_PHOTO_ID_5383557437543947602" border="0" /></a><center><span style="font-size:85%;">Prices on the X-axis, profit on the Y-axis</span></center><br />Now, businesses don't charge prices in the sole purpose of covering their costs. It's the other way around; they take on costs necessary to acquire or maintain sales. No sound company has ever had a business meeting where they've said "OK, we have all of those costs, now what should we do in order to get our money back?". The whole reasoning is backwards.<br /><br />The real questions to ask yourself when setting the price are:<br /><span style="font-style:italic;"><br />-- How many units can we sell at a specific price?<br />-- How much money does that earn us?<br />-- What are the variable costs of producing that many units? <br />-- How much is left to cover our fixed costs after variable costs have been accounted for?<br />-- What are our fixed costs?<br />-- How much profit is left after fixed costs are accounted for?</span><br /><br />The costs that Joshua Ritchie talks about in his article are mainly the variable costs. Depending on the answers to the other questions, they may have a big or small impact. Sometimes they're almost irrelevant. On some markets, raising prices would yield such a drop in sales that this factor outweighs the cost factor. Rising costs may call not for raising prices, but rather for canceling the product if there's no longer room for good-enough profit.<br /><br />The main point is this: Whenever you feel like saying "This item is over-priced, it can hardly cost a tenth of this much to produce", remember that businesses don't set prices in order to cover costs. They set whatever prices they can in order to maximize profit, or actually to maximize utility (where profit is a main ingredient, but other key figures come to play as well). As long as people are buying and they make a good profit, prices are reasonable.Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com0tag:blogger.com,1999:blog-2104323735153093007.post-88195936957575651672009-03-20T19:28:00.002+01:002009-03-20T19:38:31.109+01:00Meta blogThe translation of “meta blogging” goes something like “blogging about blogging”. I don't like it. The reason being that, when I read a blog, I'm not interested in the blog itself. I'm interested in the topic of the blog. I don't care what the author has been up to recently and I'm not interested in whatever adventures he pulls himself through in order to splash his letters onto my screen. I suspect that my readers share that indifference to information relating to my persona and, thus, I'd like to keep any such uninteresting details at a minimum (I'm not doing a particularly good job here).<br /><br /><span style="font-weight: bold;">Objective meta-meta blog</span><br /><br />Many meta blog entries are concerned with the responsibility of the author and the fact that the author haven't managed to actually author anything for a while. The readers are supposed to have been subjected to great distress in the absence of quality content to digest off the pages of the blog in question, and therefore it lies within the author's responsibility to maintain their hunger for information with a steady flow of well thought out entries.<br /><br /><span style="font-weight: bold;">Subjective meta blog</span><br /><br />My readers are a sparse collection of nerds, friends, acquaintances, my brother and hopefully a few others (please say “hi” in the comments). However, I believe that my responsibility as a writer is proportional to the size of my reader base. Because of the small size of that reader base, I believe my responsibility is limited to not lying and deliberately spreading misinformation on these pages. I'm sure I don't have to produce a certain quantity of text in order to meet the expectations of my readers. And I haven't produced much of anything here lately. In the future, however, the updating of this blog will probably be just like my reader base – sparse. Maybe I'll post every two weeks or something along those lines.<br /><br />After all, I write for my own pleasure. This blog fulfills a need to discuss some topics that I have no other media for, even if it means I'll discuss them with myself. And it helps me practice my English and general language skills.<br /><br />So, hopefully you'll hear from me in the future and I'll be able to sense your presence from the slow ticking of the stat counter or from the precious few comments to my entries. Until then, have a nice day.<br /><br />Oh, and I apologize for this entry.Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com2tag:blogger.com,1999:blog-2104323735153093007.post-30297065228253636942008-11-25T06:24:00.009+01:002015-07-30T17:28:49.709+02:00Conditional ProbabilityIn <a href="http://economatheek.blogspot.com/2008/11/probability-theory-for-dummies.html">my last entry on probability theory</a>, I promised to have a more detailed look at conditional probability. We need this for solving the <a href="http://economatheek.blogspot.com/2008/11/quiz-probability-of-having-disease.html">disease test problem</a> and for solving the Monty Hall problem mathematically.<br /><br />P(A | B) reads "the probability of A given B", and this is referred to as conditional probability. The formula for calculating this probability is<br /><br />P(A | B) = P(A AND B) / P(B)<br /><br />Why?<br /><br />We're trying to figure out the probability that the event A is also true, given that B is true. Sometimes A may be true even though B is not, but we're not interested in those instances.<br /><br /><span style="font-weight: bold;">Independent events</span><br /><br />Now, sometimes the probability of A is independent of B. For example, if we flip two coins [A, B], and each event A and B is true if the corresponding coin comes up heads. If B is true (that is, comes up heads) the probability of A is still 50% (1). This means that<br /><br />P(A | B) = P(A)<br /><br />where P(A) is the a-priori probability and the conditional probability is unchanged due to the information that we gained from flipping the coin B.<br /><br /><span style="font-weight: bold;">Dependent events</span><br /><br />But what if<br /><br />P(A | B) != P(A)<br /><br />("!=" reads "does not equal") In this case, A is dependent on B. What this means is that, as we gain information about B, the probability of A changes from the a-priori probability. In this case, we need to consider all the cases when B is true:<br /><br />P(A AND B) + P(NOT A AND B) = P(B)<br /><br />Those are all of the instances where B is true. So we know that B is true. Out of all the instances where B is true [P(B)], some of them are instances where A is also true [P(A AND B)]:<br /><br />P(A | B) = P(A AND B) / P(B)<br /><br /><span style="font-weight: bold;">An example</span><br /><br />Let's say we have a drawn three cards from a deck: An ace, a king and a queen [A, K, Q]. We shuffle those three cards and draw two of them, trying to draw an ace. Let's say that the first card is not an A. What's the probability that the second one is?<br /><br />First, let's define two events:<br /><br />Card1 is the event that the first card is an A<br />Card2 is the event that the second card is an A<br /><br />P(Card1) = 1/3<br />P(NOT Card1) = 2/3<br />P(Card2 AND NOT Card1) = 1/3<br /><br />The last probability is easy to see from an a-priori standpoint - The probability that any one of the drawn cards will be an A is 1/3. So the probability of that one card to be an A and the other one not to be an A is also 1/3.<br /><br />P(Card2 | NOT Card1) = P(Card2 AND NOT Card1) / P(NOT Card1)<br />P(Card2 | NOT Card1) = (1/3) / (2/3) = 1/2<br /><br />More intuitively, this can be illustrated as follows:<br /><br /><br /><br /><br /><br /><br /><br /><table border="1"><tbody><tr><td colspan="2"><br /></td><td colspan="3" style="font-weight: bold;">Card 2</td></tr><tr><td colspan="2"><br /></td><td style="font-weight: bold;">A</td><td style="font-weight: bold;">K</td><td style="font-weight: bold;">Q</td></tr><tr><td rowspan="5" style="font-weight: bold;">Card 1</td><td style="font-weight: bold;">A</td><td>0</td><td style="color: red;">1/6</td><td><span style="color: red;">1/6</span></td></tr><tr><td style="font-weight: bold;">K</td><td><span style="color: #33cc00;">1/6</span></td><td>0</td><td><span style="color: #33cc00;">1/6</span></td></tr><tr><td style="font-weight: bold;">Q</td><td><span style="color: #33cc00;">1/6</span></td><td><span style="color: #33cc00;">1/6</span></td><td>0</td></tr></tbody></table><br />As we can see in this matrix, there are six possible combinations of two cards. [AA, KK, QQ] are not possible, since there's only one card of each rank. Each possible combination has a 1/6 probability of occuring. [AK, AQ] are the possible combinations where the first card is an A. In the matrix, we can find the probabilities stated earlier:<br /><br />P(Card1) = 2 * 1/6 = 1/3<br />P(NOT Card1) = <span style="color: #33ff33;">4 * 1/6</span> = 2/3<br />P(Card2 AND NOT Card1) = <span style="color: red;">2 * 1/6</span> = 1/3<br /><br />Asking what P(Card2 | NOT Card1) is, is the same as asking "how big a fraction of the times that we don't pick an A as our first card do we pick an ace as our second card?". We can easily see in the matrix that there are 4 cases (marked as green) where we don't pick an A as our first card. In two of those cases our second card is an A. 2/4 = 1/2. But also, 2*(1/6) / 4*(1/6) = 1/2.<br /><br /><span style="color: red;">2*(1/6)</span> / <span style="color: #33ff33;">4*(1/6)</span> = P(Card2 AND NOT Card1) / P(NOT Card1)<br /><br />Stated in words, in 2 out of 4 cases when the first card is not an A (out of a total of 6 possible cases, which includes draws where the first card is an A), the second card is an A. So, knowing that the first card is not an A, we can narrow the situation down to those 4 cases, giving us a probability of 2/4 = 1/2.<br /><br /><span style="font-style: italic;">I hope this entry has helped your understanding of how conditional probability works. It's not very formal, and it's not very extensive, but hopefully it's quite intuitive and at least free of too big gaps in it's logic. However, it's late now, and I kind of just threw this one out there, because I haven't posted anything for a while.</span><br /><br /><span style="font-size: 78%;">__________<br /><span style="font-weight: bold;">Notes:</span><br />(1) If you think otherwise, you're subjected to the gamblers fallacy, which we'll have a closer look at in a future post.</span>Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com1tag:blogger.com,1999:blog-2104323735153093007.post-74052587056481871532008-11-19T10:07:00.006+01:002008-11-25T10:09:54.950+01:00Soccer Penalty Kicks Article Flawed?<div style="text-align: justify;">In <a href="http://economatheek.blogspot.com/2008/11/maths-of-soccer-penalty-kicks.html">my last post on this subject</a>, I made a quick reference to <a href="http://www.slate.com/id/2144182/pagenum/all/">an article</a> on a mathematical examination of soccer penalty kicks. In this article, Tim Harford gives a brief survey of the findings of a paper by <a href="http://www.econ.brown.edu/fac/ipalacios/">Ignatio Palacios-Huerta</a> of the Brown University. The paper (<a href="http://www.econ.brown.edu/fac/ipalacios/pdf/professionals.pdf">pdf</a>) is quite an interesting read, and I really recommend anyone with some knowledge in statistics and game theory to read it. However, I do believe I've detected a flaw in it. Though, before proceeding any further, I should include all the standard disclaimers, including, but not limited to, the fact that I'm in no way an authority nor an expert in this area, and that there is a chance that I've misunderstood things. I have all due respect for Mr Palacios-Huerta as a scientist and for Mr Harford as a writer, and I'm merely a layman myself.<br /><br />Anyways. After writing my first entry on the article by Tim Harford, I got to thinking. Quoting from Tim Harford's article:<br /><blockquote>Professionals such as the French superstar Zinédine Zidane and Italy's goalkeeper Gianluigi Buffon are apparently superb economists: Their strategies are absolutely unpredictable, and, as the theory demands, they are equally successful no matter what they do, indicating that they have found the perfect balance among the different options. These geniuses do not just think with their feet.</blockquote>At first, this seemed to be a good indication that Zidane and Buffon are indeed playing optimal strategies. But what hit me after writing my first entry, is that their playing optimal strategies doesn't make <span style="font-style: italic;">themselves</span> indifferent between their strategy choices. That is, their playing optimally doesn't make them succeed equally often no matter what they do. It does, however, make <span style="font-style: italic;">their opponents</span> indifferent between their strategy choices.<br /><br />The optimal strategy is about making your opponent indifferent between his strategy choices. Recall, from my previous post on this subject, how the indifference equations for each player included the strategy choices for the other player, but not his own strategy choices. This relationship works two ways: Your playing optimally doesn't make you indifferent, and your indifference is not an indication that you're playing optimally.<br /><br />So the Harford article is wrong. The fact that Zinédine Zidane and Gianluigi Buffon seem to be indifferent does not indicate that they play optimal strategies. It does, however, indicate that <span style="font-style: italic;">their opponents</span>, on an aggregate level, are playing optimally.<br /><br /><span style="font-weight: bold;">Now, is this Tim Harford's or Ignatio Palacios-Huerta's </span><span style="font-weight: bold;">mistake</span><span style="font-weight: bold;">?</span> In order to find out, I read the original paper by Mr Palacios-Huerta.<br /><br />In the paper, Palacios-Huerta starts by formulating a hypothesis saying that professional players are indeed playing a minimax strategy. In order to test this hypothesis, he examines a sample of 1417 penalty kicks. I have no objections to his examination on all the players on an aggregate level. However, when testing the hypothesis for individual players, he seems to be looking at each individual players' strategy choices and their corresponding outcomes. Using Pearson statistics and p-values, based on those figures, the hypothesis is rejected for five players.<br /><br />On an aggregate level, we can look at the overall figures of both sides of the game. According to the hypothesis, both goalies and kickers should have equal sucess rates, no matter their choices. This can be tested and the hypothesis rejected with the tools used by Palacios-Huerta. But when testing the hypothesis for individual players, we should look at that individual player's aggregated opponents' sucess rates for their strategy choices, which, it seems to me, is not what he's done.<br /><br />So what hypothesis should we reject when the individual figures used by Palacios-Huerta don't give a good enough match with the hypothesis? Well, not the one that that particular player is playing minimax, but rather the one that <span style="font-style: italic;">his opponents</span>, on an aggregate level, play minimax. This is not, in itself, an uninteresting hypothesis to examine, but, as far as I can see, it's not the one intended by Palacios-Huerta.<br /><br />Unfortunately, the tables provided in the paper don't allow for the data to be rearranged so that we can perform this test on our own. There is no information on the strategy choices of the opponents of each individual player and their corresponding outcomes, so we can't examine the hypothesis that a specific individual player plays optimally, without accessing the underlying data.<br /><br /><span style="font-weight: bold;">So, what do we know about Zinédine Zidane and Gianluigi Buffon?</span> Not much, but it seems they've been playing against superb economists.<br /><span style="font-size:78%;">__________</span><br /><span style="font-size:78%;"><span style="font-weight: bold;">Notes:</span></span><br /><span style="font-size:78%;">Again, let me remind you of the disclaimers. I'm really a laysman, and I may very well be wrong. Either all wrong or just in my interpretation of the paper.</span><br /><br /><span style="font-weight: bold;font-size:78%;" >External links in this post:</span><br /><span style="font-size:78%;"><a href="http://www.slate.com/id/2144182/pagenum/all/">World Cup Game Theory</a> - What economics tells us about penalty kicks by Tim Harford. The quoted article in Slate Magazine.</span><br /><span style="font-size:78%;"><a href="http://www.econ.brown.edu/fac/ipalacios/">Ignatio Palacios-Huerta</a> at the Brown University website</span><br /><span style="font-size:78%;"><a href="http://www.econ.brown.edu/fac/ipalacios/pdf/professionals.pdf">Professionals Play Minimax</a> by Ignatio Palacios-Huerta of the Brown University (pdf format)</span><br /><br /><span style="font-weight: bold;font-size:78%;" >Other resources:</span><br /><span style="font-size:78%;"><a href="http://timharford.com/">Tim Harford</a> - The Undercover Economist</span><br /></div>Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com0tag:blogger.com,1999:blog-2104323735153093007.post-61344402800875727282008-11-15T13:11:00.005+01:002008-11-15T14:03:35.131+01:00The Monty Hall Problem, Part 4Hopefully, we all agree that we should switch when faced with the problem given in the basic formulation of the Monty Hall problem. However, in the alternative formulation given in <a href="http://economatheek.blogspot.com/2008/11/who-is-economatheek.html">my first entry</a>, there's one major difference. In the original game, the host was obliged to reveal a second door after watching you picking one. In my version of the game, I hadn't made any such commitment. So, why would I give you a chance to change your mind? Possibly out of generosity, sure, but most probably because I knew you had made the right choice and wanted you to switch to an empty cup.<br /><br />Put in game-theory terms, switching is a dominated strategy. Your strategy choices are to switch when given the opportunity or to never switch (switch / stay). I have more strategy choices than you do. This is a full payoff matrix of the game for all possible strategy choices, with your choices represented as columns and my choices as rows. The outcome values are the probabilities of you winning the bill.<br /><br /><table border="0"><br /><tr><td><br /></td><td>Switch</td><td>Stay</td></tr><br /><tr><td>No-No</td><td>1/3</td><td>1/3</td></tr><br /><tr><td>No-Yes</td><td>1<br /></td><td>1/3</td></tr><br /><tr><td>Yes-No</td><td>0</td><td>1/3</td></tr><br /><tr><td>Yes-Yes</td><td>2/3</td><td>1/3</td></tr><br /></table><br /><br />My strategy choice "Yes-No", for example, means that I offer you an opportunity to switch if you choose the right cup initially, but I don't offer you that opportunity if you choose the wrong one. So the first Yes or No refers to whether I offer you that opportunity when you choose the right cup, and the second one to the case when you choose the wrong one.<br /><br />Notice that all of my strategy choices but "Yes-No" (offering the opportunity to switch only when you've picked the right cup) are dominated. This means that they can never lead to better results, only worse, depending on your strategy choice. So there is no reason for me to choose any of those strategies. Thus removing those strategy choices, we get a much simpler payoff matrix:<br /><br /><table border="0"><br /><tr><td><br /></td><td>Switch</td><td>Stay</td></tr><br /><tr><td>Yes-No</td><td>0</td><td>1/3</td></tr><br /></table><br /><br />It should now be obvious that switching is a dominated strategy. So in a game-theory sense, switching is a bad strategy. However, game theory isn't everything. Maybe you have a "read" on me, making you believe that I want you to have the bill. Maybe you think that I intended to always give you the switching opportunity as in the original Monty Hall problem. So there may be reasons to deviate from game-theory optimal play. But lacking such guidance, you're probably better off resorting to game theory, in this case guaranteeing you a 1/3 chance to win the prize.Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com0tag:blogger.com,1999:blog-2104323735153093007.post-64172369346351997522008-11-14T07:00:00.001+01:002008-11-14T07:00:01.150+01:00Quiz: The Probability of Having a DiseaseA new super-resistent virus has emerged in Farawayistan. The Ministry of Health estimate 1 out of 1000 to be infected. After immense efforts, a very accurate test has been developed. If the test subject is infected, 99.9% of the time the test will come out positive. But if the subject is not infected, 1 in 1000 times the test will yield a false positive result.<br /><br />A random person is tested, and the test comes out positive. What's the probability that he's infected?<br /><br />Please, answer in the comments.Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com2tag:blogger.com,1999:blog-2104323735153093007.post-88135285329161456082008-11-13T07:00:00.003+01:002008-11-25T09:00:26.851+01:00The Maths of Soccer Penalty Kicks<div style="text-align: justify;">In a very interesting article, titled <a href="http://www.slate.com/id/2144182/?nav=tap3">World Cup Game Theory - what economics tells us about penalty kicks</a>, Financial Times columnist Tim Harford gives a simple and intuitive introduction to how to apply game theory to soccer penalty kicks. I really recommend reading his article. (After writing this entry, I detected a flaw in this article. <a href="http://economatheek.blogspot.com/2008/11/soccer-penalty-kicks-article-flawed.html">More on that here</a>.)<br /><br />In this entry, I'll examine the game theory of penalty kicks in some more detail, and we'll arrive at an actual formula for each player of the game. Due to the mathematical nature of this examination, there will be some math that may look dense and deterring, but it really looks more complicated than it actually is. As an attempt to make it easier to follow, I've used some color coding. Maybe, this serves only to make it look messier. Please tell me what you think in the comments.<br /><br /><span style="font-weight: bold;font-size:130%;" >Now, let's get started</span><br /><br />A penalty kick can be reduced into a simple grid game. There are two players: The penalty kicker, who has the choice of which direction to shoot, and the goalie, who has the choice of which way to throw himself.<br /><br />As Tim Harford points out, there is not enough time for the goalie to see which way the ball is going and to subsequently choose to go in that direction in order to intercept the ball. He must guess, risking going in the complete wrong direction. So the goalie's choice of direction is independent of the shooters choice (1).<br /><br />Most shooters have a stronger and a weaker side, and should tend to favor their stronger side. But if they always choose to aim at the stronger side, the goalie can exploit this by always going that direction. The shooter can then counter-exploit this by shooting at the other side, where he will most certainly score a goal even though it's his weaker side, since the goalie will be going the other way. The goalie then reacts to this, and we have a never-ending loop of exploitations and counter-exploitations. To solve this problem, we need to find a game-theory optimal solution that offers an equilibrium to the game.<br /><br /><span style="font-weight: bold;font-size:130%;" >Game-theory optimal play</span><br /><br />If both players play game-theory optimally, neither one can better his chances by altering his strategy. If he could, his strategy wouldn't have been optimal. In the same fashion, the optimal strategy is unexploitable, since the opponent can't gain an additional edge on it by switching from his optimal strategy. Remember how the shooter could elect to always shoot to his weaker side to exploit a strategy where the goalie always go to the shooters' stronger side? That means that the goalie's strategy wasn't optimal.<br /><br />Now, if neither player can gain an edge by altering their strategy, then, by definition, they're <span style="font-style: italic;">indifferent</span> between their choices. If they weren't indifferent, one choice would be better than the other, and the player would gain an edge by opting for that choice. So, in order to find the game-theory optimal strategies, we should look for indifference points.<br /><br /><span style="font-weight: bold;font-size:130%;" >Strategy values</span><br /><br />This game can easily be summarized into a grid as follows:<br /><br /></div><table style="text-align: left; margin-left: 0px; margin-right: 0px;" border="0" width="200"><br /><tbody><tr><td><br /></td><td>GS</td><td>GW</td></tr><br /><tr><td>SS<br /></td><td style="color: rgb(255, 0, 0);">50%</td><td style="color: rgb(153, 102, 51);">95%</td></tr><br /><tr><td>SW<br /></td><td style="color: rgb(0, 0, 255);">80%</td><td style="color: rgb(0, 153, 0);">30%</td></tr><br /></tbody></table><div style="text-align: justify;"><br />where the rows represent the shooter's strategy choices and the columns represent the goalie's strategy choices. Each cell correspond to a combination of the choices of both players, and the figure is the corresponding chance of a goal. GS means that the goalie throws himself in the shooter's strong direction, and GW that he opts for the shooter's weak side. SS and SW relate to the corresponding shooting strategies.<br /><br />Now, those figures are just made up for the purpose of illustration. I don't claim that they're realistic in any way. Notice though, that I've taken into account the chance that the shooter misses the goal even if the goalie goes the wrong way, and that there is a bigger risk for this when he shoots to his weaker side. Notice, also, that the chance of a goal is greater if he opts for the stronger side and the goalie goes the right way, than if he opts for the weaker side and the goalie goes that way.<br /><br />So, we have 2 strategies for each player: SS and SW for the shooter, and GS and GW for the goalie. We have 4 strategy pairs, <span style="color: rgb(255, 0, 0);"><span style="color: rgb(255, 0, 0);">SSGS</span></span>, <span style="color: rgb(153, 102, 51);">SSGW</span>, <span style="color: rgb(0, 0, 255);">SWGS</span> and <span style="color: rgb(0, 153, 0);">SWGW</span>, with corresponding outcome values. (2)<br /><br /><span style="font-weight: bold;font-size:130%;" >Calculating the goalie's strategy</span><br /><br />If the shooter chooses the strategy SS S% of the time, he will chose the strategy SW 1-S% of the time (3). Similarily, if the goalie chooses the strategy GS G% of the time, he will chose strategy GW 1-G% of the time. We should now solve for S and G, which are the strategy variables for the two players.<br /><br />The expected value for the shooter of strategy SS is:<br /><br />E(SS) = G * <span style="color: rgb(255, 0, 0);">SSGS</span> + (1 - G) * <span style="color: rgb(153, 102, 51);">SSGW</span><br /><br />In plain English, this means that the shooter will obtain an outcome value of <span style="color: rgb(255, 0, 0);">SSGS</span> (50% in our example) the G% of times when the goalie chooses the strategy GS, and he'll obtain a outcome value of <span style="color: rgb(153, 102, 51);">SSGW</span> (95% in our example) the 1-G% of times when the goalie chooses the strategy GS.<br /><br />Similarily,<br /><br />E(SW) = G * <span style="color: rgb(0, 0, 255);">SWGS</span> + (1 - G) * <span style="color: rgb(0, 153, 0);">SWGW</span><br /><br />Now, here comes the magic. The shooter is indifferent when E(SS) = E(SW), as explained above. To find this point, we'll insert the two equations above into that equation:<br /><br />E(SS) = E(SW)<br />G * <span style="color: rgb(255, 0, 0);">SSGS</span> + (1 - G) * <span style="color: rgb(153, 102, 51);">SSGW</span> = G * <span style="color: rgb(0, 0, 255);">SWGS</span> + (1 - G) * <span style="color: rgb(0, 153, 0);">SWGW</span><br />G * <span style="color: rgb(255, 0, 0);">SSGS</span> + <span style="color: rgb(153, 102, 51);">SSGW</span> - G * <span style="color: rgb(153, 102, 51);">SSGW</span> = G * <span style="color: rgb(0, 0, 255);">SWGS</span> + <span style="color: rgb(0, 153, 0);">SWGW</span> - G * <span style="color: rgb(0, 153, 0);">SWGW</span><br />G * <span style="color: rgb(255, 0, 0);">SSGS</span> + G * <span style="color: rgb(0, 153, 0);">SWGW</span> - G * <span style="color: rgb(153, 102, 51);">SSGW</span> - G * <span style="color: rgb(0, 0, 255);">SWGS</span> = <span style="color: rgb(0, 153, 0);">SWGW</span> - <span style="color: rgb(153, 102, 51);">SSGW</span><br />G * (<span style="color: rgb(255, 0, 0);">SSGS</span> + <span style="color: rgb(0, 153, 0);">SWGW</span> - <span style="color: rgb(153, 102, 51);">SSGW</span> - <span style="color: rgb(0, 0, 255);">SWGS</span>) = <span style="color: rgb(0, 153, 0);">SWGW</span> - <span style="color: rgb(153, 102, 51);">SSGW</span><br /><span style="font-weight: bold;">G = <span style="color: rgb(0, 153, 0);">SWGW</span> - <span style="color: rgb(153, 102, 51);">SSGW</span> / (<span style="color: rgb(255, 0, 0);">SSGS</span> + <span style="color: rgb(0, 153, 0);">SWGW</span> - <span style="color: rgb(153, 102, 51);">SSGW</span> - <span style="color: rgb(0, 0, 255);">SWGS</span>)</span><br /><br />And that's the formula for the indifference point for the shooter. When the goalie chooses his actions according to this formula, the shooter will be indifferent to his strategy choices. Plugging our example figures into the formula:<br /><br />G = <span style="color: rgb(0, 153, 0);">SWGW</span> - <span style="color: rgb(153, 102, 51);">SSGW</span> / (<span style="color: rgb(255, 0, 0);">SSGS</span> + <span style="color: rgb(0, 153, 0);">SWGW</span> - <span style="color: rgb(153, 102, 51);">SSGW</span> - <span style="color: rgb(0, 0, 255);">SWGS</span>)<br />G = 0.3 - 0.95 / (0.5 + 0.3 - 0.95 - 0.8)<br />G ~ 0.684<br /><br />The goalie should go for the shooter's stronger side about 68.4% of the time.<br /><br /><span><span style="font-weight: bold;font-size:130%;" >Calculating the shooter's strategy</span></span><br /><br /><span>Similarly, we calculate the expected values of the goalie's strategy choices:</span><br /><br />E(GS) = S * <span style="color: rgb(255, 0, 0);">SSGS</span> + (1 - S) * <span style="color: rgb(0, 0, 255);">SWGS</span> = S * <span style="color: rgb(255, 0, 0);">SSGS</span> - S * <span style="color: rgb(0, 0, 255);">SWGS</span> + <span style="color: rgb(0, 0, 255);">SWGS</span><br />E(GW) = S * <span style="color: rgb(153, 102, 51);">SSGW</span> + (1 - S) * <span style="color: rgb(0, 153, 0);">SWGW</span> = S * <span style="color: rgb(153, 102, 51);">SSGW</span> - S * <span style="color: rgb(0, 153, 0);">SWGW</span> + <span style="color: rgb(0, 153, 0);">SWGW</span><br /><br />Notice that high outcome values are bad for the goalie, as that means a large probability of a goal. But that doesn't matter to us. We don't care about the exact values of the strategies, as long as they're equal. Now for the indifference equation:<br /><br />E(GS) = E(GW)<br />S * <span style="color: rgb(255, 0, 0);">SSGS</span> - S * <span style="color: rgb(0, 0, 255);">SWGS</span> + <span style="color: rgb(0, 0, 255);">SWGS</span> = S * <span style="color: rgb(153, 102, 51);">SSGW</span> - S * <span style="color: rgb(0, 153, 0);">SWGW</span> + <span style="color: rgb(0, 153, 0);">SWGW</span><br />S * (<span style="color: rgb(255, 0, 0);">SSGS</span> + <span style="color: rgb(0, 153, 0);">SWGW</span> - <span style="color: rgb(153, 102, 51);">SSGW</span> - <span style="color: rgb(0, 0, 255);">SWGS</span>) = <span style="color: rgb(0, 153, 0);">SWGW</span> - <span style="color: rgb(0, 0, 255);">SWGS</span><br /><span style="font-weight: bold;">S = <span style="color: rgb(0, 153, 0);">SWGW</span> - <span style="color: rgb(0, 0, 255);">SWGS</span> / (<span style="color: rgb(255, 0, 0);">SSGS</span> + <span style="color: rgb(0, 153, 0);">SWGW</span> - <span style="color: rgb(153, 102, 51);">SSGW</span></span><span style="font-weight: bold;"> - <span style="color: rgb(0, 0, 255);">SWGS</span></span><span style="font-weight: bold;">)</span><br /><br />Again, plugging in the example figures:<br /><br />S = <span style="color: rgb(0, 153, 0);">SWGW</span> - <span style="color: rgb(0, 0, 255);">SWGS</span> / (<span style="color: rgb(255, 0, 0);">SSGS</span> + <span style="color: rgb(0, 153, 0);">SWGW</span> - <span style="color: rgb(0, 0, 255);">SWGS</span> - <span style="color: rgb(153, 102, 51);">SSGW</span>)<br />S = 0.3 - 0.8 / (0.5 + 0.3 - 0.95 - 0.8)<br />S ~ 0.526<br /><br /><span style="font-weight: bold;font-size:130%;" >Conclusion</span><br /><br /><span>The shooter should opt for his stronger side</span><span style="font-weight: bold;"> </span>only 52.6% of the time, while the goalie goes that way 68.4% of the time. Both players are then indifferent to their choices. This means that either one of them could choose any action, and still obtain exactly the same result. But biasing towards one decision opens up for the opponent to exploit that bias, and that's why they should stay with the frequencies prescribed by the solution (4). They cant do better by changing, but they can do worse, if their opponent catches on.<br /><br /><span style="font-weight: bold;">Next entry on this topic: </span><a style="font-weight: bold;" href="http://economatheek.blogspot.com/2008/11/soccer-penalty-kicks-article-flawed.html">Soccer Penalty Article Flawed?</a><br /><span style="font-size:78%;">__________</span><br /><br /><span style="font-size:78%;"><span style="font-weight: bold;">Notes:</span></span><br /><span style="font-size:78%;"><span style="font-weight: bold;">(1)</span> Actually, some goalies have developed an ability to, based on the movements of the shooter prior to the impact with the ball, anticipate in what direction the shooter will aim. This gives him a greater time frame in order to decide which way to go. On the other hand, some shooters have developed a counter-technique of bluffing which way he's going to aim, so this boils down to a game of it's own. For simplicity, we'll assume that there are no prior indications as to which way the shooter will aim.</span><br /><span style="font-size:78%;"><span style="font-weight: bold;">(2)</span> Here, we consider the probability of a goal, for each strategy pair, an outcome. Even if the eventual, actual outcome of the kick is still uncertain, the probability of a goal, given a specific strategy pair, serves as a value of that strategy pair for each player.</span><br /><span style="font-size:78%;"><span style="font-weight: bold;">(3)</span> As explained in the entry <a href="http://economatheek.blogspot.com/2008/11/probability-theory-for-dummies.html">Probability Theory for Dummies</a>.</span><br /><span style="font-size:78%;"><span style="font-weight: bold;">(4)</span> If the opponent isn't playing the optimal strategy, however, one might consider making exploitative adjustments to the optimal strategy. Bear in mind, though, that this opens up for counter-exploitation, and should thus only be done if one don't expect the opponent to catch on.</span><br /><br /><span style="font-size:78%;"><span style="font-weight: bold;">External links in this post:</span></span><br /><span style="font-size:78%;"><a href="http://www.slate.com/id/2144182/?nav=tap3">World Cup Game Theory</a> - What economics tells us about penalty kicks</span><br /><span style="font-size:78%;">by </span><span class="byline"><span style="font-size:78%;">Tim Harford, posted on the website of <a href="http://slate.com/">Slate Magazine</a></span></span><br /></div>Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com2tag:blogger.com,1999:blog-2104323735153093007.post-40613694877420794422008-11-12T07:00:00.003+01:002008-11-12T07:42:34.154+01:00The Monty Hall Problem, Part 3<div style="text-align: center;"><span style="font-size:78%;"><span style="font-weight: bold;">Previous entries in this series:</span><br /><a href="http://economatheek.blogspot.com/2008/11/who-is-economatheek.html">Part 1 </a>| <a href="http://economatheek.blogspot.com/2008/11/i-suppose-one-comment-for-my-very-first.html">Part 2</a><br /></span></div><br /><div style="text-align: justify;">In this post, we'll examine the Monty Hall problem intuitively. First, we'll start with the problem in it's basic formulation:<br /><br />You're on a game show. Here are the rules of the game: There are three doors, and behind one of them there's a brand new car. Behind the two other doors, there are goats. You'll choose one of the doors, and subsequently, the game show host, who knows where the car is, will open one of the other doors, revealing a goat. Then you are given a choice: Will you stay with your initial choice or will you switch to the other unopened door?<br /><br />Many answer initially that it doesn't matter. They reason that, as there are only two doors left, one of which include the car and one that does not, the chances that any one of them veils the car is 1/2. Sensible as this may seem, it's incorrect. In fact, the probability that the other door veils the car is 2/3, so you should switch.<br /><br />How can this be? Of course, when we first picked a door, there was a 1/3 chance that we picked the right one. But when the game show hosts opens another door, doesn't that additional information change the situation? In fact, it doesn't. We don't gain any additional information from his opening of a door, since he'd open a door to reveal a goat whether or not we picked the right door to start with. We knew from the beginning that the probability was 1/3, and we haven't gained any additional information to change things, so we're still looking at a 1/3 probability.<br /><br />Here's another way of looking at it: 1/3 of times, you'll pick the right door to start with. Staying will grant you the car, and switching gives nothing. But 2/3 of the times, you'll pick the wrong door, and the host will open the other door that also veils a goat. Those times, staying gives nothing, and switching grants you the car. So switching gives you the car in 2/3 of instances.<br /><br />In the next part of this series, we'll have a look at the mathematics of this game, but for now, let's leave it at this. As a side note, if the host didn't know where the car is, but he still opened a door at random and just happened to reveal a goat, then the chances would, indeed, be 1/2 and you'd be indifferent to switching.<br /><br /><span style="font-weight: bold;">In the alternative formulation</span> that I gave in <a href="http://economatheek.blogspot.com/2008/11/who-is-economatheek.html">my first entry</a>, I said that the correct decision is to stay with your initial choice. The formulation was slightly different, with only one significant difference. But what? I've had a few friends asking me about this on MSN, and one of them actually came up with the correct answer by himself. Can anyone figure out what that difference is, and why it changes things so drastically?Please comment.</div>Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com0tag:blogger.com,1999:blog-2104323735153093007.post-36571551255435219292008-11-11T07:00:00.004+01:002008-11-25T07:41:01.641+01:00Probability Theory for Dummies<div style="text-align: justify;">Before we start examining the Monty Hall problem in any kind of detail, let's look at some probability notation and theory. This entry is intended for those readers who aren't already altogether familiar with basic probability theory. Those who are may read this as a review, or skip reading today's post.<br /><br />I believe that about 50% of my readers are going to read this specific entry. The others may not be too interested in this particular topic, they may be deterred by the mathematical notation (don't let that deter <span style="font-style: italic;">you</span>, though), or they may already be very familiar with the topics in this entry. However, taking a random reader, the probability that he'll read this entry, given that my assumption is correct, is 50%. Put another way:<br /><br />P(Read) = 0.5<br /><br />"P" is the notation for probability. "Read" is the event that he reads the entry, and 0.5 is the decimal form of the probability of that event. In English, P(Read) = 0.5 reads "the probability of the event that he reads this entry equals 50%".<br /><br /><span style="font-weight: bold;">Conditional probability (2)</span><br /><br />However, I feel that at least 75% of my math-inclined readers probably are familiar with those topics and, thus, will skip reading this entry. So,<br /><br />P(NOT Read | Math-inclined) = 0.75<br /><br />"P" and "Read" are the same as above."NOT" is an operator stating that the "Read" event is not true. "|" is the the operator of conditional probability, and it reads "given". "Math-inclined" is the event that the chosen reader is one of those math-inclined readers. So this whole statement reads "The probability of the event that he does not read this entry, given the event that he's math-inclined, equals 75%".<br /><br /><span style="font-weight: bold;">The NOT operator</span><br /><br />The chances that an event takes place, plus the chance that it does NOT, always adds up to 100%. Put differently, in 100% of cases, either the event takes place or it doesn't. So,<br /><br />P(A) + P(NOT A) = 1.<br /><br />With some very basic algebra, this can be turned into<br /><br />P(NOT A) = 1 - P(A)<br /><br />The probability that the math-inclined reader will read this entry equals 1 minus the probability that he will NOT read it, so<br /><br />P(Read | Math-inclined) = 1 - P(NOT Read | Math-inclined) = 1 - 0.75 = 0.25<br /><br /><span style="font-weight: bold;">Joint probability</span><br /><br />I believe that about 75% of my readers are math-inclined, so<br /><br />P(Math-inclined AND Read) = 0.75 * 0.25 = 0.1875<br /><br />There's an 18.75% chance that a randomly chosen reader is math-inclined but still reads this entry. "AND" is the operator of joint probability. This means that, for the statement "Math inclined AND Read" to be true, both events have to be true. The reason why we multiply the probabilities of both events can be illustrated as follows:<br /><br />P(Math-inclined) = 0.75 = 3/4<br />P(Read | Math-inclined) = 0.25 = 1/4<br /><br />as stated above. Out of 16 randomly picked readers, on average 12 readers will be math-inclined (3/4 * 16). Out of those 12 math-inclined readers, 3 will read this entry (1/4 * 12). So out of 16 initially chosen readers, 3 are math-inclined AND will read this entry. 3 out of 16 are 18.75% (3/16 = 0.1875). This is consistent with multiplying the probabilities of both required events: 3/4 * 1/4 = 3/16.<br /><br />Put differently, in 75% of cases the randomly chosen reader is math-inclined. In 25% of those cases, he'll still read this entry. As we all know, a certain percentage of a quantity equals that quantity multiplied by the decimal form of the percentage. So, 25% of 75% is 0.25 * 75%, which equals 18.75% (0.1875).<br /><br /><span style="font-weight: bold;">The OR operator and the Sieve principle</span><br /><br />P(Math-inclined OR NOT Math-inclined) = 0.75 + 0.25 = 1<br /><br />Being put this way, this one seems pretty obvious. The probability that a reader is either math-inclined or not is obviously 1. But it may not seem equally obvious if we put it in more abstract terms. Formally,<br /><br />P(A OR B) = P(A) + P(B) - P(A AND B)<br /><br />Why is this? Well, let's resort to an intuitive explanation. Let's say we have six two-letter combinations, for example [AB, AC, AD, BC, BD, CD]. If we choose one of those combinations at random,<br /><br />P(A) = 3/6 (the probability of the event that the letter "A" is represented in the combination)<br />P(B) = 3/6 (the probability of the event that the letter "B" is represented in the combination)<br /><br />What's the probability that either A or B are represented in the combination? Well, from a quick look, we can see that it's 5/6, since only 1 in 6 combinations include neither A or B. But how do we arrive at this mathematically?<br /><br />Adding the number of combinations where A is represented to the number of combinations where B is represented seems like a good first step. But this leaves us with 6/6, which is clearly wrong. This is because, in this way, we've double-counted the instance where both A and B are represented. So, having counted that instance twice, we need to subtract it once:<br /><br />P(A OR B) = P(A) + P(B) - P(A AND B) = 3/6 + 3/6 - 1/6 = 5/6<br /><br />This is called the Sieve principle, or the inclusion-exclusion principle. With more events, such as P(A OR B OR C) it gets a little bit trickier, but let's not bother with that right now.<br /><br />Returning to<br /><br />P(Math-inclined OR NOT Math-inclined) = 0.75 + 0.25 = 1<br /><br />didn't I forget to subtract the double-counted instances? No. Double counting isn't possible in this case, since one reader can't be both math-inclined and non-math-inclined. So<br /><br />P(Math-inclined AND NOT Math-inclined) = 0<br /><br />and so, actually,<br /><br />P(Math-inclined OR NOT Math-inclined) = 0.75 + 0.25 <span style="font-weight: bold; color: rgb(255, 0, 0);">- 0</span> = 1<br /><br />but we can just skip that step.<br /><br /><span style="font-style: italic;">That's a very brief look at probability theory. We'll expand on it later on, and we'll use it when examining the Monty Hall problem further. But first, we'll have a more intuitive look at the Monty Hall problem tomorrow.</span><br /><br /><span style="font-weight: bold;">Next entry on this topic: </span><a style="font-weight: bold;" href="http://economatheek.blogspot.com/2008/11/conditional-probability.html">Conditional Probability</a><br /><br /><span style="font-size:78%;">__________</span><br /><span style="font-size:78%;"><span style="font-weight: bold;">Notes:</span></span><br /><span style="font-size:78%;">(1) This, and all of the other figures concerning the inclinations and tendencies of my readers, are completely made up. They are probably not even nearly accurate, and I use them solely for illustration purposes.</span><br /><span style="font-size:78%;">(2) In a <a href="http://economatheek.blogspot.com/2008/11/conditional-probability.html">future entry</a>, we'll have a more detailed look at conditional probability</span><br /></div>Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com2tag:blogger.com,1999:blog-2104323735153093007.post-61255269385558857432008-11-10T07:00:00.003+01:002008-11-12T07:43:29.666+01:00A Real-Life Application of the Bluff<div style="text-align: justify;">Many years ago, being a new student in a new city, working for students' associations was a fast lane into the student life. I took on many responsibilities, and a frequent problem was that of getting people to work. It's hard to convince people to work for sub-minimum wages when they have exams coming up. Needless to say, after I had ended my responsibilities at the students' associations, I felt for my poor successors trying to call for workers during exam periods. I didn't have much time to work, but I wanted to help. So, when they called me, asking me to work for them, I used to tell them to keep looking for someone else to work for them first, but to call me back if they couldn't find anyone, promising to think about it if they indeed did call me back.<br /><br />Pretty soon, they learned that I'd never say no, once they called me back. So they started calling me first, and I'd tell them to look for someone else first. Then, they'd just wait for a couple of hours and then call me back, knowing that I'd never say no. My strategy was highly exploitable, so to say. Of course, I could have just quit playing the game, that is, quit working at all. But I wanted to help out. So instead, I introduced the bluff to my strategy.<br /><br />Sometimes, when they called me and I knew for a fact that I couldn't work that particular day, I'd still tell them to look for someone else and then call me back if they couldn't find anyone else, promising to think about it if they did. When they called me back, I'd say, "no, sorry, I can't". That way, they couldn't rely solely on exploiting me anymore, and they had to actually start calling other people when I told them to.</div>Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com0tag:blogger.com,1999:blog-2104323735153093007.post-20619708469255822132008-11-09T07:00:00.002+01:002008-11-12T07:45:23.578+01:00Efficient-Market Hypothesis Yielded $300 Million Donation<div style="text-align: justify;">The University of Chicago Graduate School of Business is being renamed the University of Chicago Booth School of Business. The reason is a $300 Million donation from David G. Booth, the founder and chief executive of Dimensional Fund Advisors. Booth, who's been highly successful in his business, attributes much of his success to the university. His business is largely grounded in efficient-market hypothesis, which was, to a large extent, developed at the U. of Chicago Graduate School of Business.<br /><br />Basically, the efficient-market hypothesis states that prices on traded assets reflect all available information. Thus, no piece of information can be used in order to gain an edge on the market, since that information is already accounted for in the price. No trader can outperform the market, other than by luck.<br /><br />There are some obvious problems to this hypothesis, and I'll use this piece of news as a starting point for a series of entries on them.<br /><span style="font-size:78%;">__________</span><br /><span style="font-weight: bold;font-size:78%;" >Resources:</span><br /><span style="font-size:78%;"><a href="http://www.nytimes.com/2008/11/07/us/07donate.html?partner=permalink&exprod=permalink">The New York Times</a> on the donation</span></div>Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com0tag:blogger.com,1999:blog-2104323735153093007.post-12001183313684523732008-11-08T08:00:00.005+01:002008-11-12T07:45:07.147+01:00The Existence of God is 4 to 1?<div style="text-align: justify;">Bookmaker <a href="http://www.paddypower.com/">Paddy Power</a> are currently keeping a book on the existence of God. The odds are, according to the book, 4 to 1, and the odds of Russel Brand, English comedian, actor, author, columnist and television presenter, being God is, apparently, 500 to 1. On their website, Paddy Power state that "scientific proof must emerge by 31st Dec 2009, to confirm his omnipresence in order for bets to be deemed winners".<br /><br />It's not a coincidence that this bet emerges at this particular point in time. Recently, a <a href="http://www.justgiving.com/atheistbus">fund raising</a>, for the putting of adverts saying "There's probably no God" on London buses, started. Professor Richard Dawkins, famous atheist and bestselling author of The God Delusion, supports the campaign by matching any donations up to £5,500.<br /><br />Also, recent developments in the Large Hadron Collider project seems to have given people hopes of obtaining proof of the existence of God. The odds of the bet have fluctuated in correlation to events in the collider project, maybe not too surprisingly, since the scientists of the project are hoping to find the so called God particle. However, proving the existence of the God particle would not equal proving the existence of God, and therefore, anyone taking the Paddy Power bet, in the hopes of winning on such an event, is mistaken.<br /><br />So, without the hopes of finding a proof through the collider, we're back where we started in the ancient hunt for proof of God's existence. And I bet the odds against any proof to emerge within the next year are significantly larger than 4 to 1, no matter how likely the existence of God is deemed to be.<br /><span style="font-size:78%;">__________</span><br /><span style="font-weight: bold;font-size:78%;" >External links in this post:</span><br /><span style="font-size:78%;"><a href="http://www.paddypower.com/">Paddy Power's website</a></span><br /><span style="font-size:78%;"><a href="http://www.justgiving.com/atheistbus">Donation page</a> for the Atheist Bus Campaign.</span><br /><span style="font-weight: bold;font-size:78%;" >Other references:</span><br /><span style="font-size:78%;">English news site <a href="http://www.telegraph.co.uk/news/newstopics/religion/3374240/Paddy-Power-offers-odds-of-4-1-that-God-exists.html">Telegraph.co.uk</a> on the bet</span></div>Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com0tag:blogger.com,1999:blog-2104323735153093007.post-55937938412142060102008-11-07T00:00:00.010+01:002008-11-12T07:44:57.979+01:00A Quick Briefing on the Poker Bluff<div style="text-align: justify;">In poker, there are good hands, bad hands and hands in between. On the last betting round (<a href="http://www.blogger.com/post-edit.g?blogID=2104323735153093007&postID=5593793841214206010#note0811061">1</a>), obviously, you'd like to bet with your good hands, hoping for your opponent to call with a lower hand, thus making you money. However, you can't bet all of your hands, so you will check some of your hands. But if you bet only with your best hands, your opponent can adjust to this, and call only with hands slightly better than your worst betting hand. It wouldn't make sense for him to call with hands lower than your worst betting hand, since that would lose him money.<br /><br />But if he wouldn't call with a hand lower than your worst betting hand, then why would you bet it in the first place? It doesn't win you any money when he has a lower hand, but it costs you money when he has a better hand. So, you wouldn't bet it. This narrowing of your betting range would lead your opponent to narrow his calling range as well, so that you'd have another worst betting hand that isn't worth betting, either. And so it goes on, until you don't bet any hand but the very best hand possible. To put it in game-theory terms, there is no equilibrium where you bet only a top range of your hands.<br /><br />In order to find an equilibrium, you need to introduce the <span style="font-style: italic;">bluff </span>to your strategy. By betting a top range of your hands, and a bottom range of hands, that is, some of your absolutely worst hands, you'll make it worthwhile for your opponent to call with a hand lower than your worst value-betting hand (value betting being betting with one of the top range hands, as opposed to bluff betting, which you do with the bottom range hands). It follows that betting with that very worst value-betting hand will be worthwhile to you, since now it makes you money when your opponent calls with a lower hand.<br /><br />But do we really want to make it worthwhile for our opponent to call? That sounds beneficial to him, doesn't it? Well, if we bluff with the optimal frequency, that is, if our value-betting range stands in the right proportion to our bluffing range, he will break even from calling with those hands that can only beat a bluff. In game-theory terms, he's <span style="font-style: italic;">indifferent </span>to calling. But even if he could choose to fold instead of calling, he must call with a proper fraction of those hands, in order not to open up for you to exploit his folding tendency by bluffing more.<br /><br />There, that's a quick briefing on the poker bluff.<br /><span style="font-size:78%;">__________</span><br /><span style="font-size:78%;"><span style="font-weight: bold;">Notes:</span></span><br /><span style="font-size:78%;"><a name="note0811061">(1)</a> On earlier betting rounds, when the lower hand sometimes has a chance of winning, due to drawing possibilities, things are a bit different. But this is outside of the scope of this entry.</span><br /></div>Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com1tag:blogger.com,1999:blog-2104323735153093007.post-67780796047691388802008-11-06T14:13:00.004+01:002009-09-22T21:16:56.217+02:00Responsible Gaming in the Swedish Monopoly<div style="text-align: justify;">Recently, the Swedish government-owned gaming company, <a href="http://www.svenskaspel.se/">Svenska Spel</a>, which enjoys statutory monopoly on most forms of gambling services, including lotteries, online poker and casinos, was awarded the World Lottery Association's (WLA) Award for Responsible Gaming Excellence. The Swedish government has commissioned Svenska Spel to arrange lotteries and other gambling games in a responsible way. The rationale for this mission and the monopoly status is that this is deemed to have a positive effect on public health. Providing a more responsible service than foreign companies, Svenska Spel has a mission to keep Swedish gamblers from using those foreign services.<br /><br />However, many people suspect that the real rationale is purely fiscal - that the Swedish government wants to take a piece of the juicy gambling cake. Also, the responsibility seems, in many ways, to be only reluctantly introduced into the activities of the company, and only to the smallest extent possible in order to fulfill the requirements from the government.<br /><br />On their website, for instance, Svenska Spel claim to refrain from advertising for "hazardous" games. Poker, for example, is clearly considered such a hazardous game, but still, they do advertise for it. Not only do they target existing poker players with ads for upcoming tournaments and events, but they also engage in external marketing.<br /><br /><span style="font-weight: bold;">A peculiar instance that I saw today</span>, harmless as it may seem, relates to the Lotto game. In the Lotto, a series of 7 numbers ranging from 1 to 35 is drawn. Players choose their own string of numbers prior to the draw, and the number of corresponding numbers between the player's strings and the drawn string are counted. 7 correct numbers gives the grand prize, often making the winners millionaires. Fewer correct numbers give smaller prizes, and usually, less than 4 correct numbers gives no return.<br /><br />On their website, statistics on the draws can be found. From their website:<br /><blockquote>In the following statistics, you can see which numbers have been drawn the most and the least frequently in the last half year. The statistics are updated continuously. [...]<br /><br />Isn't statistics fun? Remember, though, that the choice of numbers is always random. Good luck!</blockquote><span style="font-size:78%;"><span style="font-style: italic;">(My translation)</span></span><br /><br /><span style="font-weight: bold;">Now, why is this a problem?</span><br /><br />Svenska Spel responsibly added a note on the randomness of the draw. You can not, by choosing specific numbers, impact your chances to hit the jackpot. But this is not the implication of the text at large, quite the contrary. Posting those statistics appeals to the Gamblers fallacy and the Texas sharpshooter fallacy (<a href="http://www.blogger.com/post-edit.g?blogID=2104323735153093007&postID=6778079604769138880#note1">1</a>), encouraging people to vainly try to tinker with their numbers in order to better their chances to win. Encouragement of irrational gambling and providing gamblers with false expectations is not my idea of responsible conduct.<br /><br />Interestingly, those numbers can be put to constructive use by a gambler free of those fallacies. If used the way contrary to the way that a gambler, subjected to the fallacies, would use them, we can exploit those who use them based on the fallacies. That is, we choose numbers that are less likely to be chosen by other, irrational players. Our winning chances won't increase, but on our occasional wins, we'll have fewer other winners to share the prize pool with.<br /><br />So, if there's a good use for the numbers, then there's no problem? Wrong. The good use is only possible to the extent that other players are tricked by the numbers.<br /><br /><span style="font-weight: bold;">For the sake of fairness</span>, let's note that, at least, Svenska Spel takes <span style="font-weight: bold;">some</span> measures for responsible gaming. They do, if not all they can, at least probably more than most other gaming companies. So the award is probably fair, and I don't mean to contest that. Contrarily, I congratulate Svenska Spel on their award.<br /><span style="font-size:78%;">__________</span><br /><span style="font-size:78%;"><span style="font-weight: bold;">Notes:</span></span><br /><span style="font-size:78%;"><a name="note1">(1)</a> In a future entry, there will be more on how the fallacies mentioned relate to the Lotto and those statistics.</span><br /><span style="font-size:78%;">__________</span><br /><span style="font-size:78%;"><span style="font-weight: bold;">External links in this post:</span></span><br /><span style="font-size:78%;">Website of the Swedish government-owned gaming company <span style="font-size:78%;"><a href="http://www.svenskaspel.se/">Svenska Spel</a></span><br /></span><br /><span style="font-size:78%;"><span style="font-weight: bold;">Other references:</span></span><br /><span style="font-size:78%;"><a href="http://www.world-lotteries.org/cms/index.php?option=com_content&task=view&id=3276&Itemid=30">Press release</a></span> <span style="font-size:78%;">on World Lottery Association's website</span></div>Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com2tag:blogger.com,1999:blog-2104323735153093007.post-75621104973132018262008-11-05T17:13:00.001+01:002008-11-12T07:45:46.689+01:00The Monty Hall Problem, Continued<div style="text-align: justify;">I suppose one comment for <a href="http://economatheek.blogspot.com/2008/11/who-is-economatheek.html">my very first blog entry</a> is to be considered good. To recap, I gave a new wording to the famous Monty Hall problem. Three cups were placed upside down on a bar desk, with a $100 bill under one of them. You were told to chose one of the cups, and if that particular cup contained the $100 bill, the bill would be yours. I lifted one of the other cups after watching you choose one, revealing that there was no bill under that one. You were offered an opportunity to change your mind and go for the other unrevealed cup instead of the one you initially chose. The question is: Should you switch or should you stay with your initial choice?<br /></div><br /><span style="font-weight: bold;">Anonymous </span>commented as follows:<br /><div style="text-align: justify;"><blockquote>You should switch. The probability that you picked the right cup first is 1/3, so its 2/3 that the bill is in one of the other cups. Knowing that the cup is not in one of them doesnt change that. If you switch the probability is 2/3.</blockquote>This may seem counter-intuitive to many readers. But from a probabilistic point of view, it seems to make sense. This is the answer that I expected from most readers who are already familiar with the Monty Hall problem, since this is the correct answer to the problem in its basic formulation. However, remember that I hinted that there is a twist to this one? There is, and this answer is, in fact, wrong. You should stay with your initial choice.<br /><br />Now, I expect to be ridiculed by some of my readers. Please, shoot. I'm very confident that my solution is correct, and I will explain why in a later entry. In the meantime, can anyone come up with an explanaition? Why should you stay? Or, if you don't agree with me, why am I wrong?</div>Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com2tag:blogger.com,1999:blog-2104323735153093007.post-7916210124000628102008-11-03T02:57:00.002+01:002015-07-30T17:42:51.450+02:00Who is the Economatheek?<div style="text-align: justify;">I suppose you've had some trouble interpreting my nom de plume, <span style="font-style: italic;">The Economatheek</span>. It's simple, though. It's a makeup word composed by the three parts <span style="font-style: italic;">economics, mathematics</span> and <span style="font-style: italic;">geek</span>. It says, if not all, at least some of it.<br /><br />Being a math-inclined law-and-business student among the polar bears and huskies of Sweden, I spend the long winters studying a wide range of sources, only connected by the fact that they're not actually on the curriculus. I've developed an interest in the shortcomings of mainstream economic theory and the fallacies that affect much of our thinking. I find the inherent uncertainty that impregnate every aspect of social and economic life, but which is widely assumed away in mainstream economic theory, quite intriguing and I feel that this is where focus needs to be set.<br /><br />Besides from this, I also harbour a great interest in game theory, probability theory, statistics, and many more topics. There will be plenty of all of those topics here as time goes by.<br /><br />In order not to scare our newcoming visitors off, let's start out slow:<br /><br /><span style="font-weight: bold;">The Monty Hall problem</span>, popularized by columist and author <a href="http://www.marilynvossavant.com/">Marilyn vos Savant</a>, proves an intricate problem to most on their first acquaintance with it. Expecting at least some of my visitors to have already read, solved and understood the problem in it's basic formulation, I shall say that there is a twist to this one. So stay with me.<br /><br />It's friday night, and you're standing by the bar in the local pub. From the shadows, a skinny, bearded man emerges, introducing himself as The Economatheek. Holding up a $100 bill, he asks the bartender for three coffee cups, and after receiving the cups, he asks you to look away. The bartender, who you trust as an honest man, serves as a witness as The Economatheek puts the three cups upside down with the $100 bill under one of them.<br /><br />You are now told to chose one of the cups. If the bill is, indeed, to be found under that particular cup, it will be yours. You chose one of the cups, but just as you're about to lift it, the skinny man tells you to wait. He then lifts one of the other cups, to reveal that there is no bill under that particular cup. Thereafter, he asks you if you still want to go for the cup that you initially chose, or if you want to switch to the other one that has yet not been lifted.<br /><br />So, what is your choice, and why? Please, share your thoughts in the comments.<br /><br /><span style="font-style: italic;">Also, please excuse my English. Being brought up by polar bears, English was not the first language I learned. I will not take offence from remarks on my language, rather the opposite. After all, life as a geek is all about learning.</span><br /><br /><span style="font-weight: bold;">The next entry on this topic: <a href="http://economatheek.blogspot.com/2008/11/i-suppose-one-comment-for-my-very-first.html">The Monty Hall problem, continued</a></span><br /><span style="font-size: 78%;">__________</span><br /><span style="font-size: 78%; font-weight: bold;">External links in this post:</span><br /><span style="font-size: 78%;"><a href="http://www.marilynvossavant.com/">http://www.marilynvossavant.com/</a></span><br /><span style="font-size: 78%;">Official website </span><span style="font-size: 78%;">of columnist and author Marilyn vos Savant.</span></div>Göranhttp://www.blogger.com/profile/03345144767062847676noreply@blogger.com1