Saturday, December 15, 2012

Do Incentives Imply Ability to Pursue Them?

So, I was playing Bejeweled 2 this morning when I should probably have been sleeping, when something about the game play triggered an errant thought: Do I assume that because I am rewarded for doing something that I have some control over achieving that?

In particular, B2 gives you a special explosive gem for connecting 4 gems in a row instead of the usual 3. These are very useful, and get saved between levels, so they are the only way you can affect the future by your current actions outside the current level. So, obviously they are very important and valuable, especially since you have relatively little control over the game.
In fact, you have so little control, getting those special gems is nearly random. As you can only move gems if it results in a breaking combo (3+ gems in a line of the same color) you have to plan very far ahead. Sometimes you get lucky and are basically handed board set ups where you can make it happen easily, or it randomly occurs due to lucky set ups with falling blocks. In fact, usually when it happens, it is due to random blocks; planning is limited as you have no foresight of what gems you will get as you make room, and so even if you really work at it your success rate is not highly correlated with your effort. At best you can just break lots of gems and hope for the best.

So, ok, B2 rewards achievements that are largely random and not due to player behavior (at least in getting the gems; good players save those things forever.) But B2 is a pretty easy game to analyze. What about bigger games like the Total War series? What about school? Work? Are we assuming that we have control over good things happening simply because we get rewarded for them? Are we getting rewarded based on largely random, unpurposeful achievements?

I think yes. There is a fair bit of research pointing towards success being a matter largely of random interactions and associations (who you know, not what), as well as around stocks and investing (see Nassim Taleb). External rewards are often not as rules based as people would like, and often we care about the results as well as the effort and inputs. If you think Olivia Wilde is hot, you don't care how much is genetics and how much is lifestyle choices.

We can't control that too well, so perhaps it isn't worth worrying about too much. What we can, and probably should, worry about is how we approach such things ourselves. If we don't know that a certain incentive is tied to results we can not directly control, we can spend a huge amount chasing a state that we have little or no influence over. For example, cancer is bad, but whether or not we avoid seems to be largely random for most people. However, the belief that somehow we can avoid it if only we pick the optimal combination of eating only grapefruit and whole grains pounded by hand is likely to have costs far outweighing the benefits of influencing our likelihood of getting cancer. We can spend our lives doing all manner of silly things believing that if only we practice the 7 Habits of Highly Effective People we can become hugely successful.

It isn't clear to me that this is a subject we are very good at analyzing. Maybe future research is required.

Friday, December 14, 2012

Changing Minds

I want to put forward a somewhat pessimistic model of people's changing views. I will call it the Cost/Company model for now.

First, I am going to assert that there are two types of people: those who have arrived at a conclusion with regards to a situation or belief, and those who have not.

The first group has, as stated, decided that they know what is right in some context, having arrived there through careful thought, study, indoctrination, what have you. How they got there is not important. What is important is that they got there, and believe they are correct. In fact, they are so certain that nothing will change their mind short of a direct cost to themselves of exercising their belief. So for instance, they might believe that the minimum wage has dis-employment effects no matter the evidence you provide, but not believe that they can take a speeding cement truck to the chest.

The second group, on the other hand, has not. Usually through ambivalence, but perhaps through lack of awareness of the subject, they have no real belief attached to the matter.
Now, this would generally cause some consternation; others are always ready to lecture one on why they should think one way or the other when you admit that you don't know. As a result, members of this second group are likely to agree with whatever their current social group thinks on the matter, to a greater or lesser extent depending on the social stigma attached to disagreement.

To be clear, I suspect all humans belong to both groups simultaneously, depending on the subject and their interests in it.  However, on any one subject, everyone will fall into one of the two groups.

Now that we have these models of people, let us pose a scenario. Let us suppose that you meet a random person on the street, and while waiting for the bus, strike up a conversation about some event on the news that morning. He has a differing view regarding it, and since he is pleasant and the wait is long you decide to discuss it with him. Let us call the positions 1 and 2, 1 being yours.Over time you will come to one of four general outcomes:

1: He will agree with you and you will not change, ie. you will hold combination (1,1)
2: He will disagree with you and you will not change (1,2)
3: He will agree with you, and you will change your mind. (2,2)
4: He will disagree with you, and you will change your mind. (2,1)

So, we can agree that option 4 is highly unlikely; you would have be a really terrible interlocutor to convince someone of your position and at the same time abandon it yourself. So that leaves us with either you agree because one of you changed your mind, or you just disagree and decide to talk about the local sports franchise till the bus gets there.

However, even if you get results 1 and 3, there is the possibility that either you or he did not actually change your mind, but merely agreed with the other person because it was socially acceptable. That is to say you were each other's only group at the moment, and so you said what you thought would make the group happy, not what you thought, and vice versa.

If that seems a little overly cynical, consider for a moment when the last time was when you had a debate with someone and one of you actually, really, deep down changed your mind. Not in the "Huh, I never really thought of that ever, but I guess that is true" sense of learning physics for the first time, but in the "Wow, that completely opened my mind" kind of way, where one or both of you really thought you were right, and then decided you were wrong. I am putting forth that after a point in someone's life, that NEVER happens, short direct, personal cost. Any seeming change in someone's opinion is only due to their being ambivalent and so speaking simply to please their current company.

Seem too extreme? Consider how easily most people ignore other people's points and evidence, while latching on to anything that supports their view, no matter how tenuous. Academics are (or should be) aware of this as Confirmation Bias, where you instantly accept anything that effectively tells you what you want to hear, and go out of your way to poke holes in what doesn't. Even being aware of it, it takes conscious effort to look for data that does not support your position, while data that doesn't can easily be explained away by mis-measurement or whatever.

Also, consider how often, and easily, we paint those who disagree with us as almost subhuman, being irrational, paranoid, or the like. It is difficult to acknowledge that those who disagree with us are real people, with real reasons for disagreeing, and just crazy outsiders we right minded folks should avoid.

Consider that, and now tell me that anyone's mind can be changed. Not with direct Costs; the rest is just posing for Company.