Monday, December 7, 2009

Don't let models or correlations fool you.

In today's business world, regression-based models are used frequently. But what if those are wrong or not used properly at times?
Gillian Tett and Peter Larsen write "...the industry has pinned considerable faith - and business strategy - on a set of models that now seem less than fail-safe." In their article, Market faith goes out the window as the 'model monkeys' lose track of reality, they explain how models have become so important in the industry. Models are used sometime to predict situation, but what if what was predicted turned out to be the complete opposite?
These "wrong" models can impact businesses in a tremendous way. They can cause huge financial damage to the company. In this article an example is presented to us of why models have become very important. After the downfall of Ford and General Motors, banks have started trading arbitrages between debt and equity products. Also, some have gone into credit derivatives by relying on other untested models. Banks have sold their tranches of CDO's (collateralized debt obligations) to their clients; however, they've kept the most risky equity to themselves. This might hurt the banks in the long run. Also when this trading stops, it could have a very big impact on the banks since it was a big part of their investments. Some analysts said that their revenues could fall by a third. Some banks have already begun realigning their "models".
Models can mean different things and usually people decide to believe that they are right in thinking that way. We need to accept that sometimes those models can be wrong. A person before fully relying on a model should evaluate it and see if all the strategies for a model and the right range were used.
We cannot fully trust data to predict the future. In the article, Data Mining isn't a good bet for stock-market predictions, we discuss why that is. The author Jason Zweig states, "The Super Bowl market indicator holds that stocks will do well after a team from the old National Football League wins the Super Bowl. The Pittsburgh Steelers, an original NFL team, won this year, and the market is up as well. Unfortunately, the losing Arizona Cardinals also are an old NFL team." He also makes another example with the "Sell in May and go away" rule. However, the market was gone up 17% since April, so he says that rule isn't looking too good now. Jason agrees that those assumptions are completely unrealistic. However, Mr. Leinweber puts it, "they are one of the leading causes of the evaporation of money, especially in quantitative strategies." It is sad, but very real. This is what is going on in today's world. People will try to predict things that aren't real but others will believe that they are.
One example that is ridiculous is that a person can predict stock returns by tracking the number of nine-year olds in the United States. Also another example says that stocks are more likely to go up on days when smog goes down. In both examples, the things being predicted do not even have anything in common. How can people actually believe such things? The reality is that they do.
What can you do so you do not fall into this trap? In this article, Jason tells us how to do it. He says that first you need to see that the results make sense. The first rule he writes, "Correlation isn't causation, so there needs to be a logical reason why a particular factor should predict market returns.' This means that because two things correlate with one another, it does not mean that they are caused by one another. It just means that they correlate, nothing more. In some cases, yes they are caused by each other but like the examples mentioned above, that is not true for every single case.
The second rule is to break the data into other pieces. Zweig states, 'Divide the measurement period into thirds, for example, to see whether the strategy did well only part of the time. Ask to see the results only for stocks whose names begin with A through J, or R through Z, to see whether the claims hold up when you hold back some of the data." This way we can see if maybe the correlation is just happening to some parts of the data, but not to every single piece.
He then says, "Next, ask what the results would look like once trading costs, management fees and applicable taxes are subtracted. Finally, wait." Taxes and other extra costs could be impacting the results. As for waiting, we all know that time will tell. Mr. Leinwebers says, "If a strategy is worthwhile, then it will still be worthwhile in six months or a year."
So remember, correlation does not always mean causation!