What is Evidence Based Investing?

evidencebasedinvestingserenityfinancialplanningbanner

A guest blog by Robin Powell, an award-winning journalist and financial educator. He blogs as The Evidence-Based Investor at evidenceinvestor.com.

robin-on-wall-street-2

A phrase you may have heard about lately is evidence-based investing. You would have thought that, in investing, as in, for example, medicine or criminal law, everything is based on evidence as a matter of course. Alas, it’s not. Despite more than 60 years of academic research on investing and the financial markets, much of the advice on offer today is still based, strange as it may seem, is based on opinions and hunches.

Thankfully, things are changing. There’s a growing movement among the financial advice profession worldwide to basing decisions on what has been shown to work in the past.

Nevertheless, there are plenty of financial professionals keen to denigrate evidence-based investing. Something they often point out is that “evidence” can be used to prove almost anything. They are, of course, perfectly correct; it can.

In 2015, for example, a journalist named John Bohannon and a couple of German TV producers set out to demonstrate how easy it is to turn bad science into the big headlines behind diet fads. Their challenge was to persuade as many media outlets as possible to publicise the results of “research” which “proved” that eating chocolate every day helps you lose weight.

So they spent a few thousand Euros recruiting research subjects and a doctor to run the study. They intentionally used a paltry data set of 15 participants and just three weeks of data. Over a beer-fuelled weekend with a statistician friend, they managed to torture the data sufficiently heavily to extract a technically accurate if essentially meaningless conclusion that chocolate consumption had indeed contributed to weight loss.

They submitted their paper, Chocolate with high cocoa content as a weight-loss accelerator, to 20 different journals and a few fake ones too. Within 24 hours the story was accepted for publication by dozens of publications around the world. Among the victims of the sting were Bild, Europe’s largest daily newspaper, the Daily Mail, Daily Express and Daily Star in the UK, the Irish Examiner, the Times of India, the Huffington Post and Cosmopolitan. The study also made television news in the US and Australia.

Financial newsdesks are bombarded with “research” all the time, and very often it’s about as scientific as John Bohannon’s study on chocolate and weight loss. Almost invariably it has been produced or commissioned by a company with a commercial conflict of interest.

If, then, a firm that either provides or sells actively managed funds comes up with “research” that appears to endorse the use of particular active funds, or active funds in general, journalists should be naturally suspicious. Unfortunately, for whatever reason, much of this “evidence” does end up in print.

But the problem with bogus evidence in the investment industry runs far deeper. That’s because the research that fund managers use to base their decisions on which stocks to buy and sell is, to a large extent, itself conflicted. Companies struggling for attention have been known to pay for the production of what looks like research by their broker. Those brokers often take payment for their work in warrants, or the right to buy the stock they have promoted at a future date at a substantial discount.

For fund managers, like journalists, it can be very hard to work out which research is properly independent and which isn’t. It’s often only at the end of a report that the truth is revealed, with a disclaimer along the lines of, “This material is not investment research”.

The answer, then, whether you’re a journalist, an investor or a financial professional, is to be extremely wary of “research”. You need to set very high standards for any evidence you use to base decisions on. What, then, should those standards be?

For me there are four important questions you should ask when presented with anything purporting to be research on how to invest.

Is it genuinely independent?

New studies are being published all the time which appear to support a particular investment strategy or course of action. But all too often these reports are produced, or else commissioned and paid for, by companies with a commercial interest in publicising the outcome. Most academics, on the other hand are truly independent in that they don’t have an agenda or a point to prove. Instead, they leave it to financial practitioners to act on their findings or not.

Is it based on robust data analysis?

We all know the old adage about lies, damned lies and statistics. It’s true, abuse of data can be very misleading. Often findings are based on too short a time period or a sample that’s too small. Sometimes the fund industry ignores survivorship bias; in other words, it overlooks those funds which performed so poorly that they no longer exist. Other times it compares returns to the wrong benchmark, or it quotes performance figures before the full impact of fees and charges. Sometimes it simply gets its maths wrong.

Has it been peer-reviewed?

To test whether their findings are reliable, academics publish their research in credible academic journals. This gives other academics the chance to agree or disagree on whether the results are sound. Again, caution is required — there are journals that are less credible than others — but evidence that has been properly peer-reviewed should carry far more weight with investors than evidence that hasn’t.

Have the findings been reproduced?

The fourth and final characteristic of findings you can depend on is that they’ve been tested across multiple environments and timeframes. There is some disagreement on the extent to which academic finance is properly scientific. Asset prices, for example, never move in exactly the same way as they have done in the past. However, there does need to be a strong element of repeatability to demonstrate that the findings of a particular study weren’t just down to random luck or else reached through “data mining”.

So, next time you read in the newspaper, or perhaps on a financial website, about new “evidence” on investing, ask yourself, does it pass those four tests? If it doesn’t, you’re better off not treating it as evidence at all.