Quality Money Management : Process Engineering and Best Practices for Systematic Trading and Investment

(Michael S) #1

143


is not a firm bid or offer that any market maker is obliged to honor. As one can imagine,
then, during market meltdowns, valuations are basically meaningless since brokers have
no obligation to trade; they have no incentive to add liquidity in these markets (whereas
on organized exchanges, market makers must do so). The implication for backtesting is
that the trading/investment system should show returns at least 5–10% above what the
team believes is the hurdle rate for acceptance.
Also, valuation data requires the team to mark-to-model the system ’ s positions. We
strongly recommend that a combination of the simplest industry model be used along with
whatever high-level proprietary model is being tested. Backtests should feed both models
whatever cleaned data is available at the time of trade execution, and measure value and
performance metrics using both the simple model and the high-level theoretical model.
The difference between the two measures can be called the theoretical edge. The goal is to
turn the backtested edge into real performance. A bad sign is when the edge keeps grow-
ing in the backtest since it can mean the theoretical edge can never be captured.

14.2.3. Fundamental, or Financial Statement, Data


Fundamental data consists of everything that is disclosed in 10-Q quarterly and 10-K
annual reports, including key business items, such as earnings, sales, inventories, and
rents. These facts are certified by the company itself. Of course, not all companies follow
the same accounting principles. In fact, every company does it differently. Analysis of
financial statement is a well-developed and important area of research.
Fundamental data should be updated almost continuously to fully reflect the new data of
the corporations. Updates include inserting data on newly listed companies, mergers, new
financial data, etc. Now, if the trading/investment system acquires real-time, streaming data
and translates it into a usable representation, then the entire update process would be truly
continuous. (Of course, automated tools cannot analyze most nonquantified information.)^1
Fundamental data should be normalized across the entire universe. Normalization
reconstructs fundamental data according to identical accounting rules. So, EBITDA for
one steel company will be calculated exactly the same way as all other steel companies,
and almost all other companies. Normalization is time-consuming since it forces analysts
to read footnotes and make adjustments. Normalization is the real service you pay a data
vendor for.

14.2.4. Calculated Data


Given fundamental data, the next question is the following: should the team purchase cal-
culated data (ROE, price to book, and beta, forecasted dividends, free cash flow, etc.) or
should they calculate this data in-house? Whatever the case, we recommend that all calcu-
lations first be prototyped in Excel to allow for proper cleaning and normalization of the
numbers. This way the team can agree or disagree with the method the data provider uses.

One data vendor we know rounds EPS data to two decimal places—simple and intuitive enough. But, let
us assume that the stock has an EPS of $.05. Then there is a stock split. So, EPS is now $.025 rounded to
$.03. Then there is another. $.015 is rounded to $.02. Then there is another split—$.01. And yet another
split. This time the vendor made the assumption that .005 will always be rounded up to $.01. Under these
types of assumptions, MSFT had no EPS growth through the decade of the 1990s. But, the vendor will
not likely point this out.

14.2. STEP 1, LOOP 1: DATA NEEDS AND VENDORS
Free download pdf