Exponential
Crossover
Reader
Feedback
with
comments by Ed
Reference
Library
Sat, 22 Oct
2005
(n+1)/2 and DT
Dear Mr. Seykota,
Using your data, SPC1.csv :
And equation:
ELt = ELt1 + dt * (P  ELt1) / TC
Where,
t = Aug 22, 1982
t1 = Aug 21, 1982
ELt1 = 117.45
dt = 1
P = 117.90
TC = 150
ELt = 117.453
on your website it shows:
http://www.seykota.com/tribe/TSP/EA/2005_u_11_EA_System/Metrics_Log_00.txt
ELt = 117.456
I notice that if "TC" is added to "1" and then divided by "2" then the
calculation matches your metrics log but I don't see a point (related to
trading) in doing that.
Furthermore, I think dt should not equal "1" over holidays and weekends
as your metrics log shows. I believe that just because it's not traded
publicly during weekends and holidays doesn't mean that the price
("value" of the asset) doesn't fluctuate.
Thanks for the education! 
See the
TSP resources for more information of the theory of lags and moving
averages and the (n+1)/2 factor.
Yes, this
particular system increments the lags on trading days only. If you
include increments for holidays and weekends, you get slightly different
values for the lags and, perhaps, slightly different trading results.
You might even get a different optimal solution. You might then use
that optimal to run the everydayincremental system. 
Mon, 24
Oct 2005
TSP Exponential Lag Match  To The Penny
I match your run to the penny and narrowed the search using your results
as a starting point.
Slow Lag= 325
Fast Lag= 85
Bliss = 0.210
ICAGR = 0.112
Drawdown = 0.534
start bal = $1,000,000.00
end bal = $13,551,818.75
Slow 
300 
305 
310 
315 
320 
325 
330 
335 
340 
345 
350 
355 
360 
75 
0.175 
0.174 
0.173 
0.166 
0.169 
0.176 
0.180 
0.189 
0.180 
0.198 
0.193 
0.194 
0.148 

0.097 
0.096 
0.095 
0.093 
0.094 
0.096 
0.099 
0.103 
0.098 
0.108 
0.105 
0.106 
0.084 

0.555 
0.553 
0.549 
0.562 
0.556 
0.546 
0.547 
0.543 
0.545 
0.546 
0.543 
0.544 
0.568 
80 
0.174 
0.173 
0.173 
0.177 
0.187 
0.191 
0.195 
0.208 
0.199 
0.196 
0.188 
0.148 
0.148 

0.096 
0.096 
0.098 
0.099 
0.104 
0.105 
0.108 
0.113 
0.108 
0.107 
0.102 
0.081 
0.081 

0.554 
0.552 
0.564 
0.561 
0.555 
0.551 
0.552 
0.542 
0.544 
0.545 
0.546 
0.548 
0.546 
85 
0.180 
0.181 
0.178 
0.191 
0.193 
0.210 
0.192 
0.192 
0.191 
0.152 
0.149 
0.149 
0.153 

0.099 
0.100 
0.100 
0.104 
0.102 
0.112 
0.106 
0.106 
0.104 
0.082 
0.081 
0.081 
0.083 

0.549 
0.554 
0.563 
0.543 
0.532 
0.534 
0.551 
0.553 
0.547 
0.541 
0.544 
0.544 
0.543 
90 
0.177 
0.182 
0.179 
0.188 
0.199 
0.192 
0.192 
0.183 
0.145 
0.148 
0.152 
0.149 
0.145 

0.098 
0.101 
0.100 
0.099 
0.108 
0.104 
0.102 
0.101 
0.080 
0.080 
0.082 
0.082 
0.080 

0.552 
0.553 
0.560 
0.523 
0.540 
0.542 
0.530 
0.554 
0.554 
0.543 
0.542 
0.547 
0.548 
95 
0.180 
0.178 
0.178 
0.207 
0.199 
0.190 
0.152 
0.154 
0.146 
0.144 
0.144 
0.142 
0.141 

0.100 
0.099 
0.100 
0.110 
0.104 
0.103 
0.081 
0.082 
0.081 
0.078 
0.078 
0.078 
0.077 

0.552 
0.553 
0.562 
0.533 
0.522 
0.544 
0.532 
0.534 
0.551 
0.542 
0.542 
0.546 
0.545 
100 
0.184 
0.186 
0.190 
0.197 
0.162 
0.160 
0.156 
0.151 
0.147 
0.148 
0.150 
0.145 
0.148 

0.101 
0.102 
0.106 
0.106 
0.086 
0.084 
0.085 
0.080 
0.082 
0.082 
0.081 
0.079 
0.081 

0.549 
0.549 
0.559 
0.538 
0.531 
0.527 
0.541 
0.531 
0.555 
0.551 
0.544 
0.544 
0.546 
105 
0.181 
0.185 
0.189 
0.155 
0.158 
0.160 
0.155 
0.161 
0.154 
0.155 
0.155 
0.148 
0.150 

0.100 
0.104 
0.103 
0.084 
0.084 
0.084 
0.084 
0.085 
0.085 
0.085 
0.084 
0.081 
0.082 
Fast 
0.551 
0.562 
0.547 
0.540 
0.530 
0.523 
0.539 
0.528 
0.551 
0.551 
0.545 
0.548 
0.546 

Very nice
!

Tue, 18 Oct
2005
Match
Hi, Ed  using Excel, I have matched the ending equity for the 150/15
system to the penny, and also have matched the ICAGR  your number
.0514, mine .05139703, which is close enough for government work!
Still
working to figure out how to program Per Cent Drawdown, and how to
generate the Trade Log report from the Metrics Log. Also, am working
with [others] on duplicating the results in Traders
Studio. This is an excellent exercise and learning experience. 
For more information on draw down, see resources / system math.
The Trade Log and the Metrics Log are different views of the results, I
do not generate one from the other.

Mon, 17 Oct
2005
Off by 100 x
Ed,
I'm confused. I checked the web site today
http://www.seykota.com/tribe/TSP/EA/2005_u_11_EA_System/00.htm
and the original equity specified is $100,000,000, but an Excel file
that matches results in the current Trade_Log and Metrics_Log starts
with
$1,000,000?
My TradersStudio code that gets close results, but not exact results
yet,
starts with $1M. I thought I got that number from the web site when the
project started, but I may have made a mistake.
If I change the starting equity ... in TradersStudio,
I
no longer match the log results.
Has the starting Equity on the web site changed from $1M to $100M since
the
project started? 
See Below.

Mon, 17 Oct
2005
Animated GIF for bliss;
Reproding Results for Exponential Crossover System
Dear Ed,
I reproduced the results for the "Exponential Crossover System" for both
parameter sets (150/15) and (325/85) to the penny. I am using my own
backtesting engine written in C#. I had to change the type of some
variables
of my backtesting engine from float (4 byte) to double (8 byte).
I created an animated GIF for the "Exponential Crossover System". The
GIF
contains a graphic showing how bliss depends on the "Slow Averaging
Time",
the "Fast Averaging Time" and the "Heat". I simply generated different
frames for different values of heat, so the heat changes with the time
as
the animation runs. You can open the GIFs in a web browser or other
software
which is capable of playing animated GIFs.
I also wanted to make you aware of what I believe is a mistake. You
write
that the start equity is 100000000.00 instead of 1000000.00 in the
tables on
following pages:
http://www.seykota.com/tribe/TSP/EA/2005_u_11_EA_System/00.htm
http://www.seykota.com/tribe/TSP/EA/2005_u_11_EA_System/77.htm
Greetings!

Nice Graphics
! For you other comments,
see below.

Mon, 17 Oct 2005
Impossible Margin Requirements in the 15015 EMA
Cross Over System
The total starting equity is 100000000  not the 1,000,000 you used in
your test.
more thoughts: if your test starts with 1M dollars ... then maybe you
get a
signal to buy about 65 contracts.
the S&P today is roughly 10 times the level in 1982. maybe the margins
are
now 10 times the size of what it was then.
a rough estimate of total margin required (100M account, buy signal in
1982)
is:
6500 contracts times $1969 margin per contract = $12798500
This is under the 100M starting size. 
The 100x
factor stems from comparing pennies and dollars. For information
on contract sizing,
See below.

Sun, 16 Oct
2005
Skid Problems
Ed,
I am not able to get Skid working in the commercial software packages.
Is
there anyone out there that has matched your 15015 EMA Cross Over
system in
TradersStudio, TradeStation, MetaStock, or some other commercial
software
package?
I was working on the EMA Cross Over system today and have an Excel model
working that matches your 15015 system, but am still not able to match
the
system in TradersStudio or TradeStation or MetaStock. I was able to get
around the initialization differences for the EMA because of MaxBarsBack
requirements in the commercial software, and was able to match the EMA
and
ATR by writing custom functions, but I am not able to get around the
SkidPrice calculation as of yet.
In the commercial software when an order is triggered, they can access
Open
and High for that day, but they are not able to access the Open and High
for
the next day when the order actually fills. So, I can't get the
SkidPrice to
match by one day.
Skid is defined as 50% of the distance between the Open and the High the
day
the limit order fills.
The TradersStudio code that is not working is:
Skid = 0.5
If CrossesOver(FastEMA,SlowEMA) Then
SkidPrice = Open[0]+(Skid*(High[0]Open[0]))
Buy("Buy",PositionSize,SkidPrice,Limit,Day)
End If
In the code above, the entry price (SkidPrice) is calculated on the
range
between Open and High on the day the Buy signal is triggered, not on the
day
the order is actually filled in the market the next day.
Theoretically the following code would access tomorrows SkidPrice:
SkidPrice = Open[1]+(Skid*(High[1]Open[1]))
But 1 is not allowed, I'm sure for good reasons.
Does anyone know how do I access the Open and High on the day the limit
order fills in commercial software packages to get the SkidPrice that
would
actually happen in real world trading that matches Ed's results? 
I do not know of commercial testers that allow you to access anything
about tomorrow, today.

Sun, 16 Oct
2005
Impossible Margin Requirements in the 15015 EMA Cross Over
System?
Ed,
I was working on the 15015 EMA Cross Over system this weekend, and it
looks
to me like the current system places orders that are impossible because
of
margin requirements.
For example the first order for this system in your Trade_Log is on
09/01/1982 for 6500 Units at 119.525 when the Equity is $1,000,000.
My Refco account currently requires $19,688 entry margin for one S&P
Index
contract.
6500 contracts times $19,688 margin per contract = $127,972,000
Using the Heat factor of 0.10 on Equity of $1,000,000 then the money
available for risking on one trade would be $100,000.
$100,000 at risk, divided by $19,688 entry margin requirement = 5
Contracts
(rounded down to the integer) not 6500 contracts.
It seems like your system needs to calculate trade size also taking
margin
requirements into account.
Am I missing something? 
The system accounting uses lots that have a value of $1.00 / per handle
and rounds to the nearest 50, which is one mini contract. In this
case 6,500 lots equals 130 contracts that have a gearing of $50.00 /
handle. The current margin on the mini is about $4,000 so the
total margin requirement is about $520,000.
The risk per trade is not the same thing as the margin per trade.

Tue,
13 Sep 2005
The
Lake Ratio
Greetings Ed,
I am a visual learner. If I see it, I understand it. I use the Lake
Ratio and graph it. I see and understand that I like swimming in lakes.
Before using the Lake Ratio, I worry about account balances. Now I look
to see what my lake looks like. It is a complete change, less stress and
more being. 
For
a real thrill, you might try swimming Lake Tahoe in the winter.

Tue,
13 Sep 2005
Still
Finding Differences:
Seeking
the limit as epsilon goes to zero
Ed,
When I format my log files exactly as yours are formatted and use the
Unix diff command on them I notice some small differences in the metrics
log. The other 2 logs are identical.
1) you have 2 entries for 05722F and I only have 1.
2) the thousandth position in our slow/fast/atr columns sometimes
differs by 1
As for difference 1 it seems useful to have a metric that contains the
final equity value so I might change my code to match your output.
I'm not sure difference 2 is worth the trouble of resolving but out of
curiosity I intend to take a look at it anyway.

Nice
Discovery ! I have may some ambiguity in my system in translating vendor
data formats (float) to (double). I use integer arithmetic for the
accounting, so I can get that to the penny.
The
double entry for the last day reflects that the back office does not know
the trade at the close, so it posts equity as if the trade is still
on. A while later, the back office knows the trade and posts the
adjustment.

Tue,
13 Sep 2005
CAGR
Computation
Ed,
I wonder if you've changed something in computation of CAGR. I obtain
your results in 200/50, 300/50, 400/50 tests ( respectevely 0.0512,
0.0936, 0,1424), but in the latest ones you published (150/15 and
325/85) they're different.
Does ICAGR has a different meaning than CAGR ? It's seems to me that the
formula you use in C++ code, using the log function, is far away from
the standard way to compute CAGR.

Yes,
see the Reference Library link, above.

Mon,
12 Sep 2005
Typos
Ed,
FYI,
1.
The copyright under the your banners is still 2004.
2. At:
To
see how this works, scan down the metrics log to midSeptember, 1982.
820827F slow=113.110 fast=112.055 
820830M slow=113.177 fast=112.817 
820831T slow=113.254 fast=113.590 +
820901W slow=113.306 fast=114.035 +
The fast average crosses above the slow average on 9/31/82 (See the +
sign).
"MidSeptember" and the dates that follow (820827F
slow=113.110 fast=112.055) don't seem to align.
9/31/82I hope that is just a typo as I am having trouble recalling the
day. 
Nice
Catch !

Mon,
12 Sep 2005
Exact
Match
Ed,
At first I have trouble duplicating your growth rate and bliss values.
Then I notice that the new system rules use the instantaneously
compounded annual growth rate instead of the compound annual growth
rate. After making this substitution I am able to precisely duplicate
all of your your results.
Slow Fast Heat Bliss
ICAGR MaxDD End
150 15 0 .10 0.0844
0.0514 0.6090 3303931.25
325 85 0 .10 0.2101
0.1121 0.5335 13551818.75 
Good
Work !

Mon,
12 Sep 2005
ATR
Risk Multiplier Parameter
Ed,
On the TSP page Ed Writes: "You might also
notice that you make the most money with a slow system with a high heat
and low ATR multiplier, although this configuration also delivers very
large drawdowns."
Also, from your system math document you list the system parameters as:
Fast Lag Time Constant
Slow Lag Time Constant
ATR Time Constant
ATR Risk Multiplier
Skid Fraction
Heat
I suggest that ATR Risk Multiplier is not an actual parameter, but
rather just a convenient way to adjust heat. We can get the exact same
results by setting the multiplier to 1 and setting the heat higher.
Example:
ATR multiplier = 5, heat = 10% is equivalent to ATR Multiplier = 1 and
heat = 50%. When I test these two parameter sets the results are
identical.
Therefore, I recommend keeping the ATR Multiplier for the sake of
convenient bet sizing, but not using it as a parameter to optimize. We
accomplish that, in effect, by optimizing the heat parameter. 
Traders
sometimes think in terms of 1ATR, 2ATR's, etc when discussussing stop
placement, as in, "I have my sell stops 2ATR's below the market and
my buy stops every 1/2ATR above."
Traders also think in terms of overall portfolio heat, patricularly for
systems that (1) have multiple entries or do not use ATR at all.
In the Simple EA System, these parameters combine as a quotient,
Q=(Heat/ATR), so, as you point out, you can get all test combinations by
varying only one of them.
Per your example:
h=0.1/A=5 ==> Q=.02
h=0.5/A=1 ==> Q=.50
These do not appear to be the same.

Mon,
12 Sep 2005
Matches
to the Penny
Hi Ed,
I confirmed your results down to the penny. Included two screenshots of
the 15/150/0.1 and 85/325/0.1 run. I did have to modify my CAGR function
according to your math document. (I did my calculation based on years
without fractions.) Source code: form1.cs


Sun,
11 Sep 2005
TSP
 Results
Ed,
Here are my results for 150/15:
Equity 3,303,931.25
CAGR 0.0527
ICAGR 0.0514
Max DD 0.609
Bliss 0.0844
and for 325/85:
Equity 13,551,818.75
CAGR 0.1185
ICAGR 0.1121
Max DD 0.5335
Bliss 0.2101 

Sun, 11 Sep
2005
Results
Match
Ed,
My software is Mechanica. I feed the output into Excel when I want to do
some custom analysis of results.
The CAGR that my testing software computes varies from yours as follows:
1. It computes not the instantaneous CAGR, but the other version.
2. It uses 365 days for a year rather than 365.25.
3. It uses the date of the first trade entry as the beginning of the
measurement period rather than the data of the first available data.
I am now computing both CAGR and ICAGR per the calculations in the
spreadsheet. When I do this, my ICAGR matches yours. Thus, frequency (ICAGR
/ MAXDD) matches yours as well.
I suggest that CAGR, in either of its versions, might ought to have a
start date of the end of the indicator priming period. In this case that
is 25 bars after the beginning of the data file, since the system is not
eligible to take trades until then. This is a nitpicky item, but it
ensures that we measure CAGR over the period when capital is at risk. 
Good
Work !

Sun, 11 Sep 2005
Results Match
Ed,
This spreadsheet includes my verification of your 85/325 EA cross system.
My numbers agree with yours with the following exceptions:
My testing software displays trade P/Ls and equity amounts to the nearest
dollar, not to the penny. However, it does preserve trade P/Ls down to the
penny for internal calculations, only rounding the final result for
display purposes. Therefore, my spreadsheet shows some very slight
differences from your results. My ending equity is $13,551,819 compared to
yours of $13,551,818.75, for a difference of 25 cents. This is 100%
attributable to rounding the ending equity to the nearest dollar.
Additionally, although my trades and equity values match yours, my
software reports a compound annual growth rate of 12.10% over the test,
whereas yours is 11.21%. I confirm the way that my software calculates
CAGR. It is easy to get slightly different results depending on the method
for deciding how many annual periods are in the test, as well as when the
test starts and stops.
My software seems to measure from the time that the
first trade enters to the last day of the test. I find it more intuitive
to use the first day of the test period (4/21/1982), rather than the day
of the first trade inception.
However, in order to match your results, I find that I
need to input 24.53 as the number of annual periods in the test. I believe
that the testing period (from 4/21/1982 to 7/22/ 2005) is shorter than
this. You might want to doublecheck the method you use to figure the
growth rate and share it with us so we can match your calculations.
Other than these two relatively minor issues, I believe my results match
yours. 
Good
Work !

Fri,
9 Sep 2005
Trade Station
and Exponentials
Ed,
I think there is more than rounding or Double, Float, or Long number
formatting going on here. What about the MaxDaysBack issue that starts
the Exponential Smoothing on different days (that I sent previous email
about). Your system probably does not have MaxDaysBack in the
programming and appears to get its first value on day 2. TradeStation,
TradersStudio, and other commercial software have this programming. The
minimum MaxBarsBack is 20 days because of the parameters of the system.
That makes the starting date for the Exponential Smoothing different by
19 days because it starts the EMA on day 20. That makes my first trade 6
days earlier than your first trade, and it takes 2 ½ years for your
Exponential Smoothing to converge with mine exactly because of the 20
day starting difference.
How close are others coming to your results? Are they getting it to the
dollar, but not the penny, or are they off by dollars? On the previous
version one data I came within 15.3 points of your total which rounds up
to 2/100ths of one percent. That amounts to $3,825 difference from your
results.
It sounds like your results posted on the web site are different than
the original and I need to compare my results to the version 2, correct?
Is there someone duplicating your results with TradeStation? If so, I
would like to talk with them about the EMA and MaxBarsBack issue. 
The
Exponential Average is not an average. It is a lag. The initialization
scheme is arbitrary. I initialize the lags on the first close. I do not
wait 20 days to initialize it. I wait 20 days before trading so
the lags can separate. In theory, exponential lags with
different initializations never converge. For practical purposes, they
converge at about three time constants. The purpose of the exercise is to
check the logic of the systems, to set a foundation for more complex
studies.
Perhaps you can find a workaround for TradeStation.
I
have feedback that my latest version now agrees with others to the
penny. 
Fri,
9 Sep 2005
More
TSP Optimizations
Ed,
I ran some more tests on the S&P sample data file from your Trading
System Project page.
I include a spreadsheet in a Zip file because it is too large to email
otherwise. I am not sure if it will download from your site or not. I
know the last one did not.
There are two graphs, one 3D and one 2D. They both represent the same
data set. The 3D chart makes it easy to see where the peaks and valleys
are, especially when you grab one corner of the chart and rotate it
around to view from a few different angles. The 2D chart tends to be
good for finding the exact fast and slow EA values that correspond to a
point on the chart. These charts cover the parameter space that showed
the highest peak on my broader test from a few days ago. The Fast EA
time constant steps from 70 to 100 by 1s, and the Slow EA time constant
steps from 300 to 360 by 1s.
The neato thing about these charts is that by manipulating the pulldown
menu labeled "% Volatility" you change the heat level of the
results you are viewing on the chart. You can adjust it from 5% heat to
50% heat in 5% increments. As long as the heat level is set the same on
the 2D and 3D charts they use exactly the same color schemes and you can
easily see what EA values correspond to peaks and valleys on the 3D
chart.
I find it interesting to note that the charts retain the same general
shape across different heat levels, but the optimal parameter values
(the ones that produce the highest frequencies) do tend to shift a bit.
At the highest heat of 50%, the optimal parameters look to be about
97 and 315. At the lowest heat of 5% the optimal parameters are about 80
and 335.
Though these parameter sets are rather close to one another, it does
imply that perhaps the "right" parameters to trade vary with
heat.
Another take on it is that a system that shows much different shaped
graphs for different heat levels might be prone to more
"drift" of the optimal parameter set than a system where the
shape of the graph doesn't change much with heat.
It is also interesting to note how many variables there are and how much
computing power it takes to explore even relatively simple systems in
depth. They don't get much simpler than this one: 1 market EA crossover
system. However, this latest test to explore the sensitivity of a
combination of Fast EA, Slow EA, and Heat requires over 18,000
individual test runs. This is over a relatively small parameter space,
and we haven't even touched the other parameters of ATR averaging
period, ATR multiplier, or Skid assumption yet.
Slow_Fast_Heat_EA_Optimization.zip 
A
synthesis of huntandpeck, steamroller and thinking about the results is
likely the best approach. I get similar values by eyeballing the
chart on the EA System Page: 330 / 90. For a simple system (one
instrument, one system, long only) the optimal solution is fairly
insensitive to heat. If a trader notices a difference between
lowheat and highheat formulations, it may indicate using a Bliss
Function that weighs drawdown more heavily. 
Fri,
9 Sep 2005
Pushing
Heat to the Max
Dear Ed,
Ed Says: I'd like to know how to find a margin
clerk who is willing to go along with your "the more you bet, the
more you make" strategy. You might try for an optimal solution for
initial margin = 10% of value and maintenance margin at half that.
I'd like to find such a margin clerk too. Here is an optimal solution
subject to to those margin constraints.
slow averaging time = 821
fast averaging time = 20
heat=0.80
end = 614977462.50
cagr = 0.318
drdn = 0.705
bliss = 0.4512
Consider that if I turn up the heat or if one of the trades performs
even slightly worse then the margin constraint is not met. The system is
so close to the edge that it's extremely likely to fail going forward.
These parameters might maximize frequency in this simulation and they
likely result in negative bliss for most people going forward.
This exercise has prompted a lot of selfexamination regarding what my
own bliss function might look like. 
I
like to call highheat approaches "Phoenix Systems." They
seem to keep rising again from their own ashes. Aggressive
entrepreneurs tend to favor Phoenix Systems.

Tue,
6 Sep 2005
More
EA optimization
Hi Ed,
Ed Writes:
Nice job. I like your insight that optimal
heat is likely above trader tolerance. That provides a kind of
mathematical proof that system trading is largely psychological.
I wonder if you have a theory to explain the relatively straight line
from (100,100) to (400,400) that seems to separate your chart into two
distinct regions.
You might consider using a color scheme that relates value to brightness
so you can see the peaks and valleys without having to refer to a
legend.
I include here new graphs of my EA optimization that incorporate your
suggestions. I provide both PDF and word formats in case one is easier
for you to work with than the other.
For these graphs, I go with a deep water / shallow water color scheme.
The darker blues represent lower bliss functions (deeper water) and the
lighter blues represent higher bliss functions (shallower water).
I also provide the graph in 3D form. This highlights the peaks and
valleys, but it is easier to refer back to the 2D graph when you want to
see exactly what the fast and slow EA time constants are that correspond
to points on the 3D graph.
The relatively straight line separating the chart into two regions
represents the point where the fast and slow EA time constants are the
same.
There
are no trades when the time constants are equal because no cross occurs.
The area to the lower right of this line represents tests where the fast
EA time constants are greater than the slow EA time constants. The
result is a kind of "countertrend" system that goes long when
the EA with the shortest time constant crosses *below* the longer EA. It
is a validation of trend following principles that such systems seem to
perform markedly worse than systems where the fast EA time constant is
less than the slow EA time constant.

Nice
Graphics !

Mon,
5 Sep 2005
Going
for the Heat
Dear
Ed,
Thank you for your comments. You said "I wonder if you can get an
optimal solution for Bliss = f(slow, fast, heat)"
I did some further testing and found that slow EA of 180 is superior to
the alternatives regardless of heat. So I optimized for Bliss = f(fast
EA, heat) and for Drawdown = f(fast EA, heat). Please see the attachment
for results (charts 2 and 3).
I found that fast EA of 25 is always the best solution. Bliss and
drawdown increase as heat increases, however at heat of 0.8 and above
the bliss function turns flat (very little incremental increase). At
heat = 0.8 drawdown approaches extreme levels (greater than 0.7).
Max. drawdown is almost flat between heat = 0.6 and heat = 0.8, so if
one can handle this kind of volatility one may as well go for heat =
0.8. At levels above heat = 0.8 drawdown again increases at a higher
rate.
Thank you for helping me become a better trader. 
Yes,
managing the interface between the mind and the stomach is an essential
task of trading.

Mon,
5 Sep 2005
Reproduction
of Exponential Moving Average Values
from
Run 0.0
I reproduce the 50 day and 200 day exponential moving average numbers.
The initial EA value is the simple average of the prior 50 values:
EA = MA(50) = (P[0] + P[1] + P[2] + ... + P[48] + P[49]) / 50
or in my Microsoft Works 6.0 spreadsheet formula:
=(sum(e4:e53))/50
I use this formula to calculate 50 day exponential moving average
values:
EA = EA + (Close  EA ) * 2/51
or in my Microsoft Works 6.0 spreadsheet formula:
=h53+((e54h53)*2/51)
Spreadsheet is attached.
Thank You for providing this lesson in trading system design. 
You
might have a look at Reference Library, above, for more information on
Exponential "Averages."

Sun,
4 Sep 2005
Simulation
Results
Ed:
I am excited to have this opportunity to learn from you and other
traders about system building. Attached is my writeup and some analysis
results. AmiBroker 
Nice
Job !

Sun,
4 Sep 2005
A
Little Off Using TradeStation
EMA start dates are problematic for me because of MaxBarsBack issues.
Basically
my EMA starts on day 19 because I have to have MaxBarsBack set to 20 for
the ATR part of the system. Your EMA and my EMA eventually converge 2 ½
years in but the first trade is off by 6 days. How do the people using
TradeStation handle matching your EMA?
They
would have the same MaxBarsBack issue. One way I thought about matching
was to manually change the raw data file’s Close on day 19 to be the
same as your EMA on day 19. This would force the EMAs to match from day
one, but manually modifying the raw data doesn’t seem right.
Doesn’t
every one who is duplicating your system using software that has
MaxBarsBack? If so, perhaps it would be better for your system to start
19 or 20 days later to match all the systems that use MaxBarsBack. This
could be done by simply providing the full raw data file to testers and
simply deleting the first 19 or 20 days of the raw data for use with
your system.
Because of this EMA and MaxBarsBack issue, and the fact that different
software systems, use Double, Float, or Long number formats to calculate
their results. This would make an exact match impossible. Perhaps we
should agree on a percentage match that is acceptable. Perhaps 99%
ending NetProfit.
Who is duplicating your results with TradeStation. I would like to talk
with them about EMA and MaxBarsBack too. What is the reason for the
rounding to 50 contracts? I can understand rounding to individual
contracts, but doesn’t rounding to 50 effect system performance
results by an additional factor that isn’t necessary?
I’m pretty sure your Trade log is calculating the Net Profit
incorrectly or I’m doing something wrong. On your first trade I would
calculate Net Profit like this:
Entry Price  124.82
Exit Price + 150.45
Points = 25.63
Contracts * 3050
Total Points = 78171.5
Dollars/Point * $250.00
Total Dollars = $19,542,875.00
Your Trade Log $78,156.20
Delta $19,464,718.80

You
might check with the TradeStation people to find the workaround for the
MaxBackIssue. I am carrying the position in round lots of 250
shares.

Sat, 3 Sep
2005
Additional
Thoughts on Exponential Smoothing
Ed,
Changing my formulas in the Excel test file to:
ESt = ESt1 + (Closet  ESt1)/((Time_ Constant + 1) / 2)
allows me to match your Log numbers within +0.01 and makes the Excel math
match your Log file, which I think that would be rounding issues or
something like that. I will try rewriting the function in Trader's Studio
now. It looks like the default xAverage and MyEMA in TradersStudio does
NOT divide by two, because they got results close to my custom Function
without the divide by two.
Something you might make clear in your writing is that the
(n
+ 1) / 2 factor is an arbitrary convention used by some software (and not
by others) to make a 200 period Moving Average and a 200 period
Exponential Average track closely to each other and thus be similar to
each other with the Exponential Average tracking the latest numbers
better.
But
using the (n + 1) / 2 convention in the formula means a 200 Exponential
Average is really a 100 Exponential Average, because it really moves 100th
of the way towards the latest Close.
I
think the word "Average" should not be used with
"Exponential" at all, and that Exponential Smoothing should
stand alone as it's own system and not forced to behave like a Moving
Average by dividing it by 2, but apparently the industry already has
adopted this standard.
Since
your book uses Exponential Smoothing a lot, it might be a good idea to
explain why the (n = 1) / 2 was adopted (or falsely adopted) and that some
systems use it, and some systems don't, and that since the actual math in
these software packages is not disclosed; that the use or nonuse of the
(n = 1) / 2 convention could give different results than expected when a
user does not know for sure which convention the software is using.

See
the Reference Library, above, for additional commentary on the EA / MA
comparison.

Sat, 3 Sep
2005
Exponential
Average Crossover Log Delta Questions
Ed,
I did notice the EA_Time_Constant = (MA_Averaging_Time +1)/2 formula,
but it is not up with the Exponential Average. The use of “MA_Averaging_Time”
is misleading. Also, It is in the Moving Average section where you say “To
acknowledge this effect, some versions of testing software incorporate
this formula. This formula brings exponential averages in line with
moving averages for tracking ramps (linear markets). It does not bring
MA and EA together work for steps, sinusoids or real markets.”
Because of
where the (n + 1)/2 formula was located and the text around it, I didn’t
think you were using it in the formula EAt = EAt1 + (Closet 
EAt1)/Time_ Constant
I did some modeling (see attachment) (Ed:the winmail.dat file does not
download) about when the MA starts and the results of using 104 days or
200 days, or varying the start date. I notice that changing the days
calculated from 104 to 200 days gets different results no mater when
each are started. I also notice that they eventually converge (with
rounding) at 379 days for the Fast EA and 1467 days for the Slow EA.
My confusion
about all this comes from the use of the word “Average” in
Exponential Average. You have said before that Exponential Average is
not really an Average, but the use of the word Average still hooks me as
a reader.
Perhaps the
word “Smoothing” is a better choice to eliminate reader confusion,
and that Exponential Average should be replaced with Exponential
Smoothing in your writing. You should say that you use the words “Exponential
Smoothing” when others use the words “Exponential Average”, but
then you should not use the words “Exponential Average” again in
your book. When I read in another book about Exponential Smoothing and
saw the (n + 1)/2 in that formula, I began to see where I might be in
error. Anyone who is thinking “Average” will not intuitively divide
the number of days by two.
Also, I do not intuitively understand why there is the “+ 1” in the
(n + 1) / 2 formula? In the other book he gave both examples: n / 2 and
(n + 1) / 2 and said that different software calculate it in these two
different ways. Why is the “+ 1” in the formula?
If the (n + 1)/2 is used, it should be in the formula for Exponential
Smoothing (ES) as something like.
ESt = ESt1 + (Closet  ESt1)/((Time_ Constant + 1) / 2)
The bottom line is that I should be using the formula above, correct?

You
can easily argue that the Terms Exponential Average and Moving
Average are both unfortunate choices. Pragmatically, you might
consider getting it absolutely straight in your own thinking so as to be
immune to the general confusion on the topic.

Sat, 3 Sep
2005
Processing
Metrics Files
Hi Ed,
I am pleased to see the Trading System Project.
I processed the metrics files (Metrics_00.txt, Metrics_11.txt and
Metrics_22.txt). There appears to be some minor discrepancies between
the results posted (Three Runs) and values derived from the metrics
files.
My drawdown calculation is a bit off.
Which entries in the metrics files are used to determine the drawdown
calculation?
Here are my results:
File: Metrics_00.txt
Maximum percent draw down:
High:
Line Number>1354
19870826Wednesday OHLC:[298.70 298.95 295.30 296.10 ] slow=252.999
fast=279.769 Atr=4.298 Eq=1808922.43
Low:
Line Number>1392
19871020Tuesday OHLC:[178.35 199.35 138.35 173.60 ] slow=258.389
fast=267.766 Atr=19.677 Eq=957057.46
End = 3181650.20
CAGR = 0.0510
Exponential Final Value = 9172467.3385
DrDn = 0.8901 = (High/Low)1 = (1808922.43/957057.46)1
Bliss = 0.0573 = CAGR/(DrDn)
Years to Recover = 17.4408 = 1/Bliss
Lake Ratio = 0.1845
File: Metrics_11.txt
Maximum percent draw down:
High:
Line Number>1354
19870826Wednesday OHLC:[298.70 298.95 295.30 296.10 ] slow=240.977
fast=279.769 Atr=4.298 Eq=2541318.80
Low:
Line Number>1392
19871020Tuesday OHLC:[178.35 199.35 138.35 173.60 ] slow=247.609
fast=267.766 Atr=19.677 Eq=1054046.37
End = 7946810.74
CAGR = 0.0932
Exponential Final Value = 9172467.3385
DrDn = 1.4110 = (High/Low)1 = (2541318.80/1054046.37)1
Bliss = 0.0661 = CAGR/(DrDn)
Years to Recover = 15.1339 = 1/Bliss
Lake Ratio = 0.1856
File:
Metrics_22.txt
Maximum percent draw down:
High:
Line Number>1354
19870826Wednesday OHLC:[298.70 298.95 295.30 296.10 ] slow=231.316
fast=279.769 Atr=4.298 Eq=3140484.79
Low:
Line Number>1392
19871020Tuesday OHLC:[178.35 199.35 138.35 173.60 ] slow=238.163
fast=267.766 Atr=19.677 Eq=1171419.87
End = 21863170.99
CAGR = 0.1419
Exponential Final Value = 9172467.3385
DrDn = 1.6809 = (High/Low)1 = (3140484.79/1171419.87)1
Bliss = 0.0844 = CAGR/(DrDn)
Years to Recover = 11.8485 = 1/Bliss
Lake Ratio = 0.1542

DD%
= largest of [drawdown / previous_peak]

Sat, 3 Sep
2005
Exponential
Average Crossover Log Delta Questions
Ed,
I was looking up Exponential Average in my books and just wanted to
confirm what you mean by "Time_Constant" in your formula. In a
book Exponential Smoothing uses "2/DaysForEmCalculation".
That may be my
problem if "Time_Constant" in your formula is not equal to
DaysForEmCalculation. Which would be 200 for the Slow Average and 50 for
the Fast Average in Run 0.0.

Use:
Exponential_Lag = (Moving_Average_Time + 1) /2. See the
Reference Library, above.

Sat, 3 Sep
2005
Exponential
Average Crossover Log Delta Questions
Ed,
I cannot get the same results as your Trade and Metrics Logs for Run
0.0.
I have tried the Trader's Studio standard code xAverage and MyEMA for
two different ways to calculate Exponential Average and have even
written a third calculation special function in Trader's Studio with the
following formula:
EAt = EAt1 + (Closet  EAt1)/Time_ Constant
Called EdSeykotaEMA and I cannot match your Log results. All 3 of these
formulas get results that are close to each other but all are nowhere
close to your Log results.
Proof of the mismatch is in the attached Excel file which uses the same
formula above.
Also, if I calculate the Exponential Average using the formula above I
can't match the Log file itself. So it appears to me that the results of
the log file are not from using the exact formula above. It doesn't
appear to be rounding or significant digits error.
I must be doing a math error, if other people are getting the same
result, but I am using your formula for the math both in Trader' s
Studio and in Excel. So I don't know how I could be getting different
results.

Many
people are getting the same results I get, allowing for rounding errors,
likely on my side. You might check the Reference Library, above,
for information on the (N + 1)/2 adjustment factor. I consider it
an honor to have an equation bear my name, even if it is dysfunctional.

Fri, 2 Sep
2005
TSP
Optimization
Ed,
Here is a spreadsheet
(does not seem to download) that shows the results of my optimization of
the EA Crossover system for your sample data file. In this test I hold
all parameters constant at the same values you do in your first example
(heat, ATR averaging time, ATR multiplier, skid) except for the fast and
slow EA time constants.
The spreadsheet shows a graph of values of the bliss function (aka
frequency or MAR) against different combinations of fast and slow EA time
constants. It is labeled, "Bliss vs. EA Time Constants." The
highest frequency appears between fast values of 60 to 110 and slow values
of 280380.
The chart labeled, "Detail Optimization Chart" shows this
parameter space more closely. The data for the previous graph represents
tests where the value of the fast EA steps 10 units between test runs and
the slow EA steps 20 units. This chart shows finer granularity with a fast
EA step of 2 and a slow EA step of 5.
The optimal combination of fast and slow EA time constants appears to
be about 80 and 335. There are several other values in the same
general area of the graph that give similar results.
Note that various time constant combinations might respond differently if
I vary any of the other system parameters that are unchanging in this
test. In particular, frequency tends to increase with heat up to a point.
The
optimal frequency generally occurs at a heat level that is greater than
the most traders' heat tolerance.
Also,
slower systems that take fewer trades are typically less volatile (they
have lower returns and smaller drawdowns) for the same heat level as
faster systems that take more trades. Of course, adding more markets to
the simulation would likely alter the optimal parameter values as well.

PS  You can look at the two sheets labeled "Data" and
"Detail Data" for more information about the results for any
individual parameter pair.

Nice
job. I like your insight that optimal heat is likely above trader
tolerance. That provides a kind of mathematical proof that system
trading is largely psychological.
I
wonder if you have a theory to explain the relatively straight line from
(100,100) to (400,400) that seems to separate your chart into two distinct
regions.
You
might consider using a color scheme that relates value to brightness so
you can see the peaks and valleys without having to refer to a legend.

Wed, 31 Aug
2005
TSP
I'm looking at a matrix of results and notice that many of the short
term systems lose serious money over the entire period. I reduce skid to
0 and notice that many of the shortterm systems still lose money
(including the popular 5/20 combination). Even with no skid shortterm
systems are less profitable and with 50% skid short term systems are
extremely hazardous. I decide to consider only systems with longer time
constants and I run a search for the parameters that achieve the highest
bliss.
I vary the parameters as follows:
slow averaging time: 100 to 900 in steps of 100.
fast averaging
time: 25 to less than the slow averaging time in steps of 25
heat: 0.05 to
0.25 in steps of 0.05
I find the most blissful system might be:
slow averaging time = 800
fast averaging time = 25
heat=0.25
end = 32308318.75
cagr = 0.161
drdn = 0.577
bliss = 0.2795
A local maximum in that neighborhood appears to be at slow=826 and
fast=20.
There are no losing trades in the results so the more you bet the more
you make ;)

I'd
like to know how to find a margin clerk who is willing to go along with
your "the more you bet, the more you make" strategy. You
might try for an optimal solution for initial margin = 10% of value and
maintenance margin at half that.

Wed, 31 Aug
2005
Exponential
Average Crossover Reproduction
Ed,
I continue to work on the Trader's Studio code to reproduce your
results. I find that two different ways to calculate the EMA generate
the first trade on 9/14/1984 or on 10/12/1988 in my two ways to
calculate the EMA.
When I look at the raw data file, the first date is 19820421, and your
first trade is 3050 units on 9/16/82; only 104 trading days from the
beginning of the data file. How is it possible to do this with a 200 day
EMA? Don't you need 200 days before the first Slow Average Time line can
correctly be used to take a trade?

You
can prime an exponential with any value. See the section on EA for
more information.

Wed, 31 Aug
2005
TSP
Tutorial Result
Hello Ed,
Here are my results from the TSP EA tutorial. I wrote a test program in
C# and confirmed your results. I do have a few
comments on your test results though.
I have a
problem getting the CAGR and the end result to compare 100% with your
runs. My results always come a few cents below or above your result. The
reason for this seems to be a strange rounding error in either
mine or your program. (Maybe not an error but at least an issue, I am
not sure which program do the calculation right.) Look at example 1.1
and at date 820820, there the open price should be 122.475 (you round
it to 122.48, but I assume that that is only for presentation.) On the
820821 the equity is: ( 123.35 – 122.475 )* 6400 = 5600 and not
5999.99. When we close the position on the 840316 the profit is
(153.2122.475)*6400=196640. You show a profit of 192799.80. This might
be the cause for the CAGR calculation to not be 100% exact.
The program also bruteforce optimizes the slow/fast averages and the
heat variables. I will start the testing today, but my first test showed
a good Bliss when using the following parameters: Fast: 44 / Slow: 410 /
Heat: 1.0 :) But I think that will not be suitable for the system …
You need a Windows machine with .NET framework to run this program.
Also, I really enjoyed your book. I am trying to get a few people to
start a tribe here in Sweden, but so far no luck ... I will report back
when I get a few people here.

Nice
presentation.

Tue, 30 Aug
2005
Equity
Column
Hi Ed,
I write a program to reproduce your results. I compare your results with
mine. I think there might be a problem with the equity column in
your metrics log. For example, the equity values in your Metrics_00.txt
file appear to be for a now that is one day earlier than the other
metrics that appear on the same row. For the system rules you give this
does not affect the trades since the equity value stabilizes before the
system uses it to size the next position.
Except for this discrepancy the differences in the results of our
simulations are all due to different rounding. For position sizing, I
use an exact rounding to the nearest 50 units. Our (trivial) differences
might best be explained if you are using floating point for the number
of units and have inexact rounding.
My ending values:
Run 0.0: 3194250.00
Run 1.1: 8001410.00
Run 2.2: 22098018.75

Per
your comments, I plan to clarify the definitions of the columns as to
which are morning values and which are evening values. I plan to
rerun the simulation at higher precision.

Tue, 30 Aug
2005
EA Trading
System
Dear Ed,
Thank you for the Trading System Project.
I was able to duplicate your results using Excel down to the third
decimal. From the fourth decimal my
results (4 meg excel file) are slightly different probably due to
rounding.
I also optimized for Bliss. My optimal solution for Heat = 0.15 is Bliss
= 0.255, at Slow EA = 800 and Fast EA = 25. However, CAGR is only
0.097. Also, this result is an outlier.
A better result may be Slow EA = 950 and Fast EA = 45 which gives Bliss
= 0.2322. Please see the attached file for a 3D model of the results. I
averaged the results from adjacent data points to come up with a more
meaningful solution.
How would a system like this hold up in actual trading? I mean, how much
should I expect CAGR and DD to differ from historical test results?
I would appreciate any feedback or comments you may have.

Per
your comment, I plan to rerun the simulation at higher precision. I
do not know what you should expect. Nice presentation of Bliss =
f(slow, fast). I wonder if you can get an optimal solution for Bliss
= f(slow, fast, heat)

Tue, 30 Aug
2005
FAQ
Dear Mr. Seykota,
I have a question regarding the EA crossover system in your TSP:
When do you reenter a position after your ATR exit stop has been
triggered, but the crossover system is still on a buy signal?

This
system does not use an ATR stop. It uses ATR to compute a virtual
stop in order to estimate entry risk per contract and position size.

Mon, 29 Aug
2005
EA Crossover System
Verification
Hi Ed,
I am including an Excel spreadsheet that contains my verification of the
test runs you recently post to the Trading System page.
Here are some things I have to figure out before I can duplicate your
results:
1. What kind of exponential lag math are you actually using, and how do
you "prime" the EA?
The webpage indicates you calculate the exponential lag by the formula 
EAt = EAt1 + (Closet  EAt1)/Time_ Constant
My efforts indicate that a modification to this expression is necessary
for the time constants you use in your example runs to produce the results
you report. The modification is as follows:
EAt = EAt1 + (Closet  EAt1)/((Time_Constant + 1) / 2)
When I put the time constant of 50, for example, into the first
expression and I assume a previous EA value of 117.45 and a current close
of 117.90 (the first data points in your example files), I get a value of
117.459. This does not match your output, nor do the other values that
result from applying this expression for each bar of the data file.
If I plug the same numbers into the second expression, I get a value of
117.467647. This matches your output. This holds for every value of EA
throughout the test.
Concerning the priming value: I match your test results when I use the
first closing price in the data file for the initial EA value.
2. What kind of point values are you using for the S&P?
When
I use the actual "big point value" of the S&P of $250, my
results do not match yours at all. I have to assume that each point move
is worth $1 per contract in order to match your results.
3. How do you induce the 50% skid, exactly?
For buying a new long position, I calculate the fill price as: Open +
((High  Open) * .5) For selling to exit a position, I calculate the fill
price as: Open  ((Open  Low) * .5)
Sometimes this results in a value that is halfway between two ticks. I
have to decide whether to round up to the nearest tick, truncate to the
current tick, or something else.
I try several combinations, and each one gets very close to your
results, but some discrepancy remains.
Overall, I emulate your results for ending equity to within $19 for the
50/200 system, $12 for the 50/300 system, and $267 for the 50/400 system.
All trade entry and exit dates match, and all fills match to within 1
tick. This is pretty darn close, but not yet exact.
I have some preliminary results for an optimization of the long and
short EA values for frequency for this data file. I plan to send it to
you when it is complete. 
1. I use the (N +1)/2 adjustment and I
prime the Lag with the first close.
2. I am using a $1 point value and buy
in 50lot multiples. Per your comment I plan to rerun the simulation with
250lot multiples.
3. I have the skid fraction as a test
parameter, so I can run sensitivity tests on skid. I expect
longterm systems to be insensitive to transaction costs, shortterm
systems to be sensitive to transaction costs. I do not round up or
down on fills. In actual trading, I experience split fills with
average prices at nontradable levels.
4. Per your comment, I plan to rerun my
simulation at a higher degree of precision, by replacing floats with
doubles. In C++ floats carry 4bytes while doubles carry 8bytes of
precision. 
