General Topics
Markets
Technical Topics
Brokerage Firms
Company Specific
Community Lounge
Site Support

# Genetic algos

Yeah, I had a look to the repo you shared once. Is that the one you would like to continue with?

I wrote a bit of a rest client for the IG markets broker when I was in the UK. You can see there a few lines in Java.

That's my user on GitHub if you want to give me permissions to anything.

Last edited: May 11, 2022
#11     May 11, 2022
2. ### ph1l

Here are examples of how I've tried some ideas using genetic algorithms and genetic programming.

I currently use genetic optimization of floating point values for hand-coded rules like
Code:
```\$R0 = \$R1 = 0;
if (  \$cyFitProp_lo >= 0.22716 ) { if (  \$cyFitProp_lo <= 0.732629 ) { \$R1  = 1 + \$R1 ; } }
\$R0  = 1 + \$R0 ;
if (  \$cyFitProp_hi >= 0.29458 ) { if (  \$cyFitProp_hi <= 0.904422 ) { \$R1  = 1 + \$R1 ; } }
\$R0  = 1 + \$R0 ;
if (  \$cphase0_lo >= 2.8355 ) { if (  \$cphase0_lo <= 3.95609 ) { \$R1  = 1 + \$R1 ; } }
\$R0  = 1 + \$R0 ;
if (  \$cphase0_hi >= 2.80795 ) { if (  \$cphase0_hi <= 3.76181 ) { \$R1  = 1 + \$R1 ; } }
\$R0  = 1 + \$R0 ;
if (  \$rlh_per0 >= 0.316866 ) { if (  \$rlh_per0 <= 1.02584 ) { \$R1  = 1 + \$R1 ; } }
\$R0  = 1 + \$R0 ;
if (  \$rlh_phase0 >= 0.000702903 ) { if (  \$rlh_phase0 <= 6.28273 ) { \$R1  = 1 + \$R1 ; } }
\$R0  = 1 + \$R0 ;
if (  \$detrPropNextBar_lo >= 2.92023e-09 ) { if (  \$detrPropNextBar_lo <= 0.32465 ) { \$R1  = 1 + \$R1 ; } }
\$R0  = 1 + \$R0 ;
if (  \$detrPropNextBar_hi >= 8.10645e-06 ) { if (  \$detrPropNextBar_hi <= 0.635021 ) { \$R1  = 1 + \$R1 ; } }
\$R0  = 1 + \$R0 ;
if (  \$R1 >= \$R0 ) { \$return =   1 ; }
```
When the optimized rules perform well on in- and out-of-sample data, I consider combining them with other (base) rules to try to improve the performance of using the base rules alone. The final combination would be forward tested and might be used for actual trades.

#12     May 11, 2022
3. ### ph1l

This is in response to something @stochastix posted but deleted (it's always possible I imagined that too).

That code listing described the genetic optimization after other software created functions that model price movement like:
Code:
```y_hi = 110.319160461426  -  0.0421270467340946 * x
+  7.39092588424683 * skewed_cos(twopi / 88.1613007177421, 0.646005928516388, -0.747746825218201, x, 100) ;
y_lo = 107.114105224609  -  0.042427085340023 * x
+  7.60985136032104 * skewed_cos(twopi / 87.9424281352233, 0.619945704936981, -0.802092313766479, x, 100) ;
```
where y_hi and y_lo are predicted high and low prices of a particular asset
x is the number of calendar days past the start of a sampling period
skewed_cos(freq, phase, skew, x, iter) = cos(freq * x + phase) when iter == 0
skewed_cos(freq, phase, skew, x, iter) = cos(freq * x + phase + skew * skewed_cos(freq, phase, skew, x, iter - 1)) when iter > 0

The parameters whose values are genetically-optimized are based on how good the fit to prices was, the current phase of the cyclic part of the functions, and the relationships between qualities of the two functions (period and current phase in the example). These same genetically-optimized parameters apply to more than one asset, but the functions for any asset depend on that asset's prices and would be different than the functions for another asset. And, the functions get recalculated daily to use the most recent sampling period.

As far as determining resources to allocate to a trade, I try to keep individual trades to have similar risk level as in this post.

#13     May 11, 2022
4. ### greejan

I worked with genetic algorithms when I was in University to solve complex optimization problems by brute force, basically (that's what they do). My opinion is in trading they will only lead you to extreme overfitting.

#14     May 12, 2022

That's right.

There is so much bullshit on the Web that has led people to believe all the strangest things.

Machine learning, TA, "signals", and similar "magical thinking", are all based on the nonsensical premise there is some "underlying law" animating the price moves.

#15     Jun 1, 2022
6. ### Kevin Schmit

Over-fitting may not be as bad as we thought it was:

https://www.stat.berkeley.edu/~bartlett/talks/201908Microsoft.pdf

Bartlett is one of the leading researchers in this emerging sub-discipline.

Try googling the phrase "benign overfit in deep learning"

A recent paper by Hastie et al gives some color as to what may be going on, at least where the overfit is due to p >> n.

Last edited: Jun 2, 2022
#16     Jun 2, 2022
7. ### easymon1

ET IS FREE FOR TRADERS BECAUSE OF THE FINANCIAL SUPPORT FROM THESE SPONSORS: