I was wondering if it is possible to measure how much faster a genetic algorithm is compared to something like brute force optimization. Is there a ratio or approximate ratio of one compared to the other in terms of speed? Going along with this same type of thinking, is there a way to get an idea of how fast a computer processor can crunch these types of numbers? For example, take a fast computer with an Intel QX9650 3.0 GHz and 8 Gb of memory running Microsoft Vista and optimizing a strategy that contains 70,200,000 combinations of inputs. Is there a math formula that would help compare the two types of optimization techniques? I realize that there probably isnât a way to get an exact figure, but an approximation would be helpful.