What is your favorite way to detect local minima / maxima on noisy timeseries? For example, if you looked at every single bar that is above it's two neighbors, that would create too many peaks. You could smooth the timeseries using an N-minute centered average or you could also use an N-minute window min/max method where you would place a centered window around a bar, and decide if it's a peak within that window or not. For these latter two methods though, you would not be able to detect the peak until N/2 minutes later. I suppose in discrete math the problem is defined by N-th order centered differences, where you would compare with +1 and -1 minutes, or +2 and -2 minutes, etc. I guess you can define a suitable timeframe for your problem (e.g., 5 minutes, 15 minutes, 30 minutes) and check if a bar is above it's +/- Nth neighbors, (N-1)th neighbors, (N-2)th neighbors, and so on. I suppose it's not a peak if any single one of those neighbors are higher. I'm not sure about this though. Another method is to use moving average crosses, and keeping track of the min / max in between the cross events. You could do this with a single moving average and use the crosses of the timeseries with the MA or you could use two moving averages, again keeping track of the min / max between crosses. Again these methods will suffer from delays which is probably inevitable, because a peak or trough is not defined until you wait N-minutes later to see that it was really a peak or trough. Any other ideas?