Wayward asked if I might explain a bit about signal processing as it was not a TA technique he was familiar with. I’ve posted it here since its a place where many TA’s hang out and so it might benefit them as well. I’m new to the area, so read with care. It’s a highly mathematical/technical approach, but the basic principals are quite simple. To explain, I will use pics and text from notes I have taken from various papers and sites I have uncovered during my research.
Share/index/forex prices are signals that behave in wave patterns, just like a brain wave, a heart ECG beat, voice wave, radio wave. A price signal = trend component + cycle component + irregular component. Part of the signal processing challenge is to remove the noise and separate the trend and cycle components.
Conventional TA is based in the time domain showing how the price signal changes over time. A signal has one or more frequencies in it, and can also be viewed from the Frequency domain. From Fourier theory, time-domain financial data is made up of one or more sine waves containing a frequency, amplitude and phase angle. In other words, we can transform a time-domain signal into its frequency domain equivalent. From the pic below, when we add these waveforms, they may re-enforce or cancel each other out and their summation thus reflects the volatile stock market behavior they were derived from.
Why frequency domain analysis? Three main reason":
1. To decompose a complex signal into simpler parts to facilitate analysis. For example, to examine how much of the price signal lies within a given frequency and to determine the dominant frequency.
"The root of the problem of time cycle is, that there always seems to be a current, or dominant cycle, that we can see on our chart right now. The rub is, another cycle is always about to become dominant, overpowering the one we have just located and invested in.” Larry R. Williams (2003): Long-Term Secrets to Short-Term Trading, p. 23
This area of signal processing is covered by literature on Empirical Mode Decomposition, Principal Component Analysis, Wavelet analysis, Fourier Analysis etc. From my limited reading so far, a key feature to watch for is whether you are applying a model to the data or allowing the data to determine the model.
2. Differential and difference equations and convolution operations in the time domain become much simpler algebraic operations in the frequency domain.
3. Ideal for filter performance assessment and design. I’ll cover this in a bit more detail because most TA’s use filters. The objective of filtering is to eliminate the unwanted frequency components. If you are interested in separating the trend component, you want to allow the low frequencies components of the signal to pass thru to your output and you want to stop the high frequency components. So you would use a low pass filter such as a moving average that smooths the data. Conversely, if your interest is the cyclical component, then apply a high pass filter that removes the low (trend) related frequencies.
But why is frequency analysis better for filter performance assessment and design than simply eyeing the filter on the chart? In the frequency domain, we can assess filter performance against its:
1. Cutoff frequency sharpness: It is desirable to have a sharp or fast roll off, meaning that the amplitude drops as rapidly as possible through the transition band.
2. Passband ripples and stop band ripples
3. Step response: The performance response of the filter when price rapidly changes from one value to another. Factors are the degree to which it overshoots/undershoots for example.
Source: Digital Filters with MATLAB by Ricardo A. Losada 2008
In the above example, the blue filter was designed to have a 0.4 cut-off, whereby low frequencies below 0.4 are passed and higher frequencies above 0.4 are stopped. Compared to the red “ideal” filter, the blue filter has a pretty good transition band and is performing not too badly.
These performance criteria really come into play when you want to assess a number of different adaptive filters to determine which is best. Alternatively, you may want to apply three different filters to achieve a single objective. Each filter might be exceptionally good at meeting a component of your performance criteria.
By an adaptive filter I am talking about, for example, a filter that responds to the periodicity of the price cycle given that it is constantly changing. The 19 in a 19 period moving average would be a variable, rather than a fixed input, and it would be derived from your frequency analysis. A range of data pre-conditioning measures, filter applications and fast filter response design features can assist in creating a fairly smooth filter, with low lag which aggressively follows the price signal. It will do this by applying a mixture of positive and negative weights to the data. To explain that, think about a moving average that is centered on the 10th bar in a 19 bar window. If the weights applied to the first 9 bars equal the weights applied to the last 9 bars, then the average is unchanged. One way of applying more emphasis on the last 9 bars is to apply more weight to them. Alternatively, your filter might apply negative weight to some of the first nine bars. Clearly, the mix of positive and negative weights will have crucial implications for the performance of the filter and how aggressively it follows price.
A major threshold issue for choices to move in this direction for me were whether:
- it was making something quite simple complex, for little, if any, additional benefit;
- I was better off using a black box that shielded me from the complexity given the certainty I was only going to be able to move so far down this path (code for brain cell deficiency);
The answers were simple. I don’t like black boxes, the complexity was a challenge and the benefits were sufficient. The necessity to learn a programming language was not a major factor compared to the need to delve into some math that is quite troublesome for this non-mathematician.
In terms of a platform to conduct the analysis, I chose Matlab which is arguably the best and definitely the most expensive ($2,900 plus $1450 for each toolbox, of which a basic setup would require 2-3 toolboxes for real time data). I think there is a free equivalent of Matlab called “R”, but with much less documentation and support. Getting my TA package (Metastock) to talk to Matlab has been a nightmare of Metastock’s making.
John Ehlers books Rocket Science, Mesa and Trading Cycles and Cybernetic Analysis would be first ports of call for an exposure of signal processing. There are many downloadable presentations and papers on his site. My primary source of learning and ideas regarding application come from papers in the fields of oceanography, voice analysis and human physiology. Yep – who would have guessed studying signaling problems associated with the human brain can assist your stockmarket TA.
Anyway, hope that was of some use Wayward and to anyone else that bothered to read. It gives you a flavor of what is involved in the area and only touches upon some of the benefits and problems.
If Wayward pops in over the weekend could someone kindly direct him to my post. He's not exactly expecting it here.
Cheers
Bleasby
Add to My Watchlist
What is My Watchlist?