Author: Leonid Yaroslavsky

Series Title: Digital Signal Processing in Experimental Research

Fast Transform Methods in Digital Signal Processing

Volume 2

eBook: US $24 Special Offer (PDF + Printed Copy): US $101
Printed Copy: US $89
Library License: US $96
ISSN: 1879-4432 (Online)
ISBN: 978-1-60805-026-0 (Print)
ISBN: 978-1-60805-230-1 (Online)
Year of Publication: 2011
DOI: 10.2174/97816080523011110101


This ebook covers fast transform algorithms, analyses, and applications in a single volume. It is the result of the collaboration by the author with others in the world wide university community and has been accumulated over the author’s working lifetime of about 40 years. It has now culminated in a nice mix of theoretical development and practical uses of various fast transforms. Thus readers will find practical approaches not covered elsewhere for the design and development of fast transform methods.

Some of the most immediate applications, such as detection and analysis of periodicities in data, signal denoising and deblurring, signal resampling, precise differentiation and integration are covered and supported by concrete algorithms in this book. Other potential applications are supported by a tour of the theory and mathematical abstraction. The book is addressed to a broad circle of experimentalists, researchers and students that are not regularly educated in signal processing and work in various fields in experimental sciences ranging from physics a to metrology and to biomedical engineering.

Indexed in: EBSCO, Ulrich's Periodicals Directory.


The notion of signal transforms is of a fundamental value in signal processing. Whatever signal processing is carried out, it is carried out in a domain of a certain signal transform. Integral transforms, specifically, convolution and Fourier and Laplace integral transforms, have been used in what we call now electronic and communication engineering since its very beginning in 1920-40s. It is, apparently, impossible to give credit to numerous individuals who contributed to this process, but at least these three names should be mentioned: Oliver Heaviside, Harry Nyquist and Norbert Wiener. In optical imaging, E. Abbe revolutionized the theory even earlier when he suggested, in 1880-th, to treat lenses as Fourier transformers.

In 1940s – 50s signal processing emerged mainly from demands of audio and video communication and radar. Being purely analog at the time, it was based on same natural transforms, Convolution and Fourier ones, implemented through analog low-pass, high-pass and band-pass filters and spectrum analyzers. Initially, integral transforms served only as instruments of the signal theory. With the advent of computers, signal processing became digital, which opened a completely new option of making transforms powerful instruments of applied signal processing.

It is not an exaggeration to assert that digital signal processing came into being with introduction, in 1965 by James W. Cooley and John W. Tukey, of the Fast Fourier Transform (FFT , [1]). This publication immediately resulted in impetuous growth of all branches of digital signal processing and their applications.

The second boom in this growth process was associated with introduction into communication theory and signal processing, in 1970s, of the Walsh transform ([2]) and the development of a large family of fast transforms with FFT-type algorithms ([3]). Some of these transforms, such as Walsh-Hadamard and Haar transforms already existed in mathematics, others were being invented “from scratch” to achieve better “energy compaction” while preserving the principle of fast algorithmic implementation. This development was mainly driven by the needs of data compression, though the usefulness of transform domain processing for signal restoration, enhancement and feature extraction was also very quickly recognized. This period ended up with the acceptance of the Discrete Cosine Transform (DCT) as the best choice between other available and resulted in JPEG and MPEG standards for image, audio and video compression.

The next milestone in transform signal processing was introduced in the 1980s, a large family of transforms that are known as wavelets ([4]). This development continued the invention of new transforms better suited for purposes of signal processing. Specifically, the main motivation was to achieve a better local representation of signals in contrast to the “global” representation that is characteristic to Fourier, DCT and Walsh-Hadamard Transforms. An important feature of wavelet transform is also their low computational complexity.

Fast transforms with FFT-type algorithms and wavelet transforms constitute the basic instrumentation tools in digital signal processing. This volume addresses properties and application of these transforms and consists, correspondingly, of two parts. The first part offers, in Chapters 1 to 4, a tour over fast discrete transforms and their properties. Considered are Discrete Fourier and Cosine Transforms treated as discrete representations of the integral Fourier Transform, binary transforms such as Walsh-Hadamard and Haar Transforms, discrete wavelet transforms, transforms in sliding window and signal “time-frequency” representation, energy compaction properties of transforms. The second part is devoted to applications and efficient computational algorithms. In Chapters 5 to 8, applications for signal spectrum analysis, restoration of distorted signals, signal re-sampling, signal differentiation and integration are addressed. In concluding Chapter 9 of this part are described efficient practical computational algorithms for discrete signal convolution, which are not vulnerable to edge effects, methods for computing scaled and rotated Discrete Fourier Transforms using fast convolution algorithms, fast recursive algorithms for computing Discrete Fourier and Discrete Cosine transforms, when processing is carried out locally in sliding window.

Reading the book and its application in practical work should not be difficult and, the author hopes, it will be enjoyable. The book is practically self-contained. It gives specific mathematical knowledge beyond basics of calculus. Derivation of all formulas is provided in full details without omission of any intermediate stages. Bulky formulas and derivations are placed in appendices to the chapters in order not to obstruct the main body with details unnecessary for understanding. The author will highly appreciate any remarks, comments and questions.


[1] J. W. Cooly and J. W. Tukey, An algorithm for the machine calculation of complex Fourier Series, Math. Comput. V. 19, 297-301, 1965.

[2] H. Harmuth, Transmission of information by orthogonal functions, Springer Verlag, New York, 1971.

[3] N. Ahmed, K.R. Rao, Orthogonal transforms for digital signal processing, Springer Verlag, Berlin-Heidelberg, 1975


[4] I. Daubechis, Where do wavelets come from? – A personal point of view, Proceedings of IEEE, v. 84, No. 4, Apr. 1996, pp. 510-513.