Abdelhak M. Zoubir and D.

Robert Iskander

[from the GUEST EDITORS]

Bootstrap Methods in Signal Processing


cial section is to stimulate more interest assumptions relating to the probability our first reaction when you among signal processing practitioners to distribution of the noise process. In read the title of this issue discover the power of bootstrap methods today’s applications, the Gaussian noise of IEEE Signal Processing and to advocate their systematic use in assumption is not always valid. It may Magazine could have been: inferential signal processing. hold for sensor noise, for example, but “Why a special section on not necessarily for the environmental the bootstrap?” Or maybe “What is the WHAT IS THE BOOTSTRAP? noise encountered in array signal probootstrap?” Maybe you did not ask yourThe bootstrap is a computer-intensive cessing. The signal processing practiself any questions and you are eager to tool for answering inferential questions. tioner thus requires methods that are discover what the authors of this special In essence, the bootstrap does section report on. Regardless with a computer what the of which of the above groups experimenter would do in you belong to, we will attempt IN ESSENCE, THE BOOTSTRAP practice if it were possible: to briefly answer these quesDOES WITH A COMPUTER WHAT they would repeat the experitions and in particular what THE EXPERIMENTER WOULD DO IN ment. The underlying idea of led us to propose a special secPRACTICE IF IT WERE POSSIBLE: THEY the bootstrap is based on the tion on the bootstrap. More WOULD REPEAT THE EXPERIMENT. substitution and simulation importantly, we would like to principles. If you cannot give you a little taste for what rerun your experiment, use promises to be a collection of the already acquired data and create simple and work under minimal assumpexciting tutorial articles with a wide “new” data and recalculate your estitions. At the same time these techniques range of real-life applications and conmate. The “new” data is obtained should be robust against changes in clude with some suggestions for further through random sampling from the origenvironmental conditions and perform reading. inal data. Since random sampling is perequally well under conventional assumpformed on the original data (empirical tions, such as Gaussian noise. THE RATIONALE distribution), you can draw as many data What should we do if in a practical The use of more accurate models has as you wanted and recalculate your estisituation we cannot invoke procedures become a fundamental requirement in mate each time in the same way you based on analytical mathematics and signal processing applications such as obtained it with the original data. These asymptotic approximations? One may wireless communications, radar, sonar, many estimates of the parameter of think of Monte Carlo simulations, but biomedical engineering, machine and interest, originating from resampling how often do we face a situation where car engine monitoring, and speech and your original data, can be used to we cannot repeat the experiment or, even image processing. With improved accuapproximate measures of your estimator if we could, it would be unjustified due to racy, the models have become more such as its distribution function. The changes in experimental conditions or complex and inferential statistical signal cost of resampling and recalculating estidue to the unaffordable cost? A real alterprocessing is often intractable. The sigmates is being constantly reduced by the native to these problems is the bootstrap. nal processing practitioner requires a ever-increasing computational power of It is the statistician’s choice for answersimple but accurate method for assessing today’s computers. ing inferential questions such as detererrors of estimates and answering inferThis intuitive bootstrap approach is mining confidence limits for parameters ential questions. Asymptotic approximabacked up by a vast literature in matheof interest and testing hypotheses. In sigtions are useful only when a sufficient matical statistics, demonstrating its thenal processing terms, the bootstrap amount of data is available, which is not oretical validity. Its applicability is far could provide the tools for the design of always possible due to time constraints, reaching as one can infer from the artia detector for signals buried in noise/ the nature of the signal or the measurecles of this special section, which are all interference of unknown distribution or ment setting. Also, many of the theoretibacked up with real data analysis. for model selection. The aim of this specal derivations for assessing errors make


but the reader is referred to [1] and [4] for some answers. we believe that it will prove to be useful for all signal processing practitioners as it promotes a pragmatic approach to the use of bootstrap methods. SUGGESTED FURTHER READING Some readers will be inspired by the collection of articles in this special section and will want to apply bootstrap techniques to their own signal processing problems. Our thanks also go to Doug Williams. [SP] IEEE SIGNAL PROCESSING MAGAZINE [8] JULY 2007 . Some readers may want to know more about the differences between these techniques. 1998. Iskander. Cambridge. “The bootstrap and its application in signal processing. ACKNOWLEDGMENTS We would like to thank all contributors. [6] A.R. The article covers the many intricacies that need to be taken into account when even basic bootstrap techniques for hypothesis testing are used in a practical application. The theory conveniently interleaves with practical applications of biological transducer function modeling in animals. Tibshirani. New York: Chapman & Hall. 2004. REFERENCES [1] B. 15. Zoubir and B. Shao and D. “Computer-intensive methods in statistical analysis. All articles are written in a tutorial style. Polikar reports on bootstrap variants such as bagging and boosting and their application in computational intelligence.. The first three contributions are introductory tutorials with real-life applications while the following four articles focus on the use of the bootstrap in advanced signal processing applications. vol.N. [5] A. Franke and Halim continue with a tutorial on bootstrap hypothesis testing and in particular on the so-called wild bootstrap method. Their article provides a tutorial to local linear fitting with bootstrap bandwidth estimation. They support the theory with applications to signal and image analysis. have become a popular tool for many signal processing applications such as target tracking. We are grateful to all reviewers who did excellent work in a timely fashion.[from the GUEST EDITORS] continued Much statistical work is now performed with computers in ways that are too complicated for realistic analytical treatment. which is closely related to the bootstrap. Statist. and their shortfalls. Efron and R. no. pp.” J. Politis. which is written primarily for a bootstrap novice. 452. Abry. vol. They highlight the limitations of practical scaling analysis and show how. and Markov chain Monte Carlo (MCMC) methods. pp. for his continued support and encouragement and Geri Krolin-Taylor. [2] B.K. for many years. the jackknife. 1293–1296. with the bootstrap. for example. One then refers to computerintensive statistical methods and the bootstrap is only one of them. Foster and Zychaluk consider bootstrap techniques in the nonparametric modeling of biological transducer functions. Press. In his article. Wendt.: Cambridge Univ. Bootstrap Techniques for Signal Processing. ranging from turtles. CONTENT OF THIS SPECIAL SECTION The articles in this special section are ordered based on complexity. has worked on and advocated another resampling technique. the author considers the application of the jackknife to multitaper spectra. Particle filters. [5]. This is followed by an article by Thomson who. ˙ Next. and frequency transfer functions in a range of diverse applications ranging from dropped-call rates in cellular phones to 663-year level records of the river Nile. His article will prove to be useful to all interested in emerging areas of machine learning. The Jackknife and Bootstrap. including those who submitted excellent articles. Finally. Efron. but we are confident that those articles will find their place in other journals. Tu. At the same time. through goldfish to guineapigs as well as humans. coherences. [4] J. and Jaffard use the bootstrap in multifractal analysis of hydrodynamic turbulence. Boashash. 1995. 56–76. Space has been critical. their advantages. Candy addresses the concept of bootstrap particle filtering and considers the application of tracking in synthetic aperture sonar. for her timely coordination of this special section. vol.J. The reviewers were very enthusiastic about each of them and praised the excellent technical content disseminated in a very accessible tutorial style. Further. Other methods include the jackknife.. We also hope that the readers will find inspirations in other tutorials on the bootstrap published earlier in this magazine [3]. that we could not consider for this special section. An Introduction to the Bootstrap. 2000. 1. no. The articles of this special section have been chosen with extreme care. senior managing editor of IEEE Signal Processing Magazine. 39–55. We start with our tutorial article. 1998. We are confident that you will share this view. 95. Those readers are encouraged to consult statistical textbooks such as the one by Efron and Tibshirani [2] or the one specially dedicated to signal processing practitioners [6] for further reading. which include testing defects in texture analysis. area editor for special sections. 1993. U. 1.M.” IEEE Signal Processing Mag.M. Amer. cross-validation. they can overcome this problem.” IEEE Signal Processing Mag. This is beyond the scope of this special section. and we needed to make many difficult decisions. New York: Springer Verlag. 15. Zoubir and D. . pp. [3] D. Assoc. “The bootstrap and modern statistics. no.

Sign up to vote on this title
UsefulNot useful