Overview
FrançaisABSTRACT
First part of this article introduces the random functions and their relevant main features expressed in terms of stationary and ergodism concepts. The second part is devoted to autocorrelation and cross correlation functions illustrated with the support of few practical examples, while third part deals with the spectrum of random signal as deduced from the Wiener Kintchine theorem. To conclude, a brief view on the Markov process will be applied to discrete and continue random data.
Read this article from a comprehensive knowledge base, updated and supplemented with articles reviewed by scientific committees.
Read the articleAUTHOR
-
Bernard DEMOULIN: Professor Emeritus - Lille 1 University, IEMN TELICE Group, UMR CNRS 8520
INTRODUCTION
Unlike the usual analytical functions, which can be rigorously determined, the numerical value of a random function remains totally or partially unpredictable.
The most familiar example of random functions is undoubtedly the capture of certain signals dependent on the time variable. In this particular case, although the time variable can be fixed very precisely by periodic sampling, the amplitude of the signal will remain shrouded in an uncertainty constituting a discrete or continuous random variable.
We shall see in the course of the article that the only rational description of random functions can be borrowed from the notion of covariance, itself derived from the correlation coefficient defined in
The question of the autocorrelation function is therefore closely related to the general properties of stationary variables discussed in the first paragraph. Stationarity will be illustrated by the spontaneous decay of atomic nuclei and the generation of electronic noise signals.
These two phenomena, well known to physicists, successively appropriate the Poisson probability law and the normal probability law, both with stationary parameters.
The second paragraph is devoted entirely to other properties involved in the calculation of autocorrelation functions. This mainly concerns the ergodic principle involved in calculating nth-order moments. Depending on whether the moment is calculated directly, by finding the mathematical expectation of a variable, or by determining the moment using the arithmetic mean of a large number of variables, the two results frequently converge. The ergodic principle will be illustrated by two examples taken from the physics of matter and measurement analysis.
Following on from the previous sections, the third paragraph, which is specifically dedicated to autocorrelation functions, tackles the subject by studying signals assimilated to random functions. Using examples, it will be shown that autocorrelation functions...
Exclusive to subscribers. 97% yet to be discovered!
You do not have access to this resource.
Click here to request your free trial access!
Already subscribed? Log in!
The Ultimate Scientific and Technical Reference
KEYWORDS
Markov process | Theory and practice | autocorrelation functions | frequencies spectra | measurements | signal processing
This article is included in
Instrumentation and measurement methods
This offer includes:
Knowledge Base
Updated and enriched with articles validated by our scientific committees
Services
A set of exclusive tools to complement the resources
Practical Path
Operational and didactic, to guarantee the acquisition of transversal skills
Doc & Quiz
Interactive articles with quizzes, for constructive reading
Random functions
Bibliography
Exclusive to subscribers. 97% yet to be discovered!
You do not have access to this resource.
Click here to request your free trial access!
Already subscribed? Log in!
The Ultimate Scientific and Technical Reference