This paper derives and analyzes the asymptotic performances of the maximum-likelihood (ML) estimator and the generalized likelihood ratio test (GLRT) derived under the assumption of independent identically distribution (i.i.d.) samples, where in the actual model the signal samples are m-dependent. The ML and GLRT under such a modeling mismatch are based on the marginal likelihood function, and they are referred to as marginal maximum likelihood (MML) and "generalized (sum) marginal log-likelihood ratio test" (GMLRT), respectively. Under some regularity conditions, the asymptotical distributions of the MML and GMLRT are derived. The asymptotical distributions in some signal processing examples are analyzed. Simulation results support the theory via several examples.
|Date of Award||2007|
|Original language||American English|
- Ben-Gurion University of the Negev
|Supervisor||Joseph Tabrikian (Supervisor)|