Abstract
A K-user direct-sequence spread-spectrum code-division multiple-access (CDMA) system with (q ≪ log2K)-bit baseband signal quantization at the demodulator is considered. It is shown that additionally quantizing the K + 1 level output signal of the CDMA modulator into q bits improves significantly the average bit-error performance in a non-negligible regime of noise variance, σ2, and user load, β, under various system settings, like additive white Gaussian noise (AWGN), Rayleigh fading, single-user detection, multi-user detection, random and orthogonal spreading codes. For the case of single-user detection in random spreading AWGN-CDMA, this regime is identified explicitly as , where γ(q) is a certain pre-factor depending on q, and the associated BER improvement is derived analytically for q = 1, 2. For the other examined system settings, computer simulations are provided, corroborating this interesting behavior.
| Original language | English |
|---|---|
| Article number | 365004 |
| Journal | Journal of Physics A: Mathematical and Theoretical |
| Volume | 41 |
| Issue number | 36 |
| DOIs | |
| State | Published - 24 Jul 2008 |
Fingerprint
Dive into the research topics of 'Can quantization improve error performance in CDMA?'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver