|
EDA365欢迎您!
您需要 登录 才可以下载或查看,没有帐号?注册
x
Contents) z' B* N; X) T
Preface xv
+ E1 l4 v Q" j( ^, t3 OAcknowledgments xvii$ D4 F, A( \3 j; q
Chapter 1 Probability Concepts 1
" W4 `9 l" E O1.1 Introduction 1! ?: v: I" B; [& @+ D+ C
1.2 Sets and Probability 1
D2 S( _1 p8 |2 w, V) q E1.2.1 Basic Definitions 11 f' O w6 H6 R/ W
1.2.2 Venn Diagrams and Some Laws 3% _& v! Q- [2 A
1.2.3 Basic Notions of Probability 6
$ e( O# B5 j& \9 I+ O0 p+ P1.2.4 Some Methods of Counting 8; L. a3 M, a, j' h0 [; A
1.2.5 Properties, Conditional Probability, and Bayes’ Rule 12) U5 E; V- |1 U9 t8 L+ }
1.3 Random Variables 17+ \/ v6 b7 t& e$ P( P# Y# N
1.3.1 Step and Impulse Functions 17# T6 D4 d. a: o' }9 I, G. E
1.3.2 Discrete Random Variables 18
0 v0 O$ o- L2 |! Q- k% d* V1.3.3 Continuous Random Variables 20' n- U- A" F, n z0 a7 U2 g& F
1.3.4 Mixed Random Variables 22
9 Y$ ~2 G% ]& D2 v: g: s1.4 Moments 23! g8 L. \/ j- G5 Y3 W: g; ]1 {7 [
1.4.1 Expectations 23
: j$ ?4 P3 M2 W' u, V1.4.2 Moment Generating Function and Characteristic Function 26+ I! \( d' v1 ], c9 `; C9 ]
1.4.3 Upper Bounds on Probabilities and Law of Large4 Y( \- S: u3 k7 D/ |# U5 r
Numbers 29+ u& a' C. D9 F, \- {
1.5 Two- and Higher-Dimensional Random Variables 31
+ M+ w `! x# u6 L* F1.5.1 Conditional Distributions 337 F9 Q; O& p% O5 b1 |
1.5.2 Expectations and Correlations 41) v9 G! s& K! n6 `1 @
1.5.3 Joint Characteristic Functions 44
7 W( x: W& {9 S; ^5 j1.6 Transformation of Random Variables 48
6 \! ^7 O7 Y) D, H3 r4 Z; U/ @1.6.1 Functions of One Random Variable 49
1 [- a2 u& \4 a i* s1.6.2 Functions of Two Random Variables 52
/ @2 Z$ y& |2 `( v2 O5 b8 e1.6.3 Two Functions of Two Random Variables 593 Z( g4 N) {# h; b3 r0 x
1.7 Summary 65 z+ x) z" @8 K: u( w* ~& r5 ?
Problems 65/ `+ r5 d6 |, q0 }! ]
Reference 73. I; ^/ | {5 }* }% A2 V
Selected Bibliography 73
6 x F M1 L- w0 S. @! O, [+ i. ^Chapter 2 Distributions 75# |: k3 T; t+ l0 {
2.1 Introduction 750 V, H) a) p2 P2 a- ^ B9 i- F& g/ G# s5 ?
2.2 Discrete Random Variables 75
2 f2 q2 t$ ~. o' R) F2.2.1 The Bernoulli, Binomial, and Multinomial Distributions 75
# A8 p& E3 ~3 e0 S% o5 R) f0 U# ^2.2.2 The Geometric and Pascal Distributions 78
# F3 T) c* B) T$ }- E8 ?1 M: s2.2.3 The Hypergeometric Distribution 82/ k% ^" N' S1 P: s; L4 V
2.2.4 The Poisson Distribution 85
, H# | M- m( a0 u o* z2.3 Continuous Random Variables 88: f2 a- l. d* p+ h4 u! b3 ^
2.3.1 The Uniform Distribution 88. @' A1 @6 ^/ a8 Y
2.3.2 The Normal Distribution 89& c: u2 Z- W7 w; h5 Z: i+ j
2.3.3 The Exponential and Laplace Distributions 96! k/ |0 o; d+ y) o# H9 ^. _
2.3.4 The Gamma and Beta Distributions 985 h! n7 _+ e/ n( h- ?" o! b" q8 t' d
2.3.5 The Chi-Square Distribution 101
7 _ Y2 k" U- _2.3.6 The Rayleigh, Rice, and Maxwell Distributions 106
- w" I6 i J4 I+ Y; m2.3.7 The Nakagami m-Distribution 115 u! t) A) ?; V$ Z
2.3.8 The Student’s t- and F-Distributions 115
, L- l7 U' h: e1 M& p' j2.3.9 The Cauchy Distribution 120
& E8 W- E$ K+ u& Z* _, O2.4 Some Special Distributions 121* X; j4 x! g# I/ W
2.4.1 The Bivariate and Multivariate Gaussian Distributions 121- x( D: w& Y( C- \; b' n0 c
2.4.2 The Weibull Distribution 129
+ G' l ^) \* |1 r* M5 [. h! K2.4.3 The Log-Normal Distribution 131
" ?# s! C4 m7 I% y q3 P+ j2.4.4 The K-Distribution 132
! a A1 @3 u6 Q$ }2.4.5 The Generalized Compound Distribution 1354 g3 X, j" p+ ?/ T u: D
2.5 Summary 136
. t6 J, }( O8 d d3 r G7 NProblems 137: D# }8 |* j* ?! }+ d$ z: N
Reference 139
( e K, w1 J. u- oSelected Bibliography 139
1 g) ^9 p$ G& \2 s% |' LChapter 3 Random Processes 141. P4 x$ ~/ x, V# e
3.1 Introduction and Definitions 1415 e+ \7 r# S6 D# \% ~0 S
3.2 Expectations 145
* U! G! l% v4 I( W' R3.3 Properties of Correlation Functions 153
- I5 w$ ^3 m% v. ?& q& B! I, m3.3.1 Autocorrelation Function 1531 H6 I; A% y1 e7 C
3.3.2 Cross-Correlation Function 1539 S4 b5 Y" L, ^5 u
3.3.3 Wide-Sense Stationary 154
( s' c k% E* j3.4 Some Random Processes 156- l0 j# { l3 M! n8 D+ H
3.4.1 A Single Pulse of Known Shape but Random Amplitude
F; N. T, e; z9 |5 eand Arrival Time 156$ ^. ^ J6 {0 u8 P4 N
3.4.2 Multiple Pulses 157
2 [& q7 n1 y9 D, j6 B G7 p3.4.3 Periodic Random Processes 158( W8 X) e, H- z* F7 M
3.4.4 The Gaussian Process 161
: }/ `& d% U& Z: U. J9 V3.4.5 The Poisson Process 1638 B: ^7 D3 [- X0 ]7 ^, V$ r
3.4.6 The Bernoulli and Binomial Processes 1663 s+ X, k [( g
3.4.7 The Random Walk and Wiener Processes 1689 Y7 T! u$ D: Q0 X9 n
3.4.8 The Markov Process 172
, Z6 `: c% j8 H6 d9 y3.5 Power Spectral Density 1749 e# A1 f2 g$ Z& P' i
3.6 Linear Time-Invariant Systems 178
8 L. s& T5 @" `) G E3.6.1 Stochastic Signals 179
# o# `0 E3 `. B" v& R3.6.2 Systems with Multiple Terminals 185' E) \- {) ^+ Y$ E! c
3.7 Ergodicity 186
; R g4 L! h+ F. P# r% K3.7.1 Ergodicity in the Mean 1861 {& A5 l6 L3 L4 @- ^ ]
3.7.2 Ergodicity in the Autocorrelation 187! z3 k, X6 \9 B
3.7.3 Ergodicity of the First-Order Distribution 188
4 I1 c+ C. F. h+ t3.7.4 Ergodicity of Power Spectral Density 188
4 |' V& I) _, g2 A& o4 z3.8 Sampling Theorem 189: ]. {& u0 [/ s2 A3 A0 W( Z9 x
3.9 Continuity, Differentiation, and Integration 194
# E, ?' t0 q8 N$ ]1 a* r3.9.1 Continuity 194* ~/ O* ]6 n- P
3.9.2 Differentiation 196
( ?* @, c; N# X3.9.3 Integrals 199
) s5 O R" \* p3.10 Hilbert Transform and Analytic Signals 201
% N+ q7 S* A% M4 F3.11 Thermal Noise 205
7 R; J' ~& j8 O1 k3.12 Summary 211$ p9 y F% O0 D: K h$ f" p8 X
Problems 2129 x Z+ T# B+ Y" |# ~
Selected Bibliography 2215 u4 L- B4 X* u" E# t1 _
Chapter 4 Discrete-Time Random Processes 223
: H( [$ J$ `5 Z$ k4.1 Introduction 223. \( W, U2 H$ a' n
4.2 Matrix and Linear Algebra 224- {5 W# P& L8 \- }/ J
4.2.1 Algebraic Matrix Operations 224
6 ?$ m' }! G; t& i. n- K4.2.2 Matrices with Special Forms 232, n. o; b$ `/ [$ P3 K: h( j: }
4.2.3 Eigenvalues and Eigenvectors 236
1 `# A1 z3 {# {0 m% n" G; o4.3 Definitions 245
4 X. L% Y/ e9 \4 e5 y( j1 ^4.4 AR, MA, and ARMA Random Processes 253
5 _3 w) Y: H( V0 |! O2 D4.4.1 AR Processes 254
7 F% N0 [, e8 c. C/ Y a4.4.2 MA Processes 262' v" W( O; g) v' Q- S+ m
4.4.3 ARMA Processes 264
6 b. e2 k- x! I1 V2 s% Y! l4.5 Markov Chains 2664 D4 K4 U4 l- N- k W7 p4 i k
4.5.1 Discrete-Time Markov Chains 267
8 ^) ^, ], V% H: F4.5.2 Continuous-Time Markov Chains 276
) }+ W$ \2 W; g& z4.6 Summary 284 ^# n7 O; t. x% K0 C6 |, x
Problems 2840 ~6 d# T7 Z+ R; p8 j$ a
References 287& k8 ]' P D* [7 c
Selected Bibliography 2889 Y3 H6 s+ [7 F. o3 t( D7 g5 I
Chapter 5 Statistical Decision Theory 289: ^" J/ B' i1 E0 [, S( z3 \
5.1 Introduction 289, K2 b' x0 A7 ]
5.2 Bayes’ Criterion 291
! V+ W, Q, l, f: H5.2.1 Binary Hypothesis Testing 291
' B; Q. b0 v M9 v% V5.2.2 M-ary Hypothesis Testing 3032 a; U+ ^7 \& Q
5.3 Minimax Criterion 3137 }& ^- |- A' ?4 |% @9 C& m& h6 @
5.4 Neyman-Pearson Criterion 317
* {! ?2 U% }- T2 C1 [5.5 Composite Hypothesis Testing 326
/ ^1 D% T. y! h0 X$ G. D5.5.1 Θ Random Variable 327
$ l* h! y# K; H5.5.2 θ Nonrandom and Unknown 329
1 _- H4 N: ]9 j0 C5.6 Sequential Detection 3326 F% U2 N! ]3 T( V) z2 i
5.7 Summary 337
' [3 X, L- G, R7 o) O- y! LProblems 338
/ M8 l: _. M# e0 m) @Selected Bibliography 3430 F" G; m e$ p3 ~, H
Chapter 6 Parameter Estimation 345" e* }' ^ l- `0 ^& ]! u5 `9 \* e4 W
6.1 Introduction 345- q0 L( K' ?) p$ c7 g% \" a* R [: `& P
6.2 Maximum Likelihood Estimation 346: ~( c9 a( G: b: S1 q* R
6.3 Generalized Likelihood Ratio Test 348
/ q% k U2 m* W- \6.4 Some Criteria for Good Estimators 3538 o: P2 x- O0 a) ~) S: R2 E
6.5 Bayes’ Estimation 355
0 B- q+ g. ]! S4 {, z" o) K3 Q) J6.5.1 Minimum Mean-Square Error Estimate 357
' i5 v1 `7 r( f9 ?$ a, J3 y6.5.2 Minimum Mean Absolute Value of Error Estimate 358+ Q; G2 b8 q! ?
6.5.3 Maximum A Posteriori Estimate 359' _7 F2 P, p; M& {- u
6.6 Cramer-Rao Inequality 364
s4 V" @( o7 B+ `( S6.7 Multiple Parameter Estimation 371# m) `3 u7 d0 {' q& j; D
6.7.1 θ Nonrandom 371
9 q' K9 ?2 x, l0 B; c6 g- I6.7.2 θ Random Vector 376
) p E9 u) f) p# [- j6.8 Best Linear Unbiased Estimator 378
# f+ o1 }; e! F5 c8 P3 {6.8.1 One Parameter Linear Mean-Square Estimation 379
$ Z' B& X p! g+ E6 n: f6.8.2 θ Random Vector 3810 [% w9 F& A" T; z4 C
6.8.3 BLUE in White Gaussian Noise 383
( P& m3 o9 E/ F- k6.9 Least-Square Estimation 388# r1 a+ l C- l/ Q" E2 x: L6 y; S
6.10 Recursive Least-Square Estimator 3917 D m, ?! q; v0 \+ P7 m, f& _
6.11 Summary 393
' u1 e* a7 P7 [Problems 394
) L$ B& R9 N! \9 J+ G8 MReferences 398! a0 i, k1 Q4 R
Selected Bibliography 398' [& m9 |# O& X+ u1 F# [: u/ P
Chapter 7 Filtering 399
2 d# U: ]+ U; Z7.1 Introduction 399
9 Q6 P0 K: s# D1 W1 v& ?7.2 Linear Transformation and Orthogonality Principle 400 H4 F4 v/ ^: M1 G+ x
7.3 Wiener Filters 409. q: ?1 u/ a7 }6 [3 P
7.3.1 The Optimum Unrealizable Filter 410 d2 G+ D B& P. [
7.3.2 The Optimum Realizable Filter 416$ v" S, R2 d/ L" C% @# P1 Z
7.4 Discrete Wiener Filters 424
P8 o" Q" j8 o2 `2 w7.4.1 Unrealizable Filter 425' A: T+ |1 o8 i F; J1 [6 Q
7.4.2 Realizable Filter 426
2 p2 M7 b- f i) Z5 E7.5 Kalman Filter 436
9 F( z. h- N0 r4 a7.5.1 Innovations 437 Y) N" ^& V6 p" k5 [
7.5.2 Prediction and Filtering 4400 Q8 }5 e. I" g
7.6 Summary 445
' k& }8 M9 q4 DProblems 445
" l2 ^6 \+ y3 I& X" `References 4484 b& [/ ~# U+ _4 P4 o A7 D& Q+ D
Selected Bibliography 448
3 O6 J, ?# [: }, v. r0 ?Chapter 8 Representation of Signals 449
, s, p8 v, |7 ^0 Z- B/ u+ s0 V8.1 Introduction 449
9 R) ^! \1 T* X, U7 d/ A! p8.2 Orthogonal Functions 449
6 z* z; p# l; W: M2 w+ C. Q8.2.1 Generalized Fourier Series 451. @" D6 G. C# o/ c( O& ]- S
8.2.2 Gram-Schmidt Orthogonalization Procedure 4559 y( E$ `/ Q, r# \9 B: n
8.2.3 Geometric Representation 458: F+ {; Y' T. {2 K. j R0 a
8.2.4 Fourier Series 4630 k/ a; |3 j/ S- p, R. t" n; G
8.3 Linear Differential Operators and Integral Equations 466 B' d8 K- |8 }& z3 E
8.3.1 Green’s Function 470- R' X8 v3 M: c* A2 T( u
8.3.2 Integral Equations 471
+ B: I6 P9 L6 @* d8.3.3 Matrix Analogy 4792 k) \/ t- z8 r* v! O1 ]
8.4 Representation of Random Processes 4802 q- w! ?1 }" Q* e. M
8.4.1 The Gaussian Process 483
/ e, r0 Q( x: i; K7 h8.4.2 Rational Power Spectral Densities 4875 M J# I1 ^: f, K/ q6 `5 Q% l7 P
8.4.3 The Wiener Process 492
( r/ U9 O# z( Y3 \ [8.4.4 The White Noise Process 4938 y% {8 n4 S; W2 Q' P% k1 k( p
8.5 Summary 495
6 h# M" q0 w7 }' d, qProblems 496
# t% i1 P! p2 q! P# F" E5 ^References 500
- V! c" N2 h2 t& ]Selected Bibliography 5001 s! B0 j9 R- d0 k
Chapter 9 The General Gaussian Problem 5030 b( l2 Q4 Q0 Z4 }9 N: ^" }& G% d! p* h7 C
9.1 Introduction 503
Z4 f2 `+ Y8 d) f, l- { d9.2 Binary Detection 503
4 X% `8 G$ g: C/ Q7 R; D9.3 Same Covariance 505
4 y4 B+ Q5 S2 U& w6 M5 B* |9.3.1 Diagonal Covariance Matrix 508
% X; d O8 b9 T. o9 ]$ |9.3.2 Nondiagonal Covariance Matrix 511( J e+ g/ v) B8 d
9.4 Same Mean 518) y7 L" a1 ~, `9 x2 i( b f
9.4.1 Uncorrelated Signal Components and Equal Variances 519
- d1 q1 y. \. Q4 y3 e9.4.2 Uncorrelated Signal Components and Unequal
# J4 j2 G# `6 m- ?* GVariances 522
]% ?4 z' u7 `. ~9 _9.5 Same Mean and Symmetric Hypotheses 524, F6 T8 T8 C8 b8 \4 D% c
9.5.1 Uncorrelated Signal Components and Equal Variances 5268 b5 |; I1 S+ ?5 N) | R8 l; g
9.5.2 Uncorrelated Signal Components and Unequal
. C( O2 W" K" w1 kVariances 528' v' r2 X+ F3 s. h* p3 i: q
9.6 Summary 529, ]- ~. [0 |9 J
Problems 530
4 A; a9 o' ^- G/ VReference 532
, B `& Q3 K* Y. `! Y: NSelected Bibliography 5327 ~0 K S. N# a3 }
Chapter 10 Detection and Parameter Estimation 533; |+ c: U5 ]$ J5 @
10.1 Introduction 5334 Q. l1 u: {( m' g& D1 i6 T
10.2 Binary Detection 5341 S9 o# I. {& B1 c0 J b
10.2.1 Simple Binary Detection 534 S6 M1 P* s6 k$ ?* ]) D* I0 T
10.2.2 General Binary Detection 543
+ k$ ^! w' \0 A/ E( e10.3 M-ary Detection 556
2 y: |, n6 M' V) T! I! @6 e% M10.3.1 Correlation Receiver 557
5 B( i$ \' g/ w10.3.2 Matched Filter Receiver 567) s( ^ T: V9 w* S6 Q2 V- J
10.4 Linear Estimation 572
0 Q" A- ^4 Y* b: j/ z: W& G5 s10.4.1 ML Estimation 573" N$ Q1 p" J8 N+ q/ q
10.4.2 MAP Estimation 575
7 z O# @/ O, H& R; g10.5 Nonlinear Estimation 5763 l7 ^/ n+ r, t& W
10.5.1 ML Estimation 576
1 O- z6 ~. a, }2 \10.5.2 MAP Estimation 579
9 L; C( z) H, t8 S! {10.6 General Binary Detection with Unwanted Parameters 5809 Q$ @% G3 i8 z" u9 Y
10.6.1 Signals with Random Phase 5832 Z- g. t L6 g: O4 g( y+ \
10.6.2 Signals with Random Phase and Amplitude 595
2 J8 x" a% \, s. S' J- e10.6.3 Signals with Random Parameters 5983 Q$ s, w2 c! {/ ]1 @9 h
10.7 Binary Detection in Colored Noise 606
4 U( i8 A7 I" H7 \10.7.1 Karhunen-Loève Expansion Approach 607
" f: _( O w! _* m. Q/ i8 S7 n10.7.2 Whitening Approach 611' _0 V/ C$ B9 b |& u
10.7.3 Detection Performance 615) p+ Z, Q; e* G3 F$ b4 Z% [
10.8 Summary 617
) x- R7 n+ t9 ?( qProblems 618( J/ _. j1 K5 O$ w# [; s6 ~
Reference 6261 s" h& ?. A0 g3 ]
Selected Bibliography 626
3 n" @. T( e) h3 _1 u5 h' IChapter 11 Adaptive Thresholding CFAR Detection 627
) p$ Y [$ Q- J& [5 @3 w8 L. ^11.1 Introduction 627' N, ~2 k/ U& j- }. s
11.2 Radar Elementary Concepts 629! |% u3 A% Y+ g/ W" V1 J0 G+ Y
11.2.1 Range, Range Resolution, and Unambiguous Range 631- w5 Z$ b7 ?2 n9 \
11.2.2 Doppler Shift 633
! Y) w) r+ G) @, S) v11.3 Principles of Adaptive CFAR Detection 634
* i: J, {, r( m, _2 M0 F0 s11.3.1 Target Models 640, v! N7 T' l7 S* e X; v0 B
11.3.2 Review of Some CFAR Detectors 642
* X; X; C; c% _3 z: ~11.4 Adaptive Thresholding in Code Acquisition of Direct-% V B- C( D3 d: G# R
Sequence Spread Spectrum Signals 648, L% Q# i& L# |" f' ~$ j
11.4.1 Pseudonoise or Direct Sequences 649
/ P" K& b% k; p4 m11.4.2 Direct-Sequence Spread Spectrum Modulation 652
; ?. F( v \+ k6 a! i7 g3 I$ R11.4.3 Frequency-Hopped Spread Spectrum Modulation 6550 N- n0 S. W6 i+ }3 @% {! B$ \
11.4.4 Synchronization of Spread Spectrum Systems 6554 G% @ i* M3 N7 s+ N, [; ?) {
11.4.5 Adaptive Thresholding with False Alarm Constraint 659
6 G) G% j m7 G# s! d7 O11.5 Summary 6602 K$ [8 h* j' F, Y9 Z+ z s: R6 [
References 661; y7 ?' B) D( T0 \! E! W
Chapter 12 Distributed CFAR Detection 665- B% J* c; Z+ ]" W6 V- a0 C
12.1 Introduction 665
" ^ E+ ` U$ {0 z% {% V! Z p, g2 i12.2 Distributed CA-CFAR Detection 666
! l9 N7 l5 J* Z- c; X12.3 Further Results 670
# ]& P$ L Z# C( b0 K+ d' D! M: ]3 T12.4 Summary 671# w1 ]! l% }+ U8 D
References 672
8 }$ n" ~6 h! c' D6 }$ }$ IAppendix 6752 m+ T% N- \' B- V# J- y6 Q
About the Author 683
n$ d$ w, J1 u1 p8 RIndex 685
* U: |5 `1 P1 T! B7 }8 ^ |
|