Loading [MathJax]/jax/output/HTML-CSS/jax.js

HTML Code

Thursday, 12 July 2012

Multi User CDMA System Model

Let's have CDMA System with K Users indexed by j=1K. Let Each User Transmits BPSK Modulated Signal Simultaneously. Then the Transmitted Signal for one such User j can be Represented as:

Sj(t)=2Pjcj(t)bj(t)Cos(ωct+θj) Where,

cj(t)=n=c(n)jpTc(tnTc) is the jth User Signature Waveform, with the Signature bits

c(n)j1,+1

bj(t)=n=b(n)jpT(tnT) is the BPSK Modulated Signal for User j


b(n)j1,+1 and

T=NTc Where N is the Spreading Factor or Processing Gain. The Received Signal at the Receiver can be Represented as

r(t)=Kj=12Pjcj(tτj)bj(tτj)Cos(ωct+ϕj)+η(t) Where τj is Relative Time offset and τj[0T] ϕj is Phase Offset such that ϕj[02π] and ϕj=θjωcτj η(t) is Zero Mean AWGN Process with PSD N0



Monday, 2 July 2012

Gil-Palaez Theorem

It Helps us to Find the CDF of a Random Variable Directly from MGF or Characteristic Function .

if X is any Random Variable with Characteristic Function (CF) given by


ΦX(ω)=fX(x)ejωxdx, Then


FX(x)=0.51jπ0ΦX(ω)ωdω

Friday, 22 June 2012

Definition of Hypergeometric Function of Matrix Argument

pFq(a1ap;b1bq;X)=k=0κ(a1)κ(ap)κ(b1)κ(bq)κCκ(X)k!
where the Generalized Hypergeometric Coefficient is Given by

(a)κ=mi=1(a12(i1))ki, Where the Pochammer Symbol

(α)j=α(α+1)(α+j1),(α)0=1

Definition of Zonal Polynomial

Let Y be m×m Symmetric Matrix with Latent Roots(Eigen values) y1ym.

Let κ=(k1km)

be the Partition of an Integer k with Partition Size not more than m.

The Zonal Polynomial Corresponding to κ Denoted by Cκ(Y) is a Symmetric , Homogeneous Polynomial of Degree k in Latent Roots y1ym Such that:

Cκ(Y)=dkyk11ykmm+Terms of Lower Weight, dk being a constant

Cκ(Y) is an Eigen Function of Differential Operator ΔY given by:

ΔY=mi=1y2i2y2i+mi=1mj=1,jiy2iyiyjyi

As κ varies over all Partitions of k,The Zonal Polynomials have Unit Coefficients in the Expansion of Tr(Y)k i.e.,

Tr(Y)k=(y1++ym)k=κCκ(Y)

Friday, 11 May 2012

Definite Matrices

Can we Express an Indefinite Matrix as Sum of Positive Definite/Semi Definite and Negative Definite/Semi Definite matrices? I am working on it...

Conditional Gaussian Distribution

Let X1 and X2 are Jointly Gaussian Random Variables(Which Implies They are Individually Gaussian) i.e.,

X1N(0,σ21) and
X2N(0,σ22), with Correlation Coefficient as ρ.

Then we Know that :

fX1X2(x1,x2)=(12πσ1σ21ρ2)×e12(1ρ2)(x21σ212ρx1x2σ1σ2+x22σ22)x1,x2()

Prove that :
E(X2|X1)=ρx1σ2σ1
Var(X2|X1)=σ22(1ρ2)

Thursday, 10 May 2012

Find the Digit

This is a Pretty Logical Question..

1216451*0408832000 is 19!, ! Denotes Factorial.  Find the Digit in the place of *

Unitary and Orthonormal Matrix

Let the Matrix UCN×N is Unitary. Define ˜UR2N×2N as:

˜U=[Re(U)Im(U)Im(U)Re(U)]

Prove that ˜U is Orthonormal

Wednesday, 9 May 2012

Complex Gaussian Random Vector

Consider an N Dimensional Complex Gaussian Random Vector ZCN, Such That

Z=X+jY, where XRN and YRN are N Dimensional Real Gaussian Random Vectors .

Covariance matrix Q of Z in terms of Auto Covariance and Cross Covariance Matrices of X and Y is Defined as:

Q=E((ZE(Z))(ZE(Z))H)

So:
Q=(ΣXX+ΣYY)j(ΣXYΣTXY)

Define 2N Dimensional Gaussian Random Vector ˜Z, as:

˜Z=[XY]

The Covariance Matrix of ˜Z is Denoted as ˜K, Defined as:

˜K=E((˜ZE(˜Z))(˜ZE(˜Z))T)

So ˜K, In terms of Auto Covariance and Cross Covariance Matrices of X and Y is:

˜K=[ΣXXΣXYΣTXYΣYY]

Now if Z is Circularly Symmetric, Prove That:

˜K=12[Re(Q)Im(Q)Im(Q)Re(Q)]


Tuesday, 8 May 2012

Affine Transformation Does not Alter Correlation Coefficient

if X and Y are Two Random Variables with Correlation Coefficient ρ, Then the Correlation Coefficient of Random Variables
a1X+b1anda2Y+b2, Where a1 and a2 are Non Zero Real Numbers with Same Sign is also ρ. if They have Opposite Signs, The Correlation Coefficient is ρ.

Sunday, 6 May 2012

Laplacian Distribution

Consider the Laplacian Distribution whose PDF is Given by:

fX(x)=12e|x|<x<

If Random Variable Y is Defined as

Y=|X|+|X3|,  Find

Pr(Y3)

Friday, 27 April 2012

Expectation and Even PDF

By Definition of Expectation of Random Variable:

E(X)=xfX(x)dx

If fX(x) is Even, Then E(X)=0, Provided The Integral Exists, An Exception Being Cauchy Distribution , Whose Mean Doesn't Exist.

Now the Question is, Will the PDF of Random Variable is Even ,When E(X)=0, I am Unable to Solve/Prove This. I Encourage all of you to Solve This...

Thursday, 26 April 2012

Conditional Distribution

Consider two random Variables X and Y whose Joint PDF is:
fXY(x,y)=2(x+y),0y1,0xy=0,Else

Find fY|X(y|x)

Independence and UnCorrelated

We Know that if Two Random variables X and Y are  Independent Statistically, Then

E(XY)=E(X)E(Y), i.e., They are UnCorrelated. It Doesnt Mean that Dependent Random Variables are Always Correlated. Lets Think of Some Examples with Dependent Random Variables being UnCorrelated.

Coin Tossing and Mutual Information

There are Two Coins viz., A fair Coin and a Two Headed Coin. A Coin is Selected at Random for which the Random Variable is Denoted as X, and Selected Coin is Tossed Twice, and Number of Heads is Recorded which is Denoted as Random variable Y.

Find I(X;Y)

Wednesday, 25 April 2012

Log Normal Distribution

if X is Gaussian Distributed Random variable i.e., if
XN(μ,σ2)Y=eX
Y is said to be Log Normally Distributed i.e.,
YLogN(μ,σ2)
 If MGF of X is Given i.e., ΦX(s) is Given.

Find the nth Moment of Y, where nZ, Without Finding the PDF of Y

Thursday, 19 April 2012

A Very Interesting Probability Problem

Consider a Line Segment of length L. Randomly Select a Point on Each Side of the Midpoint of the Line Segment. What is The Probability that The Distance Between the Two Points Randomly Selected is Greater Than L3

Testing Math

Nk=1k=N(N+1)2

This Post is Basically to Test Usage of LaTeX in My Blog. Please Ignore it...