Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Active In SP

Posts: 1,124
Joined: Jun 2010
24-11-2010, 04:43 PM


The Least Mean Square (LMS) algorithm, introduced by Widrow and Hoff in 1959 [12] is an adaptive algorithm, which uses a gradient-based method of steepest decent [10]. LMS algorithm uses the estimates of the gradient vector from the available data. LMS incorporates an iterative procedure that makes successive corrections to the weight vector in the direction of the negative of the gradient vector which eventually leads to the minimum mean square error. Compared to other algorithms LMS algorithm is relatively simple; it does not require correlation function calculation nor does it require matrix inversions.
6.2 LMS Algorithm and Adaptive Arrays
Consider a Uniform Linear Array (ULA) with N isotropic elements, which forms the integral part of the adaptive beamforming system as shown in the figure below. The output of the antenna arrayis given by,

tsdenotes the desired signal arriving at angle0θθandudenotes interfering signals arriving at angle of incidences)(tiiθrespectively. a(0θ) and a(i) represents the steering vectors for the desired signal and interfering signals respectively. Therefore it is required to construct the desired signal from the received signal amid the interfering signal and additional noise n(t). As shown above the outputs of the individual sensors are linearly combined after being scaled using corresponding weights such that the antenna array pattern is optimized to have maximum possible gain in the direction of the desired signal and nulls in the direction of the interferers. The weights here will be computed using LMS algorithm based on Minimum Squared Error (MSE) criterion. Therefore the spatial filtering problem involves estimation of signalfrom the received signal (i.e. the array output) by minimizing the error between the reference signal , which closely matches or has some extent of correlation with the desired signal estimate and the beamformer output y(t) (equal to wx(t)). This is a classical Weiner filtering problem for which the solution can be iteratively found using the LMS algorithm.

LMS algorithm formulation (All signals are represented by their sample values)
From the method of steepest descent, the weight vector equation is given by [10],
)})]({([21)()1(2neEnwnw−∇+=+μ (6.2)
Where μ is the step-size parameter and controls the convergence chachteristics of the LMS algorithm; e2(n) is the mean square error between the beamformer output y(n) and the reference signal which is given by,
e2(n) = [d*(n) – whx(n)]2 (6.3)
The gradient vector in the above weight update equation can be computed as
∇(E{ew2(n)}) = - 2r + 2Rw(n) (6.4)
In the method of steepest descent the biggest problem is the computation involved in finding the values r and R matrices in real time. The LMS algorithm on the other hand simplifies this by using the instantaneous values of covariance matrices r and R instead of their actual values i.e.
R(n) = x(n)xh(n) (6.5)
r(n) = d*(n)x(n) (6.6)
Therefore the weight update can be given by the following equation,
w(n+1) = w(n) + μx(n)[d*(n) – xh(n)w(n) ] (6.7)
= w(n) + μx(n)e*(n)
The LMS algorithm is initiated with an arbitrary value w(0) for the weight vector at n=0. The successive corrections of the weight vector eventually leads to the minimum value of the mean squared error.

for more information,please go through:
Active In SP

Posts: 1
Joined: Apr 2011
18-04-2011, 12:16 PM

hi dear
i had read entire post so i have doubt in


K.Pradeep kumar

Important Note..!

If you are not satisfied with above reply ,..Please


So that we will collect data for you and will made reply to the request....OR try below "QUICK REPLY" box to add a reply to this page

Quick Reply
Type your reply to this message here.

Image Verification
Please enter the text contained within the image into the text box below it. This process is used to prevent automated spam bots.
Image Verification
(case insensitive)

Possibly Related Threads...
Thread Author Replies Views Last Post
  Chi-Square Tests PPT project girl 0 538 20-11-2012, 12:13 PM
Last Post: project girl
  Chi-Square Table seminar ideas 0 430 10-08-2012, 01:04 PM
Last Post: seminar ideas
  Least Squares. seminar flower 0 454 25-07-2012, 01:33 PM
Last Post: seminar flower
  F5—A Steganographic Algorithm High Capacity Despite Better Steganalysis seminar ideas 0 516 19-04-2012, 11:34 AM
Last Post: seminar ideas
  Booth's Algorithm Example seminar paper 0 1,166 30-03-2012, 11:32 AM
Last Post: seminar paper
  Bresenham Line Drawing Algorithm Circle Drawing & Polygon Filling project report helper 0 3,779 16-10-2010, 07:29 PM
Last Post: project report helper
  Statistical Algorithm for Power Transmission Lines Distance Protection seminar surveyer 0 1,000 13-10-2010, 10:57 AM
Last Post: seminar surveyer
  The Prediction Algorithm Based on Fuzzy Logic Using Time Series Data Mining Method seminar surveyer 0 1,860 13-10-2010, 10:15 AM
Last Post: seminar surveyer
  An Algorithm For Labeling A Tree tulasi 0 1,655 05-10-2008, 04:34 PM
Last Post: tulasi