0:00:13a distributed gaussian particle filtering using a like lead consensus and is the joint work with
0:00:18the on is lou checked friends how much but there your each and
0:00:21um are
0:00:23so uh first let me summarise the
0:00:27uh a contribution of this paper
0:00:29we proposed a a a a distributed implementation of
0:00:33of the gaussian particle filter that was originally introduced in
0:00:37um
0:00:38in a centralized for mean in the paper of co tech uh and you reach in two and
0:00:43and three
0:00:44so in this paper uh we we've posted the distributed implementation and in this implementation each uh
0:00:51sensor computes i
0:00:54global estimate based on to joint or all sensors uh likelihood function
0:00:59and and
0:01:00the joint likelihood function uh or its approximation is obtained uh
0:01:05at each sensor in a distributed way using a the like likelihood consensus scheme which we propose to
0:01:11in our previous paper two thousand ten
0:01:13at the T asilomar conference
0:01:16uh here we also use a a second stage of consensus algorithms
0:01:20uh to reduce the complexity of the of the distributed gaussian
0:01:25a particle filter
0:01:27so a a is a brief comparison with some other consensus based distributed particle filters
0:01:33so in this paper
0:01:35in in far a lot uh two doesn't ten uh the use no approximations and so the
0:01:41do estimation performance can be
0:01:43better but on the other hand the communication requirements so can be much higher than it
0:01:48in our case and in
0:01:50uh uh two to and eight uh the use on a local like load functions
0:01:55and in contrast to the use the joint likelihood function at each sensor
0:01:58and
0:01:59so the estimation performance of of our method is is better
0:02:04um
0:02:05okay so let's start with some
0:02:08overview of distributed estimation wireless the sensor network
0:02:12so what we consider is a wireless sensor network that's composed of capital K U
0:02:17sensor nodes that joint we estimate a time varying state X and
0:02:23and each of the sensors
0:02:24indexed by
0:02:25small K
0:02:27uh obtains the measurement vector
0:02:30is E
0:02:31and
0:02:31an example is so local aviation and
0:02:34or tracking based on the sound emitted by a moving target or we can
0:02:39and the cost that we would like to achieve are to forming so
0:02:42it sensor nodes should obtain a global
0:02:44state estimate
0:02:45X head
0:02:46and based on measurements of all other sensors and the network and this might be important for example in sensor
0:02:51actuator or
0:02:53robotic networks
0:02:55and would like to use only local processing short distance communications with neighbours
0:03:00and also no fusion center should be used and no routing of measurements throughout the network
0:03:07okay we also wish to perform sequential estimation
0:03:09uh of the time-varying state
0:03:12X and
0:03:13uh from the current and the past measurements of all sensors in the
0:03:16in the sensor network
0:03:19and and
0:03:20so we consider nonlinear non gaussian state space model but with
0:03:24independent additive gaussian measurement noise is
0:03:27and it's uh such a system is described by the state transition pdf
0:03:31and the joint likelihood function or J lf
0:03:34where C N is uh
0:03:37is the collection of measurements from all sensors
0:03:41and
0:03:41well in this case optimal bayesian estimation amounts to calculation of the posterior
0:03:46uh pdf here
0:03:48and sequential estimation is enabled by is uh a recursive posterior update they where we
0:03:53turn the
0:03:54previous post your to the current one
0:03:57using the state transition pdf and and the don't by clint function
0:04:00and joint like that function is important if you want to obtain
0:04:03global results based on to
0:04:06all sensors
0:04:07measurements
0:04:09okay so now let's have a look at the distributed gaussian particle filter
0:04:14so it it's well known that for nonlinear non gaussian systems the optimal bayesian estimation is typically infeasible
0:04:21and the computational feasible approximation is provided by a
0:04:25particle filtering for
0:04:27well sequential monte carlo approach
0:04:29and think an example of
0:04:31many one of the many particle filters is the the gaussian particle filter proposed in this paper
0:04:36where the posterior is approximated by a gaussian pdf and the mean and covariance of this
0:04:42gaussian approximation R
0:04:44oh obtained from a set of
0:04:47weighted samples or particles
0:04:50and what we propose is a distributed
0:04:52implementation of the
0:04:54gaussian particle filter where each sensor
0:04:57use
0:04:58local
0:05:00gaussian in particle filter to sequential track the mean and covariance of a local gaussian approximation but that the global
0:05:06posterior
0:05:08uh
0:05:09and
0:05:10in this case the the measurement update
0:05:12at each sensor uses the global joint likelihood function and which ensures that global estimates are obtained
0:05:18and it's end
0:05:20and the J laugh is provided to to each sensor in a distributed way using the likelihood consensus
0:05:27scheme that we proposed in in this paper
0:05:29and some advantages are that the consensus algorithms employed by like that consensus require only local communications and operate without
0:05:38putting protocols
0:05:40and also no measurements or particles need to be exchange between the sensors
0:05:45um so here are the i'll show some steps that each
0:05:49sensor performs oh so the steps of a local
0:05:52gaussian particle filter
0:05:54so first couple at time and it's sends or obtains the gaussian approximation to the previous global posterior
0:06:01then eight
0:06:02draws
0:06:03particles from this
0:06:04a a gaussian approximation and it propagates so through the state this model so
0:06:09basically it's samples new predicted particles from the state transition pdf
0:06:14a then we need to calculate the joint likelihood function
0:06:18at each sensor and to do this we used the likelihood consensus
0:06:21and this step we will require communication between the
0:06:25uh neighbouring sensors
0:06:26and
0:06:28after each sensor can update the particle weights using the
0:06:31obtained trying to like lead function so
0:06:33this is how it's done
0:06:35so basically be we then evaluate the joint like with functions at the
0:06:39but in like look function at the predicted particles so
0:06:42that's why we need the joint likelihood function at each sensor
0:06:45as a function of the
0:06:46of the state X and four point twice evaluation
0:06:50and once we have to particles and weights we can calculate again to meeting
0:06:54and covariance of the of the
0:06:56a gaussian approximation to the global posterior
0:06:59and
0:06:59the state estimate is basically equal to disk calculated the in here
0:07:04so now let's have a look at how the like that consensus scheme operate
0:07:09so
0:07:10we we in this paper we consider the following measurement model we have here uh
0:07:15measurement function H and K of X N which is
0:07:19in general nonlinear it's it's a function of the
0:07:22of of the sensor
0:07:23index and
0:07:25uh uh it it depends on the sensor and possibly also on time
0:07:28and fees
0:07:29additive uh
0:07:31gaussian measurement noise which is
0:07:33assumed to be independent from sensor to sensor
0:07:35and you to this we obtain the joint like that function as a product of local likelihood functions
0:07:41and therefore in the exponent of the joint likelihood
0:07:44we have a sum over all sensors so this is this expression as and
0:07:48a here
0:07:49and for purposes of statistical inference is
0:07:52S an expression completely describes the joint like lead functions will focus on a distributed calculation of of S N
0:07:59and it will be
0:08:00that's three to obtain this as a function of the state X N
0:08:03and C N is just a collection of measurements from all sensors and it's observed and hence fixed
0:08:09well
0:08:10a direct calculation of of S and wood
0:08:13required at each sensor knows the
0:08:15measurements and also measurement functions of full other sensors in the network
0:08:19but uh initial we assume that
0:08:22each sensor only has its
0:08:23local information so we would need to somehow
0:08:26root this local information from each sensor to have every other sensor but that
0:08:30so what we would like to a it so we
0:08:33we choose another approach will be suitably approximate S N
0:08:37by
0:08:38suitably approximating the sensor measurement functions locally
0:08:41and to do the approximation in such a way that we can use than consensus algorithms to compute S N
0:08:48a
0:08:49so
0:08:49here we use a
0:08:51polynomial approximation of the sensor measurement functions so which till is the polynomial approximation
0:08:57uh and the
0:08:58this function here P R of X and basically this is are the
0:09:02the monomials all meals of of the polynomial but in principle we could use other basis functions to obtain
0:09:08some more general approximation
0:09:11and the the coefficients all five of this approximation there we calculate them using a least squares polynomial fitting and
0:09:18as the data points for this we squares fit be use the predicted particles of the
0:09:23of of the particle filter
0:09:24and that's
0:09:25important note that the
0:09:28rocks summation so basically the
0:09:29alpha coefficient of the approximation error obtained locally at it sensor so
0:09:33we don't need to communicate anything to to do that
0:09:38now if we have a substitute the
0:09:40polynomial approximation H two the for for H in in this S expression we obtain
0:09:46and approximate
0:09:48S still that
0:09:49uh since
0:09:51H till this are
0:09:53polynomials basically out of this um
0:09:55overall all sensors we obtain also a polynomial but of twice the degree so
0:10:00what we write this
0:10:01we see her the polynomial
0:10:03uh
0:10:05you
0:10:05coefficients the beta coefficients they contain for each sensor
0:10:09all local information so it's measurement
0:10:12as well as the U
0:10:13alpha coefficients of the approximation of of fits a local
0:10:17uh a measurement function
0:10:19uh what's important is that the coefficients are independent of the state X N
0:10:24and the only
0:10:25way how the state and into this expression is that would these monomials or
0:10:29some general basis function
0:10:31and now if we exchange the order of summation here so we we get a uh
0:10:36polynomial which has
0:10:38coefficients T
0:10:40and this coefficients here
0:10:42there are obtained as a sum over all sensors
0:10:45and therefore for the
0:10:46these coefficients they contain information from the entire network
0:10:50so we could view them
0:10:52as the sufficient statistic that fully describe
0:10:56is that still that
0:10:57and in turn also the approximate joint likelihood function
0:11:00so we see this is the approximate joint likelihood if each sensor knows these coefficients T
0:11:06then it can evaluate the
0:11:08joint likelihood function for more less for any any value of of the state X N
0:11:14a so since this coefficients are obtained as already said
0:11:18uh as a summation over all sensors state can be computed using the a distributed consensus algorithm at at each
0:11:24sensor
0:11:25so this is basically how would operate it check it's sensor computes locally coefficients speech of from the local available
0:11:31data
0:11:32and then the sum over all sensors is computed in a distributed by using consensus
0:11:38and it requires only transmission of some
0:11:40partial sums to the next per so we don't in to transmit measurements or or or or particles
0:11:46a the communication load put therefore be much much lower
0:11:51okay okay i'll just briefly mention
0:11:53ah
0:11:54a reduced complexity person of the distributed gaussian particle filter
0:11:59a a so in in this
0:12:01reduced complexity version each each of the
0:12:04"'kay" set uh sensors
0:12:06or
0:12:07"'kay" local in particle filters
0:12:09uses a reduced number of particles cheap prime
0:12:12so we we use the number of particles by a factor put to the number of sensors
0:12:17and we calculate a partial mean and the partial covariance variance of the global posterior but also using the joint
0:12:23like with function of the using the like with sensors
0:12:27and
0:12:28after this partial means and covariances can be
0:12:31combine by means of the second stage of consensus algorithms
0:12:36and
0:12:36if the second stage
0:12:38use a sufficient
0:12:39number of iterations then the pitch estimation performance
0:12:43of the reduced complexity version will be effectively put to that of the original one so
0:12:48we
0:12:49reduce the computational complexity but of course we introduce some new communications so it comes at the cost of some
0:12:58increasing in communications
0:13:01okay now i'll show you
0:13:02a target tracking sample and some simulation results
0:13:06so
0:13:07oh in this example the state
0:13:10represents the two D position and the two D velocity of the target
0:13:14and it it false according to this state transition equation
0:13:19uh
0:13:20and we consider or we simulate a
0:13:23network of randomly deployed acoustic amplitude sensors that sounds the sound i mean that sense the sound i meet it
0:13:30by to target
0:13:31and
0:13:32the measurement model is
0:13:33the following so the
0:13:35sensor measurement function is basically given here
0:13:38so we have the amplitude of the
0:13:40of the source divided by the distance between the
0:13:43target
0:13:44and the sensor
0:13:46and it's in principle the sensor positions can be time varying so we could
0:13:51the plight this mess the also the dynamic uh a sensor networks
0:13:56a this is the setting so we deployed sensors in the field of
0:14:00do mention two hundred by two very meters and
0:14:03it consists of twenty five acoustic and sensors
0:14:07um
0:14:08and the proposed distributed gaussian particle filter and it's reduced complexity person now compared with a centralized gaussian in particle
0:14:16filter
0:14:17we used one thousand particles sense to approximate the measurement function we use a polynomial of degree
0:14:22to
0:14:23which leads to fourteen consensus algorithms that need to be executed in parallel so
0:14:28basically a what in one iteration of like consensus you need to to transmit fourteen real numbers
0:14:34and we compare like that consensus that use eight iterations of consensus
0:14:39with a
0:14:41with a case where we calculate the sums
0:14:44exactly so that that could be
0:14:46a that's a
0:14:48as an S the asymptotic case
0:14:49so infinite number of consensus iterations more less
0:14:54okay okay here just as an illustration we see that the green line is the true target trajectory and the
0:14:59the right one is the track one
0:15:02and it's just a result
0:15:03a from one of these sense but in principle all sensors obtain the same reason
0:15:08okay here is to root mean square error performance of first this time
0:15:13ah
0:15:13the black line is the centralized case and as expected this the best one
0:15:17now if you look at the distributed case the exact some calculation
0:15:22that's the red plan
0:15:23there is a slight performance degradation and
0:15:26of course if you only use eight iterations of consensus you you get the to line which has
0:15:31slightly worse performance again but even
0:15:34we compare
0:15:35the blue and red two
0:15:37to the to the black ones with to the centralized case the the performance degradation is not so large
0:15:42here it's
0:15:44average average rmse which we averaged also over over the time and versus just measurement noise variance so yeah the
0:15:50noise variance rises is also the
0:15:52error arise is but more less the comparison between the three mats this the same as something
0:15:57on the first figure
0:15:58here it's the dependence of the estimation error on the number of consensus iterations and yeah of course as the
0:16:04number of iterations increases the performance gets better
0:16:08but what's interesting is the
0:16:11when we compared to the
0:16:12solid
0:16:13ooh curve with the solid red once for the
0:16:16strip it gaussian part before and it's reduced complexity version
0:16:19for
0:16:20lower number of
0:16:21iterations here the the reduced used complexity version
0:16:25uh
0:16:25has a slightly better performance and this we could explain
0:16:29more or less that
0:16:30such a way that the second stage of consensus algorithms helps to diffuse for that a local information
0:16:36throughout the network
0:16:38okay okay so what's conclude we proposed to distributed it uh that was can particle filtering scheme that
0:16:45in which each sensor around a local gaussian particle for to that computes a global state estimate that reflects the
0:16:51measurements of all sensors
0:16:52and to do this
0:16:54the
0:16:55we have to the particle weights at it's sensor using the joint like good function which we obtained in a
0:17:00distributed way
0:17:01i likelihood consensus
0:17:04and
0:17:05a think about like let can is that it requires on only local communications of
0:17:10some sufficient statistics so no measurements or particles need to be communicated and
0:17:14is also suitable for dynamic sensor networks
0:17:17and we also propose a reduced complexity variant of the distributed option particle filter
0:17:23and the simulation results indicate that the performance is good even in comparison with the centralized a in particle filter
0:17:31okay so that's compose my talk thanks
0:17:42i
0:17:44i
0:17:44i
0:17:45i
0:17:46yeah
0:17:47should
0:17:48is uh are insensitive to K to the value issue
0:17:53a a take a static um a couple of lot the number of polynomials right
0:17:58uh
0:17:59yeah that's the order of the problem a yes i and what that this approximation
0:18:03is good for K
0:18:05you mean all sensors yes
0:18:06uh
0:18:07yeah i mean in this in this application we use the same same
0:18:12type of measurement function at each sensor
0:18:14so
0:18:16that's what we used also the same approximation for for all sensors
0:18:19but i mean in principle you could have
0:18:21different measurement functions that different sensors and then you would need to
0:18:27use different order of polynomials and yeah
0:18:34yeah
0:18:37i
0:18:41and
0:18:42say
0:18:43a my in the same manner value
0:18:45of the global one
0:18:46a function
0:18:48well i mean you can only guaranteed by using a
0:18:51yeah
0:18:54a i think you cannot guarantee these i mean it depends on the on the size of your network and
0:18:59the bigger the network the more iterations you need i mean so
0:19:03a
0:19:06hmmm
0:19:07uh yes
0:19:08yes i on there are slight differences i mean depend on the number of iteration i mean you can
0:19:16oh
0:19:19hmmm no actually in in in the gaussian particle for to you don't need any a resampling because you construct
0:19:24the gaussian posterior and then you sampled new
0:19:27but yes i mean if you have insufficient number of iterations then each because each of the also operates separately
0:19:33so it go
0:19:34each of the nose has a
0:19:35he's all its own set of particles and its own set of weights and it will there be slight difference
0:19:43yeah
0:19:44and
0:19:45yeah
0:19:48and
0:19:49oh
0:19:49lee
0:19:50yes
0:19:51that's one
0:19:52i
0:19:53well
0:19:54yes
0:19:56i
0:19:57yes
0:19:58uh
0:20:00no no it's it's not not the case uh i mean
0:20:04uh it's just what
0:20:06you saying
0:20:08and
0:20:09as
0:20:10yeah okay
0:20:13yeah