oh thank you

uh a topic of my presentation as

best for source for compressed sensing signal corps rate

uh and i will introduce the a

so or do not much of it over again

in this presentation

well to a given all of the presentation i will

just

sure the start with complex that compressed sensing problem or you construction problem and matching pursuits

done try to you the motivation

for incorporating and model to best such to the problem

uh and i will introduce the a star and the algorithm

and uh the most it's reconstruction performance on one and two D

uh

reconstruction construction problems

that i i will conclude the presentation just for short summary

well we have a or the scene a number of good presentations about the compressed sensing problem reconstruction problem

um

principally we are interested

in

recording K K sparse signal X

from a reduced set of

uh a measurement

say the signal has been um and we take on the M measurements from that where and last and and

and you would like to

reconstruct this case sparse vector X

of length and

so the new well known as well construction can be written as here it's the we are interested in the

minimum on as run on X

that satisfies the observation

um obviously is

direct

intractable so that and number of uh reconstruction approach has appeared in future

well we can categorise them into

uh well farming you categories than i listed here

but in this presentation we will uh most to be interested in the creative process

and uh in this

next slide i will like to talk about

uh yeah certainly of them

the matching pursuits

these are iterative algorithms that uh right i don't build up the solution from

uh an initial zero source like the matching pursuit an orthogonal matching pursuit

or they define a sparse solution as in the case

for subspace pursuit or

compressive sampling

uh matching pursuit

uh for our purposes the orthogonal matching pursuit is

you most important one is the algorithms based on that

uh so all this work it

uh identifies one non-zero coefficient of its for each iteration

uh

it's not like this

nonzero coefficients

and the dictionary atom that has not something a product

um at the residual prediction residual

and then

uh

we compute an orthogonal projection of the received you already

uh subspace spanned by a set of selected a time so that the residual after the projection is

orthogonal to the uh subspace bias

the set

well this is all those the single best strategy

so it selects one no

one uh vector after the others or one non-zero coefficient after the other

and

in this work we will like to propose a multiple G

that in state

uh such the solution among the number of

candidates as you can see here

um

so we try to so in the solution

but i sort among a number of dynamically women candidates

and

provided and a property that selection algorithm to

to uh

find the bass bad or the most from my isn't and on that

we can follow the most probable isn't and one peas next and that and finally reach

the solution

so you know the solve this

model but problem i will introduce the a star of one matching in a is shot is still one P

algorithm

that combines the a star such which

you're to gonna matching purse

so first let's have a look at how we can represent present this

compressed sensing problem this as was this problem with

a source

uh all this is true but

i i want to

give it for sake of completeness the not he represented each dictionary elements as you can see

uh some

dictionary dictionary must kind of on different that's

model times and three

or speak

oh and and it

each pad

then is a candidate solution

here for example we see a patch of length four

as you can see it as a uh suppose scripts

and it is suppose to the is the

is is the number of the pads of is the talk but then four

i the kind of a

um H but is the center easy to

and we compute a cost also based on those is the you i will come to this

next slide

so are in ms to build up a and dynamic label late the search tree

a three to will be a star search

and it each iteration we will first choose the best but on these

and then expand this but

so that we might have a

all we of to all we now here

we first initialize the so it's tree to i notes your i set the one so you

on the one initial

that and a three

we select the best but its tree little use of this

one single path

and we start iterations by expanding this but bad

but it's

but children

so is

the the egg number of expect extensions is your stick the to be we update did use and

uh a cost of the past for each

pad and of to the tree and then it once again select the best but

and one now this two pat

but this turns out to be to it's expanded once again in the next iteration

then once again the best but is selected until the convergence here

and the convergence criterion is that

oh we start when the best pad

has it is that night

okay

uh

so here for example we stop line K is for we stop and this back to select

as the best but

and a point

to to understand this better be

i would like to define a a three important stage of the algorithm

first just really the initialization stage we choose

in this state i nodes

as all would set that as much some the product to why so this to the most remote step

and we have got

selection of the best but problem and here

there is a problem of comparing pass with different a

and that the problem is how we expand this dispatch

well in other words how we a way to many pads and the three so that be or just track

as for the best but selection we have got the

so called sort function of a star so

uh here you can see in this case

this function here

i have that the pad of length for and i would like to compare this to the second part of

thing to

so i have to compensate for this

nine X six is two what's your you know to to have to compare comparable costs

and i introduced this stocks a function

to compensate or a a star that's a introduces this box or function to compensate for this

nine existent C

so yes

uh oh the function should reflect and how much the residual four

or would degrees give

the but were complete or of length four

and we need to now to define cost models

uh it's not

easy to find point "'cause" models always that exactly hold what we want to find some was an next like

that

generally and

well the whole

so as for the cost models i have a uh i would like to introduce tree scheme us

the first one is an

uh i did of scheme a your i better that of

like to as and

you correspond reason you

and i assume you fight at in you north here

this

you know what would

we use the residual by a constant imams proportional to the

uh a out to norm of the observation

so we can also see the general form here for a pack of length out

uh a as a seconds

X example of scheme are uh i once again have a same

that here

and i is assume

addition of the you

uh node

nor vector to to do the presentation now

uh read use system easy to why hard this proportional to the

uh difference of the L two norms of the you'd use in the last

uh to no

so here

a i one minus are two out to a normal of are one minus

oh i know of

are two

and we can also write this generally is you C

for any a of an L

and finally the is

and another other was more called the model to one

here we

assume that addition

of the new node

uh uh uses to is you by a constant

ratio out of four

is i'd ways of the selected based and one so if is you

and this can be also general than as

this form here

so after we

uh applied this cost models we can so that the best that that has the minimum cost

and now we about the problem of expanding it

well i a starting it's

uh original form

expanse all the children of the selected

brian's mind this what in three will result in two men that's always that's and the power K it

as will be done

intractable

so we have to buy some pruning

uh the first printing mechanism is these of extensions put back pruning we

we prune the the children of the three what it on the best

B each the remaining three or be on at the best

sure

a three

uh so that the today deer

uh you know a you know products to do you is you of the no

a second pruning is this takes size pruning it is the ms the number of that's in the tree to

P

and we prune the three of you number of paths exist the be and remote the pats

that have a minimum cost a sorry of some costs

and are we have to take care of it you want that

as i was that permutations of nodes with an a

i or temptation of nodes in a tree exists and we we get see that temptation of nodes with at

that a Q will and an example is here

we got that one and pay to it one at different lines

and i is you see the first three nodes and pet one and a two

uh share the same

so i i than the prediction property i know that the residual here

this node

but example be equal to the residual you hear it this not so i'm sure that the but to and

select

you know five minutes

expanded and this to we'll the exact exactly the same so i can that one and B two are Q

but here for the case for P three is different it has five here in the middle that's not in

the first uh three nodes of but one

so this change everything and the pet one and pick three are not clue

you can wouldn't as the you use here

and here but not be the same in am

so we can now illustrate this

uh a single iteration of a of P with one initial node

a a four packs a low and the tree and

uh number of extensions restricted it to three

so we got the initial pad

and the best

the initial three and the best but to select as a pet for with minimal cost

and the best extensions do not to be to a what's to eight and nine

uh it just pick to the you know the X to you of the bad

so we first that the not to to bad for

this doesn't change number of in the tree so we are fine

and that we have to add in not eight

uh this is also a new pad

uh so we have no five that's in the three and the worst one

now has to be remote

and

last we have to consider not nine

he we see that the if we add not nine we will have a new pack that has a a

cost higher than all the pets in the three so as we have or the four pass we ignore this

but

for one single iteration of the algorithm

a no i would like to do the performance of the algorithm in of one D scenario

we have the reconstruction of

one D signals from

a uh and and and

observations the signal seven lines

uh two hundred

and fifth the six

that is sparse to various between ten and fifty

or K uh and the nonzero coefficients zero are drawn from these standard normal distribution

yeah the red curve shows as the uh supplements of

a a is only P V the multiplicative cost function

and as we bought see from a B B C from the normalized mean square error that lower that all

that of candidates

in need subspace per basis pursuit and orthogonal matching pursuit

and at the same is also true for exact reconstruction rates or

so we are the algorithm is also boat

E the others

uh a a second case here is very similar to the first one except the point that be nonzero coefficients

but room the stem from

the uniform distribution between minus one and one

and and here i'll but

three "'cause" model is for a way P

uh uh that to additive and multiplicative cost models

and we see that in terms of normalized music what error this still out from the all it until all

somewhere where

the error is already

well quite high

and as for the equals

construction race the simo also holds uh except for the it'd two

well i and who is

reconstruction rates force down so we can you conclude that the adaptive and multiplicative strategies issues are

uh far or better for our purposes and up to

a and the two

so finally we have that some

a we of you much problem

oh sorry

uh we have here to block process the images an eight by blocks

and each block was you

where the process to

you for be uh

so measurements

to be fourteen sparse and you to to to T two to two and observations from each block

so these are about long images

and we see that we model the key a way in P here results in better

because an a shows than all the other algorithms and you when increasing the

uh

number of and and branches per iteration increase of the performance our

so here we can see uh two you just gets is the basis person reconstruction of lee and this is

the is a of P reconstruction

i hope the differences are reasonable for example here uh on the bottom regions or

uh a in the U E T of he reaches already has region

uh

alright he this some be more visible now

here is the basis posted reconstruction and the target can much and a star P reconstruction of clearly a the

bomb is it's are P use much

much closer to the type image and also here for the here region

it's

much closer to the target

we can also compare really uh pixel

errors of

basis pursuit than a star when be well from the bed this person so is you can you when we

couldn't as the has and a

yeah and the boundaries of land the your do our minutes it's for but is would be clearly

introduced much this or than that

conclude i have here present a multiple or strategy that

uh a mine's best first search and uh also matching pursuit

uh in this algorithm are aim is to build up and find in clear all eight uh search three

while a the pass that uh minimize the cost function

as what this cost function we have introduced re scheme must to nine and scheme uh so called a multiplicative

and of once

we also introduce an additive

scheme out but uh multiplicative at once seven perform better in the examples

and we showed that a a like be but was better reconstruction than

well one P S P and E P in

in the

i finally once also to note that the model of a was available at my website

and you also introduce a real-time implementation of sue

so thank you for us

or it is okay cool

uh well computation time us to take the was it depends on the number of nodes expanded in a three

and i should

say that this is

much higher than a basis but in this case when you expand to and notes

per iteration

uh

i can say that

in real time i have

i have my you pension in computer with the real calm software about ten seconds for

for example

or this experiment for one K for K

equal to twenty five

this experiment of five hundred on them but this once and something like ten sec

um but

uh

in terms of computational

a competition analyses you have to find a limit on the number of bats in the three

you can find it

very lose bounds bones those upper bound but this doesn't

really represent the complex of the algorithm

so we have working on that

to

two

strain in this bound

but

i i not say anything more no

oh the questions

you can show that the L simple find the optimal solution only has some conditions on a future expansion cost

and meeting if it is a lower bound on a

real cost

so how can you guarantee that you

cost models and a a a a and in the at to model i really satisfying as a lower bound

conditions do had it so for that

um

we have one might of proof for that

but as i said here it's

really that a hard to find some X

models that um

model

sorry is decrease in the there it should at look you're of does not exist

and and all that before if we had known we wouldn't the and you can reconstruction or algorithm that um

that was hard to prove yeah yeah for or what one can

one can probably is

but what the what what is

this that the uh show that this cost models able to hold in general

um i think that

but what's that can only be done and

this

this data it is constant this way

uh similar to the way a star but not as in the sense that the stressed these forces

a we have one work on that

uh but

this to use other web that that you select a spate very high is model exactly hold

but then you have to expand to reach to the solution

well what other than that

i think it's

one can that's problem

the

uh fines

the

say say be a find

a a relation

it's been this course and the exact okay

and just one more comment uh there was a lot of work and the coding theory community on least equal

so this falls into the category of

stuck partial list

so that was at least lanky algorithm that works in very similar ways

partially constructing least

a candidates pruning the list and

you at the end Q a list of K

can you case your T you only one path

or yeah they'll have list can

no that you can you can test

you

for a future test

see i one of these can uh

oh so we be be of is the keep a number paths

i the other two but finally to turns one pat you you keep line

because it is equal you keep a list out your search that you keep a list that the end as

well as you have a very small

sampling can uh D

and you can play with that lee

as the next

a side information

a side which

oh

the algorithm

a in our case of turns on the one that

okay thank

yeah a question

it's uh think to speaker again

thank you