0:00:13 | oh thank you |
---|---|

0:00:14 | uh a topic of my presentation as |

0:00:17 | best for source for compressed sensing signal corps rate |

0:00:19 | uh and i will introduce the a |

0:00:21 | so or do not much of it over again |

0:00:24 | in this presentation |

0:00:26 | well to a given all of the presentation i will |

0:00:29 | just |

0:00:30 | sure the start with complex that compressed sensing problem or you construction problem and matching pursuits |

0:00:35 | done try to you the motivation |

0:00:37 | for incorporating and model to best such to the problem |

0:00:41 | uh and i will introduce the a star and the algorithm |

0:00:45 | and uh the most it's reconstruction performance on one and two D |

0:00:50 | uh |

0:00:51 | reconstruction construction problems |

0:00:52 | that i i will conclude the presentation just for short summary |

0:00:57 | well we have a or the scene a number of good presentations about the compressed sensing problem reconstruction problem |

0:01:04 | um |

0:01:04 | principally we are interested |

0:01:06 | in |

0:01:07 | recording K K sparse signal X |

0:01:09 | from a reduced set of |

0:01:11 | uh a measurement |

0:01:12 | say the signal has been um and we take on the M measurements from that where and last and and |

0:01:17 | and you would like to |

0:01:18 | reconstruct this case sparse vector X |

0:01:22 | of length and |

0:01:23 | so the new well known as well construction can be written as here it's the we are interested in the |

0:01:28 | minimum on as run on X |

0:01:30 | that satisfies the observation |

0:01:32 | um obviously is |

0:01:34 | direct |

0:01:35 | intractable so that and number of uh reconstruction approach has appeared in future |

0:01:40 | well we can categorise them into |

0:01:42 | uh well farming you categories than i listed here |

0:01:45 | but in this presentation we will uh most to be interested in the creative process |

0:01:52 | and uh in this |

0:01:53 | next slide i will like to talk about |

0:01:55 | uh yeah certainly of them |

0:01:57 | the matching pursuits |

0:01:59 | these are iterative algorithms that uh right i don't build up the solution from |

0:02:04 | uh an initial zero source like the matching pursuit an orthogonal matching pursuit |

0:02:10 | or they define a sparse solution as in the case |

0:02:13 | for subspace pursuit or |

0:02:15 | compressive sampling |

0:02:16 | uh matching pursuit |

0:02:18 | uh for our purposes the orthogonal matching pursuit is |

0:02:21 | you most important one is the algorithms based on that |

0:02:24 | uh so all this work it |

0:02:27 | uh identifies one non-zero coefficient of its for each iteration |

0:02:31 | uh |

0:02:32 | it's not like this |

0:02:34 | nonzero coefficients |

0:02:36 | and the dictionary atom that has not something a product |

0:02:39 | um at the residual prediction residual |

0:02:42 | and then |

0:02:43 | uh |

0:02:44 | we compute an orthogonal projection of the received you already |

0:02:48 | uh subspace spanned by a set of selected a time so that the residual after the projection is |

0:02:54 | orthogonal to the uh subspace bias |

0:02:57 | the set |

0:02:59 | well this is all those the single best strategy |

0:03:02 | so it selects one no |

0:03:04 | one uh vector after the others or one non-zero coefficient after the other |

0:03:08 | and |

0:03:09 | in this work we will like to propose a multiple G |

0:03:12 | that in state |

0:03:13 | uh such the solution among the number of |

0:03:16 | candidates as you can see here |

0:03:18 | um |

0:03:20 | so we try to so in the solution |

0:03:22 | but i sort among a number of dynamically women candidates |

0:03:25 | and |

0:03:26 | provided and a property that selection algorithm to |

0:03:30 | to uh |

0:03:32 | find the bass bad or the most from my isn't and on that |

0:03:35 | we can follow the most probable isn't and one peas next and that and finally reach |

0:03:40 | the solution |

0:03:41 | so you know the solve this |

0:03:43 | model but problem i will introduce the a star of one matching in a is shot is still one P |

0:03:47 | algorithm |

0:03:48 | that combines the a star such which |

0:03:51 | you're to gonna matching purse |

0:03:53 | so first let's have a look at how we can represent present this |

0:03:56 | compressed sensing problem this as was this problem with |

0:03:59 | a source |

0:04:00 | uh all this is true but |

0:04:03 | i i want to |

0:04:04 | give it for sake of completeness the not he represented each dictionary elements as you can see |

0:04:09 | uh some |

0:04:10 | dictionary dictionary must kind of on different that's |

0:04:12 | model times and three |

0:04:14 | or speak |

0:04:15 | oh and and it |

0:04:16 | each pad |

0:04:17 | then is a candidate solution |

0:04:19 | here for example we see a patch of length four |

0:04:22 | as you can see it as a uh suppose scripts |

0:04:24 | and it is suppose to the is the |

0:04:26 | is is the number of the pads of is the talk but then four |

0:04:29 | i the kind of a |

0:04:31 | um H but is the center easy to |

0:04:35 | and we compute a cost also based on those is the you i will come to this |

0:04:39 | next slide |

0:04:41 | so are in ms to build up a and dynamic label late the search tree |

0:04:45 | a three to will be a star search |

0:04:47 | and it each iteration we will first choose the best but on these |

0:04:51 | and then expand this but |

0:04:54 | so that we might have a |

0:04:56 | all we of to all we now here |

0:04:58 | we first initialize the so it's tree to i notes your i set the one so you |

0:05:02 | on the one initial |

0:05:04 | that and a three |

0:05:05 | we select the best but its tree little use of this |

0:05:08 | one single path |

0:05:09 | and we start iterations by expanding this but bad |

0:05:12 | but it's |

0:05:13 | but children |

0:05:14 | so is |

0:05:15 | the the egg number of expect extensions is your stick the to be we update did use and |

0:05:20 | uh a cost of the past for each |

0:05:22 | pad and of to the tree and then it once again select the best but |

0:05:26 | and one now this two pat |

0:05:28 | but this turns out to be to it's expanded once again in the next iteration |

0:05:32 | then once again the best but is selected until the convergence here |

0:05:36 | and the convergence criterion is that |

0:05:38 | oh we start when the best pad |

0:05:40 | has it is that night |

0:05:42 | okay |

0:05:42 | uh |

0:05:43 | so here for example we stop line K is for we stop and this back to select |

0:05:48 | as the best but |

0:05:49 | and a point |

0:05:52 | to to understand this better be |

0:05:54 | i would like to define a a three important stage of the algorithm |

0:05:58 | first just really the initialization stage we choose |

0:06:01 | in this state i nodes |

0:06:02 | as all would set that as much some the product to why so this to the most remote step |

0:06:07 | and we have got |

0:06:08 | selection of the best but problem and here |

0:06:10 | there is a problem of comparing pass with different a |

0:06:14 | and that the problem is how we expand this dispatch |

0:06:17 | well in other words how we a way to many pads and the three so that be or just track |

0:06:23 | as for the best but selection we have got the |

0:06:25 | so called sort function of a star so |

0:06:28 | uh here you can see in this case |

0:06:30 | this function here |

0:06:32 | i have that the pad of length for and i would like to compare this to the second part of |

0:06:35 | thing to |

0:06:36 | so i have to compensate for this |

0:06:39 | nine X six is two what's your you know to to have to compare comparable costs |

0:06:42 | and i introduced this stocks a function |

0:06:44 | to compensate or a a star that's a introduces this box or function to compensate for this |

0:06:49 | nine existent C |

0:06:51 | so yes |

0:06:52 | uh oh the function should reflect and how much the residual four |

0:06:56 | or would degrees give |

0:06:58 | the but were complete or of length four |

0:07:00 | and we need to now to define cost models |

0:07:02 | uh it's not |

0:07:03 | easy to find point "'cause" models always that exactly hold what we want to find some was an next like |

0:07:08 | that |

0:07:09 | generally and |

0:07:10 | well the whole |

0:07:12 | so as for the cost models i have a uh i would like to introduce tree scheme us |

0:07:17 | the first one is an |

0:07:19 | uh i did of scheme a your i better that of |

0:07:21 | like to as and |

0:07:23 | you correspond reason you |

0:07:24 | and i assume you fight at in you north here |

0:07:27 | this |

0:07:28 | you know what would |

0:07:29 | we use the residual by a constant imams proportional to the |

0:07:33 | uh a out to norm of the observation |

0:07:35 | so we can also see the general form here for a pack of length out |

0:07:42 | uh a as a seconds |

0:07:43 | X example of scheme are uh i once again have a same |

0:07:47 | that here |

0:07:49 | and i is assume |

0:07:50 | addition of the you |

0:07:51 | uh node |

0:07:53 | nor vector to to do the presentation now |

0:07:55 | uh read use system easy to why hard this proportional to the |

0:07:59 | uh difference of the L two norms of the you'd use in the last |

0:08:02 | uh to no |

0:08:04 | so here |

0:08:05 | a i one minus are two out to a normal of are one minus |

0:08:08 | oh i know of |

0:08:09 | are two |

0:08:12 | and we can also write this generally is you C |

0:08:14 | for any a of an L |

0:08:17 | and finally the is |

0:08:19 | and another other was more called the model to one |

0:08:21 | here we |

0:08:22 | assume that addition |

0:08:24 | of the new node |

0:08:25 | uh uh uses to is you by a constant |

0:08:27 | ratio out of four |

0:08:29 | is i'd ways of the selected based and one so if is you |

0:08:34 | and this can be also general than as |

0:08:37 | this form here |

0:08:39 | so after we |

0:08:40 | uh applied this cost models we can so that the best that that has the minimum cost |

0:08:45 | and now we about the problem of expanding it |

0:08:47 | well i a starting it's |

0:08:49 | uh original form |

0:08:50 | expanse all the children of the selected |

0:08:52 | brian's mind this what in three will result in two men that's always that's and the power K it |

0:08:57 | as will be done |

0:08:58 | intractable |

0:08:59 | so we have to buy some pruning |

0:09:01 | uh the first printing mechanism is these of extensions put back pruning we |

0:09:06 | we prune the the children of the three what it on the best |

0:09:09 | B each the remaining three or be on at the best |

0:09:12 | sure |

0:09:13 | a three |

0:09:13 | uh so that the today deer |

0:09:16 | uh you know a you know products to do you is you of the no |

0:09:20 | a second pruning is this takes size pruning it is the ms the number of that's in the tree to |

0:09:25 | P |

0:09:25 | and we prune the three of you number of paths exist the be and remote the pats |

0:09:29 | that have a minimum cost a sorry of some costs |

0:09:33 | and are we have to take care of it you want that |

0:09:37 | as i was that permutations of nodes with an a |

0:09:41 | i or temptation of nodes in a tree exists and we we get see that temptation of nodes with at |

0:09:46 | that a Q will and an example is here |

0:09:48 | we got that one and pay to it one at different lines |

0:09:51 | and i is you see the first three nodes and pet one and a two |

0:09:55 | uh share the same |

0:09:57 | so i i than the prediction property i know that the residual here |

0:10:01 | this node |

0:10:02 | but example be equal to the residual you hear it this not so i'm sure that the but to and |

0:10:07 | select |

0:10:08 | you know five minutes |

0:10:09 | expanded and this to we'll the exact exactly the same so i can that one and B two are Q |

0:10:14 | but here for the case for P three is different it has five here in the middle that's not in |

0:10:18 | the first uh three nodes of but one |

0:10:21 | so this change everything and the pet one and pick three are not clue |

0:10:24 | you can wouldn't as the you use here |

0:10:26 | and here but not be the same in am |

0:10:30 | so we can now illustrate this |

0:10:32 | uh a single iteration of a of P with one initial node |

0:10:35 | a a four packs a low and the tree and |

0:10:38 | uh number of extensions restricted it to three |

0:10:41 | so we got the initial pad |

0:10:43 | and the best |

0:10:44 | the initial three and the best but to select as a pet for with minimal cost |

0:10:48 | and the best extensions do not to be to a what's to eight and nine |

0:10:52 | uh it just pick to the you know the X to you of the bad |

0:10:56 | so we first that the not to to bad for |

0:10:59 | this doesn't change number of in the tree so we are fine |

0:11:02 | and that we have to add in not eight |

0:11:04 | uh this is also a new pad |

0:11:06 | uh so we have no five that's in the three and the worst one |

0:11:10 | now has to be remote |

0:11:12 | and |

0:11:13 | last we have to consider not nine |

0:11:16 | he we see that the if we add not nine we will have a new pack that has a a |

0:11:20 | cost higher than all the pets in the three so as we have or the four pass we ignore this |

0:11:24 | but |

0:11:25 | for one single iteration of the algorithm |

0:11:29 | a no i would like to do the performance of the algorithm in of one D scenario |

0:11:34 | we have the reconstruction of |

0:11:36 | one D signals from |

0:11:37 | a uh and and and |

0:11:40 | observations the signal seven lines |

0:11:42 | uh two hundred |

0:11:43 | and fifth the six |

0:11:44 | that is sparse to various between ten and fifty |

0:11:47 | or K uh and the nonzero coefficients zero are drawn from these standard normal distribution |

0:11:52 | yeah the red curve shows as the uh supplements of |

0:11:56 | a a is only P V the multiplicative cost function |

0:11:59 | and as we bought see from a B B C from the normalized mean square error that lower that all |

0:12:04 | that of candidates |

0:12:05 | in need subspace per basis pursuit and orthogonal matching pursuit |

0:12:09 | and at the same is also true for exact reconstruction rates or |

0:12:12 | so we are the algorithm is also boat |

0:12:15 | E the others |

0:12:17 | uh a a second case here is very similar to the first one except the point that be nonzero coefficients |

0:12:22 | but room the stem from |

0:12:24 | the uniform distribution between minus one and one |

0:12:27 | and and here i'll but |

0:12:28 | three "'cause" model is for a way P |

0:12:31 | uh uh that to additive and multiplicative cost models |

0:12:34 | and we see that in terms of normalized music what error this still out from the all it until all |

0:12:39 | somewhere where |

0:12:40 | the error is already |

0:12:42 | well quite high |

0:12:44 | and as for the equals |

0:12:45 | construction race the simo also holds uh except for the it'd two |

0:12:50 | well i and who is |

0:12:51 | reconstruction rates force down so we can you conclude that the adaptive and multiplicative strategies issues are |

0:12:57 | uh far or better for our purposes and up to |

0:13:00 | a and the two |

0:13:03 | so finally we have that some |

0:13:05 | a we of you much problem |

0:13:07 | oh sorry |

0:13:08 | uh we have here to block process the images an eight by blocks |

0:13:12 | and each block was you |

0:13:14 | where the process to |

0:13:16 | you for be uh |

0:13:18 | so measurements |

0:13:19 | to be fourteen sparse and you to to to T two to two and observations from each block |

0:13:24 | so these are about long images |

0:13:26 | and we see that we model the key a way in P here results in better |

0:13:31 | because an a shows than all the other algorithms and you when increasing the |

0:13:35 | uh |

0:13:36 | number of and and branches per iteration increase of the performance our |

0:13:42 | so here we can see uh two you just gets is the basis person reconstruction of lee and this is |

0:13:46 | the is a of P reconstruction |

0:13:48 | i hope the differences are reasonable for example here uh on the bottom regions or |

0:13:53 | uh a in the U E T of he reaches already has region |

0:13:57 | uh |

0:13:57 | alright he this some be more visible now |

0:14:00 | here is the basis posted reconstruction and the target can much and a star P reconstruction of clearly a the |

0:14:07 | bomb is it's are P use much |

0:14:09 | much closer to the type image and also here for the here region |

0:14:12 | it's |

0:14:13 | much closer to the target |

0:14:16 | we can also compare really uh pixel |

0:14:19 | errors of |

0:14:21 | basis pursuit than a star when be well from the bed this person so is you can you when we |

0:14:25 | couldn't as the has and a |

0:14:27 | yeah and the boundaries of land the your do our minutes it's for but is would be clearly |

0:14:31 | introduced much this or than that |

0:14:35 | conclude i have here present a multiple or strategy that |

0:14:38 | uh a mine's best first search and uh also matching pursuit |

0:14:43 | uh in this algorithm are aim is to build up and find in clear all eight uh search three |

0:14:48 | while a the pass that uh minimize the cost function |

0:14:52 | as what this cost function we have introduced re scheme must to nine and scheme uh so called a multiplicative |

0:14:57 | and of once |

0:14:58 | we also introduce an additive |

0:15:00 | scheme out but uh multiplicative at once seven perform better in the examples |

0:15:06 | and we showed that a a like be but was better reconstruction than |

0:15:09 | well one P S P and E P in |

0:15:12 | in the |

0:15:13 | i finally once also to note that the model of a was available at my website |

0:15:18 | and you also introduce a real-time implementation of sue |

0:15:21 | so thank you for us |

0:15:33 | or it is okay cool |

0:15:40 | uh well computation time us to take the was it depends on the number of nodes expanded in a three |

0:15:46 | and i should |

0:15:47 | say that this is |

0:15:48 | much higher than a basis but in this case when you expand to and notes |

0:15:51 | per iteration |

0:15:53 | uh |

0:15:55 | i can say that |

0:15:56 | in real time i have |

0:15:58 | i have my you pension in computer with the real calm software about ten seconds for |

0:16:02 | for example |

0:16:04 | or this experiment for one K for K |

0:16:06 | equal to twenty five |

0:16:07 | this experiment of five hundred on them but this once and something like ten sec |

0:16:12 | um but |

0:16:13 | uh |

0:16:14 | in terms of computational |

0:16:17 | a competition analyses you have to find a limit on the number of bats in the three |

0:16:20 | you can find it |

0:16:21 | very lose bounds bones those upper bound but this doesn't |

0:16:25 | really represent the complex of the algorithm |

0:16:27 | so we have working on that |

0:16:29 | to |

0:16:29 | two |

0:16:30 | strain in this bound |

0:16:32 | but |

0:16:32 | i i not say anything more no |

0:16:36 | oh the questions |

0:16:47 | you can show that the L simple find the optimal solution only has some conditions on a future expansion cost |

0:16:53 | and meeting if it is a lower bound on a |

0:16:56 | real cost |

0:16:57 | so how can you guarantee that you |

0:16:59 | cost models and a a a a and in the at to model i really satisfying as a lower bound |

0:17:05 | conditions do had it so for that |

0:17:07 | um |

0:17:08 | we have one might of proof for that |

0:17:10 | but as i said here it's |

0:17:11 | really that a hard to find some X |

0:17:13 | models that um |

0:17:15 | model |

0:17:16 | sorry is decrease in the there it should at look you're of does not exist |

0:17:19 | and and all that before if we had known we wouldn't the and you can reconstruction or algorithm that um |

0:17:24 | that was hard to prove yeah yeah for or what one can |

0:17:27 | one can probably is |

0:17:28 | but what the what what is |

0:17:30 | this that the uh show that this cost models able to hold in general |

0:17:34 | um i think that |

0:17:36 | but what's that can only be done and |

0:17:38 | this |

0:17:39 | this data it is constant this way |

0:17:41 | uh similar to the way a star but not as in the sense that the stressed these forces |

0:17:46 | a we have one work on that |

0:17:48 | uh but |

0:17:49 | this to use other web that that you select a spate very high is model exactly hold |

0:17:55 | but then you have to expand to reach to the solution |

0:17:58 | well what other than that |

0:18:00 | i think it's |

0:18:01 | one can that's problem |

0:18:03 | the |

0:18:04 | uh fines |

0:18:06 | the |

0:18:07 | say say be a find |

0:18:09 | a a relation |

0:18:11 | it's been this course and the exact okay |

0:18:13 | and just one more comment uh there was a lot of work and the coding theory community on least equal |

0:18:18 | so this falls into the category of |

0:18:20 | stuck partial list |

0:18:21 | so that was at least lanky algorithm that works in very similar ways |

0:18:25 | partially constructing least |

0:18:27 | a candidates pruning the list and |

0:18:29 | you at the end Q a list of K |

0:18:32 | can you case your T you only one path |

0:18:35 | or yeah they'll have list can |

0:18:38 | no that you can you can test |

0:18:40 | you |

0:18:40 | for a future test |

0:18:41 | see i one of these can uh |

0:18:46 | oh so we be be of is the keep a number paths |

0:18:48 | i the other two but finally to turns one pat you you keep line |

0:18:51 | because it is equal you keep a list out your search that you keep a list that the end as |

0:18:56 | well as you have a very small |

0:18:57 | sampling can uh D |

0:19:00 | and you can play with that lee |

0:19:01 | as the next |

0:19:02 | a side information |

0:19:03 | a side which |

0:19:05 | oh |

0:19:06 | the algorithm |

0:19:07 | a in our case of turns on the one that |

0:19:09 | okay thank |

0:19:12 | yeah a question |

0:19:18 | it's uh think to speaker again |

0:19:20 | thank you |