0:00:14thanks first to me and and joe for for organising
0:00:18very interesting
0:00:19session which also gives us a
0:00:21the in
0:00:22to be prime
0:00:23uh so it's a will be talking to you about the reconstruction algorithms under sparsity constraints and so that kind
0:00:29of relates also to
0:00:31that's of activity going on in signal processing it
0:00:34a compressed sensing in give you some tuition
0:00:37uh in "'em" are i and by means
0:00:39nee since uh tomography
0:00:42um
0:00:43and the
0:00:44also i i mean in doing so i'll give you also kind of the art of for
0:00:50uh i turn to a reconstruction algorithms that use the
0:00:54uh
0:00:55uh
0:00:56regularization constraints uh by by the way of sparsity
0:01:01so uh first to a we i i i i i will just
0:01:04discuss a little the variational approach to image reconstruction will need to
0:01:08minimize so of some
0:01:10some uh a functional and and where we introduce some sparsity constraints then
0:01:14i will read uh read you some of the standard algorithm in particular uh it
0:01:20soft thresholding the uh uh the algorithm that's is stuff
0:01:23the fast version of that which is a a kind of multi step i'll grid which is
0:01:28gives you sort of the state of the arts
0:01:30coming from convex a analysis
0:01:32and then uh uh in this talk i i i i want to
0:01:35i discuss our variation of used star uh which is sort of
0:01:39you note a like to the problem which will use step adaptation to to gain uh some speed because the
0:01:45cool here really to get to the speed of the conventional in your taking
0:01:49is that have been
0:01:50use for for many years
0:01:52and then uh application to power L or i and and by mean this nest since the tomography
0:01:58so here is uh just the set
0:02:00so uh okay so we we have some object X that we'd like to to image for for example
0:02:07a proton density or some absorption map
0:02:10and then we have these scanner
0:02:12here that uh we will just assume we have a linear measurement system which is the the usual assumption in
0:02:18them are i
0:02:19X ray and so it's an integral operator but to since here we have a discrete eyes the problem so
0:02:24we can all represent body big fat matrix H
0:02:28that uh represents all our physics knowledge of of the problem
0:02:31and then uh at the outputs uh oops
0:02:34the yeah i here a see we have some
0:02:37modeling perfection and noise and and all that i i represent by oops
0:02:41by be uh maybe i should be problem like lawn
0:02:45knowledge knowledge that's professes where
0:02:48oh
0:02:49in the room
0:02:50uh
0:02:52okay so so uh uh a so here was saying we we have noise
0:02:56and so our imaging matching model here looks very easy just the this uh imaging matrix up like to the
0:03:01object that we like to recover
0:03:03plus some noise
0:03:04and then in the variational approach so you just set up this
0:03:08kind of uh uh that this criterion here where you have a
0:03:11data consistency requirements so we you you are kind of
0:03:14uh reconstructing X
0:03:16and then you are you're simulating your measurements and you would like this difference to to be very small
0:03:22spec to the true measurements
0:03:23and then here you it should use some regularization constraint and this is the arts
0:03:28of reconstruction here to
0:03:30to to just choose
0:03:31the the the right trigger is regularization
0:03:34and and and a classically here uh for example if we put a a quadratic term your was that's say
0:03:40sum
0:03:40but people operator here that
0:03:42it's that's
0:03:42tikhonov regularization but
0:03:45now uh
0:03:46i with compressed sensing actually all this was started by
0:03:49the observation that so many images natural images would represent was uh well was very few large group fusion is
0:03:57some basis
0:03:58in particular wavelet basis and and in fact this happens as well
0:04:02uh very significantly with medical images since so here's some "'em" are i image
0:04:07and and just the showing different
0:04:09transformations and and and and and so that you can get a pretty good uh a signal restitution but just
0:04:16keeping being a very small percentage of of large coefficients and
0:04:20and and so this this is really like used uh
0:04:23to promote uh i mean to to sort of justify the you the use of wavelets and so here
0:04:29then the idea ideas to use a regularization where you will impose a and one and that T
0:04:34uh a on the wavelet coefficients so here are would uh represent let's say might wavelet transform
0:04:39and one norm
0:04:40uh uh uh promotes sparsity and why it's good its convex so so that we have like some
0:04:47i guarantee you existence uh
0:04:49this
0:04:50oh
0:04:51a solution and also it will favour sparse solution
0:04:55okay so here is sort of uh the grandfather of of those uh a sparse a reconstruction written
0:05:01and and and so uh what do what you want to do is to minimize this okay you have this
0:05:06L two term
0:05:07and this
0:05:08sparsity constraint here i'm just as to me
0:05:11X is the representation in the wavelet basis and i'm using
0:05:15and augmented just the matrix that's sort of contents this
0:05:18a a wavelet transform followed by by by by by by uh
0:05:22i imaging device
0:05:24so
0:05:25then you have to very special case very easy completely classical also first if you put a day will zero
0:05:31just the square problem so it's a a text solution is one like calculation you get that
0:05:37uh and four she those i which may
0:05:39way
0:05:40a you at and
0:05:41rob you
0:05:42would not even want to compute that but then
0:05:44this sort of the grand father of our britains here which is
0:05:48but
0:05:49and a grid
0:05:50uh on on on to this data term
0:05:52and and and i mean this is sort of a you know the big the backbone of many iterative out
0:06:00and the i mean this works but the you have to two
0:06:03to stop it after a certain number of iterations
0:06:06but now the other interesting cases
0:06:08H is i you or you are
0:06:11like a wavelet transform in that case
0:06:13fact this
0:06:14problem you can by just going the wavelet domain
0:06:17uh doing a a a a a a soft thresholding and
0:06:20going
0:06:21now it soft thresholding algorithm which was the uh discovered
0:06:25well
0:06:26many
0:06:27people probably but uh
0:06:29feed yeah they do in no of act yeah uh where where
0:06:32among the very first here using your yeah formulation
0:06:35and so the a kind of a start if you were just to thing those two very
0:06:39three of
0:06:41remarkably uh this would solve this problem
0:06:45should look like to very difficult problem and then do a she the use them and two thousand four were
0:06:50able to
0:06:51sure that action
0:06:52prove convergence
0:06:54and so that that was uh a little revolution in the field
0:06:57although i'm say the output myth
0:07:02so maybe not so usable in that form
0:07:04and now if you just look at the structure actually the fixed points uh algorithm
0:07:08so uh what you have is a rule which is this a gradient
0:07:12this sense followed by
0:07:14based thresholding that will give you the new update and then there's this nice
0:07:18mathematics that guarantees convergence
0:07:21and fortunately it's very slow so
0:07:23that's been shown now that uh the cost
0:07:26will a decrease like one over and where and the number of iteration
0:07:30but then by
0:07:32uh building on many many years of full convex optimization then there was this proposition position in two thousand nine
0:07:39of fist yeah
0:07:40which is a like you control
0:07:42over the relaxation
0:07:43evaluation on this thing but
0:07:45essentially what it is it's it's a an updating dating a G that not only text
0:07:49uh
0:07:50uh uh uh built upon the previous estimate but takes two
0:07:54pretty
0:07:55a so and minus one
0:07:56and and and
0:07:58and then will
0:08:00a a construct a kind of
0:08:01vacation
0:08:02well
0:08:03car uh the correct uh update should be
0:08:05and then apply the is to out
0:08:08so to to just make it
0:08:10problem
0:08:11trying to solve
0:08:12and are remarkably this fist uh algorithm
0:08:14use you uh uh and square
0:08:17so it it's one or there
0:08:18faster or and and and and so gives you
0:08:21now a a a fast enough our
0:08:23to to to apply this kind of techniques so
0:08:25now what we have proposing here
0:08:28it's some variation and and so we call it fast the okay
0:08:32weighted iterative soft thresholding algorithm
0:08:35and and the idea is pretty simple so we had the step size
0:08:39or
0:08:40but not your
0:08:41yeah
0:08:42getting factor for for the great so we replace that by a diagonal matrix
0:08:47and actually we can also uh some
0:08:49flexibility on the
0:08:51a side of the regularization so
0:08:53with the slightly more channel
0:08:55regularisation matrix
0:08:57and you she uh the this is uh or or style with my are derived using
0:09:01uh max a uh uh uh and M
0:09:04a a a a a a a as a sort of constructing a a a quadratic bound on on the
0:09:09cost functional
0:09:10and and then all using a you know always trying to go and the bottom of this quadratic bound no
0:09:15if you have more flexibility here with
0:09:17introducing this matrix
0:09:19fact but you can do you can construct
0:09:21better a quadratic bound that
0:09:23re tailored to it to your problem
0:09:26and so hopefully this should allow us to go faster
0:09:29and then actually we we can do some have here
0:09:32and and so we obtain a now this uh this kind of uh in a quite the on the cost
0:09:38so no if you look at it superficially for mathematician doesn't look so much better because we still like what
0:09:44are and where one or and square
0:09:46but the important thing here we instead of having
0:09:48a in a norm we have
0:09:51weighted
0:09:52and actually this i for the call
0:09:54oh
0:09:55and now it means that clever engineering can really like from down those calls
0:10:00quite significant and and then
0:10:02can uh and that a be much fast
0:10:04so the application here i
0:10:06just want to
0:10:07a show you two examples by by the way i forgot to knowledge my
0:10:11cool all is okay and who the want to
0:10:14did all the work and and this is the first one matt i'm much should
0:10:17just can K on is working on product and more i actually in collaboration
0:10:21with the guy who invented the pollen and the rice so it's cuss was month from it T H three
0:10:28so it here the idea is just to to have the a bunch of colours here
0:10:32and for the
0:10:33soon or a little basics about "'em" or rice so M are i it's she we compute samples of of
0:10:38a a a a measure samples of the for yeah transform
0:10:41but if you have colours in multiple colour they also this sensitivity function so it's
0:10:46it's you are object X multiplied by a C
0:10:50a function that's a sense it to you but you of your cord and doing the for transform all all
0:10:54that
0:10:54and in power and the right you just put several core
0:10:58so you you can do the the measurement in parallel and hopefully this
0:11:01allows you
0:11:02to reconstruct images using
0:11:04fewer samples
0:11:06and so now here how can we exploit the structure of the problem so we need really to
0:11:10they those degrees of freedom that that left which is this diagonal weighting matrix
0:11:15so if you look at the pollen and more i what we have
0:11:18so for for each call here or you would have this sense
0:11:21T matrix which is just a a i uh a diagonal matrix followed by full you
0:11:26and now the problem here is this sensitivity can vary a not
0:11:30and
0:11:30on on you on your on your course
0:11:33and it's a pretty in a gene use and that makes the the problem not so well conditioned because for
0:11:38many years
0:11:39actually i are i would mean you
0:11:40and inverse for your transforms so with the unitary for
0:11:43vol
0:11:44oh rate in the
0:11:46problem for inverse
0:11:47a the imaging uh if inverse problem
0:11:50people but uh i mean with was that it becomes much uh uh less less well uh condition
0:11:56so now the idea of sensitivity based a precondition conditioning
0:12:01here uh
0:12:02okay so it's uh try to uh inverse he respect respect to to to the sensitivity
0:12:08just tried to get the better condition
0:12:11uh a system matrix
0:12:12uh in in this uh precondition
0:12:16them
0:12:16variables
0:12:17and so now if you think of it in the wavelet domain you could almost to the same but okay
0:12:23uh you just have to transpose your send
0:12:26i
0:12:26in the wavelet domain and an intuitive is just the way that is more less local so you would just
0:12:31i agree
0:12:32a sense C P T V T oh over the support or of the where let's
0:12:37and and and then get this kind of precondition E
0:12:40matrix
0:12:41here here and and and so that makes a a much better version
0:12:45after precondition version of the great
0:12:47so here are some examples i could also have shown you was real data are but here were just using
0:12:53the ship and look and don't because
0:12:55just the to you know compare the performance of the R so
0:12:58we had taking a a pretty realistic you a set up here with the
0:13:03uh for receiving corps some
0:13:05some noise here and and uh a certain number of of a radial a trajectory
0:13:11and where using uh a a reconstruction in the in the hard to domain
0:13:15using a two level haar transform and so this would be the traditional is to out
0:13:20so this chart soft thresholding
0:13:24and so this is the cost it but it would should
0:13:26oh of the cost function and this is the uh this is the the error with respect
0:13:32this solution
0:13:33actually
0:13:34exact
0:13:34sure
0:13:35optimization problem
0:13:36and so you see this kind of
0:13:38uh a typical uh it pollution of the classical uh i i mean the typical out with is not very
0:13:44fast
0:13:45and then
0:13:45if you do fist uh it's much five
0:13:48this is
0:13:49and square
0:13:50type of performance
0:13:52but now if you do this trick conditioning as as as we propose
0:13:56in this work
0:13:57we get more less the same slope but okay
0:14:00we get a
0:14:00or earlier
0:14:01and actually in practice this makes a huge difference because we may be happy here
0:14:06K with this kind of uh performance
0:14:10and so it can
0:14:10means we can
0:14:11get there uh at that much fun
0:14:14"'cause"
0:14:14as the number
0:14:15iteration
0:14:17and and same a here so maybe just to show you some uh uh uh examples of reconstruction so here
0:14:23have just showing you the evolution of of the solution
0:14:26was a number of iterations so that's the T star
0:14:29so it or too soft thresholding goes there very slowly
0:14:32fist fist close them very a much faster and if we have those appropriate weight we will there even faster
0:14:38so
0:14:39she here we chose a such that a
0:14:41uh we get yeah this quality here
0:14:44with this number of iterations so so we can
0:14:47a essentially the uh or or or here here's it what's okay to actually this is better quality of that
0:14:53so
0:14:53we can go go there for five
0:14:56this particular
0:14:58so why not
0:15:00yeah
0:15:01times
0:15:01that's sir and
0:15:02a a faster and and and
0:15:05anyway all those methods that uh
0:15:08optimising the same cost function
0:15:10okay okay so then these second application was
0:15:13but you wouldn't since tomography
0:15:16and so this is a a a a much harder inverse problem of this
0:15:20very very badly conditioned
0:15:22and hear what you have your you your you have some uh uh by
0:15:26hmmm hmmm this sense uh uh
0:15:28uh
0:15:30a Q markers within the specimen
0:15:33and and then a you are you i have like detector
0:15:36just the around the object of interest
0:15:39and here i mean you you have okay optical uh optical setup but
0:15:44i i i mean here the like this
0:15:46just
0:15:46if you
0:15:48and and in fact you have the diffusion equation so you can shall solve this diffusion equation using the green
0:15:53function
0:15:53and this will give you a a a a a system matrix and
0:15:57voice once to a a
0:15:59and near
0:16:00uh a model but
0:16:01three need badly condition because was green function
0:16:05so here the green functions actually look like that so it is it K like one over
0:16:09uh a a a normal hard to the power to
0:16:12and it happens
0:16:13be very well condition also
0:16:15so the idea of now is a by second call third row shot
0:16:19was try to
0:16:21you know use this flexibility of the precondition conditioning tried to try to
0:16:25you mode get a much better conditioning by just multiplying this by this
0:16:29diagonal matrix and so
0:16:30yeah the inside that
0:16:32if you few where a more this normal
0:16:34uh uh this a system matrix
0:16:36to make it like constant and energy
0:16:38it would produce a much better condition
0:16:41and and and so this
0:16:42this results in
0:16:44type of of
0:16:45set up for
0:16:47uh reconstruction of group
0:16:49and in this case of so here we have also some experiments use syntactic force just to demonstrate that it
0:16:55really works so we just have to for uh
0:16:57a by lee
0:16:58mean S and sources point sources
0:17:00so here we are imposing sparse
0:17:03state of because it's uh
0:17:05a very small sources here
0:17:07and here's a comparison of this three algorithms by the way also
0:17:10uh physically realistic modeling of different optical palm which is uh
0:17:15number of sources detectors et cetera also some noise
0:17:19and so here it's to a bit of course
0:17:21this slow
0:17:22the fist act was much faster and our i'll get
0:17:25was at the same the rate
0:17:27i the fist are but gets there much faster and
0:17:30in this case we have a ten fold
0:17:33you
0:17:34but essentially using the same out with just with this
0:17:37clever
0:17:38uh precondition E
0:17:39here are uh some is also here
0:17:41is the uh the source
0:17:44and and so this is the evolution of the
0:17:46standard i'll go so it has a
0:17:48no it takes a C is very badly conditioned so it's very very low
0:17:52it will take a long time to get there this is the fist a
0:17:56goes faster and this is the weighted fist or
0:17:59and and essentially here we also set it up so that you have have a correspondence your so it's a
0:18:05bit and fine
0:18:06or ten times is a lot
0:18:08oh
0:18:08for
0:18:09image
0:18:10so so then this makes it much more practical so that really gets me to the and here
0:18:14so the conclusion here
0:18:16uh in a principal we hope we can hope we get better quality
0:18:20taking advantage of of all those methods based on on sparsity so it has been demonstrated in the field that
0:18:26you get better
0:18:27instruction now we can also
0:18:29it might think speed and this is this
0:18:31a she's
0:18:32T Q was of
0:18:33different next generation strategy so the multi step update the adaptive weights
0:18:38and and so now present we we
0:18:41with a the factor
0:18:42so
0:18:43or even a little less of the best the
0:18:46you here taken
0:18:47we which are conjugate way
0:18:49and and so now it becomes very applicable
0:18:52now that when used for signal processing this you like it's not like you can our
0:18:56to requires you expert is to just find the right
0:18:59i i that this step and conditioning so that you algorithm
0:19:03we will work for us and finally generality
0:19:06because it's transpose able to many
0:19:08modalities as uh meal and was saying that X rate
0:19:11tomography more breathy action the lab we also use a uh working face
0:19:15thus the more graffiti
0:19:16also or S
0:19:17my process
0:19:18other modality
0:19:20so that's just you and
0:19:22thank you for your
0:19:23uh attention
0:19:28i have time for maybe one quick question
0:19:36just one if you can time i H i seven wavelet sparse in seven like total variation a their air
0:19:42you technique can be used X a total variation a construct
0:19:45yeah actually is uh a a a very good question which just the actually yeah i have a paper on
0:19:49that so that hopefully should come out in journal
0:19:52i
0:19:53the that so also
0:19:55uh uh there's a trick with wavelet uh a because where that's and not completely shift invariance so so
0:20:01uh two
0:20:02in fact if you just a use a a conventional wavelet and poke the valuation of the valuation works a
0:20:07little but
0:20:09but now
0:20:09if you allow for sure
0:20:11wavelet wavelet they something
0:20:12that's called
0:20:13cycle spinning
0:20:15we show that we can get
0:20:17as as good or even a little better than total the variation
0:20:21and it a in its much faster uh uh to a rate
0:20:25the way that you bed
0:20:26the total valuation where is all this
0:20:29a you do you are but of course people are also working on total variation and there also fast version
0:20:34of total variation
0:20:36but uh i i think it
0:20:38easier to get five our written
0:20:40uh with with
0:20:41the the sort of uh since this this formulation rather than and that is
0:20:45formulation but
0:20:47uh and and
0:20:48so the good news is this with way that we can it
0:20:50yeah that's with that
0:20:52oh
0:20:52so it's
0:20:53we have documented it with the M or i
0:20:57it's like my so once again things