thanks first to me and and joe for for organising

very interesting

session which also gives us a

the in

to be prime

uh so it's a will be talking to you about the reconstruction algorithms under sparsity constraints and so that kind

of relates also to

that's of activity going on in signal processing it

a compressed sensing in give you some tuition

uh in "'em" are i and by means

nee since uh tomography

um

and the

also i i mean in doing so i'll give you also kind of the art of for

uh i turn to a reconstruction algorithms that use the

uh

uh

regularization constraints uh by by the way of sparsity

so uh first to a we i i i i i will just

discuss a little the variational approach to image reconstruction will need to

minimize so of some

some uh a functional and and where we introduce some sparsity constraints then

i will read uh read you some of the standard algorithm in particular uh it

soft thresholding the uh uh the algorithm that's is stuff

the fast version of that which is a a kind of multi step i'll grid which is

gives you sort of the state of the arts

coming from convex a analysis

and then uh uh in this talk i i i i want to

i discuss our variation of used star uh which is sort of

you note a like to the problem which will use step adaptation to to gain uh some speed because the

cool here really to get to the speed of the conventional in your taking

is that have been

use for for many years

and then uh application to power L or i and and by mean this nest since the tomography

so here is uh just the set

so uh okay so we we have some object X that we'd like to to image for for example

a proton density or some absorption map

and then we have these scanner

here that uh we will just assume we have a linear measurement system which is the the usual assumption in

them are i

X ray and so it's an integral operator but to since here we have a discrete eyes the problem so

we can all represent body big fat matrix H

that uh represents all our physics knowledge of of the problem

and then uh at the outputs uh oops

the yeah i here a see we have some

modeling perfection and noise and and all that i i represent by oops

by be uh maybe i should be problem like lawn

knowledge knowledge that's professes where

oh

in the room

uh

okay so so uh uh a so here was saying we we have noise

and so our imaging matching model here looks very easy just the this uh imaging matrix up like to the

object that we like to recover

plus some noise

and then in the variational approach so you just set up this

kind of uh uh that this criterion here where you have a

data consistency requirements so we you you are kind of

uh reconstructing X

and then you are you're simulating your measurements and you would like this difference to to be very small

spec to the true measurements

and then here you it should use some regularization constraint and this is the arts

of reconstruction here to

to to just choose

the the the right trigger is regularization

and and and a classically here uh for example if we put a a quadratic term your was that's say

sum

but people operator here that

it's that's

tikhonov regularization but

now uh

i with compressed sensing actually all this was started by

the observation that so many images natural images would represent was uh well was very few large group fusion is

some basis

in particular wavelet basis and and in fact this happens as well

uh very significantly with medical images since so here's some "'em" are i image

and and just the showing different

transformations and and and and and so that you can get a pretty good uh a signal restitution but just

keeping being a very small percentage of of large coefficients and

and and so this this is really like used uh

to promote uh i mean to to sort of justify the you the use of wavelets and so here

then the idea ideas to use a regularization where you will impose a and one and that T

uh a on the wavelet coefficients so here are would uh represent let's say might wavelet transform

and one norm

uh uh uh promotes sparsity and why it's good its convex so so that we have like some

i guarantee you existence uh

this

oh

a solution and also it will favour sparse solution

okay so here is sort of uh the grandfather of of those uh a sparse a reconstruction written

and and and so uh what do what you want to do is to minimize this okay you have this

L two term

and this

sparsity constraint here i'm just as to me

X is the representation in the wavelet basis and i'm using

and augmented just the matrix that's sort of contents this

a a wavelet transform followed by by by by by by uh

i imaging device

so

then you have to very special case very easy completely classical also first if you put a day will zero

just the square problem so it's a a text solution is one like calculation you get that

uh and four she those i which may

way

a you at and

rob you

would not even want to compute that but then

this sort of the grand father of our britains here which is

but

and a grid

uh on on on to this data term

and and and i mean this is sort of a you know the big the backbone of many iterative out

and the i mean this works but the you have to two

to stop it after a certain number of iterations

but now the other interesting cases

H is i you or you are

like a wavelet transform in that case

fact this

problem you can by just going the wavelet domain

uh doing a a a a a a soft thresholding and

going

now it soft thresholding algorithm which was the uh discovered

well

many

people probably but uh

feed yeah they do in no of act yeah uh where where

among the very first here using your yeah formulation

and so the a kind of a start if you were just to thing those two very

three of

remarkably uh this would solve this problem

should look like to very difficult problem and then do a she the use them and two thousand four were

able to

sure that action

prove convergence

and so that that was uh a little revolution in the field

although i'm say the output myth

so maybe not so usable in that form

and now if you just look at the structure actually the fixed points uh algorithm

so uh what you have is a rule which is this a gradient

this sense followed by

based thresholding that will give you the new update and then there's this nice

mathematics that guarantees convergence

and fortunately it's very slow so

that's been shown now that uh the cost

will a decrease like one over and where and the number of iteration

but then by

uh building on many many years of full convex optimization then there was this proposition position in two thousand nine

of fist yeah

which is a like you control

over the relaxation

evaluation on this thing but

essentially what it is it's it's a an updating dating a G that not only text

uh

uh uh uh built upon the previous estimate but takes two

pretty

a so and minus one

and and and

and then will

a a construct a kind of

vacation

well

car uh the correct uh update should be

and then apply the is to out

so to to just make it

problem

trying to solve

and are remarkably this fist uh algorithm

use you uh uh and square

so it it's one or there

faster or and and and and so gives you

now a a a fast enough our

to to to apply this kind of techniques so

now what we have proposing here

it's some variation and and so we call it fast the okay

weighted iterative soft thresholding algorithm

and and the idea is pretty simple so we had the step size

or

but not your

yeah

getting factor for for the great so we replace that by a diagonal matrix

and actually we can also uh some

flexibility on the

a side of the regularization so

with the slightly more channel

regularisation matrix

and you she uh the this is uh or or style with my are derived using

uh max a uh uh uh and M

a a a a a a a as a sort of constructing a a a quadratic bound on on the

cost functional

and and then all using a you know always trying to go and the bottom of this quadratic bound no

if you have more flexibility here with

introducing this matrix

fact but you can do you can construct

better a quadratic bound that

re tailored to it to your problem

and so hopefully this should allow us to go faster

and then actually we we can do some have here

and and so we obtain a now this uh this kind of uh in a quite the on the cost

so no if you look at it superficially for mathematician doesn't look so much better because we still like what

are and where one or and square

but the important thing here we instead of having

a in a norm we have

weighted

and actually this i for the call

oh

and now it means that clever engineering can really like from down those calls

quite significant and and then

can uh and that a be much fast

so the application here i

just want to

a show you two examples by by the way i forgot to knowledge my

cool all is okay and who the want to

did all the work and and this is the first one matt i'm much should

just can K on is working on product and more i actually in collaboration

with the guy who invented the pollen and the rice so it's cuss was month from it T H three

so it here the idea is just to to have the a bunch of colours here

and for the

soon or a little basics about "'em" or rice so M are i it's she we compute samples of of

a a a a measure samples of the for yeah transform

but if you have colours in multiple colour they also this sensitivity function so it's

it's you are object X multiplied by a C

a function that's a sense it to you but you of your cord and doing the for transform all all

that

and in power and the right you just put several core

so you you can do the the measurement in parallel and hopefully this

allows you

to reconstruct images using

fewer samples

and so now here how can we exploit the structure of the problem so we need really to

they those degrees of freedom that that left which is this diagonal weighting matrix

so if you look at the pollen and more i what we have

so for for each call here or you would have this sense

T matrix which is just a a i uh a diagonal matrix followed by full you

and now the problem here is this sensitivity can vary a not

and

on on you on your on your course

and it's a pretty in a gene use and that makes the the problem not so well conditioned because for

many years

actually i are i would mean you

and inverse for your transforms so with the unitary for

vol

oh rate in the

problem for inverse

a the imaging uh if inverse problem

people but uh i mean with was that it becomes much uh uh less less well uh condition

so now the idea of sensitivity based a precondition conditioning

here uh

okay so it's uh try to uh inverse he respect respect to to to the sensitivity

just tried to get the better condition

uh a system matrix

uh in in this uh precondition

them

variables

and so now if you think of it in the wavelet domain you could almost to the same but okay

uh you just have to transpose your send

i

in the wavelet domain and an intuitive is just the way that is more less local so you would just

i agree

a sense C P T V T oh over the support or of the where let's

and and and then get this kind of precondition E

matrix

here here and and and so that makes a a much better version

after precondition version of the great

so here are some examples i could also have shown you was real data are but here were just using

the ship and look and don't because

just the to you know compare the performance of the R so

we had taking a a pretty realistic you a set up here with the

uh for receiving corps some

some noise here and and uh a certain number of of a radial a trajectory

and where using uh a a reconstruction in the in the hard to domain

using a two level haar transform and so this would be the traditional is to out

so this chart soft thresholding

and so this is the cost it but it would should

oh of the cost function and this is the uh this is the the error with respect

this solution

actually

exact

sure

optimization problem

and so you see this kind of

uh a typical uh it pollution of the classical uh i i mean the typical out with is not very

fast

and then

if you do fist uh it's much five

this is

and square

type of performance

but now if you do this trick conditioning as as as we propose

in this work

we get more less the same slope but okay

we get a

or earlier

and actually in practice this makes a huge difference because we may be happy here

K with this kind of uh performance

and so it can

means we can

get there uh at that much fun

"'cause"

as the number

iteration

and and same a here so maybe just to show you some uh uh uh examples of reconstruction so here

have just showing you the evolution of of the solution

was a number of iterations so that's the T star

so it or too soft thresholding goes there very slowly

fist fist close them very a much faster and if we have those appropriate weight we will there even faster

so

she here we chose a such that a

uh we get yeah this quality here

with this number of iterations so so we can

a essentially the uh or or or here here's it what's okay to actually this is better quality of that

so

we can go go there for five

this particular

so why not

yeah

times

that's sir and

a a faster and and and

anyway all those methods that uh

optimising the same cost function

okay okay so then these second application was

but you wouldn't since tomography

and so this is a a a a much harder inverse problem of this

very very badly conditioned

and hear what you have your you your you have some uh uh by

hmmm hmmm this sense uh uh

uh

a Q markers within the specimen

and and then a you are you i have like detector

just the around the object of interest

and here i mean you you have okay optical uh optical setup but

i i i mean here the like this

just

if you

and and in fact you have the diffusion equation so you can shall solve this diffusion equation using the green

function

and this will give you a a a a a system matrix and

voice once to a a

and near

uh a model but

three need badly condition because was green function

so here the green functions actually look like that so it is it K like one over

uh a a a normal hard to the power to

and it happens

be very well condition also

so the idea of now is a by second call third row shot

was try to

you know use this flexibility of the precondition conditioning tried to try to

you mode get a much better conditioning by just multiplying this by this

diagonal matrix and so

yeah the inside that

if you few where a more this normal

uh uh this a system matrix

to make it like constant and energy

it would produce a much better condition

and and and so this

this results in

type of of

set up for

uh reconstruction of group

and in this case of so here we have also some experiments use syntactic force just to demonstrate that it

really works so we just have to for uh

a by lee

mean S and sources point sources

so here we are imposing sparse

state of because it's uh

a very small sources here

and here's a comparison of this three algorithms by the way also

uh physically realistic modeling of different optical palm which is uh

number of sources detectors et cetera also some noise

and so here it's to a bit of course

this slow

the fist act was much faster and our i'll get

was at the same the rate

i the fist are but gets there much faster and

in this case we have a ten fold

you

but essentially using the same out with just with this

clever

uh precondition E

here are uh some is also here

is the uh the source

and and so this is the evolution of the

standard i'll go so it has a

no it takes a C is very badly conditioned so it's very very low

it will take a long time to get there this is the fist a

goes faster and this is the weighted fist or

and and essentially here we also set it up so that you have have a correspondence your so it's a

bit and fine

or ten times is a lot

oh

for

image

so so then this makes it much more practical so that really gets me to the and here

so the conclusion here

uh in a principal we hope we can hope we get better quality

taking advantage of of all those methods based on on sparsity so it has been demonstrated in the field that

you get better

instruction now we can also

it might think speed and this is this

a she's

T Q was of

different next generation strategy so the multi step update the adaptive weights

and and so now present we we

with a the factor

so

or even a little less of the best the

you here taken

we which are conjugate way

and and so now it becomes very applicable

now that when used for signal processing this you like it's not like you can our

to requires you expert is to just find the right

i i that this step and conditioning so that you algorithm

we will work for us and finally generality

because it's transpose able to many

modalities as uh meal and was saying that X rate

tomography more breathy action the lab we also use a uh working face

thus the more graffiti

also or S

my process

other modality

so that's just you and

thank you for your

uh attention

i have time for maybe one quick question

just one if you can time i H i seven wavelet sparse in seven like total variation a their air

you technique can be used X a total variation a construct

yeah actually is uh a a a very good question which just the actually yeah i have a paper on

that so that hopefully should come out in journal

i

the that so also

uh uh there's a trick with wavelet uh a because where that's and not completely shift invariance so so

uh two

in fact if you just a use a a conventional wavelet and poke the valuation of the valuation works a

little but

but now

if you allow for sure

wavelet wavelet they something

that's called

cycle spinning

we show that we can get

as as good or even a little better than total the variation

and it a in its much faster uh uh to a rate

the way that you bed

the total valuation where is all this

a you do you are but of course people are also working on total variation and there also fast version

of total variation

but uh i i think it

easier to get five our written

uh with with

the the sort of uh since this this formulation rather than and that is

formulation but

uh and and

so the good news is this with way that we can it

yeah that's with that

oh

so it's

we have documented it with the M or i

it's like my so once again things