0:00:19thank you
0:00:28oh
0:00:35a
0:00:35all the slide
0:00:42oh okay
0:00:43so it's uh
0:00:44meaning and
0:00:45on
0:00:46about fifteen slide
0:00:47separating us from
0:00:49go check
0:00:50so
0:00:50as all of it
0:00:52uh this
0:00:53uh work
0:00:54that the uh a sparsity technique K in order to take the problem of uh
0:00:59C reconstruction from a noisy data
0:01:02it was down uh yeah or uh joint it was my piece two supervisors makes it out than then because
0:01:08have ski
0:01:10a snapshot of the resulting a technique
0:01:13is a
0:01:14at the uh the low part of the slide
0:01:17yeah
0:01:18after the
0:01:19offline training is a to the previous uh work here that
0:01:24chaining of the data is not offline
0:01:26and then uh uh are we use a standard to construction all
0:01:30uh algorithm such as uh filter the back projection
0:01:33uh but the
0:01:34before that a some uh noise reduction and that's sending on them in the sending domain is that perform
0:01:40uh using a like S and the sparse representations
0:01:44and finally we get there
0:01:46finally
0:01:47so that's a snapshot and uh i'm getting that the T
0:01:52but this recall of some basic uh model of the in
0:01:56so that that could
0:01:57have a foot the big
0:01:58we have uh
0:02:00in this case it today
0:02:02but no us slice
0:02:04uh of a male had which uh
0:02:06um i'm scanning with X
0:02:08so each rate
0:02:10is is um
0:02:11has that some in shell uh an initial intensity of i zero four dollars per unit time
0:02:17and uh as the ray travels to this so the body
0:02:20it uh the the the four ones are sort the it's so that issue
0:02:25oh the final number of of for we count
0:02:27how just east estimate the line integrals was uh
0:02:30attenuation map
0:02:32and the this is actually there i don't transform to two-dimensional radon transform of the
0:02:36of that initial map
0:02:38and the what you're measuring uh are there
0:02:41approximations of uh a line integrals up to uh look
0:02:45well four
0:02:47now in the a does it's can when when we want to keep this
0:02:51number i as a little and i zero
0:02:54keep it low
0:02:55yeah we have a a deviation from the a correct ideal number of uh for tones
0:03:00the be count
0:03:01therefore the two measures measurements are
0:03:04um model those so on the variables
0:03:07instances of possible variables was that uh
0:03:10uh a parameter that uh them that which is the bows the expectation of this variable
0:03:15and their variance of
0:03:16so the higher the
0:03:18had had a that the time that the and number of that most count
0:03:22uh the that's nice we
0:03:24so we have here a
0:03:26day they the trade between a
0:03:29a a good image is want to get a and the
0:03:32radiated and sick patients that we
0:03:34the two will
0:03:34have you free
0:03:36try to improve they image them
0:03:40oh
0:03:41so
0:03:42number of things can but can be done in order to reduce the does it's in the in this can
0:03:46is first as to use algorithms which are which were
0:03:49specifically designed to
0:03:51acknowledges this a statistical model
0:03:54uh so there is a
0:03:56uh there is uh
0:03:58a a map uh objective which is minimised
0:04:01i directivity
0:04:02and the usually is those algorithms are quite slow
0:04:05uh despite by the fact that uh they do
0:04:08yeah truly improve the performance of uh
0:04:11such a basic uh are or assist few to by projection or
0:04:15other similar pressed methods
0:04:17which are currently
0:04:18to really are used in there and the clinical ct "'em" is K
0:04:23another way to reduce drastically there
0:04:27uh the amount of eric's will deletion
0:04:29he's not to eliminate the entire uh the entire had
0:04:33but if you want to you look on at the small region on the in a a a a the
0:04:37head
0:04:37we can get the where was radiating on the on that region plus some
0:04:42small additional amount of data
0:04:45uh theoretically we mask ready at a i i'd to the whole had in order to
0:04:49the recover you been one picks still because the there don't transform is not uh is not local but in
0:04:55practice we can
0:04:56yeah to by was much this radiation and you get a good image of some region
0:05:01there are a
0:05:02a special the algorithms to do that
0:05:05we want to consider this scenario one where we do a scan that are had
0:05:09and the you know it to improve the result of to the by projection
0:05:14we perform some uh sign enormous asians some processing of a set run before uh
0:05:19uh we can applied
0:05:22so just uh
0:05:24we called of your of there
0:05:25images that we get along the way
0:05:27uh from their ask and had we get a perfect sine gram
0:05:31where are was uh
0:05:33uh which is computed from
0:05:35but on measurements was a log transform
0:05:37then we have a a data dependent noise
0:05:40which is at that because of a a little for don't count to the system
0:05:44and from here we want to recover
0:05:46the um
0:05:48the the final image
0:05:52okay now uh i'm we define the goal want to get to the method
0:05:56that that via a attacking the problem
0:05:59uh i see that the
0:06:02some them them image was done to this that to the symbols here so this is uh
0:06:06you uh almost people sign
0:06:10um
0:06:11you know that the you three i try to decompose natural signals in such a a
0:06:15yeah yeah
0:06:16yeah frames a to of let's or discrete con signs or the for transform
0:06:21we have a rapid decay of the coefficient
0:06:24so uh uh similar we want to
0:06:26take a model which where we assume that only a few non-zero coefficients are needed to represent the signal well
0:06:33here the signal for this case is the small uh uh quadratic page for the image
0:06:38we put it the as a straight vector
0:06:40and what that we want to represent that as uh the product
0:06:43of uh matrix D
0:06:44by the present a presentation i far
0:06:47which she where the D is redundant so that i with a much longer
0:06:50but we only use few nonzero uh a few if you comes from D
0:06:55and the we both want a sparse vector L zero norm
0:06:59measure the sparsity
0:07:01but that the number of nonzero uh elements
0:07:03and so that the residual at a
0:07:06would be small
0:07:11um based on this principle there is a noise reduction technique from uh for uh for standard the signal image
0:07:18processing
0:07:19developed by a lot and a on in uh and i that the something six
0:07:23the define uh this objective function which contains three
0:07:27basic turn
0:07:29first time is uh is if you don't to term
0:07:32uh which you compare is the noisy image F T today
0:07:36was supposed to be and the
0:07:38while which try to recover at
0:07:40a second one uh uh
0:07:42request that the
0:07:43all the representations are a
0:07:46uh
0:07:47the J a the J runs over all the small patches in the an overlapping patches
0:07:52and it J yeah operator extracts a small patch from have
0:07:55and it here it is compared to
0:07:58uh a sparse and coding
0:08:00uh uh in in a form of the times i for G
0:08:03so i the J is a sparse presentation
0:08:06which she L zero norm is small
0:08:08and also the residual
0:08:10the difference a L two norm normal difference
0:08:12is the
0:08:13a required to be small
0:08:15how this a
0:08:16this equation is sold online after the
0:08:19and noisy noisy image is a so the dictionary D
0:08:22and this that the representations are
0:08:25boast lower and only from the noise addiction
0:08:27there is also a of and and uh uh
0:08:30and of to procedure with training images
0:08:33are you
0:08:34so here a uh is what's a what's called a case the algorithm
0:08:38we minimize for the second and that certain that
0:08:41and third relevant turns for D and i'm five
0:08:44uh there are two steps
0:08:45to to do it
0:08:47we optimize for are five and for the i directory
0:08:51and to compute the odd us giving a a dictionary D
0:08:54with perform what's called the sparse code
0:08:57we want to find their the the sparse just are a
0:09:00so
0:09:01so uh under the condition that a threshold
0:09:04but uh uh uh a different um
0:09:06and norm of the residual
0:09:07is below some threshold a epsilon J
0:09:10this is done uh use and and that prior approximate and go algorithm
0:09:14a pursuit algorithm such that uh a a a a uh orthogonal matching pursuit or P
0:09:19or other uh you don't go
0:09:21a second stage in the in the saturation is addiction of date
0:09:25it does not relevant to might dog so i'll skip
0:09:29finally any we have the both dictionary and they presentation
0:09:32we can compute the that you image using the first and the sir
0:09:36relevant terms for the image
0:09:38the there is actually a closed form a to to solve this uh
0:09:42equation so it is not quite quick
0:09:47okay
0:09:48and now this technique which is by the were quite efficient for noise reduction in image images
0:09:53uh was used
0:09:54but a couple of years ago by D appear sapiro
0:09:57to uh to to
0:10:00produce a reconstruction of an image from um
0:10:03yeah from a C
0:10:04so it is basically the same question is we so a minute ago
0:10:08except the fidelity delta term
0:10:09compare the noise assigning signing and you do that
0:10:12and the image this sort image have which is uh
0:10:16a transform by their at done transform
0:10:19um
0:10:20well first of should say that the uh this uh paper the shows some
0:10:25very impressive to the results
0:10:27a a a a on the yeah uh image which images to was uh region mention structure
0:10:31and there a severe conditions of of of the partial data of the used very few
0:10:36projections
0:10:37uh but the
0:10:39uh
0:10:40in principle there are few
0:10:42uh problems in this in this equations which we want
0:10:46but to try and the
0:10:48uh repair in a different uh set
0:10:50so first of all know what is the
0:10:52the the use the L two norm in the in the P do to to
0:10:56which actually means that the assumption is that the noise
0:10:59is a home a genie
0:11:01a however do know that uh in the sinogram domain the main the noise does depend on the data
0:11:06more more than would know exactly how does the we know the variance of the noise so
0:11:11this can be used
0:11:13and the um the second problem was that for that the term is
0:11:16we actually want to get a good the
0:11:19a a or uh a low error in the
0:11:21and the image domain main is it images of
0:11:24step of that we are
0:11:25it decrease in the error or be in the signing gram the domain
0:11:28and since the i don't transform is ill condition
0:11:31a
0:11:32this does not tell as much about what you image yeah
0:11:35we are we're are seeing Q
0:11:37the second problem is the
0:11:40which is also a model but the also was that the we can not
0:11:43surely obtain these coefficients you G
0:11:46um S is the are related directly
0:11:50to the um
0:11:51to the uh thresholds i'd uh on J we don't really use
0:11:55these new J's
0:11:56but for each you for each batch we we need to know what is the expected that uh error energy
0:12:02so that
0:12:03to the put here the correct threshold
0:12:06if we don't know the noise statistics and we do not know the noise statistics in the a ct image
0:12:10domain
0:12:11because there are uh after the reconstructions the noise is
0:12:15quite complicated
0:12:16we can not
0:12:17compute these thresholds uh
0:12:19a a quite right there are some estimation techniques but the the don't be a was uh uh a very
0:12:24good result
0:12:26so
0:12:27we are trying to solve the problem
0:12:30yeah shifting the different this their own where we do have a a few the back projection
0:12:35not is the
0:12:35the this concept does not use and any reconstruction the just of the in the a minimizer of this a
0:12:41equation
0:12:42and we do use and offline algorithm which you
0:12:45does the provides learning was trained
0:12:49so
0:12:50uh
0:12:51we
0:12:52she from the image domain
0:12:54but sparse coding was done or to the sending gram the mean
0:12:57and a want to stick code the the pitch is of sending gram instead of the image
0:13:01uh
0:13:02so the
0:13:04the panel to that the be are seeking for
0:13:07is uh should should be in the image domain "'cause" are we can require for some
0:13:11uh a nice properties um
0:13:13of the image that they want to
0:13:16we what we using an offline training stage which on the one hand to be yeah requires some training images
0:13:21on a another hand
0:13:23it makes the algorithm very fast because all the heavy work is done once the you just can or is
0:13:28initiated
0:13:29and then the in the reconstruction stage
0:13:31it's
0:13:32almost all most just just
0:13:34almost as fast as the a F P P
0:13:38so
0:13:38the algorithm uses a set of uh a reference images high quality ct images
0:13:43and the also uh corresponding glottal signing grams
0:13:47should be applied
0:13:49such that can be obtained the uh for instance use and using cadavers or find terms which can be scanned
0:13:54without the
0:13:56any any hesitation is to uh the radiation dose
0:14:00okay so the algorithm goes as follows
0:14:03we use the um
0:14:05the the case we de algorithm to train
0:14:08for for but very similar
0:14:10equation is well before but the mix of extract the patches of sending gram and not of the image
0:14:17except them from that is uh it is just the send question
0:14:20uh equation uh except from the very
0:14:23yeah important difference that we don't need a a different uh coefficients for uh for different presentations
0:14:30here no it is it's
0:14:31a weighted L two norm was the spatial matrix W able
0:14:35detail on in uh okay and next slide
0:14:37so this uh help us to normalize the noise over all the pitch
0:14:43okay so we do and coding was a fixed the threshold Q
0:14:47which is actually the size of page a number of pixels
0:14:51uh a and the noise is normalized corresponding to so that this will work
0:14:55okay now
0:14:57and once once none of that we have a dictionary
0:15:00a and the set of us uh that representations
0:15:03is that the uh
0:15:04help us though and it the produced in good sparse and coding for sending
0:15:09uh one could stop here and use this dictionary to uh to improve the center of in the future
0:15:16but think again that want the in the penalty to be in the image domain in here
0:15:20we are uh we're not comparing to the
0:15:23uh well actually not comparing to anything we just acquiring requiring that
0:15:28for each patch that would be a good sparse code
0:15:31so the second step is to take these representations
0:15:34and the to make
0:15:36uh the dictionary a better able
0:15:38now
0:15:39this uh
0:15:41expression in in here is actually the or there is stored sending graph
0:15:44i take this sparse encoded patches
0:15:47the times i've the J
0:15:49yeah return then to the my and a gram metrics
0:15:52and finally the M inverse uh accounts for their but uh patches overlap
0:15:57so this is the sign a gram
0:15:59after i remove all the unnecessary
0:16:02uh i necessary noise
0:16:04and D is the
0:16:06some field that is some construction algorithm
0:16:09we wanted to be linear for this equations the be sort of the is but it can also be known
0:16:14in near you have uh
0:16:16i if this can be still so
0:16:18so it can be the feel be projection or some other a
0:16:22uh in your uh algorithm like that sounds like a the free inverse verse transform
0:16:28i mean for the free uh algorithm for the
0:16:30uh inverse to london
0:16:33okay so a here D is uh is the linear function in terms of a of the data provided here
0:16:40so of the
0:16:41this L to more use the easily minimize for the using good
0:16:45so you could you could writing
0:16:48and
0:16:49all this is an offline training which prepares as these two dictionaries D one and D two
0:16:54and the in the in the second training stage we compares the reconstructed image
0:16:59with the original one
0:17:00and then
0:17:02we use a a also a weighted to L two more
0:17:05which she allows us to
0:17:07demand specific things about the error that we are we are we are observing this is a construction error in
0:17:12this
0:17:13in this term
0:17:14and um
0:17:15and the matrix Q allows us to do some things which ill which i will also shown in a in
0:17:20a couple of men
0:17:23and meanwhile while how do i you use this the uh a train data
0:17:28given then you noisy uh
0:17:30uh uh a G till that
0:17:32i idea compute the computer presentations using the sparse coding was the the same threshold Q
0:17:38and the diction do you one
0:17:40and then this presentations are used to encode
0:17:44is the the a gram the restored sinogram a jury
0:17:47okay this is the same for mil
0:17:50finally when a have that uh the center gram i applied the
0:17:53you the projection to to compute the the
0:17:58now
0:17:58uh what are the matrices W and matches
0:18:02make matrix Q or was talking about
0:18:04you know to build
0:18:05uh
0:18:06a good the note the not normalize is the the the error in the centre of the domain so that
0:18:11all these the differences would the all the same
0:18:14yes a yeah
0:18:16a you need to buy you need one a energy i need to recall what are the statistics of the
0:18:21of the noise in
0:18:23so one using the their statistical model introduced the in the beginning
0:18:28one can deduce
0:18:29but that the the variance in each location of the sound of gram difference between the ideal sending a very
0:18:35and the
0:18:36measured one
0:18:37is uh approximately uh a in verse to the true photon count
0:18:42a and the and
0:18:44we don't have it but we can
0:18:45hey a use a good approximation by the mel for count
0:18:50so when a
0:18:51yeah multiply
0:18:52by one over this the variance
0:18:55i have a a uniform noise no in the in the uh
0:18:58in the centre gone the so
0:19:00is summing over the Q
0:19:02yeah yeah as and the patch i
0:19:04expect the energy to be just
0:19:06Q
0:19:07and therefore i can take this uh weight matrix as a diagonal matrix C a containing the initial photon count
0:19:14in know to to uh produce a to do uh
0:19:17correct sparsity decoding was a on thrash
0:19:22and now as as uh a to the question of uh
0:19:26what
0:19:26kind of error measure i using
0:19:28in fact to was this uh with this slide i'm more hoping for uh your help that the coming to
0:19:34tell the something "'cause" the this is something come
0:19:37i stumbled upon that the
0:19:39wasn't quite considered in the literature
0:19:41"'cause" not much of the supervised learning in the C reconstruction was done so far
0:19:46i would like to think of uh and error measure
0:19:50it which you can be designed using good a good quality reference image
0:19:53which could we which which would help me to promote a a a good properties of the reconstructed image
0:19:59for instance if i'm not looking at the had
0:20:02i'm seeing a regions of bound and re in the regions of air
0:20:06uh which are um
0:20:07uh
0:20:08not uh not necessary for my reconstruction if i'm interested only in the soft tissues there is a a great
0:20:14to the dynamic range of C videos
0:20:16he read is about on south than for five hundred
0:20:19here it is a mine one thousand and i one plus minus third
0:20:23so in the in the is in the special L by by design
0:20:27i remove those regions
0:20:29it can be seen
0:20:30in this is a in the piecewise constant a
0:20:34phantom here that
0:20:35those regions are completely removed from the map then the
0:20:39are not uh
0:20:40i i don't i don't come then when a a a a a reduce my i all that the rest
0:20:45of the image
0:20:46also i would like to
0:20:48to emphasise the uh that's the the the edges just of uh of small tissues
0:20:55so that here all the you know an sis will be
0:20:57uh
0:20:58a good a good uh we'll build a good quality
0:21:02so overall i E upper the i design such weighting map
0:21:05and the maybe there are would other designs that can the proposed
0:21:09and with the uh with respect to this map
0:21:11i can obtain a visually better
0:21:14uh images
0:21:16uh um finishing in just one minute
0:21:18all the just like the shows some uh um based some um
0:21:21results
0:21:22this is the piecewise uh
0:21:24find tom is the um random L
0:21:26a strong all about it
0:21:28this is the reconstruction was standard fit the back projection
0:21:32where it their uh of the parameters
0:21:34uh a cut the frequency was optimal each known
0:21:37uh this is compared to to our algorithm
0:21:41and to double those you'll to but projection the result
0:21:45which E which use
0:21:46twice as much for tones
0:21:48and the by the signal to noise ratio measurements
0:21:51yeah white one can observe the these are
0:21:54more is the same
0:21:55also there are a some our results on clinical images this is a head section from a visible human uh
0:22:01source
0:22:02and this again is the egg F bp P algorithm which is
0:22:06a little bit noisy
0:22:07here i can uh recover the final fine details
0:22:11much better
0:22:12which she also use a a roughly just about what we can do was a double those
0:22:16in the if to by projects
0:22:19so in to summarise
0:22:21we we can see that sparse presentations
0:22:23can well already were you
0:22:26uh for for computed tomography and the they can produce a very good results
0:22:30white well taking not
0:22:31much of the computational effort
0:22:34and they can be easily incorporated in the
0:22:36existing clinic kind as "'cause" it is just a matter of maybe replacing the soft
0:22:42uh
0:22:43so that's is it for now and thank you very much
0:22:50recent to use the return my phone
0:22:54and
0:22:55question
0:23:00no but
0:23:01if a break dct
0:23:05yeah oh
0:23:06oh
0:23:07you a balloon or to make you
0:23:13X exposed to of a construct a and and and
0:23:20just uh
0:23:21and i still and to explain to uh
0:23:27i mean cell has has a done
0:23:30yeah
0:23:30it's a reading can radiating a small neighborhood of the region of interest and only if few global projection so
0:23:36that the low frequency can all source to
0:23:40oh that there are good it's
0:23:42one is a question you know come
0:23:47you know i
0:23:49and
0:23:49yeah approach of just what comes to a are right
0:23:53i of getting explain quite so it's not like
0:23:59you ask if my approach can be used for for that uh race
0:24:03a for uh construction from partial date
0:24:06if if i only if i only measure that
0:24:08a race sort through the region of interest if i can use the this technique
0:24:15yeah
0:24:17i i a i like should i yes because my technique is local i only i i i think local
0:24:23in the sending them take a small patches and what our work on that
0:24:27so uh even if a if the if i have a partial data by and their i
0:24:31have some method of way
0:24:33of of uh of dealing with a like and the extrapolation which is usually used
0:24:37i can still uh
0:24:38work on the available power data and the some uh
0:24:42a a preprocessing in know though to get better or uh
0:24:45uh but S now there before i does the
0:24:48uh apply that that a algorithm so yeah you're right those things can be combined
0:24:53okay okay you know mark
0:24:57yeah we got all speak now more so on
0:25:02point five
0:25:04i