0:00:13a very much good afternoon my name is down you well are to be talking to you a to day
0:00:17about combining press and see what parallel M are i
0:00:20and for gonna compare
0:00:22with uniform and random cartesian undersampling pattern
0:00:25"'kay" space
0:00:26so first me introduce magnetic resonance image
0:00:29a attack resonance imaging
0:00:31is a very versatile
0:00:32in growing imaging modality both the madison
0:00:36in spectroscopy and others
0:00:38and in particular we can get images of all kinds of organs of four bodies et cetera
0:00:44uh for example we have
0:00:46a a a a a
0:00:47uh a three D volume of a
0:00:49brain i T one weighted image here
0:00:51yeah and however despite all the advancements in the last thirty is some odd years and am are i acquisition
0:00:57time so remains an issue for many times acquisitions for instance this image took between eight in ten minutes to
0:01:02acquire in a three test the mac
0:01:05now if we can get a faster acquisition we can lower the cost of
0:01:08uh the uh uh are imaging first
0:01:10subjects slash patients
0:01:12uh we can increase the comfort for those subjects because they're not in the scanners long and we can possibly
0:01:17also improve the tradeoffs an increase the quality of our image reconstruction
0:01:22and to that and do uh we propose a method called spring which combines the parallel imaging method known as
0:01:28crap but
0:01:29with
0:01:29compressed sensing slash sparsity
0:01:32and we use that cover images from accelerated that is under they
0:01:37now
0:01:38we
0:01:38it can investigate several different undersampling sampling strategies in the kind of
0:01:42uh compressed sensing all the conventional wisdom is that random undersampling is required
0:01:47however we show that with the addition of parallel imaging in certain situations uniform undersampling
0:01:53is sufficient
0:01:56so let's talk about case space quickly so case space basically were first to the fourier a transform domain which
0:02:01the signals are actually acquired
0:02:02then we can normally take an inverse fourier transform a shown here to take the case space data
0:02:08in recovery images or in the three D case of value
0:02:11now we're gonna sample case space using a a after scanning a a approach
0:02:16uh we're gonna get a cartesian volume
0:02:19here where we essentially actually have a readout direction as the after scanning direction not sometimes called frequency oh
0:02:25and we're going to assume that this direction is
0:02:27transverse to the axial slice plane
0:02:30slice but something called the transverse
0:02:33now the axial slice plane we're going to acquire
0:02:36uh in two dimensions here a set of lines
0:02:40so uh
0:02:41and we can but a bound points in the plane
0:02:43yeah we're going to under sample the points because it takes too much time to acquire a whole whole you
0:02:50so for instance a we under sample by two in two directions we can reduce the total amount of ski
0:02:55time by a factor
0:02:56for
0:02:57and we would get and aliased image if we just did inverse fourier transform because of the undersampling that's actually
0:03:03looming under sample reduces or field of view that we we get overlap
0:03:07and the reconstruction
0:03:10now the deal with this there too general methods that the proposed over the years
0:03:14other probably others to but i to focus on be used
0:03:17so for small there's parallel imaging
0:03:19in parallel imaging essentially we can have a multiple uh a set up such as the three two channel quayle
0:03:24shown you
0:03:25and the thirty two channel called all basically a gets images
0:03:28from
0:03:29a a a a a a whole ray
0:03:31of coils and each coil gets a slightly different we image basically due to its spatial position
0:03:37very
0:03:38so prop but is a method that basically takes
0:03:41all these
0:03:42a all data in case space under undersampled data
0:03:45uses a small block of additional
0:03:47calibration line
0:03:49called a cs S lines
0:03:50to calibrate a kernel which is then used to fit in all the missing K space and that's on do
0:03:54the aliasing
0:03:56now compressed sensing works a little different way and that and step
0:03:59a designing a specialised observation model we're designing
0:04:02a prior for are image for or image is sparse in some domain
0:04:07for instance
0:04:08the brain images i showed for
0:04:10we could consider them approximately sparse or compressible
0:04:13in a domain like the for level nine seven do W T
0:04:17now the fourier transform is nice because it provides incoherent sampling
0:04:21a i to be under sample
0:04:23and have have the random fashion
0:04:25and and therefore for using the sparsity the incoherent slash random sampling
0:04:29and an amateur algorithm like of one magic or others adapted to the complex that the data we have here
0:04:35uh we can do compressed sensing reconstruction
0:04:38now a more abstract lee what we can think about this problem
0:04:42i have in
0:04:43in not simple presentation as a multichannel sampling problem
0:04:46where essentially we can incomplete data
0:04:49so we have that the image you that we desire the we a i
0:04:52yeah this image is multiplied in the image domain by a much these coil all sensitive D so those weightings
0:04:57i mentioned in the parallel imaging setup
0:04:59and then there's sample that for a transfer domain knows that their sample the same for a transform court and
0:05:04it's
0:05:05for all the oil
0:05:07and then we're assume there the data is perturbed by some amount of additive white noise
0:05:12now we assume the noise well white the cross frequency is actually some correlation across coils so we can measure
0:05:17this noise covariance
0:05:19using he simple fast priest king
0:05:24now
0:05:25what's look back a parallel imaging for second the motivate the spring work
0:05:29so parallel imaging methods such just grab a we're great at low undersampling factors in fact today clinically we can
0:05:34do
0:05:35uh acceleration factor a between two and four and still get high quality images is many application
0:05:40however as we further accelerate our scanning process to say nine sixteen twenty five thirty six times over undersampling
0:05:48our proper results clearly a D grade and receive the noise essentially blows up
0:05:53yeah this noise amplification is well is a one of phenomenon in parallel image
0:05:58now there's additional form of error that i wanna just mentioned because it's important
0:06:01to this talk
0:06:02and that as residual aliasing so you can't really see that in the use images but you can imagine that
0:06:07i due to limitations are coil is that the on aliasing from the graph of kernels not going to be
0:06:12exact if are coils
0:06:14are don't have enough a spatial variability to duplicate a high frequency complex exponentials we need the synthesized the frequency
0:06:20shifts in case space
0:06:23now
0:06:24we know the noise here is not sparse
0:06:26yeah and therefore for promoting sparsity can D noise or images
0:06:30now compressed sensing can also undo the incoherent aliasing here
0:06:33so if we have a saying
0:06:35unlike like what we saw before but a leasing from random undersampling compress i think a on that as well
0:06:41so that kinda motivates
0:06:42using spring to improve this crap a result using compressed sensing or sparsity
0:06:49so we have
0:06:50two possible a general types of undersampling sampling frameworks works many others
0:06:54now a uniform undersampling we motivate by the time prototypical grapple a
0:07:00method expect a uniform undersampling
0:07:02and drop can take uniformly undersampled data and a on it we use it up to uh P alias
0:07:08so basically if you have a capital P oils
0:07:11it can
0:07:13in fury and you up to an acceleration factor of
0:07:16are equals P
0:07:18however beyond on that you get a cold here artifacts
0:07:21now if we shift to a random undersampling set the application grout but isn't and so straightforward and will address
0:07:26that shortly
0:07:27but we can still in theory are really as
0:07:30a frequency shifts up to capitol P
0:07:33of course if we have random undersampling we may have areas that are very sparse sampled and for this may
0:07:38not be sufficient
0:07:39however can here is that are not able be reconstructed exactly be typically get not sparse were incoherent part of
0:07:46now compressed sensing
0:07:48but we apply with uniform undersampling were violating kind of one that basic of requirements compressed sensing at least in
0:07:54theory
0:07:54uh we get a very limited on a listing for that reason however a press thing or rather a regular
0:07:59rising the sparse C still very capable D voicing images
0:08:04however if we can combine to press with random undersampling therefore we get the resolution a listing as predicted by
0:08:09a T you here as bounds and so one
0:08:11so then the question becomes with spring apply with these different sampling patterns
0:08:15what gives us the best combination on aliasing and denoising for application
0:08:19oh just the be complete i just one mention that between there various kinds of sampling parents we
0:08:24kind of export those as well there's jitter undersampling sampling spots on this sampling which is a favourite in the
0:08:30elements P are well known a state-of-the-art art method
0:08:33other other under sampling all pass of this can other combinations of
0:08:37but list goes on and not
0:08:39oh the spring method just a a general overview
0:08:42essentially panels as fidelity to the crap a solution
0:08:45the first part of the equation below
0:08:48joint sparse the solution that the second part the equation below
0:08:51while preserving the acquired data
0:08:54and the reason why preserve our acquired data
0:08:56is is better than taking uh a a bayesian view of things we actually don't wanna a kind of remove
0:09:01any information or somehow clean up our data in some way that's not really realistic
0:09:06so therefore we really only going to mess with the
0:09:09and a case space
0:09:10so this
0:09:11optimization problem here
0:09:13uh picks the full
0:09:15F will be so the full set of K space but we're gonna preserve the is S were only actually
0:09:19filling in the missing case space in we can use that
0:09:22to simplify or optimization problem and to a constraint one
0:09:25but a in the null space of the matrix K
0:09:28now just a little bit of other notation here we have our D R data
0:09:32arg a solution
0:09:34you and
0:09:35that basically can be pre-computed a of time if that only depends on the data
0:09:39we have some low resolution quilt combination we that we just use the fights um
0:09:44heuristic way
0:09:45to the uh grab a
0:09:47a fidelity
0:09:48and we have a tuning parameter that we can set based on confidence in the graph a result
0:09:52we also have a a joint sparsity penalty function
0:09:55that basically
0:09:57uh could be like the L one norm or could be something else to approximate the L zero norm in
0:10:02our analysis
0:10:02"'cause" we know the elves your penalty T is very hard to uh
0:10:06compute with this
0:10:07where a have my
0:10:09so we can use the convex all one norm which is
0:10:12nice to a C properties or we want to maybe get a bit closer to the L zero penalty we
0:10:17use
0:10:17a home we topic you'd way should with the non-convex convex penalty
0:10:20function
0:10:21such as the cushy penalty function
0:10:23which has this kind of a logarithmic increase as we increase the you which we just say school fish
0:10:29of are sparse transform representation
0:10:32now
0:10:33we but and we enforce joint sparsity over all the coils pipeline
0:10:38you penalty function whatever it is
0:10:39not to this this not to the individual coefficients
0:10:42but to the L two norm of the coefficient across all the coils
0:10:47so we actually have this hybrid penalty function
0:10:50yeah and
0:10:50by applying this so we can essentially forced joint sparsity across all just the sparsity basically is considered to be
0:10:57from the object for scanning and not some
0:10:59uh
0:11:00artifact from a a little sensitivity use
0:11:03say not having enough signal and in area
0:11:07now various strategies exist for extending the spring method which is originally proposed a uniform undersampling case
0:11:14and essentially were going to look at extending the grab the method and then using spring this actually has is
0:11:20so it of method such as here to drop but
0:11:22or or a real
0:11:24uh methods
0:11:25a spirit to grab
0:11:27can handle arbitrary sample
0:11:28as but they are catered of an nature and therefore computationally intensive
0:11:32now other direct map is that exist already for radial and spiral trajectories
0:11:36and we draw a a motivation from those two start grab but are arbitrary cartesian undersampling using a direct method
0:11:42where we have locally by kernel
0:11:44yeah basically for each block a case space we have separate graph or kernels
0:11:48they're drive using in cs data
0:11:50now because the direct method we can still up compute G of D that a a result had of time
0:11:55and there's use that through out our spring
0:11:58computations were were not really increase the computational complexity of the spring method it's
0:12:03however because it is to racks and we're not really folding in the compressed sensing key into the crap reconstruction
0:12:10as it's happening and may not be quite as robust as a method
0:12:14but leads for this demonstration or
0:12:15we willing to take that
0:12:18we to by case space into a bunch of blocks of some
0:12:22average acceleration size we couldn't have choose R Y an R Z here appropriately
0:12:27and i we reconstruct construct each block with the kernel to find for little sampling plan for instance if we
0:12:31look at the central block here reconstructing it
0:12:33using only the central block ends neighbour we can imagine have the six sense many neighbours
0:12:37i we can look at that point that where we have a red
0:12:40as our target that we want reconstruct a we can essentially look at the same pattern
0:12:44in that C S
0:12:45a a region yeah essentially use that pair as a template and trace the row along a cs region the
0:12:50guess and if fits to calibrate occur
0:12:53so just kind of demonstrating that you can see
0:12:56now we can also take another target point
0:12:58with a different right axe there and the left
0:13:01and it still i get another set
0:13:05now the shows some reconstructions
0:13:07uh had to evaluate between a uniform random undersampling the first turn to the extremely sparse
0:13:12yep popular
0:13:13shop logan fan and and we multiply channel i sit
0:13:16uh
0:13:17part of that's not a word
0:13:18i an essentially we construct an each channel shuffled in phantom using a a B Os of our lot based
0:13:23B once really
0:13:25and we yeah some noise we're we actually simulate noise covariance using
0:13:29uh the proposed a
0:13:32a measure
0:13:33uh for noise covariance from rumours ninety ninety paper
0:13:36and uh we sin white noise that minus thirty D B
0:13:40yeah and essentially an we apply the spring method using both L one norm a she penalty using both to
0:13:45a uniform undersampling the C way
0:13:49so here i'm showing how but the shop logan phantom looks like four we more i channel it
0:13:53and but the more i channel you just look like a for under sample
0:13:58now if under sample
0:14:00uniformly and take the grab a result
0:14:02uh this is what an eight channel but where accelerating at of
0:14:05factor of sixteen approximately to the E S data
0:14:09uh
0:14:10it essentially actually we know a primary that crap of by itself is not going the work
0:14:15and we can see a significant at lease are aliasing artifact
0:14:18the lab
0:14:19how
0:14:20uh uh applying spring this is with uniform undersampling mind you
0:14:23are are able to bear once the grab result
0:14:26with a some additional sparse regularization actually get rid of
0:14:29a number of those coherent artifacts just by virtue of the fact that the cup chuckled of and that is
0:14:34so spy
0:14:35using the cushy she
0:14:37a a penalty function which as a sparse see that more than L one by itself
0:14:41uh i i can actually further reduce
0:14:43the amount of
0:14:45a come here artifacts we actually get a easily good all the image
0:14:49using random undersampling
0:14:50actually improves
0:14:51things low bit further
0:14:53in the spring side however we know as the grab result because
0:14:56of the income here's a to fast a it a visually very and pleasing however still at even with all
0:15:01that and use that we see that all that here
0:15:04uh but
0:15:05adding the regularization slash the yes
0:15:07spraying
0:15:08i does help matters
0:15:09yeah B see again with the so she L C we get i
0:15:12i quality
0:15:12image
0:15:14now if we turn ourselves to real data
0:15:16quite if very two channel T one weighted and pure rage rain
0:15:20i'd i data at three test
0:15:22and essentially a we only extracted a small set
0:15:27of the coils from that really kind to pull out that you list
0:15:31and when we did the reconstructions you can see that aliasing still we're meeting in the spring construction when we
0:15:35have uniform undersampling
0:15:37however uh when we turn the random undersampling those scope here of are non where left
0:15:42and blurring because we have to turn up the acceleration so much it
0:15:46a twenty five
0:15:47which is very
0:15:48in inter
0:15:49and just in conclusion i just one a point out that for the various partial logan phantom
0:15:54the "'cause" penalty function as a factor that boat denoising and on do we the saying even with uniform undersampling
0:15:59but for real data sparsity sparse my with uniform undersampling
0:16:02i doesn't really mitigate in leasing that much of the D is the image
0:16:06a random sample a random undersampling does increase ability of cs resolve aliasing most sparse real images was gonna correlates
0:16:12with the intuition and
0:16:14however
0:16:15uh we have to consider whether a leasing is really snap can real images and just wanna show one more
0:16:19slide of that's okay
0:16:21yeah so basically we just crying go back use all three two channels so the data now and do have
0:16:25a more uh
0:16:26a a reasonable but still aggressive undersampling and we see that in result is it's mainly noise amplification that were
0:16:32saying not really signal this level
0:16:34so spring as
0:16:35you with uniform undersampling is fairly helpful
0:16:37and we actually don't really see
0:16:39uh a that much additional improvement from random and
0:16:42so i just like to acknowledge make a there's and funding and also fast one mine for providing "'cause" be
0:16:47one simulator which can download i
0:16:49it very much
0:16:55thank you
0:16:56that question yes
0:17:03oh
0:17:03a a and the low take into account the are if Y lower bias field artifacts the you expose the
0:17:09model of them somewhere and try to get to do that or or not
0:17:13okay so we're are bring a three T and there we are gonna is that they can buy as not
0:17:18sniff can as we C at seventy for example so for the most part we just ignored
0:17:22both
0:17:24types of uh artifacts in model
0:17:26and we essentially a use
0:17:28a a a a a set of coils that are we have very good
0:17:31a a both P one plus a be one my as perform
0:17:34so
0:17:35a a are results were very clean start out with a you curves are original image and after to just
0:17:40some basic a normalisation
0:17:42to take care of the uh attenuation the middle uh we were able to kind of deal with
0:17:47all those of
0:17:49just by using a a a a high quality
0:17:50which
0:17:51but we would expect that
0:17:52i if maybe are coil but were quite as good as we
0:17:55see those kinds artifacts we have to adapt our method
0:17:58to
0:17:58for those
0:18:01so a Q you know that to keep the time
0:18:05and we moved to the
0:18:06so