a very much good afternoon my name is down you well are to be talking to you a to day about combining press and see what parallel M are i and for gonna compare with uniform and random cartesian undersampling pattern "'kay" space so first me introduce magnetic resonance image a attack resonance imaging is a very versatile in growing imaging modality both the madison in spectroscopy and others and in particular we can get images of all kinds of organs of four bodies et cetera uh for example we have a a a a a uh a three D volume of a brain i T one weighted image here yeah and however despite all the advancements in the last thirty is some odd years and am are i acquisition time so remains an issue for many times acquisitions for instance this image took between eight in ten minutes to acquire in a three test the mac now if we can get a faster acquisition we can lower the cost of uh the uh uh are imaging first subjects slash patients uh we can increase the comfort for those subjects because they're not in the scanners long and we can possibly also improve the tradeoffs an increase the quality of our image reconstruction and to that and do uh we propose a method called spring which combines the parallel imaging method known as crap but with compressed sensing slash sparsity and we use that cover images from accelerated that is under they now we it can investigate several different undersampling sampling strategies in the kind of uh compressed sensing all the conventional wisdom is that random undersampling is required however we show that with the addition of parallel imaging in certain situations uniform undersampling is sufficient so let's talk about case space quickly so case space basically were first to the fourier a transform domain which the signals are actually acquired then we can normally take an inverse fourier transform a shown here to take the case space data in recovery images or in the three D case of value now we're gonna sample case space using a a after scanning a a approach uh we're gonna get a cartesian volume here where we essentially actually have a readout direction as the after scanning direction not sometimes called frequency oh and we're going to assume that this direction is transverse to the axial slice plane slice but something called the transverse now the axial slice plane we're going to acquire uh in two dimensions here a set of lines so uh and we can but a bound points in the plane yeah we're going to under sample the points because it takes too much time to acquire a whole whole you so for instance a we under sample by two in two directions we can reduce the total amount of ski time by a factor for and we would get and aliased image if we just did inverse fourier transform because of the undersampling that's actually looming under sample reduces or field of view that we we get overlap and the reconstruction now the deal with this there too general methods that the proposed over the years other probably others to but i to focus on be used so for small there's parallel imaging in parallel imaging essentially we can have a multiple uh a set up such as the three two channel quayle shown you and the thirty two channel called all basically a gets images from a a a a a a whole ray of coils and each coil gets a slightly different we image basically due to its spatial position very so prop but is a method that basically takes all these a all data in case space under undersampled data uses a small block of additional calibration line called a cs S lines to calibrate a kernel which is then used to fit in all the missing K space and that's on do the aliasing now compressed sensing works a little different way and that and step a designing a specialised observation model we're designing a prior for are image for or image is sparse in some domain for instance the brain images i showed for we could consider them approximately sparse or compressible in a domain like the for level nine seven do W T now the fourier transform is nice because it provides incoherent sampling a i to be under sample and have have the random fashion and and therefore for using the sparsity the incoherent slash random sampling and an amateur algorithm like of one magic or others adapted to the complex that the data we have here uh we can do compressed sensing reconstruction now a more abstract lee what we can think about this problem i have in in not simple presentation as a multichannel sampling problem where essentially we can incomplete data so we have that the image you that we desire the we a i yeah this image is multiplied in the image domain by a much these coil all sensitive D so those weightings i mentioned in the parallel imaging setup and then there's sample that for a transfer domain knows that their sample the same for a transform court and it's for all the oil and then we're assume there the data is perturbed by some amount of additive white noise now we assume the noise well white the cross frequency is actually some correlation across coils so we can measure this noise covariance using he simple fast priest king now what's look back a parallel imaging for second the motivate the spring work so parallel imaging methods such just grab a we're great at low undersampling factors in fact today clinically we can do uh acceleration factor a between two and four and still get high quality images is many application however as we further accelerate our scanning process to say nine sixteen twenty five thirty six times over undersampling our proper results clearly a D grade and receive the noise essentially blows up yeah this noise amplification is well is a one of phenomenon in parallel image now there's additional form of error that i wanna just mentioned because it's important to this talk and that as residual aliasing so you can't really see that in the use images but you can imagine that i due to limitations are coil is that the on aliasing from the graph of kernels not going to be exact if are coils are don't have enough a spatial variability to duplicate a high frequency complex exponentials we need the synthesized the frequency shifts in case space now we know the noise here is not sparse yeah and therefore for promoting sparsity can D noise or images now compressed sensing can also undo the incoherent aliasing here so if we have a saying unlike like what we saw before but a leasing from random undersampling compress i think a on that as well so that kinda motivates using spring to improve this crap a result using compressed sensing or sparsity so we have two possible a general types of undersampling sampling frameworks works many others now a uniform undersampling we motivate by the time prototypical grapple a method expect a uniform undersampling and drop can take uniformly undersampled data and a on it we use it up to uh P alias so basically if you have a capital P oils it can in fury and you up to an acceleration factor of are equals P however beyond on that you get a cold here artifacts now if we shift to a random undersampling set the application grout but isn't and so straightforward and will address that shortly but we can still in theory are really as a frequency shifts up to capitol P of course if we have random undersampling we may have areas that are very sparse sampled and for this may not be sufficient however can here is that are not able be reconstructed exactly be typically get not sparse were incoherent part of now compressed sensing but we apply with uniform undersampling were violating kind of one that basic of requirements compressed sensing at least in theory uh we get a very limited on a listing for that reason however a press thing or rather a regular rising the sparse C still very capable D voicing images however if we can combine to press with random undersampling therefore we get the resolution a listing as predicted by a T you here as bounds and so one so then the question becomes with spring apply with these different sampling patterns what gives us the best combination on aliasing and denoising for application oh just the be complete i just one mention that between there various kinds of sampling parents we kind of export those as well there's jitter undersampling sampling spots on this sampling which is a favourite in the elements P are well known a state-of-the-art art method other other under sampling all pass of this can other combinations of but list goes on and not oh the spring method just a a general overview essentially panels as fidelity to the crap a solution the first part of the equation below joint sparse the solution that the second part the equation below while preserving the acquired data and the reason why preserve our acquired data is is better than taking uh a a bayesian view of things we actually don't wanna a kind of remove any information or somehow clean up our data in some way that's not really realistic so therefore we really only going to mess with the and a case space so this optimization problem here uh picks the full F will be so the full set of K space but we're gonna preserve the is S were only actually filling in the missing case space in we can use that to simplify or optimization problem and to a constraint one but a in the null space of the matrix K now just a little bit of other notation here we have our D R data arg a solution you and that basically can be pre-computed a of time if that only depends on the data we have some low resolution quilt combination we that we just use the fights um heuristic way to the uh grab a a fidelity and we have a tuning parameter that we can set based on confidence in the graph a result we also have a a joint sparsity penalty function that basically uh could be like the L one norm or could be something else to approximate the L zero norm in our analysis "'cause" we know the elves your penalty T is very hard to uh compute with this where a have my so we can use the convex all one norm which is nice to a C properties or we want to maybe get a bit closer to the L zero penalty we use a home we topic you'd way should with the non-convex convex penalty function such as the cushy penalty function which has this kind of a logarithmic increase as we increase the you which we just say school fish of are sparse transform representation now we but and we enforce joint sparsity over all the coils pipeline you penalty function whatever it is not to this this not to the individual coefficients but to the L two norm of the coefficient across all the coils so we actually have this hybrid penalty function yeah and by applying this so we can essentially forced joint sparsity across all just the sparsity basically is considered to be from the object for scanning and not some uh artifact from a a little sensitivity use say not having enough signal and in area now various strategies exist for extending the spring method which is originally proposed a uniform undersampling case and essentially were going to look at extending the grab the method and then using spring this actually has is so it of method such as here to drop but or or a real uh methods a spirit to grab can handle arbitrary sample as but they are catered of an nature and therefore computationally intensive now other direct map is that exist already for radial and spiral trajectories and we draw a a motivation from those two start grab but are arbitrary cartesian undersampling using a direct method where we have locally by kernel yeah basically for each block a case space we have separate graph or kernels they're drive using in cs data now because the direct method we can still up compute G of D that a a result had of time and there's use that through out our spring computations were were not really increase the computational complexity of the spring method it's however because it is to racks and we're not really folding in the compressed sensing key into the crap reconstruction as it's happening and may not be quite as robust as a method but leads for this demonstration or we willing to take that we to by case space into a bunch of blocks of some average acceleration size we couldn't have choose R Y an R Z here appropriately and i we reconstruct construct each block with the kernel to find for little sampling plan for instance if we look at the central block here reconstructing it using only the central block ends neighbour we can imagine have the six sense many neighbours i we can look at that point that where we have a red as our target that we want reconstruct a we can essentially look at the same pattern in that C S a a region yeah essentially use that pair as a template and trace the row along a cs region the guess and if fits to calibrate occur so just kind of demonstrating that you can see now we can also take another target point with a different right axe there and the left and it still i get another set now the shows some reconstructions uh had to evaluate between a uniform random undersampling the first turn to the extremely sparse yep popular shop logan fan and and we multiply channel i sit uh part of that's not a word i an essentially we construct an each channel shuffled in phantom using a a B Os of our lot based B once really and we yeah some noise we're we actually simulate noise covariance using uh the proposed a a measure uh for noise covariance from rumours ninety ninety paper and uh we sin white noise that minus thirty D B yeah and essentially an we apply the spring method using both L one norm a she penalty using both to a uniform undersampling the C way so here i'm showing how but the shop logan phantom looks like four we more i channel it and but the more i channel you just look like a for under sample now if under sample uniformly and take the grab a result uh this is what an eight channel but where accelerating at of factor of sixteen approximately to the E S data uh it essentially actually we know a primary that crap of by itself is not going the work and we can see a significant at lease are aliasing artifact the lab how uh uh applying spring this is with uniform undersampling mind you are are able to bear once the grab result with a some additional sparse regularization actually get rid of a number of those coherent artifacts just by virtue of the fact that the cup chuckled of and that is so spy using the cushy she a a penalty function which as a sparse see that more than L one by itself uh i i can actually further reduce the amount of a come here artifacts we actually get a easily good all the image using random undersampling actually improves things low bit further in the spring side however we know as the grab result because of the income here's a to fast a it a visually very and pleasing however still at even with all that and use that we see that all that here uh but adding the regularization slash the yes spraying i does help matters yeah B see again with the so she L C we get i i quality image now if we turn ourselves to real data quite if very two channel T one weighted and pure rage rain i'd i data at three test and essentially a we only extracted a small set of the coils from that really kind to pull out that you list and when we did the reconstructions you can see that aliasing still we're meeting in the spring construction when we have uniform undersampling however uh when we turn the random undersampling those scope here of are non where left and blurring because we have to turn up the acceleration so much it a twenty five which is very in inter and just in conclusion i just one a point out that for the various partial logan phantom the "'cause" penalty function as a factor that boat denoising and on do we the saying even with uniform undersampling but for real data sparsity sparse my with uniform undersampling i doesn't really mitigate in leasing that much of the D is the image a random sample a random undersampling does increase ability of cs resolve aliasing most sparse real images was gonna correlates with the intuition and however uh we have to consider whether a leasing is really snap can real images and just wanna show one more slide of that's okay yeah so basically we just crying go back use all three two channels so the data now and do have a more uh a a reasonable but still aggressive undersampling and we see that in result is it's mainly noise amplification that were saying not really signal this level so spring as you with uniform undersampling is fairly helpful and we actually don't really see uh a that much additional improvement from random and so i just like to acknowledge make a there's and funding and also fast one mine for providing "'cause" be one simulator which can download i it very much thank you that question yes oh a a and the low take into account the are if Y lower bias field artifacts the you expose the model of them somewhere and try to get to do that or or not okay so we're are bring a three T and there we are gonna is that they can buy as not sniff can as we C at seventy for example so for the most part we just ignored both types of uh artifacts in model and we essentially a use a a a a a set of coils that are we have very good a a both P one plus a be one my as perform so a a are results were very clean start out with a you curves are original image and after to just some basic a normalisation to take care of the uh attenuation the middle uh we were able to kind of deal with all those of just by using a a a a high quality which but we would expect that i if maybe are coil but were quite as good as we see those kinds artifacts we have to adapt our method to for those so a Q you know that to keep the time and we moved to the so