and i guess and if you take a P C such as the image we you and multi that processing P C and that's a T C which just extremely brought with a way to have a set of interest so trying to some of what are the trends in such a broad yeah wooden in the course of how off uh an extremely challenging problem that's a of need to find somebody was keep of doing it so instead what we but to do is try and some i yeah oh yeah i i i i i trained but but it and some i some of the specific plans in usually yes and that about those and hopefully i time for discussion and but the sitting at that end okay so we have to experts will be speaking on behalf of or specific i you the the D C obviously they cannot are present everything than the dc but hopefully present option but the D C that's somehow like that is going to be talking on the multi that signal processing side a a to that is going to talking about on the image and media processing site okay i i'm not of shown my the easy chair so i get to pay to moderate and they are we here if we begin i'd like to begin with some global trends that we see around us i and you planning of a i incident crossing that scene which will really be more inclusive that what we can be in the course of a or not on the application site we see it is a lot more on the web and this will site you but i do things more more the cloud that is people like to do all went to reality on the cell phones but using all augmentation of processing that is a big use of streaming we of this is already happening be her in the top the only we yesterday the video traffic for the first time has exceeded the that is that it can be a unit but so that takes now is that can what that it is net states oh and wait then people sharing actually but it head on P to be in it was to and T emitting of becoming more and more common place that's what this pushing on the consumer site and on the cat just like you looking at computed as close to capture imaging but other than have sense which that i got to the image you interested in you looking at how you compute what you interested in on on the really have buttons techniques side one of the biggest and which i C is that well boundaries doing traditional disciplines have become increasingly blurred you have the separate disciplines we to think of a as an image processing community quite separate from a number of other communities to to the the all these you to have people to the whole boundaries between image processing computer which in communications psychology machine learning any i that be very hard to draw all and this is in addition to traditional disciplines do if you already what so just mathematics and statistic so with that but introduction a a give the audio the bad take we begin with talking about trends and money with signal processing but that that take uh welcome everybody and things got for that uh and reduction i think um i would agree with curves comments that uh many any of the trends that we see a cost not just a to to C but but the the T C spending the entire uh signal processing society then involve kind of convergence of what used to be separate disciplines and also um pushing computation sort of further for back to the sensor like a kind of convergence of sensing and computation um so what i'm gonna focus on uh for the purpose of the next few minute is the specifically multidimensional aspect of this um and so a kind of a a a line take away from this talk is that big data is finally here the compute resources to work on big data are here and and it's time for us as the signal processing side is to sort of get in the game okay that's the story a a and you can see that in the background of this interest slide is a some representation of big data that all um talk about a minute um so here's a here's a fun exercise uh to think about while i a you a little bit about what i'm gonna say a a a a a lot of we or analogies and metaphors and the news and nine and recent times and uh i've put a few of them on the slide um so the question i would want close for you is when it comes to a vast quantities of data and our job a signal processors to extract from this vast quality of data the relevant salient information um i we swimming or we'd drowning uh are we drinking or are we sinking right so a can match these quotes swimming and sensors and rounding and data a a that was a quote from a uh recently retired us air force general it has basic complaint was that every time another you a be flies with more cameras on it we get more data but we don't necessarily get more information a a a a global should of information was referenced by yesterday morning morning's point speaker and uh the last quote of course is easy it's so it's a famous line from the english class rhyme of the ancient mariner and that we have to ask ourselves to really is this a case of what are are everyone already dropped to drink in other words how do we take is massive quantities of data and exploit them and a useful way for various signal processing purposes okay so the other thing also just as might well it word about this but cloud what is the cloud i meant to put in a a a graphic of a cloud but you know is just sort of water vapour anyway or something right a what what is the story so but a it is kind of a a a convenient shorthand for a data intensive computing right and a what that is to say is how do we think about scaling up traditional signal processing algorithms a single core or or given a set of of of of course a ranged into a cluster to really touch data that's faster passing the terabytes K okay so you do need to know much more about cloud but if you want to that just for a real uh earlier this week you will have seen someone from phase book uh telling you precisely how some of this exciting work is being done so that leads me to my next point which is evidence for travel so it's fine for me to stand up here and talk about this but like where is the real evidence of this is an emerging trend so if you take a look at icassp last year we had a a uh one tutorial on user dynamics of social networks we had actually another entire special issue devoted to social networking and a special session on signal processing for graph i a me this is one manifestation of this kind of mad the trend of that i put on the top of the slide here which just say that as the data rates increase and how to pace are ability to make sense of them the the the the piece of at extracting low dimensional structure from high dimensional data becomes even more important so whether you want to you know everything fits under this rubric back from compressed sensing all the way back to a principal components analysis just kind of nice because as car of mentioned i this a fun is increasingly touching on psychology a was the psychologist to came up with latent factor analysis in the first place right so this year's icassp with see even more of the same um several of the tutorials touched on big data that we had yesterday planner you we have another platter E and bayesian nonparametric a those are uh a a very nice and scalable class of techniques for treating very large got data corpora um and we have a special session this you a low dimensional structure and high dimensional data um so my question for you is what's twenty twelve and be on going to look like are we gonna sink or swim or or are out it's up to us the data are already here the computational resources are coming on line and we have to ask ourselves how do we look at these trends that are taking place across disciplines and for some common signal processing framework around a but that's my challenge to you i so i thought about this problem uh for a long time and i watched and try to take part of some of these changes have happened and this is what i think is happening here's my assessment a a a a a converging around graphs graph representations as a kind of common framework and there's to key points the first is that a graph representations presentations are of various sort of a handy way to think about data data that are very high dimensional but simultaneously very sparse okay so covariance structures are or correlation amongst a protein expressions engine general right there were co occurrence uh a and document a or i i could and examples from the other technical committee speech processing for example et cetera um other kind of key point is that they give us a common framework into which we can hack oh sort of traditional structured data signals images speech text on it so forth and kind of unstructured structured things like documents or are collections of stop like collections of multimedia data um all of this can kind of we put under this common framework so what i've shown at the bottom is then the box the basic kind of duality between a writing down a picture of a graph and what you have no and you have edges and turning that into uh a matrix variate structure right so the nodes become rows and columns in a matrix and the edges become a entry is nonzero entries and the adjacency structure so there is very little a new really new under the sun right and and if you back up to the left and the right of this diagram you'll see this famous example of on the rolling the swiss roll if you're remember this is a and i can maps or or a nonlinear pca example then a which this no a graph representation plays a fundamental role okay so taking a high dimensional nonlinear structure like a plan that's been embedded in three dimensional space and on wrapping it to something that's for a this case requires building a graph sparsifying the graph on the set of points computing shortest path across the graph and then under wrapping this up or all and to something flat right so you can start with structured data and up but structured data but you may well ask their graphs and the middle or or you can talk about pure and structured data like collections of text and documents and looking for co occurrences of words and other patterns right so i a strong evidence that a key point of this framework is the it its ability to bring together structured and unstructured data with a common framework that's what makes this sort of very hand once i turn "'em" my underlying objects and to graph representations then i think compute with them according to various rules a linear algebra and uh and the example that i shown here or are dealing with a with the images images from flicker curve and a face recognition apps are already being built that leverage the graph structure and to these images of co occurrence and photographs right is predictive of that or uh social network and trees and some like face book okay and uh uh in it's it's clear i think to see how how extensible these things are so for the reason cited by the various communities all the various communities many of which car mentioned a a kind of quiet coalescing around this type of representation right so i'll leave you with some very basic challenge problems a i think these are these are if not the big three than than three of the biggest certain my so the first one is the physics the phenomenology we don't really understand very well yeah but a massive graphs how massive of behave what they look like whether other social media data or graphs derived from large corpora images are so on and so forth we need to understand the physics to think about it and signal processing language right you can't do detection and estimation theory for radar less you understand the physics involved that a radar system then not argue that you can do yeah a graph based signal processing and less you understand the driving phenomenology right so that leads the second point which is so basically we don't have a so everything that you can answer by looking in an undergraduate statistics or or in your level signal processing text about and matched filtering analysis of variance uh so on and so forth least squares we don't know these things that program i there's not the same natural vector space structure as there is when you talk about simple signals and images and euclidean space and that the euclidean space and so we need we need we need the fundamental theories i mean know when you look back at what the theory of information look like before shannon came along and unified it that was sort of spread out and do couple i we need a sort of unifying theoretical under framework to understand how these things uh a one of the fundamental them of of of graph but based signal processing detection and estimation um then than the last one is if you step back and look what we're really doing is were just adding this correlates of context or the structural context on top of traditional signal processing cards that we are ready to large extent don't how to deal and are getting together context and content i think is another another another challenge for us um and if you ask yourself well but got a general the theory of signal processing for graphs and that even look like the answers are really not clear but i have left you with just a couple examples of the bottom oh of the way is an a which new were newly discovered mathematics can often kind of look behind the scenes on and and for quite a long time and that suddenly it pops up and becomes L relevant um um a couple of past examples boolean algebra a a weighted a quite a long time before the advent of does a logic that are and the matrix E which is uh another subject of a tutorial than the site i i i have a huge impact and wireless communications of the late nineties and early two thousands so it's an open question as to what mathematical advances are going to drive signal processing frameworks for network data and uh i heard you to a consider um taking up the challenge uh that's it thanks very much and uh uh now i'm gonna turn it over to car i Q patrick i one we do questions would be to questions on to the end of the way would be people have a in questions at this point i could have some for back click now and depending on how long things school we can decide where to cut think sure so a they question that people would like to ask so maybe i'll begin with one what what you see in terms of education and mentioned she didn't you can have been very and just on the base eight and i what is your view in terms of how do we teach and educate the next generation of these edges and students in the C yeah also so that that's a very good point uh so the good news as as as that the data are Q here and the compute infrastructure is also coming on and so probably many of us a educational institutions have access to pretty good compute resources at this point so i think the positives are that we can get a our students access to these kind of data example are for really in the course of their undergraduate education um the flip side and this is something that i didn't talk about all but is very interesting area to get into i issues about privacy and security right so i mentioned things like i mentioned face but i could've mentioned to or so on and so forth some of the most interesting social media data sets oh also the same ones that we need to be very careful about in terms of privacy concerns and at the policy level and the united states at least the national science foundation has been very involved right now in trying to figure out how to handle that other words a the right a research grant application to do signal processing education on some kind of corpus of face book data no know is that all right should the and S even be funding that what are the human subjects requirements to experiment with such data and so on and so forth for the moment all we've managed to do as collected data when people sort of have you have and their can then know that there a a being collected on so for instance uh a student a mighty business school agree too you know you cellular phones with the understanding that uh their proximity to other people on the study group would be measured and these data are actually available from mit T uh uh you can go on and use them they're called the reality mining data right so this is a very interesting but my a us understand about better the phenomenology of the subset of people who choose to go to mit mighty in the school but it's hard to know all right if that's the same thing as as the more general population so i think that's that's one of the big challenges is of is there is a a real opportunity to get people exposed to these kind of data early and their education but we have to work through some of these issues about uh data privacy and uh you can read about this in the paper almost every day thanks thanks to yeah you the question for be in this one oh right so then lena you okay have something that here "'cause" i want to look at some okay paper thanks yeah i i go go going to talk about a three D you'll process yeah also a long as then and of the back in explosion um so so a of an application didn't and that i was to in the nineteen thirties black and white be and then we and the to come section two somewhat poor resolution to it should be a solution from you each T for two for two D and then also be used in we we look at how experience that we had a be used to have temporal spatial we have a lot of below a a a a a a a a a a you know there is an had growing um um small was so that you know small the that gives them what to you and mobility and dark you know still have what can we do about that i i'm still a typical but to two hours by there and with my capture them out T V oh i and i would like to to me the and my and they're um and so and and the for user can sir um can have a problem because instrument but can't have the van and a and and then so they have to a good cop to give you my experience we have to capture a you know it's and have to step and can come a of time and the because you have some mismatch and in my view and and i and captures a a a a a a a and that you a process and a good then and compute and to go to do that we can start the main area for example i would like to maybe can capture that and and from back to to the the type of using them um but can and am so we had to go to is something with that it to plot we present position but you for dimensions an email either few was or whatever information need and um hmmm efficient solutions but the to the function solutions for sure and so and i we happen on a point um a the for example if is not good a bit of the a right you know um and i and if uh_huh channel and in the hmmm um a that section for for example the projection and you have to and the left you can be too in in the head but be it is score and uh_huh to options that what i i've been to use to that some some information as in is not i um a similar to i have a to show and a i and a conventional two how the the of missed a and do some special with um but to be that would we have to pair and a few so since since pulp and to time to cook missed and but to five not sure to and a factor so can we need a chance with limited resources and that a lot the band like an X not doing so they want to the three D and not be do not to be so some of the things that we need to on D to three time test and uh_huh if i say so and so as if you like you i i two to come for mentioned when to the U um i the complaints i the content can you know system passed of the content not of this in um if we look at that at the slides in from which we can to that information of the three imagine and that we can look at to you are be captured for oh so you using uh collapse and i'm i'm i'm not sure if you can see the the present if you we have like three and an structure from motion so try to that section of by motion for me D J like that look like a shame i that um occlusion Q i yeah i don't or that's to the medic but you and just and and the agenda the will come back to the question but but not i but i in to D two D to three D conversion i that information and to be able to to construct a a bunch of come here to i'm i mentioned that if we have a you know a couple of view to measure that and i had to men them and a global method i could and but uh i i'm and the globe in the back up to my function of the was to compute the disparity you yeah the local ones they more local but i i you know are know that he to to on the accuracy of i exploit quite clear image that's not i good yeah though i the optimized in fact however i the trying to improve you the the either or the i Q one i'm going to one by i go back to an from the G to to the cup the the man one at the cost at the that's construct a but in the and so do not computation um that used in the project topic and i i mention the that we would like to do and may be used i i can you how we have that image but most of didn't that we use the information to construction of the that that that do you a and that i'd that maybe other techniques one that using some image work based i a chance um we we have to the project that just how wear pink a function that we used to generate the views um come to stand i mean a simple not exploit to view and we would like to the notion an efficient back in me to do that and include include in the extension of the you statistics for a standard could you could the to code our web the three D you have to form that we have to make any compatible for my to do this to do you think sequence of a black for like by that left for the that thing you would just one sequence that or could use the frame compatible for way that you can put in a frame i could but then yeah but the problem is to have to down sampling and not the solution and then when one a display not you know a two point eight and interpolation method we can and also right to use the can in for the baseline we could you used to take come a compact to perform is just keep it to get back to the original resolution interest in at six three curve and i compensate for it much you and to a good quality can to you to to interesting a and and also i that from you the two D yeah and these we can the quite and um is it or or or you know be created about what to do and then a form of you just a just cup because case and each you you meet condition and you have you a quite understood to since the future can you you have used use is that that you might to so yeah information will now able to check the that had just the come and okay uh_huh agenda but if the content and to theory can i got to the house and to be the quality content and i the objective quality net fixed how to start with subset that the experiments to stand the major of three D video quality and a combat so we look at an option not but but i it out before we move into objective quite X and and you can do if you are come come and and i in the past should ensure on my team that's not going to be some to get on i but but you have to look for it i a sure to have a good subject score our objective metrics we the base of that known from find look okay you know just i back to the the the not got that D was not good be able to fix yeah have to be a given is to be a shame oh might not go to i i mean either one and a and had but for that i and email kind show that was to put the pen from an issue of can you and also still the post but and this i to this that you should just be proceedings of the ieee that's a in three display but but two thousand and if you wanna do some for the reading huh thank you thank you later i sort of spend a little time but we could still allow for one one two interesting questions from the audience so the anybody has questions please yeah one of the microphones oh i should oh um yeah is only if you speech to i guess and of the more the the the the uh using other types of modalities and to create three D like for example you know the box connector uh those types of things that using a lose or something like that to the fuse it's create make three dimensional uh representations the question is about to but for can into them read representations such as the experts can eh which to be uses yeah i a what about you and used to oh and and and to have no um uh_huh i to the clean that you have a compute um the hmmm the and my i yeah and the of time to could contents um and that comfortable and that's and i the question a question i should see like in that case in the interest of letting everyone had the lunch break that's thing the for this because the again