0:00:13and i guess
0:00:14and if you take a P C such as the image we you and multi that processing P C
0:00:18and that's a T C which just extremely brought
0:00:21with a way to have a set of interest
0:00:23so trying to some of
0:00:24what are the trends in such a broad yeah
0:00:27wooden in the course of how off uh an extremely challenging problem
0:00:31that's a of need to find somebody was keep of doing it
0:00:35so instead what we but to do is try and some i
0:00:39yeah
0:00:39oh yeah i
0:00:41i i
0:00:42i
0:00:43i trained but but it and some i some of the specific plans in usually yes
0:00:48and that about those and hopefully i time for discussion and
0:00:51but the sitting at that end
0:00:53okay
0:00:53so we have to
0:00:55experts will be speaking on behalf of or specific i you the the D C obviously they cannot are present
0:01:00everything than the dc but hopefully present option but the D C
0:01:04that's
0:01:05somehow like
0:01:06that is going to be talking on the multi that signal processing side
0:01:10a a to that is going to talking about
0:01:12on the image and media processing site
0:01:14okay
0:01:15i i'm not of shown my the easy chair so i get to pay to moderate and they are we
0:01:19here
0:01:20if we begin i'd like to begin with some global trends that we see around us
0:01:24i and you planning of a i incident crossing that scene which will really be more
0:01:29inclusive
0:01:29that what we can be in the course of a or not
0:01:32on the application site we see it is a lot more on the web and this will site you but
0:01:37i do things more more the cloud
0:01:39that is people like to do all went to reality on the cell phones but using all augmentation of processing
0:01:45that is
0:01:46a big use of streaming we of this is already happening be her in the top
0:01:49the only we yesterday
0:01:51the video traffic
0:01:52for the first time has exceeded
0:01:54the that is that it can be a unit but
0:01:56so that takes now is
0:01:58that can what that it is net states
0:02:00oh and wait
0:02:02then people sharing
0:02:04actually but it head on P to be in it was
0:02:08to and T emitting of becoming more and more common place
0:02:11that's what
0:02:11this pushing on the consumer site
0:02:13and on the cat just like you looking at computed as close to capture imaging
0:02:17but other than have sense which that i got to the image you interested in you looking at how you
0:02:21compute
0:02:22what you interested in
0:02:24on on the really have buttons techniques side
0:02:26one of the biggest and which i C is that well
0:02:29boundaries doing traditional disciplines have become increasingly blurred
0:02:33you have
0:02:34the separate disciplines we to think of a as an image processing community quite
0:02:37separate from a number of other communities
0:02:39to to the the all these you to have people to the whole boundaries between image processing computer which in
0:02:44communications psychology machine learning any i
0:02:48that be very hard to draw all and this is in addition to traditional disciplines do if you already what
0:02:52so just mathematics and statistic
0:02:54so with that but introduction
0:02:56a a give the audio the bad take
0:02:58we begin with talking about trends and money with signal processing
0:03:14but that that take uh welcome everybody and things got for that uh and reduction i think um
0:03:19i would agree with curves comments that uh
0:03:22many any of the trends that we see a cost not just a
0:03:26to to C but but the the T C spending the entire uh
0:03:30signal processing society
0:03:31then involve kind of convergence of what used to be separate disciplines and also um
0:03:36pushing computation sort of further for back to the sensor
0:03:39like a kind of convergence of sensing and computation
0:03:42um so what i'm gonna focus on uh for the purpose of the next few minute
0:03:46is the specifically multidimensional aspect of this
0:03:50um and
0:03:51so
0:03:52a kind of a a a line take away from this talk is that
0:03:56big data is finally here
0:03:58the compute resources to work on big data are here
0:04:02and and it's time for us as the signal processing side is to sort of get in the game
0:04:06okay that's the story
0:04:07a a and you can see that in the background of this interest slide is
0:04:11a some representation of big data that all um
0:04:14talk about a minute
0:04:16um
0:04:17so here's a here's a fun exercise uh to think about while i
0:04:22a you a little bit about what i'm gonna say
0:04:24a a a a a lot of
0:04:26we or analogies and metaphors and the news and nine and recent times
0:04:31and uh i've put a few of them on the slide
0:04:34um so the question i would want close for you is
0:04:37when it comes to a vast quantities of data
0:04:40and our job a signal processors to extract from this vast quality of data
0:04:45the relevant salient information
0:04:47um i we swimming or we'd drowning
0:04:49uh are we drinking or are we sinking
0:04:52right
0:04:52so
0:04:54a can match these quotes swimming and sensors and rounding and data
0:04:58a a that was a quote from a uh recently retired us air force general
0:05:02it has basic complaint was that
0:05:04every time another you a be flies with more cameras on it
0:05:09we get more data but we don't necessarily get more information
0:05:13a a a a global should of information was referenced by yesterday morning morning's point speaker
0:05:18and uh the last quote of course is easy it's so it's a famous line from the
0:05:22english class rhyme of the ancient mariner
0:05:25and that we have to ask ourselves to really is this a case of what are are everyone already dropped
0:05:30to drink
0:05:31in other words how do we take is massive quantities of data
0:05:34and exploit them and a useful way for various signal processing purposes
0:05:39okay
0:05:39so the other thing also just as might well it word about this
0:05:43but cloud what is the cloud i meant to put in a a a graphic of a cloud but
0:05:47you know is just sort of water vapour anyway or something right
0:05:50a what what is the story so
0:05:53but a it is kind of a a a convenient shorthand
0:05:56for a data intensive computing
0:05:59right
0:06:00and a what that is to say is
0:06:02how do we think about scaling up traditional signal processing algorithms
0:06:06a single core or or given a set of of of of course a ranged into a cluster
0:06:11to really touch data that's faster passing the terabytes K
0:06:15okay
0:06:16so you do need to know much more about cloud
0:06:18but if you want to that just for a real uh earlier this week you will have seen someone from
0:06:22phase book
0:06:23uh telling you precisely how some of this exciting work is being done
0:06:28so
0:06:28that leads me to my next point which is evidence for travel
0:06:32so it's fine for me to stand up here and talk about this but
0:06:35like where is the real evidence
0:06:37of this is an emerging trend
0:06:39so if you take a look at icassp last year
0:06:42we had a a
0:06:43uh one tutorial on user dynamics of social networks we had actually another entire special issue devoted to social networking
0:06:50and a special session on signal processing for graph
0:06:54i
0:06:54a me this is one manifestation of this kind of mad the trend of that i put on the top
0:06:59of the slide here
0:07:00which just say that as the data rates increase
0:07:03and how to pace are ability to make sense of them
0:07:06the the the the piece of at extracting low dimensional structure from high dimensional data becomes even more important
0:07:12so whether you want to you know everything fits under this rubric back from compressed sensing
0:07:17all the way back to a principal components analysis
0:07:20just kind of nice because as car of mentioned
0:07:23i this a fun is increasingly touching on psychology a
0:07:26was the psychologist to came up with latent factor analysis in the first place right
0:07:30so this year's icassp with see even more of the same
0:07:33um several of the tutorials touched on big data
0:07:38that we had yesterday planner you we have another platter E and bayesian nonparametric
0:07:42a those are uh a a very nice and scalable class of techniques for treating very large got data corpora
0:07:49um
0:07:50and we have a special session this you a low dimensional structure and high dimensional data
0:07:54um so my question for you is
0:07:56what's twenty twelve and be on going to look like
0:07:59are we gonna sink or swim or or are out it's up to us the data are already here the
0:08:03computational resources are coming on line
0:08:06and we have to ask ourselves how do we look at these trends that are taking place across disciplines
0:08:13and
0:08:14for some common signal processing framework around a
0:08:17but that's my challenge to you
0:08:19i
0:08:20so
0:08:21i thought about this problem uh for a long time and i watched
0:08:25and try to take part of some of these changes have happened
0:08:29and this is what i think is happening here's my assessment
0:08:32a a a a a converging around
0:08:34graphs graph representations as a kind of common framework
0:08:38and there's to key points
0:08:40the first is that a graph representations presentations are of various sort of a handy way to think about
0:08:46data data that are very high dimensional but simultaneously very sparse
0:08:50okay
0:08:50so
0:08:51covariance structures are or correlation amongst a protein expressions engine general
0:08:57right
0:08:58there were co occurrence uh
0:09:00a and document
0:09:01a
0:09:02or
0:09:03i i could and examples from the other technical committee speech processing for example et cetera
0:09:09um
0:09:10other kind of key point is that they give us a common framework into which we can hack
0:09:16oh
0:09:16sort of traditional structured data
0:09:19signals images speech text on it so forth
0:09:22and kind of unstructured structured things like documents
0:09:26or are collections of
0:09:27stop
0:09:28like collections of multimedia data
0:09:31um
0:09:31all of this can kind of we put under this common framework so what i've shown at the bottom
0:09:36is
0:09:36then the box
0:09:38the basic kind of duality between a writing down a picture of a graph and what you have no and
0:09:43you have edges
0:09:44and turning that into uh
0:09:46a matrix variate structure
0:09:48right so the nodes become rows and columns in a matrix and
0:09:51the edges become a entry is nonzero entries and the adjacency structure
0:09:56so
0:09:56there is very little a new really new under the sun right
0:09:59and and if you back up to the left and the right of this diagram you'll see this famous example
0:10:04of on the rolling the swiss roll if you're remember this is a
0:10:07and i can maps or or a
0:10:10nonlinear pca example
0:10:12then a which
0:10:13this
0:10:13no a graph representation plays a fundamental role
0:10:16okay
0:10:17so
0:10:19taking a high dimensional nonlinear structure
0:10:22like a plan that's been embedded in three dimensional space and on wrapping it
0:10:26to something that's for a
0:10:29this case requires building a graph sparsifying the graph on the set of points
0:10:34computing shortest path across the graph
0:10:36and then under wrapping this
0:10:38up or all and to something flat
0:10:40right
0:10:41so
0:10:41you can start with structured data and up but structured data but you may well ask their graphs and the
0:10:46middle
0:10:47or or you can talk about pure and structured data like collections of text
0:10:51and documents and looking for co occurrences of words and other patterns
0:10:55right
0:10:56so
0:10:57i
0:10:58a strong evidence that
0:11:00a key point of this framework is the it its ability to bring together structured and unstructured data with a
0:11:05common framework
0:11:06that's what makes this sort of very hand
0:11:08once i turn "'em" my underlying objects and to graph representations
0:11:13then i think compute with them according to various rules a linear algebra
0:11:17and uh and the example that i shown here
0:11:20or are dealing with a
0:11:22with the images images from flicker curve
0:11:24and a
0:11:25face recognition apps are already being built
0:11:28that leverage the graph structure and to these images of co occurrence and photographs right
0:11:34is predictive of
0:11:35that or uh social network and trees and some like face book
0:11:39okay
0:11:40and uh uh in it's it's clear i think to see how how extensible these things are
0:11:44so for the reason cited by the various communities all the various communities many of which car mentioned
0:11:49a a kind of quiet coalescing around this type of representation
0:11:53right
0:11:53so i'll leave you with some very basic challenge problems
0:11:57a i think these are these are if not the big three than than three of the biggest certain my
0:12:03so the first one is
0:12:05the physics the phenomenology
0:12:08we don't really understand very well yeah
0:12:11but a massive graphs
0:12:13how massive of behave what they look like
0:12:16whether other social media data
0:12:18or graphs derived from large corpora images are so on and so forth
0:12:22we need to understand the physics
0:12:24to think about it and signal processing language right
0:12:26you can't do detection and estimation theory for radar less you understand
0:12:30the physics involved that a radar system
0:12:32then not argue that you can do
0:12:35yeah a graph based signal processing and less you understand the driving phenomenology
0:12:39right
0:12:40so that leads the second point which is
0:12:42so
0:12:43basically we don't have
0:12:45a so everything that you can answer
0:12:47by looking in an undergraduate statistics or or in your level signal processing text
0:12:53about
0:12:53and matched filtering analysis of variance uh so on and so forth least squares
0:12:59we don't know these things that program
0:13:01i
0:13:02there's not the same natural vector space structure as there is when you talk about
0:13:06simple signals and images and euclidean space and that the euclidean space
0:13:10and
0:13:10so we need we need
0:13:12we need the fundamental theories i mean
0:13:15know when you look back at what the theory of information look like before shannon came along and unified it
0:13:21that was sort of
0:13:23spread out and do couple
0:13:25i we need a sort of unifying theoretical under framework to understand
0:13:29how these things uh a one of the fundamental them
0:13:33of of of graph but based signal processing detection and estimation
0:13:37um then than the last one is
0:13:40if you step back and look
0:13:41what we're really doing is were just adding this
0:13:44correlates of context or the structural context
0:13:48on top of
0:13:49traditional signal processing cards that we are ready to large extent don't how to deal
0:13:54and are getting together context and content i think is another
0:13:57another another challenge for us
0:13:59um
0:14:00and
0:14:01if you ask yourself well
0:14:03but got a general the theory of signal processing for graphs and that even look like
0:14:07the answers are really not clear but
0:14:09i have left you with just a couple examples of the bottom
0:14:12oh of the way is an a which
0:14:14new were newly discovered mathematics can often kind of look behind the scenes on and and for quite a long
0:14:19time and that suddenly it pops up and becomes L relevant
0:14:22um um a couple of past examples boolean algebra
0:14:25a a weighted a quite a long time before the advent of does a logic
0:14:29that are and the matrix E which is uh another subject of a tutorial than the site
0:14:34i i i have a huge impact and wireless communications of the late nineties and early two thousands
0:14:38so it's an open question as to what mathematical advances
0:14:42are going to drive signal processing frameworks for network data
0:14:46and uh i heard you to a consider um taking up the challenge
0:14:50uh that's it thanks very much and uh uh now i'm gonna
0:14:54turn it over to car
0:14:57i Q patrick
0:14:58i
0:15:03one we do questions would be to questions on to the end of the way would be people have a
0:15:07in questions at this point
0:15:09i could have some for back click now
0:15:11and depending on how long things school we can decide where to cut think sure
0:15:15so
0:15:16a they
0:15:16question
0:15:17that people would like to ask
0:15:20so maybe i'll begin with one
0:15:22what
0:15:23what you see in terms of education
0:15:25and mentioned she didn't you can have been very
0:15:29and just on the base eight
0:15:32and
0:15:33i
0:15:34what is your view in terms of how do we teach
0:15:37and educate
0:15:39the next generation of these edges and students in the C
0:15:42yeah
0:15:43also so that that's a very good point uh
0:15:45so the good news as as as that the data are Q here and the compute infrastructure is also coming
0:15:51on and so probably
0:15:52many of us a educational institutions have access to pretty good compute resources at this point
0:15:58so i think the positives are that
0:16:00we can get a our students access to these kind of data example are for really in the course of
0:16:06their undergraduate education
0:16:08um
0:16:09the flip side and this is something that i didn't talk about all but is very interesting area to get
0:16:13into
0:16:14i issues about privacy and security
0:16:17right so i mentioned things like i mentioned face but i could've mentioned to or so on and so forth
0:16:22some of the most interesting social media data sets
0:16:25oh also the same ones that
0:16:28we need to be very careful about in terms of privacy concerns
0:16:32and at the policy level and the united states at least the national science foundation has been very involved right
0:16:38now in trying to figure out how to handle that
0:16:41other words a the right a research grant application
0:16:44to do signal processing education on some kind of corpus of face book data
0:16:49no know is that all right should the and S even be funding that what are the human subjects requirements
0:16:55to experiment with such data
0:16:57and so on and so forth
0:16:58for the moment all we've managed to do
0:17:01as collected data
0:17:03when people sort of have you have and their can then know that there
0:17:07a a being collected on so for instance
0:17:10uh a student a mighty business school
0:17:13agree too
0:17:14you know you cellular phones with the understanding that
0:17:17uh
0:17:18their proximity to other people on the study group would be measured and these data are actually available from mit
0:17:24T
0:17:24uh uh you can go on and use them they're called the reality mining data
0:17:28right
0:17:28so this is a very interesting but
0:17:31my a us understand about better the phenomenology of the subset of people who choose to go to mit mighty
0:17:36in the school but it's hard to know all right if that's the same thing as as the more general
0:17:40population
0:17:42so i think that's that's one of the big challenges is of is
0:17:45there is a a real opportunity to get people exposed to these kind of data early and their education but
0:17:50we have to work through some of these issues about
0:17:53uh data privacy and uh you can read about this in the paper almost every day
0:17:58thanks
0:17:59thanks to
0:17:59yeah you the question
0:18:01for be
0:18:02in
0:18:03this one
0:18:04oh right so then lena you
0:18:06okay have something that here "'cause" i want to look at some okay paper thanks yeah
0:18:11i i go go
0:18:13going to talk about a three D you'll process
0:18:16yeah also a long as then and of the back in explosion
0:18:20um
0:18:21so
0:18:22so a of an application didn't and that i was to in the nineteen thirties black and white be and
0:18:28then we and the to come section two
0:18:30somewhat poor resolution to
0:18:32it should be a solution from you each T for two for two D
0:18:37and then also be used in
0:18:39we we look at how experience
0:18:43that we had a be used to have temporal spatial we have a lot of below
0:18:48a a a a a a a a a a you know there is an had growing um
0:18:54um
0:18:55small was so that you know small the
0:19:00that gives them what to you and mobility and dark you know
0:19:06still have what can we do about that
0:19:08i i'm still a typical but to two hours by there and with my
0:19:14capture them out
0:19:17T V oh i
0:19:20and
0:19:21i
0:19:24would like to to me the and my and they're
0:19:27um
0:19:28and so and and the for user
0:19:32can
0:19:34sir
0:19:35um
0:19:36can have a problem because instrument
0:19:43but can't have the van and a and and then so they have to a good cop
0:19:49to give you my experience we have to capture a you know it's and have to step and can come
0:19:56a of time
0:19:57and the because you have some mismatch
0:20:01and in my view
0:20:04and
0:20:06and
0:20:07i
0:20:09and
0:20:10captures
0:20:12a a a a a a a and that you a process and a good then
0:20:17and compute and to go to do that we can start the main area
0:20:21for example i would like to maybe can capture
0:20:27that
0:20:27and and from back
0:20:29to to the the type of using them
0:20:32um
0:20:32but
0:20:34can
0:20:35and am so we had to go to is something with that it to plot we present position
0:20:41but you for dimensions an email either few was or whatever information need
0:20:47and um
0:20:48hmmm
0:20:51efficient solutions but the to the function solutions
0:20:54for
0:20:57sure
0:21:00and
0:21:01so
0:21:03and i
0:21:06we happen on a point
0:21:08um
0:21:09a the for example if is not good
0:21:13a bit of the a right you know
0:21:15um
0:21:16and i
0:21:19and if uh_huh channel
0:21:24and
0:21:27in the
0:21:29hmmm
0:21:30um
0:21:32a
0:21:35that
0:21:37section for for example
0:21:39the projection
0:21:41and you have to
0:21:45and the left you can be too
0:21:48in in the head
0:21:50but be it is score
0:21:53and
0:21:55uh_huh
0:21:55to options that what i i've been to use
0:22:00to
0:22:02that
0:22:03some some information
0:22:05as in
0:22:06is not
0:22:08i
0:22:08um
0:22:10a similar to
0:22:15i have a to show and
0:22:18a i and a conventional two
0:22:20how
0:22:21the the of missed a and
0:22:24do some special with
0:22:25um but to be
0:22:27that would we have to pair
0:22:28and a few
0:22:29so
0:22:30since since pulp
0:22:34and
0:22:35to
0:22:41time
0:22:42to cook missed and but to five not sure
0:22:46to and a factor so can we need a chance with limited resources and that a lot the band like
0:22:53an X not doing so they want to the three D and not be do not to be
0:22:57so some of the things that we need to on D to three time
0:23:03test
0:23:04and
0:23:06uh_huh
0:23:06if i say
0:23:08so and so as if you like you
0:23:13i i two to come for mentioned
0:23:16when to the U
0:23:19um
0:23:20i
0:23:21the complaints
0:23:24i
0:23:25the
0:23:26content can you know
0:23:28system passed of the content not of this in
0:23:34um
0:23:35if we look at that at the slides in
0:23:41from which we can to that information of the three
0:23:47imagine
0:23:48and that we can look at to you are be captured for
0:23:56oh
0:23:58so you using uh collapse
0:23:59and i'm i'm i'm not sure if you can see the the present
0:24:05if you we have like three
0:24:08and an structure from motion so try to that section of by motion for me D J like that look
0:24:14like a shame
0:24:16i
0:24:20that
0:24:22um occlusion Q
0:24:24i yeah i don't or that's to the medic but you
0:24:31and
0:24:33just
0:24:35and and the agenda the will come back to the question but
0:24:39but not
0:24:40i
0:24:41but
0:24:43i in to D two D to three D conversion
0:24:46i that information and to be able
0:24:50to to construct a a bunch of come here to i'm
0:24:56i mentioned that if we have a you know a couple of view
0:25:01to measure that and i had to men them and a global method
0:25:08i could and but uh
0:25:10i i'm and the globe in the back up to my
0:25:13function of the was to compute the disparity you
0:25:16yeah the local ones they more local but
0:25:20i i you know are know that he
0:25:23to to on the accuracy of
0:25:27i exploit quite clear image
0:25:31that's not
0:25:33i
0:25:36good
0:25:38yeah
0:25:39though
0:25:40i the optimized in fact
0:25:42however
0:25:43i the trying to improve you the the either or the i Q
0:25:47one i'm going to one by i go back to an from the G
0:25:52to to the cup
0:25:54the
0:25:55the man
0:25:56one at the cost at the
0:25:59that's
0:25:59construct a but in the and so do not computation
0:26:03um
0:26:04that used in the project
0:26:06topic and i i mention the that we would like to do
0:26:11and may be used
0:26:13i i can you how we have that image but most of didn't that we use the information
0:26:21to construction of the
0:26:23that that that do you
0:26:25a and that i'd that maybe other techniques one that using some image work
0:26:30based
0:26:32i a chance
0:26:34um
0:26:34we we have to the project that
0:26:37just
0:26:38how wear pink a function that we used to generate the views
0:26:42um
0:26:44come to stand i mean a simple not exploit to view
0:26:51and we would like to the notion an efficient back in me to do that
0:26:55and include include in the extension of the you statistics for a standard
0:27:00could you could the to code our web the three D you have to form that we have to make
0:27:07any compatible for my
0:27:10to do this to do you think sequence of a black for like by that left for the that thing
0:27:16you would just one sequence
0:27:18that
0:27:20or could use the frame compatible for way that you can put in a frame
0:27:26i could but then
0:27:28yeah
0:27:29but the problem is to have to down sampling and not the solution and then when one a display not
0:27:33you know a two point eight
0:27:35and interpolation method
0:27:36we can
0:27:37and also
0:27:40right to use the can in for the baseline we could
0:27:45you used to take come a compact to perform
0:27:49is just keep it
0:27:50to get back to the original resolution interest in at six three curve
0:27:58and
0:28:00i
0:28:01compensate for it
0:28:04much
0:28:07you
0:28:08and to a good quality
0:28:10can
0:28:11to you to to
0:28:15interesting
0:28:16a and and also i that from you the two D yeah and these we can
0:28:20the quite and um is it or or or you know be created about what to do and then a
0:28:28form of you just a just cup
0:28:30because case
0:28:32and each you you meet condition
0:28:34and you have you a quite understood to since the future can you you have used
0:28:40use
0:28:40is that that you might to
0:28:42so
0:28:43yeah information will now able to check the that had just
0:28:49the
0:28:49come
0:28:51and
0:28:53okay
0:28:56uh_huh agenda
0:28:59but
0:29:00if
0:29:02the content and
0:29:05to theory
0:29:07can i got to the house
0:29:09and to be the quality content
0:29:13and
0:29:14i
0:29:15the objective quality net fixed how to start with subset that the experiments to stand the major of three D
0:29:21video quality and a combat
0:29:24so we look at an option not
0:29:27but but
0:29:30i it out before we move into objective quite X
0:29:36and and you can do if you are
0:29:39come come
0:29:41and
0:29:41and
0:29:43i in the past
0:29:45should ensure on my team that's not going to be some to get on
0:29:50i
0:29:51but but you have to look for it
0:29:53i
0:29:54a sure to have a good subject score
0:30:01our objective metrics we the base of that
0:30:05known from find look okay
0:30:07you know just
0:30:08i back to the
0:30:10the the not got that D was not good be able to fix
0:30:14yeah have to be a given is to be a shame oh might not go to
0:30:18i i mean either one and a
0:30:20and had but for that i
0:30:23and
0:30:24email
0:30:26kind show that was
0:30:29to put the pen from an issue of can you
0:30:32and also still the post but
0:30:35and
0:30:36this i to this that you should just be proceedings of the ieee
0:30:41that's a in three display
0:30:43but but two thousand and
0:30:45if you wanna do some for the reading
0:30:46huh thank you
0:30:48thank you later
0:30:55i sort of
0:30:57spend a little time but we could still allow for
0:31:01one one two interesting questions from the audience so the anybody
0:31:04has questions please
0:31:07yeah one of the microphones
0:31:14oh
0:31:14i
0:31:15should
0:31:17oh um yeah is only if you speech to
0:31:19i guess and of the
0:31:21more the the the the uh using other types of modalities and
0:31:25to create three D like for example you know the box connector
0:31:28uh those types of things that
0:31:30using
0:31:31a lose or something like that to
0:31:33the fuse it's create make three dimensional
0:31:35uh
0:31:36representations
0:31:40the question is about to
0:31:42but
0:31:43for can into them read representations such as the experts can eh
0:31:47which to be uses yeah
0:31:49i
0:31:51a what about you
0:31:54and used to
0:31:56oh
0:31:58and and and to have no
0:32:00um
0:32:03uh_huh
0:32:04i to the clean that you have a
0:32:08compute
0:32:10um the hmmm the
0:32:14and my
0:32:16i
0:32:18yeah
0:32:19and the of time to could
0:32:22contents
0:32:23um
0:32:24and that comfortable and
0:32:28that's
0:32:30and
0:32:33i the question
0:32:35a question i should see
0:32:37like in that case
0:32:38in the
0:32:39interest of letting everyone had the lunch break
0:32:41that's thing the
0:32:42for this because the again