0:00:18well
0:00:20uh i will try to be fast because there is already select for you right
0:00:25so
0:00:27it's my pleasure to go down a introduce a this project through this is european
0:00:32project uh unique properties again
0:00:36uh the same the project is robust and save mobile cooperation of the number sixteens
0:00:42and the abbreviation is a actually
0:00:51uh so at the beginning and introduce the project itself in schools structure and then
0:00:58demonstrators
0:01:00oh then i will present our scientific development and achievements in this work or university
0:01:09and finally i will describe how we integrate the our results into the project
0:01:15the ministry
0:01:17so the main project all its production of advanced robust and C cognitive uh reasoning
0:01:23older than a simple practical robotic systems at reduced cost is most important thing
0:01:31then designed and it'll reusable building blocks collected in the knowledge base
0:01:38is the main idea of the project that
0:01:42knowledge base with
0:01:45two rolls independent building blocks that are reusable so then any other repeatable will be
0:01:55cheaper in
0:01:58present time
0:02:01so based on the state-of-the-art that is
0:02:06uh the project goals is to specified objects and develop in about solutions covering or
0:02:12crucial robotics topics
0:02:15uh such as however performance in addressing and i'm and perception
0:02:22and modeling reasoning decision making or validation and testing
0:02:28that and design of solutions are used to develop a knowledge base
0:02:34uh i and tools for body development and standardised testing
0:02:44a button on to manage those two approach
0:02:50one of the key or what
0:02:53as i said is development knowledge base but for knowledge based framework
0:03:02so first two points
0:03:05state-of-the-art records
0:03:07and new features are used to feel this knowledge base
0:03:14and used to design a methodology i'll work with this knowledge
0:03:22the model
0:03:23solutions will be then
0:03:26integrated in the demonstrators from a different industrial partners
0:03:44so there are examples of application domains
0:03:50they are covered by industrial partners in this project
0:03:54um
0:03:57maybe i was giving you to show you the partners that is twenty seven partners
0:04:01in position quite large amount
0:04:05the project is focused on corporation this industrial partners so besides universities and national research
0:04:11centres there is a high part oh
0:04:15members of the project
0:04:17uh our industrial representatives
0:04:23so come back to this light
0:04:29both and that area and probably both
0:04:34they are not round by
0:04:38um independently or incorporation are imperative solutions for accomplishment of surveillance or security inspection you
0:04:48to use a more extensive address
0:04:55examples might be more ignoring the borders the seashore all inspecting infrastructures some
0:05:06about exactly was i'll talk in detail later because this is what we integrate our
0:05:12solutions
0:05:19industrial in manufacturing wouldn't parts and products the process maybe
0:05:27like cutting the who surf a single assembly of the parts that are usually carried
0:05:34by nancy on C and C O machines
0:05:38in the current practise
0:05:40oh
0:05:42manual operation with machines is necessary because the machines are not able to of
0:05:49oh to the mostly work with small
0:05:53parts so the objective of this project is also solve these problems and then there
0:06:00is a set of this remote applications that are can ski because we california talked
0:06:07about the previous
0:06:09presentation in detail
0:06:16so in this structure where is a more research and it is it is university
0:06:29there are driven by requirements of objects applications
0:06:34but online as real-time sensor data processing is key requirement
0:06:41our proposal cues and development novel methods and optimization of existing ones and acceleration
0:06:49in our some
0:06:59the methods are well separated might be so i
0:07:06that is what we have developed some methods and the results were research O R
0:07:14based on says a fusion usually for those tasks for robot localisation an object detection
0:07:21of perception
0:07:25in the robot localisation uh we develop a method that uses two
0:07:32use an existing methods is you mapping this method that process a laser scans and
0:07:41have very low precision but the model as you can see the environment is quite
0:07:45or so
0:07:46it's harder to get any knowledge from it
0:07:50other methods
0:07:53based on data from kinect so we show and that's data
0:07:58can create a more informative model but the precision is so matter if use those
0:08:06two methods actually we are able to get much we can see the precision of
0:08:13a methods domain
0:08:18and so
0:08:20one remote so
0:08:22final solution is um precise and the model was then higher quality
0:08:32including detection
0:08:35we experimented and realised several methods i will briefly introduced to them in that um
0:08:42when we achieved a nice results that the uh published
0:08:48well all those detectors are then used in our system and fused together well for
0:08:57improving the robustness the final solution and the
0:09:03more precise modeling of the environment
0:09:08situation so one that is
0:09:12a segmentation of the video signal um by tracking of local features
0:09:21in comparison with existing methods that are used more on
0:09:26precision and instability of the method as i said i focus on
0:09:31speech and also
0:09:33to work in an online manner
0:09:36but usually off-line methods that exists use whole data to produce minus tracks and then
0:09:46cost in a nice way we cannot use all data from
0:09:51the future in on-line processing so this work this was uh problems that we need
0:09:57to solve so actually come up with the
0:10:00the method is able to run online and the real time with comparable precision but
0:10:09with
0:10:10many more times higher uh speech so
0:10:14computational cost is very low
0:10:20this method is able to segment one the robot moves
0:10:24this method is able to segment objects that moves in the video stream relatively to
0:10:31each other
0:10:32so it doesn't necessarily mean that the T V to move but if there is
0:10:36object far from some background we are able to see actually i shouldn't say we
0:10:42this is computed at the picture this is a segmentation you don't know if it's
0:10:47object one C
0:10:50the second method i want to introduce is um
0:10:54processing methods for this data um
0:10:59the idea is based
0:11:02that
0:11:03in indoor scenes many objects or wonder so
0:11:08this method segments longer well
0:11:14segments object
0:11:17again before used on a computational efficiency so here we have
0:11:25also slightly better precision than existing methods but there are many times faster than the
0:11:32others
0:11:37this is the last uh and rough research what we do here for the project
0:11:43this is for validation and verification part of the project reading something we should of
0:11:49common uh features
0:11:51um
0:11:53usually one
0:11:55robin systems are the lot and need to be very verify that there is need
0:12:02of some simulation process
0:12:05and usually we generate uh image actually
0:12:11right image so what we are trying to do is to somehow similar in the
0:12:15real situation so
0:12:18we in
0:12:20introducing to be nice um
0:12:24say
0:12:26distortions
0:12:27right lower noise and many other
0:12:33distortion
0:12:35chromatic aberration
0:12:37lens flare to basically a example
0:12:41not perfect
0:12:46so that was our research from scientific point of you know how we apply an
0:12:52integrated or assumptions into a vectors we cooperate with i hope the at like a
0:13:00different at that is um
0:13:03uh
0:13:05they have a covert in for gifts those allergies laser guided very close
0:13:12just physically moving groups
0:13:15and X is a link between different machines in big barrels moving the pilots and
0:13:23they do it autonomously controlled by one
0:13:27uh
0:13:29central unit
0:13:31i that controls oh
0:13:37situation in the better so we have sold to task with that
0:13:44one is also abundance this is crucial for that a lot
0:13:50oh
0:13:51solution because when two i T Vs meets somewhere because of one hundred you can
0:13:59feel because of something like models and yeah anyway however than rgb space there and
0:14:06in the corridor but i don't actually use
0:14:10so we need to solve all the other actually might
0:14:15and white this obstacle
0:14:22the second
0:14:24task is um
0:14:27this application track probably means that those ics needs to get into that right lower
0:14:34their the product and the track the problem is that
0:14:39the solution
0:14:42proposition solution for this uh framework
0:14:47is based on there is an allegation positioning system that works only inside the white
0:14:52house indulgently the rubber believes the barrels it was is position is as see what
0:15:00it is so the goal is somehow
0:15:06measure and perceive
0:15:09the localisation actually out of the barrels inside the track the track which later there's
0:15:16also some constraints
0:15:19so this to make that we try to uh
0:15:24so
0:15:27this example this is serious frames rubber problems sensor is a camera
0:15:35this is manually driven
0:15:38example how
0:15:42and we should be hey if i think it's J
0:15:47and then we see what type do this
0:15:51it's a person if it so that the robot it's politics
0:15:55this an object because according to those information we can say if you can what
0:16:01do not because of the safety reasons for example the person is not
0:16:06ll to be avoided
0:16:10the attendant actually or pilot is about the word in the harness and so
0:16:16uh so
0:16:19for this we design some structure i would represent what here and its input data
0:16:26centre we use of actions in some sort
0:16:32in all image domain or sensory that my
0:16:39our experiments we use
0:16:43R G and G other constraints bottom an environment we
0:16:51right project and what you don't objects in the environment
0:17:00alright
0:17:03having
0:17:05information about objects don't actually are report classify what type it is if it's a
0:17:12danger situation just warning situation
0:17:19a point of the information that rgb in making decide if he wants to avoid
0:17:26object so that it is that we have a planning module that computes the past
0:17:32and uh provide information twenty actually here we do the perception and more than the
0:17:41a
0:17:44i don't know any controlling on T V
0:17:47so we only provide measurements and
0:17:51that's a
0:17:53oh analysis and proposals what actually my too much oh but you know any controlling
0:18:04is it an example
0:18:05just of the constraints in front of the T that there is something classified object
0:18:13read position study danger or if it's just morning andrea
0:18:19according to those information we use something like decision making use a if it's a
0:18:25person it is too close to be muscle or the bill if it's other robot
0:18:32and it's far all what we need to do is slow down so this is
0:18:37people describing the decision making and remind you also computes
0:18:43either that a if the avoidance
0:18:47procedure is exceeded
0:18:51so this is our asepsis team for obstacle avoidance cost
0:18:56and the second that it is it is to get into the track
0:19:01localise itself in the strong using different localisation method then used in our house so
0:19:08you can see
0:19:10i wanted visual information track is
0:19:14to do we base or solution or laser measurements
0:19:22so we design simply the model of that right and we measured roles right points
0:19:28uh in that right
0:19:31and we provide measurements
0:19:33to the T V and T V then using a similar way as uh it's
0:19:40postage it's uh it's using the localisation information
0:19:44one house system
0:19:46so
0:19:51we have developed a methods
0:19:53for sensory processing
0:19:56i think the next imputation
0:19:59we use an optimization method one
0:20:03acceleration hardware
0:20:06um but it somewhere else in which can directors oh
0:20:12the results of those methods are was rendered looks like the use of two hours
0:20:20we created experiment experimental probably form that see here our experiments
0:20:29oh really
0:20:31and most of our results are great to demonstrate
0:20:36these cases
0:20:37um thank you very much attention