us so thanks a lot for coming along is i'm right am reusing some slides

i gave

values for different presentation an internal sort of some it we have intel i hadn't

planned ahead to do this talk what they opportunity came up since vision a pistol

guess today to present this is

apologies about the branding and they didn't tell confidential as ensemble size you know that

okay so

my name's rubber private can i work for yet entails open source technology centre in

london

on a graphics engine enjoyed graphics for a long time

and just have the opportunity to work on a number of players in the graphics

to a linux ideas

one of the first companies i work so was imagination technologies right at the option

introduce silicon enabling

some about where and work through the open G L driver and some of the

kernel driver and that kind of low level work and

and i moved to

sort of a small company product and swear we'd it's discarded toolkit and gain some

experience with high level you are technologies and through into stress composites as and

or you can

the work in firefox

but so

as well as having this

passion for graphics

i've been fortunate to work in an office which is got cut indicated design is

visual design is interaction design as well as engine is the have been

involved in creating applications the design by these guys

and so i've seen the process by which we come up with ideas and we

create applications and so i hope they have some insight into where that process is

not as effective as i think it could be

and

so i'm here to talk about this project who rate which i started about

that you're going to be working on sort of part time which is this is

owed sort of markup

as you people missing from this and you know rubber's the is a guy i

work with maintaining project look always been helping with this time you must be a

he's on the graphics team at that in D C

i mean a is an intent have been working with that visit ago i could

chris cummings he's been helping out there as well actually we cows a visual design

it did this markup and help with the visuals of this project

the change rate has been helping out as well so

and so

i want to give a bit of background to why i started this project

and so the focus a good friend so i mean this the body aim of

this project is the to

to drive innovation in the space of user interface is we're looking to explore for

the use of the G P because

so

well actually just about because i i'm just gonna be a little bit rougher many

"'cause" i haven't run through this but so

during that experience of say five years of me working and you i technologies i

guess i've

built up this

kind of feeling that

you do we are completely you will not we were really wasting this gpu hardware

which is in our devices like mobile phones and a not tablets in a laptops

the state of you i technologies today we used a gpu for basically shifting tech

should rectangles around and so i've had this mounting feeling that is gonna be something

that we can do to

expose more creativity to the visual design braces an expose that gpu out with a

fully utilising the G P S as what i want to go through a bit

more

okay so and this and that through that experience of working with visual design is

what can we do to optimize the website from coming up with an idea to

create an application something just quickly recap state-of-the-art them

state-of-the-art

toolkits like G T K G T S waiting list plus the you pass and

even web technologies the that with familiar with that web okay

all of these are actually extremely similar technically speaking if you look at how they

work

and it's

they so they basically implement some form of the postscript or pdf rendering model is

the foundation of the vocabulary by which all of these technologies work

with and

so that vocabulary

in essence is these

three things it is just you have to D shapes you can fill them with

images there's two D shapes a basically just always rectangles i rounded rectangles if you

getting pretty fancy and you know theoretically you can use basic us and all that

fancy stuff but

who actually doesn't

you fill these two D shapes with images and you have text that's your vocabulary

i don't these are all built up you build you why up using the paint

is everything which just means that store the background and

will basically familiar with this a really a background press you met grant and then

your labels and foreground on top of that that's the model

and

you have all they we employ some form of seeing graph to organise the primitives

the which it's the components of the you why so you have a hierarchy with

the window the top and this is what defines

how you implement the paint is that within the stuff the top of the profits

of is the background

and the leaves a of the foreground just pretty simple it works to some extent

i mean the cornerstone of that model is the quality of those static images that

you get from your visual design that's

so these technologies

pretty much all i mean to engine is so that's part of the problem now

the come as an imperative A P I C python and such things

they have a markup language so

will familiar with H T M L with but there are things that you amount

as well

and it's

yes they all have this of sort of this contract and this promise that they

will as you develop these technologies you will to find a an interface for engineers

to work to and you must keep that's tables so

you know there's a button involved in maintaining a stable interface say that all of

your existing applications don't break as you of all that technology so is pretty darn

hard to think ahead of time what interface am i gonna be have to stick

to for the for the rest of time that i'm not gonna pay myself into

a corner as to how we can improve the flexibility of this

this technology visually speaking and enable more creativity for designers

it's that's difficult

so i sort of said this year this overwhelming failure of through this experience that

we could do so much more with the

more creative use of the gpu

so

this is spelling out of the be a the problem so we've got this the

first one is this limited vocabulary is the P D F postscript

rendering model it is like twenty five years so it's

and hasn't changed

so we've got the middleman toolkits this is the this sort of

liar where it's a means to an end these toolkit so just there for an

engineer to recreate the vision of the designer and this is sort of a language

barrier between that vision and the technology the designers tend to have very little insight

into the working of these technologies so the really obvious middleman and maybe we can

think about ways of not having quite the same

approach and avoid that middleman sort of

translation that we that we deal with today

and when we think about it from the designer's perspective as well

they're using tools like photoshop well in escape

and things like aftereffects

these are the intended for creating static images

and they have all kinds of fancy effects available to them by the end of

that you get a beta emission and we slice up and the rectangles and or

you are with that all the made for offline video

pricing like aftereffects and that uses things like ray tracing to come up with this

affects you know there's no way those kinds of effects are gonna run in real

time on a firing anytime see

and so

well these designers are actually trying to do is create interfaces that can run on

a results constrained device and be responsive to in the

and just you well within a power sort of constrained and it's

so they really don't at the moment have tools are actually geared towards that specific

problem so maybe we can think about

inventing new tools that right

improve that part

so

that is just the actual what fly

the very plastic sort of outline of how design goes of them i mean is

you have the special design a

creates sort of sketches out is vision in something like photoshop to indicate

and they had over the wall if you like to the engineer who has to

reinterpret that vision and translate that into and i pi a sort of fashion so

that the this station totally disconnected from the sort of visual creative side of things

that are opening up your reference manuals and implementing this and that implement is typically

on that desktop machine a laptop or so even if they are trying to create

something for fun you can initially priced at the stuff on your laptop

and once you kind of got it working then you're gonna started so that's integrate

this with appeal system to be able to cross compiler whatever it takes to get

it on T of fighting and then start playing around with that will how does

this before madison's response

and

at that point that way you're actually starting to test on the device

is a pretty high chance that you're gonna realise that this like some problem like

it doesn't the design doesn't fit the form factor like the designer which nist back

expected it to it doesn't respond as of as of quickly is it needs to

this like so many possibilities that could just not be quite right and you're gonna

have to go all the way back to square one where the designer has to

rethink the idea in the interaction model to be able to work with the technology

that they have

because the alternative is that you go back to engineering transfixed the technology but that's

the and you know an unknown quantity as to whether or not you're gonna be

have to fix a technology in time if you got the time so your options

are pretty slim about point to have to redesign things

and

another aspect of this is the ability to objectively measure certain

quantities of quality a that there are important for user interfaces

you a consumer expects that they're fine it's gonna respond in a timely manner when

they when they touch it lays we've around the you why their expected not to

redo that battery but i expect a certain level of visual quality and right now

we don't have the integration in a tooling and the

the processes by which to really make sure that there is

that we're on top of those quantities right through the whole design development process those

tend to just be a and often will not so much an article but they're

the end of the line

in the pipe line of

how we develop these things

so

i guess i kind of into the but some of the solutions to these as

i was going through the problems but

obviously the

one of the main things that we're interested in here is

throwing away that limited price good model and saying if we can explore way more

features of the gpu to

there's a lot of flexibility in a gpu what can we are not there and

give to the visual design of so allow them to just be more creative

the model of the price good model is actually really difficult to implement efficiently on

a gpu just given the way the gpus is kind of difficult to avoid the

over draw in the price good model because you going from background or foreground and

that's the model but you build up to see your it's hard to avoid colouring

in all of those redundant fragments and pixels behind the stuff environment of it that's

just quite a tricky problem

and so as you can probably guess from the screenshot of the beginning of this

talk the a big aspects

to how i've sort of see as being able to so many of those problems

is to develop a spec tooling

which is what the rig projects as is looking at doing

so what we're doing is we're creating we were actually

trying to learn from how game developers work here because

game developers face so many of the same problems that we face as you i

developers except that they have

done a much better job of taking advantage of gpu hardware than we have they

have to create interface they have to create something that can run in real time

response that you know interactively with the user interface with the user and we need

to do the same kinds of things

and

the weighted various guys work is that they have a rendering engine

and they often build up a lot of this but tools to help them create

their wells create that they games

and

they have to work as well with design is an engine is and if you

look at projects like unity three D where they have really quite so you know

capable tooling to be able to connect

that whole works like between

game design is and the engine is the created this logic and you game itself

so we're looking to sort of reports their ideas as much possible "'cause" we can

build on their experience especially they call of that on a lot of work on

into researching algorithms the less is the really interesting visual things of the G P

A

so

we want to create technology which is the you are rendering engine that we can

share between something that's gonna deploy onto a device

the route to run a U I and this and that the tool itself is

gonna be using so the design process is actually constrained by the capabilities of the

technology

so the visual designer was and then

the beginning and specialised tools during the visual design process

we want them to also the really connecting that be issue does development stage with

the hardware that they want to design for so we have

the i what we have in implemented even the ability to have a network connection

between the design so and the device this

that your you you're targeting so that while you're working in the design so

it's easy play around with animations of things that's gonna immediately a date

what's on the device so you can pick it up and check what's the responsiveness

like what's the quality of this like say that you're connecting testing right so that

early prototyping stage which there's just no substitute for testing your ideas on real hardware

so we see that as a pretty compelling way of improving how works like

and optimising the time it takes to develop

applications

the

the last one is necessarily can it will be done with within the same so

what we need to develop tools

for measuring save the power usage while you're running you so you testing your you

why on the device are descended asking you wanna be able to measure so what's

the power usage what i'm doing certain things or measuring the latency by which

you when use what

how long does it take for the interface to stop responding to that so even

we're even thinking of things like will high-speed be at video cameras and a processing

recordings of that kind of thing

we need to think through that problem as well

so

i guess at this point time to just

maybe but

if it was to just jump in and ask questions especially now that i'm gonna

basically demo a demo a then

just raise and stuff so

and

there's i wasn't really expecting to be there and then i guess and i've been

hacking on this the chances that this is

so let's this gonna go through a bit in order to this area on

on the last year is

well so that we're aiming at work can asset you can work load here so

we are aiming for

this area is where you import say things that we created that about does it

could be images made by design is impaired e-shop we could be models made introduced

your max it could be logic was created by engine is something that needs to

be integrated could be audio and so these kinds of things

and is this the sort of search area so we can search the images of

videos or whatever

whatever we need and then

people there is assets into the this centre area which is where we actually build

up the user interface itself we say we can we get immediate visual feedback about

the scenes that we're trying to create

on the right hand side here which is public to the moment but this bottle

would be where we would expect a more tooling capabilities and it's put there so

you can to the trends in the expand into the space of the interface to

give more detailed information about tutoring and when you're not using very spending can sort

of shift out of the way

and then down button here this is where we would so

deal with controlling of the problem is actually it is the best

just as the actually a where this is some of the properties

for the one of the objects investing a this idea of entities in components and

entity is like something ins is essentially just relative to a parent in space and

it has a bunch of components that can be associated that adding more semantic information

to the object the more components year at the most semantic meaning that thing has

and

these red buttons here about being able to control the particular property so if you

click one of those and it brings the ads it that this area down here

which is cooled what is to control is the initially when we were first that

this is just but online management but that's in the process of evolving beyond just

key frame based animation at the moment this is this example is just built around

key frame animations and it's just a

a toy them or

having in the idea i guess in this case is a fire an application that

we've got notifications bubbling up and something that's more recently people problem and it's gonna

ten years excellency can see a bit

visual stuff

so you've got down the field of facts which shows the something better way look

soft it's as modelling out real well camera works

and we're also playing around with shattered mapping in this example which is way of

having real time that shot as much

and we've also got real time lighting so

and you have a standard a lot of the lights based on and if you

which is a

so the

is a light source

well see we're ripping off at the moment it's

very traditional with actually seeing daniel engines this is largely a as a practical

the this

the practicality because

we have to bring up a whole so it's

there's

we didn't have the time to do the research and you visual algorithms this one's

a well understood and they work well together

but we'd be looking to do research on that's the

if you a stylised ratio that would be which is not something you'd necessarily expect

and again but would make more sense in a you why

but even these ones i think that there is scope being

so for having a design and it's have to use that a so say to

know go overboard because this does give you a lot of power to

you know

create a you why that would be and in massive three D virtual kind of

the you could easily go for the problem is a three D environment that you're

modelling within and but

you don't have to go overboard

so let me just the same time show some of the effects that

that

this one this one doesn't have i have got assets that really geared towards this

but it's a

a technique that as well uses the G P a bit more than where

doing them i wanted to bump mapping

but it's

your meeting so that that's twenty ten should have that image but that represents a

modulation in the shape of the a surface so it makes so it represents a

high essentially and

and you can really see the when i click that the service that it's the

it changed visually but what's kind of interesting is when you select the light

and you change the direction of the light goes out

didn't last too

so that's just to stop i found nothing special about that and the bum not

the case with this was just generated by running it through a program that uses

essentially so profile to that says there where there's contrast in the image that they're

edges best it automatically generates

so idea the edges you're moving up there

so if you take the last of the questions again you see that this like

a detail in this every stick gets changed the changes we make a lot of

this that this is not some is craft it's

what it's about this but i think with a bit of have a visual design

and knowing what they wanted to G fan with us dedicated assets then this room

to make something just bit more dynamic an interesting that light source might be doesn't

have to be moving know every time you move your finger around it could look

a little bit over the top but maybe it moves is

as you sort of time between spaces on your fine

another there's another this one is a fine example of just using the G P

to style as something the

show is

is the video on this

just to show the videos just to show really that is something that can be

done in real time and this point is an is court appointed is in the

fact that and all this thing is

great something on the original images scale this write up so you can see what's

going on

so as well

snapped and things to be have to

to control

and

we can

stylised something and this point is an effect

i guess used quite a bit in static images and prices of things

but we can do this dynamically and depending on the lightness of the pixel is

a it within the image then we're changing the size of some geometry to

to just create is a like nice looking at back to the

might want to have this in on a background of some aspect of your hard

merrier of a of a device interface is something that again have to design either

i get to leave it out to them but i'd like to be up to

provide them tools and freedom and but constraint as well two things i know to

run efficiently on the G P and it's almost cheaper to run this video effects

then it would be to actually show you the video normally just because in this

case okay ish

there will be in this case

a the i only points where we do like colour space conversion that why U

V to rgb of the double points which we are showing a secretary doing in

some ways less work than you would do to sure the emission and that's kind

of the but

so that's kind of the state of the two of them i expect i actually

they way in which the design so to next to the slave device actually which

nice

so this represents a device sitting on your desk which might be if i you

know say

well i it i don't have anything that's got this like running on a real

fine but this is the that we had an hundred back and that we created

a while ago and but it's like some updating also

so that there's connecting this

design to that device

celebrate from civilisation

so if i scrub through this animation then you see we have the effects are

enabled on this one well the not in the editing area so you can get

a feel for what it's like on a device and in here

but then

here if i gotta say to the end of the time line and grab this

change the animation at the end

about my device so anyway immediately outdated and it's see going to in czech what's

the timing of that change the key frame it's right into you constantly you're happy

with the work that while seeing it for real on the actual

and this done using of all here at the moment so ago zero come so

that it can just discover the device

this is possible

okay so that's basically will content about the moment to be have to introduce great

but be happy to take any questions and to

okay thanks to

question to you

the point to do so i want to run right well actually is the that's

not just i if

save for many with the times fragment pricing in fact expressing in terms of gpu

something like the future as well so often video effects i just done as fragment

programs like say you are doing just colour rising or you have to de saturate

or contrast or brightness is kind of thing you just running over every single bit

pixel in the image in this case where actually we have geometry for the cells

if you like in that affect

and

it starts off as basically a tiny unit rectangle for each cell which texture maps

a circle and when it reads through the videos data i need something on these

course great points so i really does like that why you be colour space compression

on a specific points looks at the lightness of those and then depending on the

like this is essentially scaling out that geometry so i've rule you're doing way less

fragment processing then you would often differ a bit effect a and you're moving a

bit to the you know the but there was really interesting to what i'm what

is you some that's in the tool designers can use

so someone to explore

i was so you wouldn't be able to see yes is so what are the

particular a design decisions made a is the it doesn't really doesn't really scale to

try to come up with them really modularising rendering system like at the extreme where

we have class to well i mean it's changed now but we you know with

to use to be able to have a so that

every single actor could write all kinds of in narrative card to do alighted roaring

and it's very flexible but if you want to optimize how use a gpu you

tend to have to do really classy with things and make various assumptions and it

works a lot better if you just make that piece of rendering card a lot

more monolithic so enrage it's much more sort of like a game engine monolithic rendering

engine and if you wanna extend a then that

the then you basically

the engine is to have this specialised experience to know how to program a gpu

efficiently they should come and should be every project and at their affects another features

that because the number of people that have specialising gpu programming

is barry lighter in terms of the number of people the ones right applications so

we can expect application developers to be writing G S L and we're you know

sort of shade ascends integrating their rendering ideas it a into a modular system and

also get the performance we will amount is that

so a couple questions the first is

what do you actually sending to the devices it just is just display list basically

with animation descriptions and i didn't have the network to device but well the whole

the rendering engine itself is running on a device or the description of the scene

which is we have this seen trough of

entities and entities just

there they just have a position in three D which can be write a sin

transform relative to a parent and potentially we could be sending multiple would be seeing

grass and each component has each entity has a bunch of components which say ads

the idea that you have material which says how should be lit or some sort

of video source image source it could be some logic associated with that entity that

triggers when you have input all this what kind of high level

sort of semantic description it's what's gonna get sent over the wire to the device

and it's gonna like that

second of which in the my second question you talked about it kind of like

already against are they created a their scene with reagan so like a game engine

so when it comes time to actually like should the application of a shipping the

maybe so what you know where they shipping the all the assets and then this

description it's interpreted

well that's because the way the than the network writable has to be able to

serialise everything that relates to be the description of the scene including bss themselves and

we also use this so we using part school buses which quite widely used way

of civilised later and they give you a nice markup for describing messages for stabilising

data and we use the same particle we use the same mechanism for being a

to stabilise over the network as we do for stabilising to this when we is

at the moment when we see realised that this we're keeping references to external assets

but if you're gonna make it for distribution on device then we would also be

see realising bss themselves in the same way that we do for the network and

you would just have this one top rate file which you just

and put on the device along with the actual read

sort of engine itself built in three ways the i the bills with all of

the editing capabilities around it or it builds

in as a slave the sort of it as a slave so the one on

a device which has that network transparency or just

for deployment of device it doesn't need to network transparency doesn't need any day editing

capabilities it's quite got basically no dependencies exact G L at that point and

huge ship that device might plus your top rate file which has everything sort of

sailors together

so one final question have you considered a

a web device so i probably but it actually say this so we we're interested

in supporting a bunch of platforms with this and that includes web development in fact

so this way we have got the fold them are working but i'm the i

at least we want to do the proof of concept is

see if it's feasible and i'm kind of optimistic that is because we have the

rendering side working with web G L and so you know this depends on G

L to do to have access to the G P we trying to projects the

gpu the only

weighted we can do that in the context of the web is web geo and

it looks like that's gonna basically be

across all major browsers that is soon including i and so

that's

that's where we're looking to go

so you mentioned that the friendly model we should try to not a bill with

like you know the we that anymore so you have a specific a specific examples

of how we

so apply principles that re well probably uses to get rid of the P D

F really model like well run vision even this initial example but the way that

you build this particular scene out it's got intermediate rent is where you have to

first determine that if this geometry up before you're able to come up with the

blurring this based on that distance and the there are multiple stages involved in coming

up with that final is so you have to do it is a it is

a step where you look from the perspective of the like for instance just say

well from the lights perspective what geometry can i see said i can determine what

geometries in shadow and you know this there's a whole bunch more involved in that

model of rendering which is completely different

the postscript model of you know you just draw your background you draw other stuff

on top of it in the middle and you draw your labels and stuff the

is just a that's a completely different model you couldn't achieve

this kind of thing with the postscript model in real time

but is the other things like i'd be interested it's at least exploring some of

the stylised to painterly rendering or so where do we can make something look a

bit like it was painted with oil paints or the like it was sketch just

something aesthetically

you know interesting there is a

that would that would take your completely different approaches to rendering as well then even

what i just described there

and so these are the really quite far removed from that

painter's algorithm hands and the vocabulary of two D shapes text and images

is that

that makes sense

i'm thinking of something like the broadsword browser for example yes you know very complex

you have to support a lot of different techniques for throwing a so probably we

could not completely get rid of all P D F and the remote but maybe

some of the principles that you a scribe know could be applied

so probably you have some specific you know experience with their potentially be way so

you could

other ad rendering a new a rendering model capabilities within the browser but it works

i to be sort of

sandbox to some region and then you get this new model to use in a

certain area it would be challenging to fit the postscript model within a different model

in that

this in these in a lot of these cases you're using that to determine a

something is behind something else so you have to other

trying to fight the idea of the postscript tiring in terms of shifting things that

you might be able to sort of rhetoric that's on this about stuff into the

and

a some browsers which has

but

maybe we can chat afterwards of that's different is if you interested

go thanks

so