a work

a a a a a a about

uh present road constraints

a following what was little up like to sign from data hiding systems

this work by but a calm an on the press

a and i don't them could be here they

so while the my best to try to transfer to you

the the what that the uh

so this is the

a line of the presentation

yeah first will see

a

a brief introduction about day what long

for such a constraint

then we define all the perceptual constraints and the constraints coming from uh robustness considerations of four

i a side from data hiding

and we will uh derive

the

the equation corresponding to a a and betting that will follow

but a kind a generalized logarithmic the M

then we'll see the analysis of embedding bedding power embedding distortion power

and probably give the coding are or and a some result

or for data hiding in the in the last two years a lot of attention has been paid

two issues she's like a best at like to minimize the probably give the coding or or

but some mice and the robustness against several facts

a a low in the that detect but in case of this steak on a graphic a context

so keeping the covert channel uh uh

hidden

and also to security

but perceptual in

has been usually on the value

so a number of works present that

a dealing with perceptual impact is much lower than the move words to some four

any of these all issue

and of all the characteristics of the human visual system that you can we can think of

it this work is for used on what was low as law is uh a a a the rule

that's

a relates

the how big the money to the fussing a least

with

the money you of the distortion we can impose an that signal

more that to be perceptually notes

the intuition behind it what was lot

is that you with five if we have a a one kilogram one possible

i if we change two hundred grams

then this change will be noted

but if the pasta is fifty kilograms and that change should be hardly noticeable

so

the perceptual in

oh for modification

to a a low money to signal is not the same as the perceptual impact to that same

a a modification

to a very high my median see signal

so

what what was lost says

is that that modification that the signal must on the goal

more the to produce these smallest is not so we'll difference

is proportional to the magnitude of signal it's itself

so the higher the money do of a signal

the high of a modification that we can apply to that signal in order two

a a a half it conceals so not be perceptually not so

so what was is implicitly used

by multiplicative straight spectral methods

in which we have to the minute you of the watermark the do we that we are line at each

what with efficient

is proportional to the magnitude of signal what of the whole coefficient

in which we embedding it

but

they are outperformed by setting form data hiding scheme

so

the question that we can ask at this point is

can we exploit

the is the perceptual constraints am by was lot

in site form data hiding insist

and S is yes we can

and in this work

uh those perceptual trains

are

a

are are uh compared dies in the

use of what was a and that what was little is used

two

a derive a generalized version of a logarithmic embedding is scheme

of of side the of anything data hiding

and we will see several choices for embedding and decoding regions

as a function of the parameters of this

so first the four will the find the constraints

i mean from what was little and coming from a robustness constraint

and from signing from data hiding

that define the embedding equation

of the a spread spectrum we have to the a perceptual constraint

is that the man do use of the what are my each watermark efficient

is bounded by the i you to of the host signal

of of money to a host signal

times the money to of the spreading of the corresponding spreading sequence coefficient

and also this coefficient at uh that controls watermark trying

it will change

these a constraint

by uh

double bound

in which we with a here the that excite is possibly if for negative isn't it would be but lee

another those

and we'll but upper bound

a

the watermark coefficient by it that two times six i word that to is positive

and lower bounded by at the one times X i where that one

is negative

from site formant batting we get the constraint that we have to one types of of depending on the human

bit

we hear considering just a binary embedding breed would be completely and that was for any of its size

i'm from robustness we get to constraints

first that the was then these you what to be a minimum

if we get a uh a a a a a a given

a distortion power

then we a have to minimize the centrist density for that distortion power for their and many distortion power

and we have also that the total code we can be determined

by knowing any if its called words so if we we've we no one code work

for embedding a C don't then we know the whole codebook for embedding as C and also the whole code

look from embedding

a one

it from all these constraints

the embedding equation that we can derive

is this one

so this

a this this letting question really a some those a do the modulation we have here that

the vector coefficient

and also the embedded be it

but

it it is a in the logarithmic domain so it's a low but nick to the modulation

and a also these stone C here that makes seat

a kind of generalized slow rhythmic the em and will seen the following slides will do C means and what

is its function

all this the

the block diagram of the better better in which we to we take the input signal we get rid of

the sign

we

a a go to the logarithmic domain we have a beast and scene and we apply normal of the him

with either they're set a sequence the and with the input and letting sequence P

and then we get back to the not real domain and recall for the sign of thing but single

in this case the parameters C

defined as the shape

of the quantization region

in this case it used as colour

the boundaries of the quantization region

and C is bounded by zero and the that

the idea is that the either

and we can also be fine the quantization step then to delta i find a not real domain so the

equivalence

in the logarithmic domain would become a

that is the exponential of that

C is defined as the

low body from of one at that two or these so that to is the bound of we had before

for the

a a a a a a minute you've of a watermark efficient

and the C use and what makes this a a generalized logarithmic the M

we'll see that different choices of C if a different choices of the boundaries for the quantization regions

and

if we chose for example

a a if we define here

it a to the choice of that that two we determine the choice of C

so if we change it at two and we take it the two we close to come a minus one

divided by come up plus one

then the quantization boundaries would be a the middle of the center so i the arithmetic mean of the it

and this is the same codebook for multiplicative T M

if we chose at that to as the square root of come mine as one thing would have

the centroid of the geometric mean value of the quantization interval and this is equivalent to use in logarithmic the

M

and and not the choice is that the two could be able to come moments one to of a two

and in that case would would have the center it at the arithmetic mean value of the quantization into vol

all these three choices have come on that if we take the first order taylor approximation of at two

of it is getting of it that two

as a function of them uh

then all all three have the same first order taylor approximation

that means that if we are in a low distortion regime one dealt approach zero and therefore them approach just

one

then all of them are asymptotically equivalent

to see graphically if the

a a yellow

bars

we present the centroids

for the first choice we would have

but the quantization boundaries located at the middle of the

sent rights so what your from a to of two consecutive centroids rates

for the second choice we get this entry look at it at the geometric mean

all of the two boundaries

and and a chip third choice we have the center look at that the arithmetic mean

of the two boundaries

so for the coding

these choice of C can be taken for encoding just two

a a defined one quantization the embedding a one decision boundaries

and forty recording

for defining

B

skip

the uh the coding on that the column region boundaries

so the choice of C at the embedder and the choice of see that we will call C prime the

be colour doesn't have to be the same

a was you had

a a the choice of the embedder will be

drive them by the minimization of the embedding distortion power

and the choice of C prime at the decoder will be driving

by the minimization of the a the coding probability

yeah or four

so here we have a a a a a formula for the embedding distortion power as a function of the

host

a distribution

if we take the assumption of a little distortion dredging

then this the this equation gets independent of these score distribution and we get these approximation

and this formula is

a a symmetric with respect to see was to that that divided by two

and that L to the it but to happens to be the minimum of these

embedding distortion

then the function rows

yeah i to the boundaries of the domain of C

uh reaching the maxim at C plus zero and see posted

and

for what can be called the high distortion regime so for when that is pretty big

we have these approximation this not a realistic approximation of course because we will never be

and high distortion reading

but

a a a it serves to the propose of checking how much we can do urge from the low distortion

regime approximation

when this assumption is not really true

so if we put a plot the this the equations

and never we get this representation

this solid lines

represent

the

experimental results

the dashed lines represent

the approximation for a little distortion in here we have that that's so here this side we are in low

distortion volume

we see that the approximation is really good

and be sold the lines the dashed but for percent approximation for the high distortion reading

but is to which the experimental results tend when we are in this

side of the plot

will were present and the document to watermark ratio so it is the inverse to the embedding distortion

and we can see have that

a if we choose elements

points that are symmetric with respect to does that divide by two

for C

then we get exactly the same approximation you to the symmetry of the of the formula that we have seen

the previous slide

and we get the maxim a the maximum for the document watermark ratio for the choice of C paul

does that of to by to as predict

if we go for the probably of the coding of or

and if we take a minimum distance D colour that of course is not the

a optimal decoder but it serves the purpose of having

um an analytic expression a closed form expression for this probably four

then

in the low distortion in

we can come out with this approximation that depends on the choice of C at the embedder and the choice

of C prime of the decoder

yeah we see that

this formula is minimised

one C approach is that to

and once you prime approach is that the it by four

so we can see here that we have a trade-off and C

the choice of C at the embedder for me my sin the embedding power is not the same as the

choice of C

given for to mice in the decoding of or or but is this one so

one is that that a very but to and other is that

in any case

in you to the symmetry of the embedding distortion formula

in a if we are in low distortion writing the team of C would be in the second half of

the form that so

in between the the E to a two and that the

if this is not true then

a is no longer holds and C can be chosen at any point between zero and that that

a it's worth multi signals in this formula that here we have that this from less ill defined for C

prime ten into a zero or to does by the by two the points in which the sign

a a gets a all

so for that point approximation we would be worse

so we we plot the formula

then we get

uh

here the

uh a continuous lines so the theoretical approximation we get in the previous slide

and the dots

represents the a experimental results

we see that the board C prime

it was that that but it by for a we get the minimum of

the probability of the coding of or

for values symmetric

but a with respect

to that they but it by for we get exactly the same

uh

the same approximation of the previous formula

we see here

here here that these approximations not

it a very good at this point you to the ill definition of a formula

and anyway we see again that for the choices of C

as we increase E and we approach to that a

then we get a lower probably to the coding or

as expected

so checking which is the the robustness of this method against

a a different kinds of that's

and comparing it to other side in hiding methods

if which are we we choose uh jpeg peg attack

we can see

that a

if we plot the quality factor used for the jpeg attack

and the bit error rate that we get with that that

then when the attack is smile so we get a very high quality factor

nobody make the em performs

a bit worse

then normal of the M

and that's because

a here the probably do for or or you know what if think the M would be dominated by the

small magnitude coefficients

the top the the is centre each you four point station very close to each other

but for the rest

of the plot

we get here for this

the for this area of the of the plot

that for low quality factors so when when the tape a get that is a strong

low i think the M performs much better than

uh a normal the em and that's because the the robustness of the center each used for the high money

to coefficients is much

a a better

a you think the em than for normal the M

regarding the

another type of a tax so the awgn attack

we get exactly

the same results

when you're that is mine then a it make the em performs worse than the M but when the are

when the is a strong so we have uh a low piece a between the watermarking image and the

a what are marked and attacked image

then the performance of liberty make the M

it's much better than that of

uh uh norm of the uh

so to conclude

we have seen mean in this work that what was a can be used to derive a perceptually constraints

the informed watermarking systems

and

in this work a generalized version of logarithmic the M has been derived

and for this a generalized version a uh we have a study of the embedding distortion power and the probability

of decoding error

yeah yeah and the parameters optimize these two and few years

and we have seen also that peace proposed a scheme of performs the M when we consider a severe at

that

so for that's of your J P not that an awgn a text

how your performs a

norm of the M

that tense the

you

time of for questions you a wire are on is available

at once

i'm i'm sure that the or or or for for non the group

a a a a a so the which is much much better than me

fine

uh

which and this one

a yes

this this

if the uh it is fixed that um but in distortion

yeah and a power of the that is is for right yeah

yes of uh i

i i would have to check with them what they have exactly used

but of course some some measure sure of of uh distortion must be using a to to have a for

comparison

yeah

yeah

oh

yeah for sure but

in these sense using uh a a a normal perceptual you where

a a measure of distortion

then a a a a we get uh a a lower bound on the

difference

that we get

a with respect to lower make the em of course if we used a a a a perceptually were distortion

metric

then we would have a better results

that from here it is not

the no it's not clear that i'm have to check but i think that they have used is now

the perceptually to were yeah

yes your right

know bit

you

on this slide

i

thank you

so apparently a chose on the five lattice coefficient from bad

in my view point use

potentially introducing problems them of synchronization

on it seems to be that

scenes

the or is no longer reaching can of minus infinity when

for high quality meaning that you you no longer guarantee

the

efficiency of the scheme

i don't have one of the percent i'm betting efficient

have you looked that

you know which coefficient modify that time betting on

instead of

selecting a the five about this coefficient that detection you we use exactly the same quick would you have a

different care

i mean uh why this choice of the coefficient

well i can as and the joy but this

introduce the problem of synchronization i know

so

some is care is kind of

mergings the two problems in one here right

there a new modulation scheme and the uses

this synchronization problem

i you try to separate the two aspects no you here a synchronization is not consider the poll

so we have this completely the the

the synchronisation problem and of course you would be a problem you these

use if this truly followed

if we embedded in any coefficient than

one have those those strains

so i think you

much

a more thank you