0:00:14a work
0:00:15a a a a a a about
0:00:16uh present road constraints
0:00:18a following what was little up like to sign from data hiding systems
0:00:22this work by but a calm an on the press
0:00:24a and i don't them could be here they
0:00:27so while the my best to try to transfer to you
0:00:29the the what that the uh
0:00:32so this is the
0:00:33a line of the presentation
0:00:35yeah first will see
0:00:37a
0:00:38a brief introduction about day what long
0:00:40for such a constraint
0:00:42then we define all the perceptual constraints and the constraints coming from uh robustness considerations of four
0:00:48i a side from data hiding
0:00:50and we will uh derive
0:00:53the
0:00:54the equation corresponding to a a and betting that will follow
0:00:58but a kind a generalized logarithmic the M
0:01:00then we'll see the analysis of embedding bedding power embedding distortion power
0:01:04and probably give the coding are or and a some result
0:01:08or for data hiding in the in the last two years a lot of attention has been paid
0:01:12two issues she's like a best at like to minimize the probably give the coding or or
0:01:16but some mice and the robustness against several facts
0:01:19a a low in the that detect but in case of this steak on a graphic a context
0:01:24so keeping the covert channel uh uh
0:01:26hidden
0:01:27and also to security
0:01:29but perceptual in
0:01:31has been usually on the value
0:01:33so a number of works present that
0:01:35a dealing with perceptual impact is much lower than the move words to some four
0:01:39any of these all issue
0:01:42and of all the characteristics of the human visual system that you can we can think of
0:01:46it this work is for used on what was low as law is uh a a a the rule
0:01:51that's
0:01:52a relates
0:01:54the how big the money to the fussing a least
0:01:56with
0:01:57the money you of the distortion we can impose an that signal
0:02:00more that to be perceptually notes
0:02:03the intuition behind it what was lot
0:02:05is that you with five if we have a a one kilogram one possible
0:02:09i if we change two hundred grams
0:02:11then this change will be noted
0:02:13but if the pasta is fifty kilograms and that change should be hardly noticeable
0:02:17so
0:02:18the perceptual in
0:02:20oh for modification
0:02:21to a a low money to signal is not the same as the perceptual impact to that same
0:02:26a a modification
0:02:28to a very high my median see signal
0:02:30so
0:02:32what what was lost says
0:02:33is that that modification that the signal must on the goal
0:02:37more the to produce these smallest is not so we'll difference
0:02:39is proportional to the magnitude of signal it's itself
0:02:42so the higher the money do of a signal
0:02:44the high of a modification that we can apply to that signal in order two
0:02:48a a a half it conceals so not be perceptually not so
0:02:53so what was is implicitly used
0:02:55by multiplicative straight spectral methods
0:02:57in which we have to the minute you of the watermark the do we that we are line at each
0:03:01what with efficient
0:03:02is proportional to the magnitude of signal what of the whole coefficient
0:03:06in which we embedding it
0:03:07but
0:03:08they are outperformed by setting form data hiding scheme
0:03:12so
0:03:12the question that we can ask at this point is
0:03:15can we exploit
0:03:16the is the perceptual constraints am by was lot
0:03:19in site form data hiding insist
0:03:21and S is yes we can
0:03:23and in this work
0:03:24uh those perceptual trains
0:03:26are
0:03:27a
0:03:28are are uh compared dies in the
0:03:30use of what was a and that what was little is used
0:03:33two
0:03:34a derive a generalized version of a logarithmic embedding is scheme
0:03:38of of side the of anything data hiding
0:03:41and we will see several choices for embedding and decoding regions
0:03:45as a function of the parameters of this
0:03:48so first the four will the find the constraints
0:03:50i mean from what was little and coming from a robustness constraint
0:03:54and from signing from data hiding
0:03:55that define the embedding equation
0:03:58of the a spread spectrum we have to the a perceptual constraint
0:04:02is that the man do use of the what are my each watermark efficient
0:04:05is bounded by the i you to of the host signal
0:04:10of of money to a host signal
0:04:12times the money to of the spreading of the corresponding spreading sequence coefficient
0:04:15and also this coefficient at uh that controls watermark trying
0:04:20it will change
0:04:21these a constraint
0:04:23by uh
0:04:24double bound
0:04:26in which we with a here the that excite is possibly if for negative isn't it would be but lee
0:04:30another those
0:04:31and we'll but upper bound
0:04:33a
0:04:34the watermark coefficient by it that two times six i word that to is positive
0:04:38and lower bounded by at the one times X i where that one
0:04:41is negative
0:04:43from site formant batting we get the constraint that we have to one types of of depending on the human
0:04:48bit
0:04:49we hear considering just a binary embedding breed would be completely and that was for any of its size
0:04:56i'm from robustness we get to constraints
0:04:59first that the was then these you what to be a minimum
0:05:02if we get a uh a a a a a a given
0:05:05a distortion power
0:05:07then we a have to minimize the centrist density for that distortion power for their and many distortion power
0:05:13and we have also that the total code we can be determined
0:05:16by knowing any if its called words so if we we've we no one code work
0:05:20for embedding a C don't then we know the whole codebook for embedding as C and also the whole code
0:05:24look from embedding
0:05:25a one
0:05:27it from all these constraints
0:05:28the embedding equation that we can derive
0:05:30is this one
0:05:31so this
0:05:33a this this letting question really a some those a do the modulation we have here that
0:05:38the vector coefficient
0:05:39and also the embedded be it
0:05:41but
0:05:42it it is a in the logarithmic domain so it's a low but nick to the modulation
0:05:46and a also these stone C here that makes seat
0:05:49a kind of generalized slow rhythmic the em and will seen the following slides will do C means and what
0:05:55is its function
0:05:56all this the
0:05:57the block diagram of the better better in which we to we take the input signal we get rid of
0:06:02the sign
0:06:03we
0:06:04a a go to the logarithmic domain we have a beast and scene and we apply normal of the him
0:06:08with either they're set a sequence the and with the input and letting sequence P
0:06:12and then we get back to the not real domain and recall for the sign of thing but single
0:06:17in this case the parameters C
0:06:19defined as the shape
0:06:21of the quantization region
0:06:23in this case it used as colour
0:06:24the boundaries of the quantization region
0:06:27and C is bounded by zero and the that
0:06:29the idea is that the either
0:06:31and we can also be fine the quantization step then to delta i find a not real domain so the
0:06:36equivalence
0:06:37in the logarithmic domain would become a
0:06:39that is the exponential of that
0:06:42C is defined as the
0:06:44low body from of one at that two or these so that to is the bound of we had before
0:06:48for the
0:06:49a a a a a a minute you've of a watermark efficient
0:06:53and the C use and what makes this a a generalized logarithmic the M
0:06:58we'll see that different choices of C if a different choices of the boundaries for the quantization regions
0:07:03and
0:07:04if we chose for example
0:07:06a a if we define here
0:07:09it a to the choice of that that two we determine the choice of C
0:07:12so if we change it at two and we take it the two we close to come a minus one
0:07:16divided by come up plus one
0:07:18then the quantization boundaries would be a the middle of the center so i the arithmetic mean of the it
0:07:23and this is the same codebook for multiplicative T M
0:07:26if we chose at that to as the square root of come mine as one thing would have
0:07:30the centroid of the geometric mean value of the quantization interval and this is equivalent to use in logarithmic the
0:07:35M
0:07:36and and not the choice is that the two could be able to come moments one to of a two
0:07:40and in that case would would have the center it at the arithmetic mean value of the quantization into vol
0:07:44all these three choices have come on that if we take the first order taylor approximation of at two
0:07:49of it is getting of it that two
0:07:52as a function of them uh
0:07:53then all all three have the same first order taylor approximation
0:07:57that means that if we are in a low distortion regime one dealt approach zero and therefore them approach just
0:08:02one
0:08:03then all of them are asymptotically equivalent
0:08:07to see graphically if the
0:08:09a a yellow
0:08:10bars
0:08:11we present the centroids
0:08:12for the first choice we would have
0:08:14but the quantization boundaries located at the middle of the
0:08:18sent rights so what your from a to of two consecutive centroids rates
0:08:21for the second choice we get this entry look at it at the geometric mean
0:08:25all of the two boundaries
0:08:28and and a chip third choice we have the center look at that the arithmetic mean
0:08:32of the two boundaries
0:08:34so for the coding
0:08:35these choice of C can be taken for encoding just two
0:08:39a a defined one quantization the embedding a one decision boundaries
0:08:43and forty recording
0:08:44for defining
0:08:45B
0:08:47skip
0:08:48the uh the coding on that the column region boundaries
0:08:51so the choice of C at the embedder and the choice of see that we will call C prime the
0:08:56be colour doesn't have to be the same
0:09:00a was you had
0:09:00a a the choice of the embedder will be
0:09:03drive them by the minimization of the embedding distortion power
0:09:07and the choice of C prime at the decoder will be driving
0:09:10by the minimization of the a the coding probability
0:09:13yeah or four
0:09:16so here we have a a a a a formula for the embedding distortion power as a function of the
0:09:20host
0:09:21a distribution
0:09:23if we take the assumption of a little distortion dredging
0:09:26then this the this equation gets independent of these score distribution and we get these approximation
0:09:32and this formula is
0:09:33a a symmetric with respect to see was to that that divided by two
0:09:38and that L to the it but to happens to be the minimum of these
0:09:42embedding distortion
0:09:43then the function rows
0:09:46yeah i to the boundaries of the domain of C
0:09:49uh reaching the maxim at C plus zero and see posted
0:09:53and
0:09:54for what can be called the high distortion regime so for when that is pretty big
0:09:59we have these approximation this not a realistic approximation of course because we will never be
0:10:03and high distortion reading
0:10:05but
0:10:06a a a it serves to the propose of checking how much we can do urge from the low distortion
0:10:12regime approximation
0:10:13when this assumption is not really true
0:10:17so if we put a plot the this the equations
0:10:21and never we get this representation
0:10:22this solid lines
0:10:24represent
0:10:25the
0:10:25experimental results
0:10:28the dashed lines represent
0:10:30the approximation for a little distortion in here we have that that's so here this side we are in low
0:10:35distortion volume
0:10:36we see that the approximation is really good
0:10:38and be sold the lines the dashed but for percent approximation for the high distortion reading
0:10:43but is to which the experimental results tend when we are in this
0:10:47side of the plot
0:10:48will were present and the document to watermark ratio so it is the inverse to the embedding distortion
0:10:54and we can see have that
0:10:56a if we choose elements
0:10:58points that are symmetric with respect to does that divide by two
0:11:02for C
0:11:03then we get exactly the same approximation you to the symmetry of the of the formula that we have seen
0:11:07the previous slide
0:11:08and we get the maxim a the maximum for the document watermark ratio for the choice of C paul
0:11:14does that of to by to as predict
0:11:18if we go for the probably of the coding of or
0:11:21and if we take a minimum distance D colour that of course is not the
0:11:25a optimal decoder but it serves the purpose of having
0:11:28um an analytic expression a closed form expression for this probably four
0:11:33then
0:11:33in the low distortion in
0:11:35we can come out with this approximation that depends on the choice of C at the embedder and the choice
0:11:40of C prime of the decoder
0:11:42yeah we see that
0:11:43this formula is minimised
0:11:45one C approach is that to
0:11:47and once you prime approach is that the it by four
0:11:50so we can see here that we have a trade-off and C
0:11:53the choice of C at the embedder for me my sin the embedding power is not the same as the
0:11:58choice of C
0:11:59given for to mice in the decoding of or or but is this one so
0:12:03one is that that a very but to and other is that
0:12:06in any case
0:12:08in you to the symmetry of the embedding distortion formula
0:12:12in a if we are in low distortion writing the team of C would be in the second half of
0:12:16the form that so
0:12:17in between the the E to a two and that the
0:12:20if this is not true then
0:12:22a is no longer holds and C can be chosen at any point between zero and that that
0:12:27a it's worth multi signals in this formula that here we have that this from less ill defined for C
0:12:33prime ten into a zero or to does by the by two the points in which the sign
0:12:39a a gets a all
0:12:40so for that point approximation we would be worse
0:12:43so we we plot the formula
0:12:45then we get
0:12:46uh
0:12:49here the
0:12:50uh a continuous lines so the theoretical approximation we get in the previous slide
0:12:54and the dots
0:12:55represents the a experimental results
0:12:58we see that the board C prime
0:13:00it was that that but it by for a we get the minimum of
0:13:03the probability of the coding of or
0:13:05for values symmetric
0:13:07but a with respect
0:13:08to that they but it by for we get exactly the same
0:13:12uh
0:13:12the same approximation of the previous formula
0:13:15we see here
0:13:16here here that these approximations not
0:13:19it a very good at this point you to the ill definition of a formula
0:13:23and anyway we see again that for the choices of C
0:13:27as we increase E and we approach to that a
0:13:30then we get a lower probably to the coding or
0:13:33as expected
0:13:35so checking which is the the robustness of this method against
0:13:39a a different kinds of that's
0:13:41and comparing it to other side in hiding methods
0:13:45if which are we we choose uh jpeg peg attack
0:13:48we can see
0:13:49that a
0:13:51if we plot the quality factor used for the jpeg attack
0:13:54and the bit error rate that we get with that that
0:13:58then when the attack is smile so we get a very high quality factor
0:14:02nobody make the em performs
0:14:03a bit worse
0:14:04then normal of the M
0:14:06and that's because
0:14:07a here the probably do for or or you know what if think the M would be dominated by the
0:14:11small magnitude coefficients
0:14:12the top the the is centre each you four point station very close to each other
0:14:18but for the rest
0:14:19of the plot
0:14:20we get here for this
0:14:21the for this area of the of the plot
0:14:24that for low quality factors so when when the tape a get that is a strong
0:14:28low i think the M performs much better than
0:14:31uh a normal the em and that's because the the robustness of the center each used for the high money
0:14:36to coefficients is much
0:14:38a a better
0:14:40a you think the em than for normal the M
0:14:44regarding the
0:14:46another type of a tax so the awgn attack
0:14:49we get exactly
0:14:50the same results
0:14:52when you're that is mine then a it make the em performs worse than the M but when the are
0:14:57when the is a strong so we have uh a low piece a between the watermarking image and the
0:15:02a what are marked and attacked image
0:15:04then the performance of liberty make the M
0:15:07it's much better than that of
0:15:09uh uh norm of the uh
0:15:11so to conclude
0:15:13we have seen mean in this work that what was a can be used to derive a perceptually constraints
0:15:18the informed watermarking systems
0:15:20and
0:15:21in this work a generalized version of logarithmic the M has been derived
0:15:26and for this a generalized version a uh we have a study of the embedding distortion power and the probability
0:15:32of decoding error
0:15:34yeah yeah and the parameters optimize these two and few years
0:15:39and we have seen also that peace proposed a scheme of performs the M when we consider a severe at
0:15:44that
0:15:44so for that's of your J P not that an awgn a text
0:15:47how your performs a
0:15:49norm of the M
0:15:51that tense the
0:15:55you
0:16:00time of for questions you a wire are on is available
0:16:03at once
0:16:04i'm i'm sure that the or or or for for non the group
0:16:07a a a a a so the which is much much better than me
0:16:11fine
0:16:13uh
0:16:25which and this one
0:16:29a yes
0:16:30this this
0:16:31if the uh it is fixed that um but in distortion
0:16:35yeah and a power of the that is is for right yeah
0:16:47yes of uh i
0:16:48i i would have to check with them what they have exactly used
0:16:52but of course some some measure sure of of uh distortion must be using a to to have a for
0:16:57comparison
0:17:07yeah
0:17:15yeah
0:17:27oh
0:17:27yeah for sure but
0:17:30in these sense using uh a a a normal perceptual you where
0:17:34a a measure of distortion
0:17:35then a a a a we get uh a a lower bound on the
0:17:39difference
0:17:39that we get
0:17:41a with respect to lower make the em of course if we used a a a a perceptually were distortion
0:17:45metric
0:17:46then we would have a better results
0:17:49that from here it is not
0:17:52the no it's not clear that i'm have to check but i think that they have used is now
0:17:59the perceptually to were yeah
0:18:07yes your right
0:18:08know bit
0:18:16you
0:18:17on this slide
0:18:19i
0:18:19thank you
0:18:20so apparently a chose on the five lattice coefficient from bad
0:18:24in my view point use
0:18:26potentially introducing problems them of synchronization
0:18:30on it seems to be that
0:18:32scenes
0:18:33the or is no longer reaching can of minus infinity when
0:18:38for high quality meaning that you you no longer guarantee
0:18:41the
0:18:42efficiency of the scheme
0:18:44i don't have one of the percent i'm betting efficient
0:18:47have you looked that
0:18:49you know which coefficient modify that time betting on
0:18:53instead of
0:18:54selecting a the five about this coefficient that detection you we use exactly the same quick would you have a
0:19:00different care
0:19:01i mean uh why this choice of the coefficient
0:19:04well i can as and the joy but this
0:19:06introduce the problem of synchronization i know
0:19:09so
0:19:10some is care is kind of
0:19:12mergings the two problems in one here right
0:19:15there a new modulation scheme and the uses
0:19:17this synchronization problem
0:19:19i you try to separate the two aspects no you here a synchronization is not consider the poll
0:19:25so we have this completely the the
0:19:28the synchronisation problem and of course you would be a problem you these
0:19:32use if this truly followed
0:19:34if we embedded in any coefficient than
0:19:36one have those those strains
0:19:41so i think you
0:19:42much
0:19:43a more thank you