i i think you uh many was out slow be presenting

uh paper

and title the the group

two two of for or as i don't because this problem

uh by brian kind of a my adviser

uh fess with peter image

okay so um a well known problem that's been around a while is uh your a title for "'cause" this

problem

where we uh C to identify

and orthogonal cycle transformation between two data matrices

and uh

this context we seek to

minimize the frobenius norm squared

of the difference of a

in this case

the rows of B with rotated rows of a

and and this is useful uh

uh it to serve as as a as a geometric

uh transformation or a geometric rotation to to correct for

uh

rotational distortion as

uh distortions between

i mitch each a and B

um

uh

going going to

going to a second order version uh there was there's the two sided or because he's problem

or you have to uh

symmetric

similarity matrices C and C are little are can to give is

reference

and

in this in this problem we have

rather than um

uh

de de de rotation during on one side we have it of on both sides

and in the form are transpose

C R and and again we're trying to match up

um this this this to side rotation with with

with some

uh

some reference

uh

similarity so

uh

the says a bunch of applications uh

one being a a covariance or or correlation

a matrix matching so

we eight

a data matrix to we've induced

uh a double rotation

on the covariance matrix

um a more interesting problem

would be a related to a graph matching problem so in a ice metric graph matching

we're given uh two adjacency matrices

again C and C R

and uh

this can be weighted

um

and and and and the at a is to find a a permutation matrix

which

best uh

which we we which serves to to reorder the the nodes of one graph and to the other man in

in the other graph

uh again

matching in in in the for been used um score different so

this is a hard problem big

given this combinatorial nature

uh there are

one one one popular method is

when me i'm as method um and

involves relaxing this problem

to

an orthogonal

uh constraint

and then and then finding the closest permutation matrix

as the hungarian algorithm

okay so if we go back to the of that don't is problem the set this has a known solution

and um

given

uh the the the right circumstances

it's a unique and this given by D

uh

single value decomposition of A transpose be where

a singular values

um

have to be

transform just a

all ones

oh have for the two sided however

it's a it's

it's

a bit more complicated because

uh we have a close form solution but it's really family of solutions so

if the eigen decomposition of C is is V

lamb V transpose and C are B R and R

the are transpose

then

uh

a solution to this problem is given by

V V D V are transpose where D is a a is a diagonal matrix where each i an entry

is plus one or minus one

so

um

right away we see that we can have know

to the N possible solutions

at least

uh so so so i was a bit of a history um

in in in this talk we're going to consider

scenario area where

a for example a graph verdi sees don't just have

similarity measures between nodes but

um but also

there some side information

and and so this paper considers

case where

the a the seize

our are parsley later

in to go

uh so

so for example if uh

let's say were modeling a face so

so you know certain certain notes belong to the i certain nodes belong to the nose set

um or we can think of a a spatial configuration so

and uh in uh

fmri analysis we might have

certain regions of interest

so

we have kind activity is between

of voxels but we also know uh what are our why those walks

two

okay

so suppose

are are are data dataset

is parsley did in

capital "'em" groups

but is for each point i there's

there's some group that i belongs to and that's uh

will note that G of i

and uh uh uh the goal is

to

again

estimate an R

uh a in in in in the two sided

of course this problem but

here are is gonna be block diagonal

and and and those block diagonals correspond to the

nature of of the uh

problem

so uh because problem the uh

groups to sided or that process problem and uh

of course when when when M equals one we

go back to the original

a standard to prior

so

get

given T block tan'll constraint

um

it essentially introduces

some some linear constraints that is the elements of

this large matrix R

a a such that

uh

the uh it in X P Q zero if um

if uh

notes P and Q of the

and uh some some benefits of this approach are

you know the the

these constraints regular as the problem so

uh we might be reducing the effects of fitting

and also were were we're using prior knowledge that uh

we know about a about this graph

um

also we have uh

a few were

parameters to estimate so it's uh

it's a

were dealing with a i

a a a a a a lower dimensional problem

okay so so for "'em" greater than one we so that it it it does have a

a close form family

solutions

but for "'em" greater than one

where we're gonna have to look at an iterative

uh technique

so that let see i i and C I R denote be with an region or within groups similarity structures

uh and

see i J where i is not equal to J we'll what we'll do not the cross

somewhere structures

so

again given that this is uh going to be an iterative

solution

uh we'd like to start from

a a a a a good starting point

and so

a logical starting point would be to just consider the you within group similar

so we can start it

each block

element of our

rotation matrix

to be

uh V M M D M V M M R transpose

where

uh

the V and the Rs

correspond to the eigen decompositions of

oh of uh

of uh a C M M and

and and M R

so and and so from the starting point our goal is to improve on this initial estimate

uh by looking at across region similarities

so it's broken out into two step process

uh the first is trying to

assign appropriate plus one's uh

plus one or minus one entries

for the uh D matrices

and it turns out that

um

and in in

in a in expanding the problem

this is um

reduces to a quadratic binary optimization problem

um

and so we can use uh

uh we we we can get an approximate solution with

the

max cut uh but but but by by using the algorithms

to solve max cut

uh the second step after we've got an R

are D matrices

wouldn't be to uh

fine the or that call

estimates

uh with a with a greedy uh strategy

and the way we do that

is by selecting

um elements

that that are within groups and applying

uh appropriate givens rotations to

decrease the objective and so this is of a form of a

block or set

i to just a a a a a bit on a

the second step um

we can we can view the matrix C as a yeah

as

a noisy

rotated version of

of of a C R

and

and because the noise the the the eigen vectors

uh of with a group similarity

our berger

so

um

the

the uh factorisation we had the for uh of our initial estimate

of

are M equals

the M M

and D M V M R transpose

uh

we introduce

another matrix you to get the M M U i'm the M V M M R transpose

uh and

and

we view the U M and the D M

as a a corrective measure

to decrease the object

so

the D Ms

as that's so so the use in the D uh are

uh a there to incorporate

the cross region similarities

to correct the uh

perturbed eigenvector

so if we let uh

you be D block diagonal

uh

version of of you want through you M

then

uh what we do is

is is

in our really approach is is you generate these givens rotations so we we pick

um

uh

elements uh

P T in Q T

that belong the same group

and we successively apply givens rotations

to um at at at iteration C to decrease the object

um um

and at a a P change you T R

are found

uh using uh and and and and approximate greedy

and and the angle is the sample the line search

so

uh look at some results

uh

so this was tested on on two data sets the first is set was the N nist handwritten digits

um and second was the L B face day so

for "'em" nist we have

i ten groups

for ten digits and for face database we have fifteen groups for

uh a fifteen

uh people

so each group

was

uh transformed by by a random orthogonal matrix it's of M

and then and and corrupted with noise

so

and you see before you

the the the D block diagonal

uh elements of this matrix are the with group similarity structures

and the and the off diagonal

elements are do you cross group

and so you'll see uh

you know

the covariance of uh

digit three and seven

here

point by the

okay case at test accuracy we looked at

um

the

our air estimate to noted uh

by what what we're each

we're reach block of

of that rotation matrix is the dated

uh

and had uh we looked at

the the inner product of um

a have with a and uh and uh we took the average so so so this as the figure between

minus one and one

um

and we see that uh

that at iteration zero

uh

this is the result if we just take in

to account the within group similarities and then

as as a T increases the number of iterations increase

uh out knowledge of across region similarities

uh help us

to teen uh a better match

and uh we

and this is true for both

you know of the M nist

and the old

um

going to the more adjusting problem of graph matching

so

uh

each group in this case was was transformed by by some random permutation matrix

and then perturbed

with noise

and uh

the rather than solve for the

uh am

after after solving for each i M at at every time step you then

use the hungarian algorithm

to get the uh corresponding

um permutation matrix

and

we can and and and we plot the accuracy

uh with the correct number of uh

permutation matches

and uh

for the M this there's there's a uh

uh

a a a a steeper improvement

between only looking at within region two

looking at across region

a in conclusion oh O

we pose the group

uh version of the you side for this problem orthogonal cycle is problem

uh we write an algorithm that's come to fit efficient given D givens rotations

um

and uh a simple to implement

and

there's results show that uh

effectively you

utilizing both within group but across group structure

allows as to

the estimation

of the

um and known

orthogonal transformation

thanks

i

sure

yeah

i i to los the obvious question is what can you say that convergence of the skein

i would say that um

that

there are probably

local minima

given given

given the constraints and so

the key starting from that that

that good initial

and and the initial point

so

um

more than that

uh i can say

no

no