0:00:13 | yes and and one of the lucky you guys will have a full time results position |
---|---|

0:00:17 | i don't have to teach |

0:00:18 | and uh |

0:00:20 | the first or or is uh for all so that one of my colleagues |

0:00:24 | in the level crossing signal |

0:00:26 | time |

0:00:26 | and both of those that device |

0:00:28 | yeah that's right |

0:00:29 | but use your not |

0:00:31 | and uh |

0:00:32 | this is |

0:00:33 | paper is the side is result of |

0:00:35 | it is it's |

0:00:36 | H T |

0:00:40 | okay |

0:00:43 | the previous papers were speaking about right |

0:00:45 | reasons |

0:00:46 | problem |

0:00:47 | we will be speaking about seven or problem |

0:00:50 | how can you |

0:00:52 | directly obtain |

0:00:54 | the iterative algorithms just like in turbo codes in |

0:00:58 | durable |

0:00:59 | iterative decoding the bic and which will be or |

0:01:03 | from uh maximum likelihood |

0:01:06 | estimation |

0:01:08 | uh |

0:01:08 | this week |

0:01:09 | has been |

0:01:10 | of interest |

0:01:11 | since many years |

0:01:13 | uh that are being and then is is of iterative decoding are being |

0:01:17 | and with the |

0:01:18 | of you six it charge density evolution |

0:01:21 | that has been quite and then is is using factor graph belief propagation |

0:01:27 | interestingly enough |

0:01:28 | very |

0:01:29 | basic understanding of the process was obtained using information geometry and in fact it was a first that and to |

0:01:36 | address this problem |

0:01:38 | and there also so a first that's using an optimization |

0:01:42 | okay how could we obtain an interactive i don't even for performing something close to maximum likelihood |

0:01:48 | using optimization |

0:01:50 | this is exactly what we would be doing |

0:01:53 | so that that that them the is first |

0:01:55 | just so that such so that to do not |

0:01:57 | for T |

0:01:59 | to soon |

0:02:00 | and |

0:02:01 | the that we would be writing |

0:02:04 | a maximum estimate |

0:02:06 | then |

0:02:06 | we would translate it |

0:02:08 | using a very simple to that would be |

0:02:10 | three three |

0:02:10 | in in the a very simple tricks and very simple estimate |

0:02:15 | in such a way that it can be |

0:02:17 | uh the done in terms of optimization |

0:02:20 | then |

0:02:22 | ooh will show that that um using a single approximation |

0:02:26 | we can obtain exactly the algorithm which is used for decoding bic C N |

0:02:33 | but |

0:02:34 | and this approximation will be used some so okay |

0:02:37 | three K |

0:02:38 | and uh uh we gonna use this trick to check there is that of bic and |

0:02:43 | it be is efficient or not |

0:02:46 | so a meaning |

0:02:48 | from a that time to really understand how you could be a the algorithms will get information about the efficiency |

0:02:55 | of the algorithm |

0:02:56 | so that cool |

0:02:59 | so it is the |

0:03:00 | situation which would be sitting |

0:03:02 | we have a convolution and for the |

0:03:04 | information bits here |

0:03:06 | the code words here which are interleaved |

0:03:09 | map to symbols and then sent them to the channel |

0:03:12 | i would not be |

0:03:14 | specifically working with specific interleavers specific but things or whatever everything |

0:03:19 | which is a can in the paper |

0:03:21 | is is valid for a a a a kind of interleaver or any kind of symbol matt and of his |

0:03:25 | see the performance will be different |

0:03:27 | and that it will be seen right |

0:03:30 | okay so |

0:03:31 | in addition to see that will |

0:03:33 | the close here |

0:03:35 | i |

0:03:36 | uh but let us here are vectors |

0:03:38 | okay so because of the thing with vector is a and an individual bits |

0:03:43 | okay |

0:03:44 | the binary messages C and go to do it's |

0:03:46 | interleaved sequence |

0:03:48 | and uh okay |

0:03:50 | there's |

0:03:53 | okay so the maximum likelihood sequence detector |

0:03:56 | is like that it's of use |

0:03:59 | it's very basic |

0:04:01 | uh where |

0:04:02 | and |

0:04:03 | you can either |

0:04:05 | where where maximization or over |

0:04:08 | you the the binary messages |

0:04:11 | or |

0:04:12 | the it works |

0:04:14 | coded bits |

0:04:15 | provide in that you can |

0:04:17 | every combination of the coding bits it's which do not don't to the goal |

0:04:21 | okay this is a a a compact notation for saying |

0:04:26 | i was zero is the indicator function of the quality |

0:04:28 | its value is a one for a code where its value is zero or for a a combination of beach |

0:04:33 | which does not be that belong to the core |

0:04:37 | okay |

0:04:38 | since |

0:04:40 | this is not really easily manipulate it is from the optimization |

0:04:44 | well we use this is very small a |

0:04:46 | which i the been you write the used but which has a can be found that's where |

0:04:51 | so meaning that you can maximise a there on that |

0:04:54 | oh |

0:04:55 | and |

0:04:55 | the value of the probability P here |

0:04:58 | uh |

0:05:00 | so so that the thing the maximum of the argument of this function |

0:05:04 | that are then if it's up for that |

0:05:07 | one |

0:05:08 | is that |

0:05:08 | this probability can be fully factor right meaning it's a product of individual probabilities for each bit |

0:05:15 | okay it's a matter selection and most than than anything at |

0:05:19 | and the second advantage is that this this problem T is |

0:05:23 | continuous |

0:05:24 | so for an optimization problem is much and but not continue |

0:05:30 | okay |

0:05:30 | and you state this problem is on track double |

0:05:33 | why because you have |

0:05:35 | many need |

0:05:36 | in that in the vector and if you want to do a i |

0:05:39 | to compute |

0:05:39 | this quantity for every possible combination you a lot |

0:05:44 | what okay |

0:05:45 | but |

0:05:46 | this is |

0:05:47 | the the first request was to be cushion but the per bit second rate is that |

0:05:52 | in this |

0:05:53 | problem you have to kind of information |

0:05:56 | one information is coming from the channel |

0:05:59 | a because you you a measurement coming composed of the the information is coming from the fact that you are |

0:06:05 | looking for |

0:06:06 | can i did |

0:06:07 | this is a of information |

0:06:09 | and |

0:06:09 | and is probably |

0:06:12 | but as so okay in this for but here |

0:06:15 | we just fact i is it |

0:06:17 | in that and L |

0:06:18 | we take care of one of the information |

0:06:21 | Q with take care of the all their information |

0:06:25 | okay |

0:06:26 | and |

0:06:27 | instead of working with |

0:06:29 | the probabilities is for the front that was what will be working on a big margin |

0:06:33 | meaning we will have a and variables |

0:06:36 | instead of to to the uh |

0:06:38 | obviously |

0:06:40 | maybe we we of go very far using exactly this procedure |

0:06:44 | we can but to here |

0:06:46 | in this |

0:06:47 | process here |

0:06:49 | we do not have any kind of approximation we have no |

0:06:52 | the bit margin as appear |

0:06:55 | and |

0:06:56 | this |

0:06:56 | equation here is exactly equivalent to the previous one |

0:07:01 | this an approximation with have |

0:07:03 | in this talk |

0:07:04 | is this one |

0:07:05 | the bit much or and a of the product |

0:07:09 | i we pose but the product of be marginal |

0:07:12 | okay so so that the previous equation is replaced that this one |

0:07:16 | okay |

0:07:17 | obviously seems to be a coarse approximation |

0:07:20 | it happens |

0:07:21 | that |

0:07:22 | if you choose directly |

0:07:24 | the interleaver or if you choose to the mapping |

0:07:27 | it's that the too bad approximation |

0:07:29 | okay |

0:07:32 | okay |

0:07:32 | but |

0:07:33 | the coding now uh is that of a think is tractable |

0:07:36 | because the computation of the marginal destructible |

0:07:41 | okay |

0:07:42 | this |

0:07:42 | is struck but this is computed well |

0:07:45 | this is computed vol |

0:07:48 | and |

0:07:49 | uh |

0:07:50 | now we have a |

0:07:52 | is a a problem which is a |

0:07:53 | different for each bit |

0:07:55 | so we change the problem now |

0:07:57 | but they do the criterion here the bands |

0:08:01 | i'm K depends on the location of a |

0:08:05 | so that the average original problem |

0:08:07 | is replaced by a by you distribute the optimization strategy |

0:08:12 | and the only the problem |

0:08:14 | seven of and cost function |

0:08:17 | okay |

0:08:22 | this criterion |

0:08:23 | yeah |

0:08:24 | is relevant for the mac for the cliff be |

0:08:27 | okay |

0:08:28 | and |

0:08:30 | you it is shown that |

0:08:32 | if you do that for the K B |

0:08:34 | the solution must be something like that meaning that |

0:08:38 | if you |

0:08:39 | have a solution to the prime |

0:08:42 | that mean from one of the set of information i'm of a V coming from the other a set of |

0:08:47 | information |

0:08:48 | but must agree for the maximization |

0:08:51 | a |

0:08:52 | which should |

0:08:52 | here would be |

0:08:54 | for a whether this it will be |

0:08:56 | you you zero or one |

0:08:57 | okay so they have to agree on be |

0:09:00 | to make of this bit |

0:09:02 | okay |

0:09:04 | so i and that in this process context the that that a mediation of |

0:09:08 | oh C case |

0:09:10 | must provide an agreement |

0:09:13 | between the code or estimate and them |

0:09:15 | D mapping estimate the for the beacon position K |

0:09:20 | but |

0:09:21 | and that is that are independent |

0:09:24 | optimization the musicians |

0:09:25 | maybe you could do not agree implicitly for the estimates of the other |

0:09:30 | okay |

0:09:33 | so |

0:09:34 | and you will see in in the next few slides that |

0:09:38 | the track collaborative them |

0:09:40 | used in decoding of the S C N |

0:09:43 | is is exactly |

0:09:45 | the solution |

0:09:46 | all this kind of problem |

0:09:48 | okay |

0:09:50 | so that |

0:09:52 | i to know that i can't time is that i i only derived |

0:09:56 | bic and decoding and algorithm it to to use C M decoding a one |

0:10:01 | using |

0:10:02 | the single approximation |

0:10:04 | for a maximum you |

0:10:07 | not |

0:10:08 | well in need building a my it up working more on that |

0:10:11 | meaning that |

0:10:12 | if one not a little lower of the group it |

0:10:16 | if you if you if you will take |

0:10:18 | the um |

0:10:20 | of of these criteria |

0:10:22 | now |

0:10:23 | you are just |

0:10:25 | not |

0:10:26 | you can not have inconsistency consistency between the estimate |

0:10:29 | of the base for the values |

0:10:31 | case |

0:10:32 | okay so if i think has to a in some way |

0:10:36 | meeting |

0:10:37 | that |

0:10:38 | yeah |

0:10:41 | the if you taken by this quantity will be an indication of the agreement |

0:10:46 | between the coder and the demapper for the sequence |

0:10:51 | okay if you work individually and that it's |

0:10:53 | the most agree |

0:10:54 | but if you were on the one sequence |

0:10:56 | the kind of three |

0:10:58 | and thus there is absolutely no where |

0:11:00 | and this is for of here |

0:11:02 | if you do not have any kind of the error |

0:11:04 | the |

0:11:05 | as as you made |

0:11:07 | provided by these criterion would be exactly the maximum likelihood |

0:11:11 | so this is true in the paper |

0:11:15 | okay so that's |

0:11:16 | given in words what is written here in equation |

0:11:20 | okay now |

0:11:22 | this is outlined here only |

0:11:24 | that's the we must come back to the individual criteria C K |

0:11:29 | if you just minimise these the individual criteria |

0:11:33 | you just |

0:11:33 | fine |

0:11:34 | that |

0:11:37 | this completely here |

0:11:39 | is is |

0:11:40 | it did not there |

0:11:41 | this be to here is something which may be look strange |

0:11:45 | but it is exactly what is computed by you the C G I E R algorithm |

0:11:50 | okay |

0:11:52 | so |

0:11:53 | would be |

0:11:54 | fixing one of the one set of the want it is and who will go to do that you interactive |

0:11:59 | musician |

0:12:00 | we have a is i mean standing station here |

0:12:03 | we that |

0:12:04 | oh okay pigtails |

0:12:06 | to the to some |

0:12:07 | three use a value |

0:12:09 | and |

0:12:10 | we compute |

0:12:11 | the next completely based on sept solutions |

0:12:15 | this one here |

0:12:19 | for for or computing you and this one here for computing have |

0:12:25 | if you just know that what |

0:12:26 | everything is doing |

0:12:28 | this is |

0:12:29 | but the or |

0:12:30 | classical in the are S E and decoding |

0:12:33 | and this is exactly what is computed but you be C J R are are greater that meaning that channel |

0:12:38 | decoding algorithm |

0:12:40 | it's a |

0:12:40 | a compact notation that you can numerically to check that do this is to it B C J are maybe |

0:12:46 | it's not that when one |

0:12:48 | if you provide to a C procedure |

0:12:50 | i to a probability is that it should be |

0:12:53 | if you compute the probably probabilities of each possible work right |

0:12:57 | product of these quantities quantities |

0:12:59 | kill |

0:13:00 | but of the words which do not |

0:13:02 | we don't to the code |

0:13:03 | and then compute the matching also a which is exactly what is |

0:13:08 | we and here |

0:13:09 | if you do that |

0:13:10 | you have to exactly at in the output of a B C J |

0:13:14 | and it turns out that |

0:13:16 | this quantity is which were they now by |

0:13:19 | matt's me maximizing some criterion are exactly |

0:13:23 | the web |

0:13:24 | keep the in in in |

0:13:26 | uh coding name the extrinsic information |

0:13:30 | and now there is no magic in |

0:13:32 | for graduating in six rather than a are plus to a it is |

0:13:36 | the |

0:13:37 | uh uh true since that |

0:13:39 | as and have group of the mediation procedure |

0:13:44 | so |

0:13:45 | to to i |

0:13:48 | we have an optimization problem which was |

0:13:51 | or to model |

0:13:52 | maximum likelihood |

0:13:53 | we can |

0:13:55 | obtain single first exactly equivalent to maximum because you like to a good detection |

0:14:00 | we have a an approximation which has been obtained by fully factor using this |

0:14:05 | the the probability mass functions |

0:14:08 | then then a so that model algorithm because we had individual |

0:14:12 | uh |

0:14:13 | quantities is which were optimized |

0:14:15 | through a distributed optimization strategy |

0:14:19 | and we can |

0:14:21 | if you know come back to the glue factor in the first the sum of all possible C case we |

0:14:26 | know how a way of evaluating E |

0:14:30 | yeah approximation was good or not |

0:14:32 | because if we write |

0:14:34 | the uh um |

0:14:36 | the the sum of the individual criteria we will not be able to check |

0:14:42 | if |

0:14:42 | the did not for and the decoder where the greeting |

0:14:46 | on on of values which where |

0:14:49 | for for the day |

0:14:51 | so that's provides as a way of evaluating the quality of the solution of the ica and it directly decode |

0:14:58 | okay we have covered just problems which |

0:15:00 | a a in another the paper |

0:15:03 | okay so that's |

0:15:04 | see first |

0:15:05 | okay what is new here |

0:15:07 | nothing but |

0:15:08 | the fact that we thing it may be |

0:15:10 | a B S C and decoding or even |

0:15:12 | but as the fact that we have maybe be a way of checking |

0:15:16 | if there is that is correct on not |

0:15:19 | and you can see here |

0:15:21 | the cost function |

0:15:22 | of the mixture means that of the global maximization |

0:15:25 | first |

0:15:26 | the let's say the bit error rate that number of errors |

0:15:29 | we are |

0:15:30 | known as K |

0:15:32 | and you can see that that's for C T V for maybe |

0:15:35 | you can see that the refuge correlation |

0:15:38 | it can be one |

0:15:42 | okay and |

0:15:44 | as another example i i |

0:15:45 | show you these kind of result |

0:15:47 | we have |

0:15:48 | but duration of sixteen grand by mapping the set partitioning |

0:15:52 | convolutional code this |

0:15:53 | five seven |

0:15:54 | and we are mixing |

0:15:56 | their use |

0:15:57 | E V over and zero from five to twelve db with a uniform distribution so it is a very complex |

0:16:03 | situation |

0:16:05 | if we choose |

0:16:07 | the threshold |

0:16:08 | indeed be equal to minus twenty here okay |

0:16:12 | so the bit error rate for the frames above the threshold |

0:16:17 | which are assume |

0:16:18 | to the uh |

0:16:20 | who |

0:16:21 | because they are are you are trying to maximise the functions are above a threshold it that's okay |

0:16:26 | you see that |

0:16:27 | above the threshold |

0:16:29 | the bit error rate it |

0:16:30 | quite stable but |

0:16:32 | if if you change the |

0:16:34 | the threshold |

0:16:36 | for of frames and of the fresh |

0:16:39 | you have a much higher bit error right |

0:16:41 | okay i it would be a is your here um okay |

0:16:45 | one here |

0:16:47 | okay |

0:16:47 | but |

0:16:48 | we are close to that |

0:16:49 | and |

0:16:51 | but that's a page |

0:16:52 | yeah of three G T frames |

0:16:54 | well i |

0:16:55 | there would be correct |

0:16:57 | is very small |

0:16:59 | and and what is as is the probability of first i'll a meaning that |

0:17:02 | the point that |

0:17:03 | not per cent ish of sequences which are to been rejected because |

0:17:08 | build a because the right below of the threshold |

0:17:11 | but that work correct |

0:17:13 | okay |

0:17:14 | but this |

0:17:16 | is the basis of the work |

0:17:18 | okay |

0:17:19 | okay that me summarise |

0:17:21 | what we obtain |

0:17:23 | iterative decoding obtained from maximum likelihood |

0:17:26 | we we had no specific assumption about the book things that we you a mapping or whatever |

0:17:33 | the is obviously will impact the performance |

0:17:36 | the in by the quality of the upper be here approximation which has been made in the in this |

0:17:42 | where |

0:17:44 | we obtain |

0:17:45 | clearly a that we should provide a extrinsic information between both |

0:17:51 | lot |

0:17:51 | rather than |

0:17:52 | a was probably is |

0:17:55 | and |

0:17:56 | we have a a we have a common just a do which has been a set it okay could be |

0:18:00 | know that we do use a go |

0:18:02 | and uh uh we have a |

0:18:04 | the process for a |

0:18:05 | evaluating the efficiency of the result |

0:18:08 | uh because of this and that is |

0:18:11 | and it's likely we not sure that |

0:18:14 | simulations are running to check that |

0:18:16 | that |

0:18:18 | B |

0:18:18 | most of the as we have in this kind of course G are not you good approximation but most of |

0:18:24 | them are out you to conventional to local minima |

0:18:27 | uh |

0:18:28 | which is |

0:18:29 | okay makes sense but we have to prove that and it |

0:18:32 | work current yeah and a ticket |

0:18:34 | and that |

0:18:54 | do you think the some kind of some as is can be applied to to about composition |

0:18:59 | could could you like to |

0:19:00 | to of current position |

0:19:02 | well well is it okay that this applies to |

0:19:05 | any kind a okay it's a toy example bic and here is that for example it applies to so you |

0:19:11 | editions |

0:19:12 | on which you have several four sets of information on the same seem |

0:19:17 | would you have here three sets of information in a a coat |

0:19:21 | on top of the M |

0:19:22 | this would apply also |

0:19:24 | that that's |

0:19:25 | okay it's quite generic |

0:19:37 | a |

0:19:38 | um |

0:19:38 | i you aware of any results from information tree which word um justify your approximation |

0:19:46 | no which i and i can send you your the the psd of the uh not to what was specifically |

0:19:52 | working on information theory |

0:19:54 | from a some geometry up like to these kind of work |

0:19:57 | it's tricky |

0:19:58 | and we could not really justified the approximation |