| 0:00:13 | so a good morning everyone i'm mark | 
|---|
| 0:00:15 | to that's are a set | 
|---|
| 0:00:17 | and uh i'm getting to | 
|---|
| 0:00:19 | present you work with uh | 
|---|
| 0:00:21 | i i with sample class you and which machines ski | 
|---|
| 0:00:24 | and well or from that to come should but i in | 
|---|
| 0:00:28 | so and the title of my tool is an extension of the ica model | 
|---|
| 0:00:33 | using that and on | 
|---|
| 0:00:36 | so here is the outline of my to a local a first introduce the problem | 
|---|
| 0:00:40 | then a a present you the extension that type prepare of uh be independent comp and uh analysis model | 
|---|
| 0:00:47 | a start with a recall in the classical know or mixture of independent sources and then | 
|---|
| 0:00:52 | i will present uh what i consider the depends sources and lack and variables that that | 
|---|
| 0:00:59 | a and this context i will then uh yeah | 
|---|
| 0:01:02 | consider uh the problem of parameter estimation | 
|---|
| 0:01:05 | in in data | 
|---|
| 0:01:06 | and this will be based on a so called iterative conditional estimation method | 
|---|
| 0:01:11 | so please see that there are two at an ins ice | 
|---|
| 0:01:14 | a a | 
|---|
| 0:01:15 | which you all know and | 
|---|
| 0:01:16 | i C E which is iterative conditional estimation | 
|---|
| 0:01:20 | then i present you the separation method | 
|---|
| 0:01:23 | and uh and uh i show you a few simulations uh of in the case where the last and as | 
|---|
| 0:01:28 | i i I D and in the case where to not | 
|---|
| 0:01:30 | and finally a we conclude my to | 
|---|
| 0:01:34 | so | 
|---|
| 0:01:35 | you probably only blind source separation or were independent come analysis | 
|---|
| 0:01:39 | as well known | 
|---|
| 0:01:41 | and the use you | 
|---|
| 0:01:42 | the usual assumptions are that's | 
|---|
| 0:01:44 | the sources should be statistically mutually independent | 
|---|
| 0:01:47 | uh they should be not ocean | 
|---|
| 0:01:49 | and the observations are | 
|---|
| 0:01:52 | given by a linear instantaneous mixture of the | 
|---|
| 0:01:56 | however | 
|---|
| 0:01:58 | uh | 
|---|
| 0:01:58 | it is known that um | 
|---|
| 0:02:00 | independence is not always recall | 
|---|
| 0:02:03 | for example uh you may uh | 
|---|
| 0:02:06 | re you may rely only on to relation in the case of a second to hold of methods and case | 
|---|
| 0:02:11 | but you use non stationarity or or colour of the sources | 
|---|
| 0:02:14 | and the case of uh source which belong to a finite alphabet then um | 
|---|
| 0:02:20 | may also uh | 
|---|
| 0:02:21 | re independence assumption that there are all other very specific cases that uh was started | 
|---|
| 0:02:26 | yeah | 
|---|
| 0:02:28 | the objective of this talk it is clearly to relax | 
|---|
| 0:02:32 | the independence assumption in blind source separation | 
|---|
| 0:02:35 | and uh for doing this that propose a new model | 
|---|
| 0:02:38 | where a separation is possible | 
|---|
| 0:02:41 | a thing and uh where the depends is controlled by a lot | 
|---|
| 0:02:46 | so let me uh give you are limitations this is a classical i consider that i have capital T samples | 
|---|
| 0:02:53 | that of the observations are denoted by X of to you know and capital and observations capital and sources given | 
|---|
| 0:02:59 | by S F T | 
|---|
| 0:03:00 | a a is the unknown mixing matrix a square matrix invertible and at each time instant the observations X T | 
|---|
| 0:03:07 | are given by the predict eight times as of | 
|---|
| 0:03:11 | i am in the blind count takes that is only a X | 
|---|
| 0:03:14 | which consists of the samples uh | 
|---|
| 0:03:16 | the is available | 
|---|
| 0:03:18 | and the objective of a i C A is to estimate the sources | 
|---|
| 0:03:21 | know the was it is equivalent to estimate D | 
|---|
| 0:03:24 | which is | 
|---|
| 0:03:25 | and in as the inverse of a or an inverse of a a a a to someone and be you | 
|---|
| 0:03:32 | so here is a figure that probably most of you uh did uh already C | 
|---|
| 0:03:37 | uh as as a way to introduce use um | 
|---|
| 0:03:39 | to introduce a | 
|---|
| 0:03:40 | independent component analysis | 
|---|
| 0:03:43 | i consider the classical case with source whether there are two sources S one S two S one is on | 
|---|
| 0:03:48 | the | 
|---|
| 0:03:48 | how reasonable X as a stream and the vertical axis this is a of the left | 
|---|
| 0:03:52 | uh figure | 
|---|
| 0:03:54 | um and uh they are independent uniformly distribute | 
|---|
| 0:03:57 | when when you makes | 
|---|
| 0:03:58 | uh of the of what the matrix a which is given here | 
|---|
| 0:04:02 | then you a you have a the observations on the right i project here um the different samples of | 
|---|
| 0:04:07 | uh X one and X two | 
|---|
| 0:04:10 | a a a a different time instants and to you C basically that | 
|---|
| 0:04:13 | uh you can uh separate the sources because you the different slips of or in on of the parallelogram gives | 
|---|
| 0:04:20 | you | 
|---|
| 0:04:21 | uh the um | 
|---|
| 0:04:24 | columns of the at mixing matrix a | 
|---|
| 0:04:26 | okay this is well now and now comes | 
|---|
| 0:04:28 | then there's uh | 
|---|
| 0:04:30 | the central idea of my talk | 
|---|
| 0:04:33 | what i'm saying is that if ica sixties in | 
|---|
| 0:04:36 | separating sources which and this distribution | 
|---|
| 0:04:39 | i should be able to make do something and to uh perform separation if the distribution looks something like that | 
|---|
| 0:04:47 | so the idea is to consider source | 
|---|
| 0:04:51 | which have | 
|---|
| 0:04:52 | a distribution which is partly independent and partly depend | 
|---|
| 0:04:58 | and | 
|---|
| 0:04:59 | in order to uh | 
|---|
| 0:05:00 | try to find another isn't the idea is to say okay | 
|---|
| 0:05:04 | if i am able to your race | 
|---|
| 0:05:06 | the points in green | 
|---|
| 0:05:07 | or to find | 
|---|
| 0:05:08 | the which are dependent and a as a sense | 
|---|
| 0:05:12 | then i should be able to use | 
|---|
| 0:05:14 | classical any um actually | 
|---|
| 0:05:17 | be able to use any a classical ica algorithm in order to form | 
|---|
| 0:05:22 | a blind source separation | 
|---|
| 0:05:25 | okay so the question is how can i model such | 
|---|
| 0:05:28 | type of distribution such a a of thought of sources | 
|---|
| 0:05:32 | well if you are of its uh carefully we you was you see that | 
|---|
| 0:05:36 | actually you have a mixture of two probability density as | 
|---|
| 0:05:40 | and whether the probability P | 
|---|
| 0:05:43 | between zero and one of the sources are independent and when the probability one minus P the sources are U | 
|---|
| 0:05:48 | penn | 
|---|
| 0:05:50 | and one possible way | 
|---|
| 0:05:52 | which is | 
|---|
| 0:05:52 | a classical to uh | 
|---|
| 0:05:54 | to model as this is to introduce a lot and | 
|---|
| 0:05:57 | a process are of T just hit | 
|---|
| 0:06:00 | and to say if or if is equal to zero them | 
|---|
| 0:06:03 | the sources are independent if a of T equals | 
|---|
| 0:06:06 | one then the sources or depend | 
|---|
| 0:06:10 | so here i'm writing the same i i uh in more uh mathematical trends | 
|---|
| 0:06:14 | i consider that i have a hidden clues R of T V such that can to show you on are | 
|---|
| 0:06:20 | the components of as | 
|---|
| 0:06:21 | one two S | 
|---|
| 0:06:23 | or or uh independent of this is given by the joint a probability distribution was is she's written | 
|---|
| 0:06:28 | uh of the screen | 
|---|
| 0:06:31 | and uh of T takes to use | 
|---|
| 0:06:33 | either zero or one if R of T is equal to zero then | 
|---|
| 0:06:37 | the usual assumptions | 
|---|
| 0:06:39 | uh you made in ica appliances of the components are | 
|---|
| 0:06:44 | independent | 
|---|
| 0:06:45 | uh none fusion | 
|---|
| 0:06:47 | and if R T then you have the dependence which | 
|---|
| 0:06:51 | can choose as you want which i will process specify like | 
|---|
| 0:06:56 | okay that's the model i can | 
|---|
| 0:06:58 | no i'm moving to the problem of a parameter estimation | 
|---|
| 0:07:03 | well first let me say that uh a you can | 
|---|
| 0:07:06 | i i did not say anything about are | 
|---|
| 0:07:09 | you have several possible at is | 
|---|
| 0:07:11 | to possible of is to model as a or as an i I i T but only process | 
|---|
| 0:07:15 | but is uh | 
|---|
| 0:07:16 | or or cool zero with to P or uh equal uh one with probably one minus P and this is | 
|---|
| 0:07:22 | what i i D you can also consider that are is given by a markov chain | 
|---|
| 0:07:27 | well this works | 
|---|
| 0:07:28 | anyway the objective is to estimate | 
|---|
| 0:07:30 | a set of our me which consist of but inverse of the mixing matrix and | 
|---|
| 0:07:34 | or parameters images of or | 
|---|
| 0:07:37 | as i told you before if i am able to you raise to green points that this be new are | 
|---|
| 0:07:42 | and it | 
|---|
| 0:07:44 | well i think you i can perform ica C which it is quite | 
|---|
| 0:07:48 | a easy i can estimate the power it is of a or just by counting the number of of utterances | 
|---|
| 0:07:52 | of | 
|---|
| 0:07:53 | uh zero the number of times | 
|---|
| 0:07:55 | or is equal to zero can give me a an an estimate of P | 
|---|
| 0:07:59 | i can see that the in sometimes where depends of the sources holes | 
|---|
| 0:08:03 | consider only a restricted uh a sub sample of the | 
|---|
| 0:08:07 | observation samples and perform my uh favourite i it | 
|---|
| 0:08:11 | in other words | 
|---|
| 0:08:13 | i i have a complete data estimator that is if are and X are both available i have an estimator | 
|---|
| 0:08:20 | or of the clan it is but i interested | 
|---|
| 0:08:23 | and fortunately or fortunately leave because a | 
|---|
| 0:08:26 | i can give the stalk | 
|---|
| 0:08:27 | we are a in in and complete data um | 
|---|
| 0:08:31 | context that is | 
|---|
| 0:08:32 | X | 
|---|
| 0:08:33 | only is available on the R S and now | 
|---|
| 0:08:35 | and at this point please note that a a a a a i makes to different um | 
|---|
| 0:08:40 | three different I D's is the idea of complete in data we first to the fact that are is no | 
|---|
| 0:08:45 | or a known and the fact that | 
|---|
| 0:08:48 | a S is and known | 
|---|
| 0:08:49 | is uh uh what to to to to that | 
|---|
| 0:08:52 | a to say that S is in and i just say that i'm in a blind | 
|---|
| 0:08:55 | come | 
|---|
| 0:08:58 | okay | 
|---|
| 0:08:58 | so here is the second important idea of my talk | 
|---|
| 0:09:02 | for in complete data estimation there is this uh | 
|---|
| 0:09:06 | quite uh if uh | 
|---|
| 0:09:07 | efficient | 
|---|
| 0:09:08 | i to perform | 
|---|
| 0:09:09 | uh estimation | 
|---|
| 0:09:11 | and | 
|---|
| 0:09:11 | this algorithm is called ice E iterative conditional estimation | 
|---|
| 0:09:16 | and it consists in uh the phone we first initial as the power mean value | 
|---|
| 0:09:22 | oh then | 
|---|
| 0:09:23 | you can you | 
|---|
| 0:09:24 | the expectation of the complete data estimator | 
|---|
| 0:09:28 | no i X | 
|---|
| 0:09:29 | and you H rate | 
|---|
| 0:09:30 | uh a it as that | 
|---|
| 0:09:32 | and uh uh you but obtain converge | 
|---|
| 0:09:36 | that | 
|---|
| 0:09:37 | some | 
|---|
| 0:09:38 | uh of the components of a a T to are not computable if you can compute the uh | 
|---|
| 0:09:44 | the a posteriori | 
|---|
| 0:09:45 | the posterior or uh expectation | 
|---|
| 0:09:47 | then you have another possibility | 
|---|
| 0:09:49 | you can just draw a random hard labels according to the little of are no an X | 
|---|
| 0:09:54 | and then replace | 
|---|
| 0:09:56 | the egg | 
|---|
| 0:09:56 | dictation | 
|---|
| 0:09:57 | well uh uh uh just a some a seven divided by a K which is number of real as it | 
|---|
| 0:10:03 | relies H | 
|---|
| 0:10:05 | so | 
|---|
| 0:10:05 | if you have a look at i see it has quite a we uh requirements | 
|---|
| 0:10:11 | you need a complete data estimator here just an ica algorithm | 
|---|
| 0:10:15 | will we use the job | 
|---|
| 0:10:17 | and you need to be able to you to calculate the posterior expectation or | 
|---|
| 0:10:21 | much weaker | 
|---|
| 0:10:22 | to draw are according to | 
|---|
| 0:10:24 | the new of or no | 
|---|
| 0:10:27 | and here in the model i | 
|---|
| 0:10:28 | but | 
|---|
| 0:10:29 | i propose the the problem is to calculate the probably of are and X | 
|---|
| 0:10:34 | well if our "'cause" i I D it's just a even by the base room she's uh written here | 
|---|
| 0:10:39 | if or is a mark of chain you can also do it you just uh a need to uh two | 
|---|
| 0:10:46 | resort to uh | 
|---|
| 0:10:47 | a one in forward backward algorithm which are is you to copy | 
|---|
| 0:10:51 | the probably of are no | 
|---|
| 0:10:54 | okay | 
|---|
| 0:10:56 | so let me one you have uh a a a a a a problem that you may encounter a here | 
|---|
| 0:11:00 | is it's | 
|---|
| 0:11:01 | the distribution of the sources | 
|---|
| 0:11:03 | however | 
|---|
| 0:11:04 | you all do that i has some be you to this so | 
|---|
| 0:11:09 | when you perform ica | 
|---|
| 0:11:11 | uh at a given my yeah i | 
|---|
| 0:11:13 | E | 
|---|
| 0:11:14 | a iteration you don't new | 
|---|
| 0:11:16 | which | 
|---|
| 0:11:17 | uh | 
|---|
| 0:11:17 | permutation mutation you obtain | 
|---|
| 0:11:19 | so the as the uh inverse metrics you found correspond to | 
|---|
| 0:11:23 | left or through the right distribution | 
|---|
| 0:11:25 | so to you have all this problem | 
|---|
| 0:11:27 | we decided it | 
|---|
| 0:11:28 | to | 
|---|
| 0:11:29 | consider for the ice | 
|---|
| 0:11:31 | E algorithm that | 
|---|
| 0:11:33 | the uh | 
|---|
| 0:11:34 | distribution | 
|---|
| 0:11:35 | off uh the sources look | 
|---|
| 0:11:38 | like this this is | 
|---|
| 0:11:39 | uh invariant | 
|---|
| 0:11:40 | by us a permutation and it avoids any problem of course this is not true distribution that true distribution is | 
|---|
| 0:11:48 | uh at the top and | 
|---|
| 0:11:50 | for a calculating | 
|---|
| 0:11:52 | the little of are no X with used to consider | 
|---|
| 0:11:55 | that this is this to this to | 
|---|
| 0:11:59 | so um | 
|---|
| 0:12:00 | just you well i just fruit it's more uh | 
|---|
| 0:12:03 | more and more mathematical terms | 
|---|
| 0:12:05 | you have a uh the joint density of are are S ask just given by a yeah | 
|---|
| 0:12:10 | top equation | 
|---|
| 0:12:12 | and the simulated data is uh as a written in the | 
|---|
| 0:12:16 | uh column on the | 
|---|
| 0:12:17 | the middle and the assume distribution for i C E is a written on the right to you see you | 
|---|
| 0:12:23 | have a | 
|---|
| 0:12:24 | uh | 
|---|
| 0:12:25 | well and is a normal do L is it had not that's a double exponential low | 
|---|
| 0:12:30 | and uh we just so much was it uh | 
|---|
| 0:12:32 | in | 
|---|
| 0:12:33 | the case for | 
|---|
| 0:12:34 | the assume distribution | 
|---|
| 0:12:38 | okay so no columns | 
|---|
| 0:12:40 | my combine i C a a i C E estimation algorithm | 
|---|
| 0:12:44 | so you first initialize the power | 
|---|
| 0:12:49 | uh i zero and T zero | 
|---|
| 0:12:51 | and then you achieve the the | 
|---|
| 0:12:53 | the following steps | 
|---|
| 0:12:54 | you first calculate the put the posterior of are new X | 
|---|
| 0:12:59 | having this do | 
|---|
| 0:13:00 | you can draw far tables | 
|---|
| 0:13:02 | are | 
|---|
| 0:13:03 | uh | 
|---|
| 0:13:04 | according to the post your a little of are X | 
|---|
| 0:13:08 | there there uh whether the current value of the parameter | 
|---|
| 0:13:10 | then | 
|---|
| 0:13:11 | you select the instant | 
|---|
| 0:13:13 | times where R is equal to zero | 
|---|
| 0:13:16 | and you perform an ica algorithm | 
|---|
| 0:13:19 | when | 
|---|
| 0:13:20 | considering only if these samples where R is equal to zero and you just ignore the part | 
|---|
| 0:13:25 | and you're also | 
|---|
| 0:13:26 | a | 
|---|
| 0:13:27 | the power to P which is the probability that | 
|---|
| 0:13:30 | uh | 
|---|
| 0:13:31 | what that uh the sources are are or or not | 
|---|
| 0:13:34 | a condition | 
|---|
| 0:13:36 | and it work depend | 
|---|
| 0:13:38 | um in the case of a markov chain is it's uh almost of the same thing you just can have | 
|---|
| 0:13:43 | a more complicated step in order to do | 
|---|
| 0:13:45 | a power just but it's | 
|---|
| 0:13:47 | it's basically the same | 
|---|
| 0:13:49 | and you can use | 
|---|
| 0:13:50 | any uh favourite algorithm in our simulations we could use J but the other | 
|---|
| 0:13:55 | uh the all those you work well | 
|---|
| 0:13:58 | so here here yeah a a a few simulations | 
|---|
| 0:14:02 | uh you have here | 
|---|
| 0:14:04 | the average mean square error on the sources depending on P so the probability of that uh the sources are | 
|---|
| 0:14:11 | independent depends P E one is the case where | 
|---|
| 0:14:14 | uh the sources are effectively independent | 
|---|
| 0:14:17 | the assumptions of i C holes | 
|---|
| 0:14:19 | so we have green | 
|---|
| 0:14:20 | the case of complete got a that is are is known | 
|---|
| 0:14:24 | i perform just a a a a a a a or ica on uh on the | 
|---|
| 0:14:28 | restricting a a set consisting of on a uh a a of the | 
|---|
| 0:14:32 | of the samples where independence whole so it's a a lower bound of the mse that uh you can tame | 
|---|
| 0:14:38 | you have a in but to the case were just ignore | 
|---|
| 0:14:42 | and depends to make as if the sources were | 
|---|
| 0:14:45 | independent although they are not just apply J | 
|---|
| 0:14:48 | and you can see that | 
|---|
| 0:14:49 | a a one P decreases is that is one that's also become no longer depend | 
|---|
| 0:14:54 | of course J | 
|---|
| 0:14:55 | phase to uh to separate the source | 
|---|
| 0:14:58 | and in read red | 
|---|
| 0:15:00 | you have a uh the results given by my i agree with them | 
|---|
| 0:15:04 | which means uh it's is it in complete data estimation | 
|---|
| 0:15:08 | and you can see that we are somehow uh we have um | 
|---|
| 0:15:12 | quite interesting performance | 
|---|
| 0:15:14 | in the case where P is greater than say | 
|---|
| 0:15:17 | do you point for two point five | 
|---|
| 0:15:19 | depends what you | 
|---|
| 0:15:20 | expel | 
|---|
| 0:15:22 | here you have basically the same a results | 
|---|
| 0:15:25 | the only thing you can use as as that of the results uh remain true one um | 
|---|
| 0:15:30 | whatever whatever the sample size maybe | 
|---|
| 0:15:33 | and finally a lets me uh out that a you can uh use the same at that | 
|---|
| 0:15:38 | in the case where R is a markov chain | 
|---|
| 0:15:41 | and in this case | 
|---|
| 0:15:43 | you can see that if you uh take into account the fact that are as of of close you are | 
|---|
| 0:15:48 | able to improve | 
|---|
| 0:15:49 | the performance | 
|---|
| 0:15:50 | which is you here we uh we we generated a lack the is going to a of change | 
|---|
| 0:15:56 | and a separate the sources either ignoring the smog uh property or | 
|---|
| 0:16:01 | uh using it and we obtain better performance with uh | 
|---|
| 0:16:06 | using the mark of prob | 
|---|
| 0:16:08 | so a conclusion confusion proposed in your model which consist of a linear mixture of source | 
|---|
| 0:16:13 | the sources are dependent | 
|---|
| 0:16:16 | and | 
|---|
| 0:16:17 | uh there exists a lack and were hidden process which controls the depends what the dependence of the source | 
|---|
| 0:16:24 | and the separation methods method | 
|---|
| 0:16:25 | but was | 
|---|
| 0:16:26 | uh | 
|---|
| 0:16:28 | relies and i two different uh | 
|---|
| 0:16:31 | to different methods of first i C A which | 
|---|
| 0:16:34 | uh is applied to the independent part sources distribution | 
|---|
| 0:16:38 | and i C E | 
|---|
| 0:16:39 | which estimates | 
|---|
| 0:16:40 | uh the power meter or uh the meters | 
|---|
| 0:16:43 | and are incomplete data | 
|---|
| 0:16:46 | with into that | 
|---|
| 0:16:47 | i think you're for your attention | 
|---|
| 0:17:05 | you have a a a case of for for more | 
|---|
| 0:17:07 | what a very good question i have no application of that's time but i'm interested in anyone having me if | 
|---|
| 0:17:13 | there is an interesting application | 
|---|
| 0:17:16 | and if uh if i understand well a | 
|---|
| 0:17:19 | do you want to estimate a | 
|---|
| 0:17:21 | just an independent sense | 
|---|
| 0:17:23 | oh okay | 
|---|
| 0:17:24 | when you want to estimate the mixture | 
|---|
| 0:17:27 | i i by some | 
|---|
| 0:17:29 | at the and you only used it depend | 
|---|
| 0:17:33 | you do you find a | 
|---|
| 0:17:34 | yes i i i i i uh i fine | 
|---|
| 0:17:37 | the in sometimes where in depends holes | 
|---|
| 0:17:40 | and i ignore the instance | 
|---|
| 0:17:42 | the samples where in depends does not hold | 
|---|
| 0:17:45 | a a a a you still mix | 
|---|
| 0:17:47 | a the other | 
|---|
| 0:17:49 | i i i i set that to a when do you have no it mean penn this | 
|---|
| 0:17:53 | is to have to so six that a mixture | 
|---|
| 0:17:56 | i i | 
|---|
| 0:17:59 | uh i don't know how i don't understand you your questions so okay you have to to under lighting process | 
|---|
| 0:18:04 | yes | 
|---|
| 0:18:05 | and i think uh a in and one you have independence of sources | 
|---|
| 0:18:08 | and the know that you don't have to depend S | 
|---|
| 0:18:10 | but you still have the mixture | 
|---|
| 0:18:12 | in impulse case | 
|---|
| 0:18:14 | is i i i didn't understand | 
|---|
| 0:18:17 | i i i i have just one product one vector process | 
|---|
| 0:18:20 | okay K as one of the of T | 
|---|
| 0:18:22 | instant i a different and sent either they are in and or not and i mix them what the the | 
|---|
| 0:18:27 | same matter | 
|---|
| 0:18:30 | really the in the case where with and then | 
|---|
| 0:18:32 | yes i makes them | 
|---|
| 0:18:34 | i have a S one of T S of T | 
|---|
| 0:18:36 | and i mix them | 
|---|
| 0:18:37 | whether they are dependent or not | 
|---|
| 0:18:39 | okay yeah i disk one than uh | 
|---|
| 0:18:42 | a lot if you could be possible to to all something for my something that case | 
|---|
| 0:18:46 | where the the the they are the because | 
|---|
| 0:18:49 | you have a also the mixture in that case so maybe some | 
|---|
| 0:18:52 | this information | 
|---|
| 0:18:54 | uh a could be a not ultimately yes of course is you are some so yes | 
|---|
| 0:18:58 | uh i i i i agree if you know more about the distribution then you can | 
|---|
| 0:19:04 | probably can perform better i'd just ignore a part of the information yes that's | 
|---|