12.05.2024 VIBS 675 29 Feb 2024 B

� � 
LIVE � �  � � 


E. e. okay we just give people a couple. minutes to come in uh what you can do if. you want you can uh. upgrade your toolbox remember you can. use that uh the code you can find from. the quick installation and make sure. you're in the. right. folder I hope I was in the right. folder. okay in the team's Channel and that's uh. what we are going to do we have agenda. here uh Chata is going to show us some. of this demonstration because he talked. about the neon networks and I assume. he's going to give some. examples and. uh also after that I can work through. the framework of neuron Network deep. neural network provided by uh M lab the. tool. box then we will look at using a example. data set this building data set called. patients um we will look at this mlab. app called classification linear so this. is a interface allow you to access a lot.

Of the classifiers uh. predefined uh in MLB so you have a user. interface as long as you know how to. organize your input data set and which. is are the input variables uh what which. is the response response variable you. want to predict for then you define this. XY and then you can use the. classification. linear um the next after that um I. believe we need to get familiar with how. if you have two data set you want to. compare uh basically independently. process two data sets and get this. scce variables and they save to SC1 SC2. so uh we need to learn how to merge them. in order to be ready to compare so one. may be from White type and two from a. noard for example the treatment groups. so after that we will look at how can we. uh annotate cell type using your own. defined marker Gene database because. right now you you click automatically U.

The program will use tangal DB marker G. and database to annotate your sales uh. Tang DB provide 148 different sale types. predefined uh basically those those. databas uh marker genes are from a litur. and from a uh user in inputs but you can. Define your. own uh marker GN database I show you how. can you use. that um if we still have time at the end. I will show you how to set up python the. environment because some of the function. in the toolbox also uh majority see 99%. of developed in my lab but the some of. function I just call python uh libraries. I uh so it's that the case you can uh. expand functionality of the toolbox to. call external function writing in other. languag so we need to install python in. this machine so you can just call that. function and get back the results to. display on your screen.

Okay um Che are going to come here and. to show. us. e. so um hello everyone um and the last. lecture I just uh talked to you about. the Nal networks uh from beginning and. the uh the end and the future basically. so I prepare some some examples for you. uh first of all um I'll will demonstrate. what is perceptron and uh the bottleneck. of uh perceptrons back then I've told. you with the x or problem I will. demonstrate all of them and U and talk. about how to uh how to how do we train. neural networks and what are the. parameters uh for setting the uh proper. uh design or the um combination I can. say so let's start with the uh. perceptron I will just simply run this. code I will share this code in the teams. by the way if you would like to uh look. uh look for them and play with. them. so when we start to train we see there.

Is a uh neural network. training window. so as we can see the epoch Epoch means. the. iteration so we uh we have to set some. uh parameters like uh these training in. this Pro uh training process we can uh. call them stopping parameters so we have. to uh satisfy one of them at least to. end the training it can be a Epoch. number uh when it hit the maximum number. it can stop or we can set the time uh. until it uh the training process finish. or we can set the performance so either. one of them. uh can be satisfied at the end and and. the training process uh the best option. is performance basically so when we set. the performance to zero it means our. error is uh zero at the end of training. it's a it's a beautiful solution uh we. want them we want that every time but. it's not happening you will see so at. this time uh it uh the epoch number.

Ended our uh training process because it. won't be uh able. to uh fit the uh you know task we have. given so as you remember we have this. um there should be another yeah okay uh. this is what we call the XR problem. there's a two class the pluses and the. uh the circles so uh it's impossible to. classify them with a single linear line. so uh the single classifier is a. perceptron can only draw a single line. to uh classify any uh data so uh as you. can see it's impossible for it to uh. complete this task because. our network is only consist of one. neuron uh and as you can see the two. input uh the two input represent the X. and Y coordinates it just uh multiply. with the weights and we add the bias at. the end and with the threshold function. we just produce our output this output. represent the line that we are seeing in.

Here so it didn't do the job it cannot. so this is the main point that uh. criticized back then and uh make some. guys to write a book to have uh have. have the potential is so limited but uh. it's it's about all it was all about the. bad design and generalization of the. perceptrons when because you will see. when we add multiple neurons it will. become so easily to uh solve any problem. so let's. proceed uh this is another example I. will just simply run. it so when our problems or uh are you. know suitable for the. perceptrons it can be easily Sol with. multiple lines but it's all depend on. the uh allocation of the uh multic. classes so there is a four classes uh we. just uh assign them at the coordinate. system properly to be able to solve by. the perceptrons with the multiple. multiple neurons so as we can see here.

There is now two neurons and two outputs. which can uh which will be able to draw. us uh two line and these two lines. successfully uh classify our data when. we look at. our uh outputs you see this uh ANS means. the performance the error of the uh. perceptron in the training process it. start with eight then four and zero so. this time our training process ended. with satisfying the uh error uh criteria. so the output is here the output Matrix. or. vector and this is the target so when. they fit exactly or so similar it means. the training process is done and it was. so. successful but there is a catch here if. you remember that I said uh you we we. have to Define our Targets in the. coordinate system properly so when we. just shift. it and run it. again we will see something not. good okay this is the the bottleneck and.

The main problem of the percep is uh. I've been telling you it can never draw. this line even though we have four. classes but the alignment is uh similar. to X or so uh it cannot uh properly. classify them and as you can see the. error is never going to reach to the. zero so at the end it will be stopped. and uh. fail so we don't want to wait for. it all. right and maybe maybe you can dig down. you know uh to uh to solve you know to. understand why it's not working the. previous code line is uh code block is. working I will shift that for you and I. will be um sharing this codes if you. want. to uh you know investigate it so let's. proceed okay this is the main uh thing. we will have some time on so in here we. will be seeing that we just uh we moved. from the perceptron now we call. multilayer perceptron so we add a extra.

Layer so this layer. is. called hidden. layer this is the layer we added so when. we add a hidden. layer instead of just an input and. output. then we add that it will become MLP so. MLP. is. multilayer. Perron. so what we call the architecture is. something like that all the neurons in. each layer are connected so this is the. basic form they don't have have to be. connected but the most basic form and uh. it's like a standard versions uh of each. class of naral networks so when. we connect them all. together we can call it basically feed. forward neural network so the main title. and Main um type is multilayer. perceptron whenever we have uh at least. one layer it's a multi layer Petron but. uh we can specialize it with another. properties when we connect all of them. all together and the information is flow. from input to Output layer as you can.

See there is no. Loops or backboard. connections this will be a different. kind of special kind of uh type of naral. networks I'll be telling you in a little. bit so this is the feed forward nural. Network this is. uh feed. forward NN but you simply say FFN ffnn. you can you can see if you uh you know. search some papers in the in the web you. can uh see the shortcut so basically the. fit forward nural network is a multi. layer perceptron uh multi layer. perceptron is extra layered and multiple. neuron uh perceptron so this is. basically. it so why I'm telling you so when we. shifted to fit forward Nal networks with. the uh the special kind of multilayer. percepton uh let's set uh our neuron. number. S2 but we remain to use uh perceptron. properties like activation function as a. heart Lim it it is a threshold. function uh we just used and train C is.

The parameter for the percep learning so. even though we um you know Advan our. architecture uh when we remain the same. properties as a learning function or. learning Rule and activation function we. still have some problems that problem. problem uh is let the what we call the H. the winter of AI so let me run this code. with only two neuron let's see what we. can. do. by the way we still trying to solve the. XR. problem I'm pretty sure that it will hit. the maximum epoke right now too because. it's so challenging uh for it with this. setup. okay that is good so if it was so. easy it's not that easy every time. because uh we have to run it multiple. times but as you can see it successfully. draw a tool line to separate the two. classes so this is basically. perceptron we just uh used the a little. bit advanced form of uh the architecture.

So with this little Improvement we can. solve the XR. problem uh this is our Network. um okay I can't see the. Eraser. simply are not only simply. that that simple setup can uh solve the. XR but we cannot trust animation. learning algorithm with only one run so. this is the thing so we try to. approximate uh a function basically so. we have to run it multiple time to uh to. trust it so it uh it cannot be a. coincidence we are not doing an exact. classical method or parametric method so. let me try another. time and also there is an improvement. because uh the window of improvement. because as you can see we ended we. satisfied the epoch maximum Epoch number. last time there wasn't any performance. uh Improvement so we we we couldn't. reach the error of zero we were so close. but as you can see it never reached the.

Error. so oh my God itday again so it took me. like minimum 10 times to uh see the. scene when when I'm preparing the uh. examples can I rent one more. time. so normally when you uh propose a method. or you know try to publish anything or. create you have to make this trial like. thousand times to uh convince the it's. really working with different setup of. uh different parameters but we are. working with this so simple uh design. right here. okay do it wrong this time. please no yes it did again so. it's like a ruining. my plan in here just one more. time you know it all depends on the uh. initial weights by the way let let me. talk one two one. one uh one two this is 2 one and 2 two. so we all want to reach the optimal uh. value of each weights in this training. process what we are doing is uh updating. these ones according to our error at the.

End as I've been told in the Tuesday so. whenever we find the best optimal or. best uh too close to Optimal so we will. end up to see uh a. successful predictions okay it did. again so sorry I could do make it fail. for you but uh it's not working like. this every time believe me it will it. will fail a lot of time it's just. coincidence because uh the. initialization of these weights are. started uh randomly at the each time. when I uh hit the button and training. process starts these weights uh. initialized randomly so when you are. lucky and it started with a good uh. solution space okay good. point. I so it is our solution space the shape. is not convex not linear anything at all. so it's like a unknown area so we just. when we initialize each weights the. problem start somewhere around like. let's say in here so these are let's say.

Our. optimums but there is so many uh gaps. and traps in this solution space so. whenever it's just all about where you. start so if you're lucky your start. point with the initialization in. initialized weights will be uh near an. Optimum so this is the coincidence so in. order to avoid this coincidence we have. to run it all over again thousand times. but this time we I want pursue. that uh okay let's proceed with the back. propagation algorithm we just uh as I. maybe. remember they uh introduced a new and. more intelligent training function with. the more intelligent and really soft. nonlinear form of activation function so. let us see when we change them what will. happen so when I change hard Lim to. let's say tangent sigmoid and run. it we will see the shape instantly how. it changes and maybe little little bit.

More. improvements but I'm not sure I'm not. uh giving any promise I believe it will. hit the epoch number again no. performance wise any. Improvement yes. as you can see uh when we change our. activation. function yes okay this time I succeed. when we look at. the actual outputs and the targets it. should be the. same if it succeeded so it didn't. succeed this time thank God so it should. be uh produced zero instead of 1.5 and. the same has gone like this one so it. failed it should have been failed more. at the previous setup but let us. see but there's an improvement you can. see there is no uh hard edges because we. shifted our mapping function like. activation function is like a mapping. and uh it's a nonlinear mapping. basically so we don't have any hard. edges right now we anymore I mean we. will have very soft we and easy to adapt.

The real word nonlinear type of problem. so it will become blur that's that's. what we wanted so let's proceed to and. change uh one more. thing to make it a proper multilayer. percepton our training function train LM. is basically a back propagation. algorithm it use the the same error but. relies on the gradient Cent algorithm. and the derivatives and it's more. intelligent basically so let me run and. also by the way don't forget we still. have just two. neurons let me run. it all right we have a different window. right. now. yes so this time it just run for. iteration. in just for oke it can satisfy the. gradient so it reaches the minimum. gradient what what does it mean. so when we use our error in the form of. uh sum of squares it just something like. this this is the representation. of a. curve so when when it does uh just in.

For iteration it just started somewhere. around here maybe and. jump with the help of the uh D to d w i. 's this W so it just propagate the. errors and then it just started to it. can I mean uh jump all the way through. so one two three three and. four at the fourth iteration it just. easily found the optimum so this Optimum. uh is satisfies our stopping criteria. over here so what we can see with the. proper activation function nonlinear. activation function as a sigo. function we see this uh beautiful. plot so let us see uh our outputs right. precisely right it's like a basically. zero the error is zero basically it's. just precisely can uh classify it and no. matter I run it like thousand of. thousand of times depend on my uh. experience it will give you this in a. very very short time because it's now. it's really capable of and so.

Intelligent. so uh let me try another thing so what. will happen if I pursue. uh the privious. setup like remaining the perceptron and. increase our number of neurons so what. we see I'm so curious right. now by the way anyone can hear me I I. don't know it's working or not but we. cool I guess. okay I need to go there. yeah now we can see okay well I'm here. it's 10um right now we still have a. hidden layer but uh we are in the setup. of the perceptrons still the properties. I. mean when we look at that okay it seems. just a minute yeah it did a good thing. but we still in the heart Zone as you. can see only lines it can draw multiple. lines but there is no nonlinear uh. capability of any mapping or any uh. relationship that it can learn it's. still a linear form is the main uh. disadvantage of perceptron it cannot go.

Beyond that. so let's do one more time so what will. happen when I use proper nural. Network I mean the fitb nural network. with the proper uh updated um. parameters with 10 neurons right. now let us see what will uh we will end. up yeah as I. think on three as you can see I will uh. shift the windows. again you know our solution space is uh. I mean uh in our solution space it just. creates so many better Zone because it's. now more advanced form and it can create. uh very good. um how to say points that uh. successfully classify so there is only. four point but uh when we uh increase. that number of um observations this. particular n network with advanced form. with the multiple neurons will uh act. better every. time so now we uh have a successful. neural network right now let us. proceed. okay and sorry for that I couldn't have.

Time to translate the ter ish uh. notes. okay this also uh the similar type of. problem we still. have yeah okay that's. good so these are our outputs and as you. can see it. failed we have four class I mean two. class still but it just failed because. let us see what we have used as a. parameter it's all about choosing the. parameters yes as I said uh when we. increase the number of observations the. coordinate system is becoming so. challenging to uh train for a particular. neural network so we what we did just. with two neurons with the default. parameters I you cannot see any. activation function or training uh. function parameter in here because uh. its default version in this default. version it uses sigmo function and back. propagation algorithm I just don't uh. want to mess with them I just want to. increase our neuron.

Number and let's see what will. happen. yeah all right as you can see even with. uh just adding more neurons it will now. capable of uh successfully do the. classification when it's uh challenging. so as we can see just uh classify. precisely this. time so one another trial in. here I will just increase that to. 20 all I. see it's a messy right now it will. become similar to single cell data I. guess but far we were far from. it but as we can see uh even though. it's really challenging it somehow. managed to successfully uh classify all. the points because there is no area that. uh you know trap us inside like like. cluster manner I mean it has to be in. the like. say it. shouldn't uh it shouldn't act like the. clustering man algorthm like uh try to. squeeze the data or uh make a you know. how to. say make a line around them it just it's.

Free with the activation function it can. create any uh an area that it can. accomplish the task so there's no. boundaries I mean. so so just adding the uh neural number. we uh did the. job uh so there's the these are the. hyper parameters I mean uh the neuron. numbers the activation function. selection the learning algorithm. selection there's tons of uh choice you. got but within time uh you got the. experience uh that you understand what's. uh needs to be done uh I mean to finding. the proper uh selection of the uh. parameter. design so let me proceed I'm sorry I. have. to this mess. all. right now I will talk a uh I want to. talk a little bit about my um spatial. function it's a really primitive. um uh kind of a special function that. I've been used and um published with an. article back then it's like a little.

Like a little toolbox of Dr Tai but it's. really really basic form it's just a. uh nested function special specialized. function and it basically allow you to. just. uh run uh uh um how to say the. experiment design so in neural networks. as I said you have to uh try many many. more times with the different. combination of hyperparameters in order. to achieve the uh trained network with a. minimum error as it can be so this is. all it does do with the default. parameters it just uh uses the. particularly for time series forecasting. by the way so input number means the. legs of uh any Vector so in Time series. uh analysis or forecastings we just use. the past values of any uh Vector of. anything as you can see it's a it's like. a let's give an example like Financial. like daily basis exchange rate or stock. market Val of any company so when uh.

When you got that data it will be. seeming. like uh at time XIs with the X. observation let's go basically. observation so what you will see is just. something like that change during time. so what we do. to. uh for. cast. the past values after that. point we just use these P values and we. call them legs so one leg means my next. prediction based on the last day and. let's say leg. 10 leg 10 means I grab last 10 days. observation. basically uh to predict the next day. it's called one step. ahead one step. ahead. forecasting so this is what what's. basically time series forecasting you. use the past values because uh in. theoretically uh this Vector has a has a. property that we call autocorrelation. autocorrelation means it just uh have. relationship with the within so uh we. don't have to provide or uh unveil this. relationship in the neural networks.

Because it's kind of a black box you. just feed it and you and it learn the. data the behavior and how it uh. relationship with uh past values and. then it just learn it uh it becomes. experience of the data itself so it can. produce the predictions uh of what we. call forecast because it will be a. future Pres predictions and it will just. make us uh it's prediction how it will. be ended up in the future so. basically uh let me run a simple. code uh yes this is. a sample code I. addit let me plot it so it's basically. okay sorry for. that okay this is basically a single. line is 1 2 3 4 5 6 7 8 9 10 up to 100. so we use that data from one to. 100 and we will expect from Nal networks. to forecast uh 10 or 20 Step Ahead. forecasting so uh what do you think when. we train a network with the with a. vector going from one.

To and like say 10 step ahead. forecasting so what should we expect to. see at the end the the the the forecast. will. be naturally like 101 for the first one. right so go. for so if you're are lucky because uh. it's a really long um I mean time to for. to keep forecasting because uh one or. two step is okay but imagine you use uh. past 10 years data or let's say Okay. past 100 days of data of anything uh you. can imagine and try to predict the 10. more days ahead so when the when we uh. extend this forecasting range it will. become so challenging and hard for it to. uh maintain that uh Power. so I'm talking too. much let's. right I'm pretty sure it's. running. yes it's busy it's running I probably. let me. look. yes I just set the parameter for the. window of the training process to zero. in order to increase the performance.

But all right I'm. sorry. okay it's started it will run uh exactly. 100 time because I set the default. parameter of. input uh layer and hidden. layer as a 1 to 10 what I mean so number. of. neurons number. of neurons in input. layer it's one 2 to 10 in Hidden. layer it's also 1 to 10 it's a basing. design so we you shouldn't have to T to. it you can increase it to let's say 1 to. 20 or 10 to 20 it's all up to your data. and your experiment so and for the. hidden layer also you can increase that. 1 to 10 or 1 to 20 or 10 to 12 so. sometimes we we can uh already know at. the beginning that uh you know 1 to 10. hidden neuron it it won't do anything. because your data is so challenging so. basically in the in the cbnet uh. parameter you can set just okay minimum. hidden number set to 10 just start from. 10 to 20 or you can add the second.

Hidden. layer which will become something like. deep uh Network that will be calling but. it's simply not uh deep networks is much. more advanced than complex form so just. adding second hidden layer it won't make. it any deep. Network so the same thing going on in. here and the at the. output layer of the number of neurons. sorry we only have one neuron because. what we do we use all this past data to. predict only one data so this kind of uh. design should be uh remind you something. uh some specific type of uh you know. analyze type it's a regression basically. so we just use the multiple data in. order to um predict single one but at a. time it it pro um I mean at the at the. time of the uh training process it just. produced one uh prediction but it can be. a multiple times so I I I'm not meaning. that it's only capable of uh predict.

Only one. prediction this is just a. design it's about to. end so any uh questions so far I'm just. keep talking and talking. any. I also got some funny stuff too uh after. that. oh yes yes it's it's basically uh the. architecture one it just started with. this architecture one on in the input. layer one in the hidden layer and one in. the output layer so one it goes on like. one 2 1 1 3. 1 up to. 1 10 1 and it shifted to uh two running. the input that means I use it it started. to use two uh leg so 2 one 1 2 2. 1 all the way down 2. 10 one so 10 in here 10 in here and my. um hyper parameter for input uh layer. number is 10 so basically our last uh. architecture to train is 10. 101 so it become a 100 uh neural network. train and it. will it's about to finish I just uh want. to show you what's uh about the uh. training uh a nural network in a proper.

Way in a proper parameter design the. parameter by that I mean the hyper. parameter so neuron numbers activation. function selection learning parameter. selection these are all the parameters. you have to assign so uh you will you. you will build uh an experience in time. to choose the proper ones I can uh I can. set the prop proper. design in just one. architecture I just uh I know you know. uh but in order to show how this. algorithm works and how this function. works I just have to run for you but. it's not that hard every time in. meanwhile you just get the intuition. that like somehow you just. find uh the proper parameter designed to. try okay it's. finished let's stop. talking. and also this uh algorithm is capable of. uh. using different kind of uh let's. say criteria uh what we call as a. performance criteria is like a RMS e.

Root means square error is a well known. widely used it's just a way to um you. know measure the error so we measure our. error based on rmsc and we find that. our best error uh is this and this. architecture provide us this uh. particular performance so but when we. use so uh complex form of data and a. different kind of performance measures. it will change so uh what I mean that. lot score error handles the error or. calculates the error in its own manners. so when you use mean absolute error this. is a different kind of a performance. feure it can be choose different. architectur. so uh at the end choosing u a proper. performance Criterion is also a hyper. parameter it becomes a hyper parameter. to choose because it will end up giving. you uh different architecture to uh make. your predictions in the future with the.

New. data so uh it finished 1011 is giving. the best so just a little visualization. for. you I prepared uh more complex one but. uh it appeared that we don't have a. time but uh. sorry. okay what we are seeing is that our. training part of the data is blue and. the red one is forecast so it's exactly. what what we wanted it learned the. behavior of the data even though it's so. simple it learned it can learn so much. complex forms I can I can show you uh. but uh later on because it will take uh. training time so much I planted to show. you. and just okay these are the all results. you got after uh everything's finished. the array and when you choose the. uh. proper line okay this is our forecast so. it just precisely forecast the 10 step. ahead forecast uh very. beautifully so for the one step ahead is. 100 one two three and four so it based.

On when when it forecast the uh First. Step Ahead so. the I mean the input shifted to uh this. point so for this forecast it pro it it. uses the last 10 observation of the. original data when it produce it the. window is shifted so it will use uh the. forecast it want to forecast uh future. um steps so when it come in here it just. uses nine forecast and one last. observation of original data to uh to. predict the P one so when you uh set the. uh Step Ahead parameter to 20 30 40 it. will become so challenging because of. this one so it has to forecast so. precisely so good in order to maintain. that performance in time so it's a. really difficult task but it can be done. as you can see U I believe that if I if. I set this to 50 instead of 10 it will. be doing this job so beautifully because. it learns it learned so beautifully so.

Good uh and the data is so simple that's. the. case so any. question all. right so I want to shift to convolution. L Network so. fast I just want. to run. it it's because of time I'm so bad at. time management so I'm sorry for that uh. I planted to show I can talk like hours. and hours for you okay let me check the. error so convolution letter works as you. remember uh it just uh it's a special. neural networks to use for image. recognition so we now this time instead. of the numbers we just have our pictures. what we say uh what we call this kind of. uh uh images is so well known it's a. famous it's amnest you can find anywhere. in the web so it consist of uh numbers I. mean not the numbers these are I don't. know how to call in English. like. uh. sorry no no no I mean it's just a zero. to nine so there is no twodigit.

Numbers okay but you can combine it to. produce you know uh this basically. 1,000 uh Zer and 1,5 1,9 so this it this. data set is consist of 1,000 of each. number with different position and. different. size so this is a like XR problem uh in. the percep for image recognition when. you come up with some algorithm you have. to prove it with this amness data set so. when you successfully uh prove your uh. method with this data set it just means. that okay it's ready to publish or. really to use by another people. so okay this is our training. process as we can see it just run so. perfectly in so uh fast this is the loss. function as you can see it just degrad. over time this is our gra gradient so. this should be like close to zero it. should end up like it did but uh maybe. you can see there's some jumps over. there so it's not going smoothly it's a.

Crucial point to train a network because. what is here we call is regularization. some kind of regularization so it what. it means it just. uh make it hard for narrow networks to. learn so easily it creates challenge in. the training process by doing some. strategies doing some tricks like a cut. off a a neuron in the in the training. process let's say in the convolution. neuron verse we got tons of neurons in. so many layers it goes and all and. on so when it's learning imagine that. okay this regularization do that let's. cut this neurons randomly let's cut this. one too and see if it can still proceed. training process so it becomes a. challenge to uh make it uh learn so. properly and can be generalized form so. this is the accuracy this is the loss. function as you can see it's 99% uh it. reaches so fast only 48.

Seconds uh so now it can. learn I mean it can predict any uh. number given from the M data. set so this particular one yes I've. chosen. four let's say it gives the uh true or. not okay as you can. see it's because it's starts from zero. this is the zero class so it's uh this. these are the pro probabilities by the. way so uh the. prediction uh belong to uh zero class. with the probability of nearly zero so. that means it's for is for 99% like not. perc 99 of probability you know uh it. just classified as a four so I just fit. the network with the image of four and. it gives me okay this is. four let show you again. yes after training I just fit the. network I just give the example okay let. me classify this one so it classified at. the fifth class which is for four number. four so this is basically. it uh can I be late like five minutes.

Sir is it okay to extend like four or. five. minutes so this is the convolution. networks I will be talking so fast with. the okay maybe you probably asking that. uh how how Network can use images as an. input. so in this. lines uh it just read the image and. converted the pixels so 28 by 28 it just. uh accepting form or format of the. Matrix so it just basically say that. okay there's a 2D image like not not a. complex this 2D simple very low. resolution 28 by 28 pixels means the. Matrix size so each cell corresponds a. uh um a pixel value. so let me show you. really so let me unlock. it. so this is what 28 by 28. uh can you guess which number is. that so it's what actually convolution. Network sees as an input this is the. input so uh let. me let me just run it like a plotted. okay it's four it's what. we it's what we just fit the neural.

Network it's actually four but it's not. like what what how I mean how we see it. just how it works in the pixel format in. the P pixel Matrix is that easy but. basically just when you have high. resolution data like 1,00 to one it can. be a very complex form of the image not. like uh low resolution it can be 3D and. you just add layers after it and it just. uh just that so I was so curious when I. was first learning what's convol. networks how it reachs the image it's. that simple it's just all the uh the. problem is how you uh Define your input. uh no matter it's text or image or. anything else convert somehow. meaningful uh data set meaningful Matrix. format so after that you just give to. the Nal Network and it do the. job everything to mat yeah Matrix is so. Matrix is the key matrix multiplication. as Dr Kai is say is everything you just.

Have to uh know your way to um convert. it in that form and how to use it in the. in this multiplication adding uh kind of. process so. uh after that you just understand so. well what's happening uh over there and. here so the last thing I uh I mean not. last but this is the lstm lstm is. working uh is is is is for especially. for text related uh classification or uh. gener generative uh uh tasks for AI uh I. want. I don't want to run it so uh long so I. will just set max Epoch to maybe just. three. and because as you can see in here I did. the proper set of. 300 can you see. that it's so okay I'm sorry it's just OC. number set to 300 and it took 7 to8. minutes to alone train it it's just 100. neuron but the uh lstm neuron is so. complex form so it just creates uh so. many challenges to the training process. as you can see there's so many.

Calculations so many uh EPO Runnings and. the accuracy is dropping down and up. down and up so it just uh but at the end. it just learned this one and produced. the new text uh after it. so let me. just okay this is the actual uh text it. just get from the web. it's always in the Wonder. World it's just a script or the book. writing so basically we feed it with. the with any text we want so let. me sometimes it just it uses so many. toolboxes and the and files so it can. sometimes. confuse. okay it read last time but now this time. so okay when we uh after 78. minutes after 7 to8 Minutes in the 300. epoke it learned Alice Wonder World so. it just read everything and it learned. everything in it so it just start to. produce uh like I say give 500 uh ver or. property because the the blanks also how. do you say blanks no blanks just spaces.

Are also a prediction in here so. everything you see is just generated. purely uh from lstm it's not exist it's. not written text so it just after it. read the Alice Wonder World it just. produced this one so I don't remember I. I I read it years and years years ago so. maybe there's some meaningful things in. here like mck turtle do something with. the king and queen or something like. that but uh we have to catch the uh you. know the um the properties of the uh. sentences and the words come after it so. it just makes sense it's just properly. design it's just not a uh nonsense of. blah blah or blah blah so if uh the. predictions make any sense it means it's. creating something like uh successful so. as I told you in my previous. presentation there's some uh short films. just shoot it like a script based on the.

Lstm so it's that. powerful so my final thing will. be a little. demonstration I'm so late I'm sorry for. that it. was okay it's here so with just like 10. lines of code you can do something like. that. I just connect my. camera to metlab you can do. that but before let me close. all. okay so I will connect to alexnet so. this is uh this alexnet is the one of. its first kind so uh back in. 2012 the first. uh the huge convolution nural networks. with thousands of images are trained and. it just sit in the web in the in the. cloud it's some somewhere in the uh. somewhere I don't know but uh it started. the way of AI is going because um you. know uh it can take so much time and. effort to uh train this large uh kind of. neural network so back then it started. with the Alex net it's 2012 I guess they. just created a network and they just uh.

Provide for public so it's just like. okay we trained this huge Network just. you just use it uh whenever you like to. so this is what we are going to do right. now we just uh connect to Alex net so. our net is now Alex net so when I run. this while uh. loop. okay. sorry I'm so bad at okay you see the. label it just perfectly and uh yeah I. also bring some items with me let me try. it can okay oh no yeah it catched so. it's uh definitely not a proper way to. fit the narrow network. but yeah match stick something like that. match stick I don't know it's not match. stick but let me grab something I just. bring. some. banana. yeah come. on green snake something like that if. it's yellow it will catch banana so. easily. but as I can say it's just uh not a. proper way okay just gas this is yeah. it's just gas but let me. okay toy of my.

Kit yeah toy. puddle yeah it's cool okay what what. else we got. okay what about the water bottle yeah I. see that you see that pop bottle no it. was yes water bottle you see. so anything else. yeah. sorry. ah it can definitely uh do that so yes. tennis. ball and finally let. me yes is ping pong ball I still the. Goldfish now orange hey yeah I see did. you. catch so it's not a proper way but you. can it knows thousands and thousands of. image so if you feed it properly it can. produce you anything you want and also. it can be um upgradeable so when when. you. ex yeah. sure I I was so yes sweet no okay. maybe sir no no what is it okay many yes. because there's too many things like uh. there is a class of thousand and. thousands of uh vector and there's a. probability Mass you just try to pick. the most probable one out of this huge.

Vector of probabilities so uh basically. we can you know do labeling and you know. uh like uh there's a like catching in. your phone it just detects your face or. something like that so uh it can be. Advanced you know when you line you know. write some lines of codes with the. capturing the uh you know any object so. we can we can do that when when when you. do that it just stabilize the uh input. as a image form and it will become more. uh precise that time so but but I didn't. have any time to extend the code I also. uh make it took so much so I'm sorry for. that so that's basically all it's uh. what Nar networks can do in a little. example there's so many things going on. so it's better to understand and you. know learn how to use it basically not. to create it because nowadays it's the. they become so uh so complex so advanced.

Form even though understanding is for me. like an expert is like really hard task. so I just uh recommend you to use. prompting and how to use the AI instead. of creating it but it's just basically. that so this Nal networks is became an. AI so uh every every time you see that. you just remember there's some neurons. connecting each other and create this so. that's all I'm sorry for taking so long. but I I I wanted to share so. much. okay thank. you. sure yeah yeah yeah yeah everybody can't. right. e. e. e. e. e. e. e. e. e. e. e. e. okay now uh we are moving on to uh next. topic so we will look at this building. app in M. lab first we will load example data if. you want to copy this three lines of. code here and just copy here. okay so we just prepare our input data. given to the. classifier so this is a prepr. provided data. file.

Contains several variables okay in. different. names. dioic systolic this blood. pressure. height age so we put those uh five. example uh variables in. a combined so we combine them together. this five different variables into. X and then we want to predict the. gender okay so we will use information. collected in big X Matrix to. predict the. label of in the. Y so if you look at the Y this is. just a list okay male female. 100 and in. X you also have a corresponding 100. samples and then five columns of. variables. age dioic height systolic weight so we. are going to use this information to. predict each label. we this is a training data we are going. to use this to train the. classifier to light this to be ready to. predict in future if you give five. numbers they will give you a label male. female is that clear okay so this a very.

Typical uh classification. problem uh we're not going to start from. scratch you can start from scratch if. you know this common lines but but here. if you click this apps here. okay this. tab and then you have this drop down. menu and. then in machine learning uh. click yeah right here okay can you see. it yeah so if you're right just click. that this will bring up this interface. um there other um lenders or neuron. Network design. um designer but we are going to use this. very classic okay classification. lar is once we started this this app and. then you click a session and you want to. um import. your training data from workspace. because we are ready we prepared X and. the. Y so here uh you. select so where the data set variable. come from so we can. select X. okay. and then you will select a variable uh. from a work space here and we want to.

Predict. why okay is that clear so we prepared a. table X table contains 100 samples with. five variables and gave them gave this. program as a data set and we want to. predict RIS response variable which is. why it's just label male or female now. we are ready to do this training and see. I want to have uh cross validation five. times fivefold so you can increase this. or decrease this and you can uh start. the session because we're ready. so in this tab there are some summary. statistics uh there scatter plot depends. on which two variables you use you you. you bring uh height weight and those. labels are the male female labels. because you have two. clusters um two. groups indicated. differently um. yeah then you can start to select your. favorite classification. method it's in here so it's called a. model okay you depends on which model.

Which algorithm you want to use you can. select Gan. bear uh linear. LDA. um linear uh. svm. okay. KNN different flavors of KNN KNN means K. is a number how many neighbors nearest. the neighbor okay the. algorithms. all for small data set you can run them. all right so we here we just select what. we uh discussed in. the uh lecture so this is decision tree. we probably will delete that I will keep. linear distribution U analysis I will. cap. Gan I will uh. select F. K so those are the algorithms we. mentioned in Tuesday's lecture so I just. keep three of them and then after select. them I can run those model training. model. okay for this small project it's very. easy to build this model to train to. change the parameter internally and then. look at the performance accuracy by. using this cross. validation. um so linear discriminant analysis gave.

You 100%. accurac 53 males have been classified. correctly into this group male. 47 of them there's no error so this. confusion Matrix look perfect. and but if you look at the K andn model. there two has been. misclassified the true classification. should be male but the the algorithm. gave two of them become female become. male so there're two value two numers. here okay there's nothing wrong with. this place here um so that's the way uh. you can try different model different. algorithm and then once you. finish um you can see okay I want to. export this model so you can save the. trend name of algorithm and with this. parameter save to the workspace and. become a function this function will. take five numbers and pro give you label. so you can use that to uh to get a new. input put data to do the prediction. that's all all.

Right um so you just need to know how to. organize your input. Matrix using what variables to predict. what responsible variables that's all. responsible variables should be the. categories should be. 01 maybe 1 two three in this case a. label male female because this is a. classification problem so we we do this. categorical variables. okay any questions is that okay so I. hope you can find. applications uh in your future study. once you have a Excel spreadsheet with a. lot of. samples and you probably want to learn. this relationship between your columns. and the labels of samples you with. my okay we're going to move to next. topic is using uh tool. box. um as you know before we have ever tried. actually you import import two uh access. numbers from the Geo database you can. have a. GSM 1 two 3 4 Comm GSM 3 45 six these.

Two will automatically merg and that you. import that but it it's okay there's. many different other ways you can make. this but once you have that uh data in. there you can save to the workspace for. example I will. create uh empt so SC so you know you can. you can create scce object so I created. one so this. single cell. experiment this is where be empty I just. put that as a example you have two uh. your. own so you have two so for example in. workspace you have two variables there. um then in the tool box here you can see. I want to merge this two uh variable and. make it one. uh. concate. so I still load this example data the. just allow me to move to Next. Step. so because you you know how to prepare. these two variables if you know how to. deal. with uh individual uh data set you can. save this and that in the.

Workspace now once you are ready to. merge you can select this edit then. merge uh SC variables these two. variables in workspace you select that. and select them the data will be. merged so that's that's what I'm doing. okay um but this example this two. example data set empty make no sense but. I just show you if you have this. variable in the workspace you just do. this okay to merge. them uh it's empty so it shouldn't work. um let's get back to here. uh then we load. example row data result annotation okay. load. example R data no. annotation but this data set is has been. prepared for you is a merged data. contains two samples. so you can uh look at this batch ID and. to start to see okay there sales from. eight wgs and 14. weeks two batches later. okay as I said we normally uh just. run this is inviting. okay 3D inviting with 2,000 highly.

Variable. genes so I just do this ining for all. the sales not independently for batch. one batch two I just do them together. if they separated into different clor. that's fine. because they from two different. conditions maybe they're separable but. some cells some cell type will be still. in the same cluster that's. fine. okay so remember next step is to do. clust string uh K. means. using this iding. position of the cell instead of the. whole. Matrix we. only invited the cell using TIY 3D so. there's only one options here so if you. but if you have tried um map or other. method you have multiple selections here. okay right now we only have done this so. there only one iding variables we can. use. select okay and K. means this number is predefined and you. can change if you want to make more. small clusters to make this number.

Bigger or make it smaller to make big. clusters but this predefined depends on. the number of cells here on average we. expect to have 100 cell in each small. clusters using the Clusters to this 100. cells to determine the cell. type. okay um yeah let's try. so I'm going to use a larger a small. number clusters I will. recompute uh to put this into 15. different clusters because I want to do. this manual select selection so I don't. want to have 80 different selection but. if you use 80 then allow to do the. automatic annotation that's pretty good. but but now I want to show you the. manual annotation for each individual. clusters so I want to run 15. times this annotation so I group them. into 15. groups now I. select U assign cell type to. groups this tells you uh the algorithm. the data side uh the data source and the.

Reference ID. and then this is mouse data sock. Mouse now you can select remember you. know how to do this out annotation but. now I show you the manually so if you. say no you will be asked each for each. cluster so I have 15 cluster I will be. asked 15 times what do you believe this. cluster should. be okay so this is for cluster one the. first popup window ask me the top option. determined by the algorithm is in uh. inter uh Endocrine cells but if you. believe it's data cell then you can. select the second option or you believe. there uh inter neurons you can see like. this so in this step gave you some. input from your understanding of your. cell system rather than automatically. give the best algorism. options so you can select so if I uh. sync this hypothesize I can select this. first cluster will be annotated as hpto.

Size then the second option this is for. the second remember I have 15 different. clusters. this is my second if I believe there are. micro monocytes I just run see. like. Rosell fiber. blast yes I yeah I don't know yet but. yeah you can label them with the numbers. yeah but for now is. jump from different. colors yeah I just. select so in my case I have to decide 15. times because I have 15. different. clusters. yeah at the end. um the algorithm will ask whether you. want to merge if if you if you have this. sub clusters fibr. blast n and 11 they. are cluster n has been classified as a. fiber blast and the same as 11 whether. you want to merge this two clusters. if you know uh the algorithm will keep. all the sub scraps for the different. clusters. okay so this give you some flexibility. to decide the cell. type uh based on your understanding of.

The cell type you worked way the tissue. sample is that okay okay. good. um now we can label the sale by uh label. the sale by class. ID just forget about The annotation I. just put that in so this is class. ID uh we run through one two 3 four. okay now I want to show you uh if I have. my own uh marker gen database I don't. want to use this. buildin um then you can select edit then. uh let me see where it is it should be. two. annotate. okay so what we have just done is to use. using was to use this to annotate based. on the pangal DB provided marker gen. database but now you have a chance. to give the input of your own marker gen. in database so you select. this so I I prepared uh a marker gen. list from another database called SC. type if you don't want to put your your. own you can select see here you can. select.

Okay um so for now I just say no because. I want to show you uh from completely. customized your type. database okay. yeah try different. way two. anation uh no I will put my. own. okay so this is something you have to. input for your uh customized. database you need a cell type name so. here the example I just put cell type. one and then you need a tab tab okay. this invisible here but you need to have. a. tab and then you put your list of Gene. marker genes you believe okay those. genes should be. uh the marker genes for cell type one. those genes should be separated by. a comma here okay so you have. a and in this example I have cell type. two and then tab I have. this random G name in. there this is just. uh example that doesn't make any sense. because this G list is randomly. generated but if you use. this customized gen uh marker you can.

Start to predict so you say. okay and if you do manually and the. algorithm will still run. okay but the output is. only cell type one and cell type two. that's based on your definition okay I. only use those markers to tell. me and the algorithm decide okay those. cells uh looks more like cell type two. so I gave label this some are cell type. oneel that and someone is completely. unknown there's no gen. expressed in this cluster of. sales no Mark gen his price at all. because I use very small data set the. random data set it doesn't make any. sense the same thing okay so you still. merge and there you customize cotype. annotation is. here so in. this I provide a more. realistic example of customized database. so you can cck. this and I. hope you you can select them all. copy and put. into that small dialogue okay to.

Replace your. hypothetical uh. annotation so if you say. know. uh this window I cut this and paste the. long list I just. prepared and let it run. automatically uh let's do this uh. class yeah I need to label the cells. with cluster ID. D and then I can put in the. customized. database let the algorithm to decide the. cell. type for each cluster in. there. okay. any. questions so you need to know how to. prepare this okay cell type and the. genes put that into the correct format. paste into that dialogue window you. should have this customized annotation. of your cell. type you can name it. whatever algorithm doesn't care but you. have to care. yourself okay uh what's on our agenda. here let's close. this. okay so uh next we move to uh. install python. environment so in this menu here I uh. incorporate several python based and.

Also R based programs for example the. Pudo time analysis using monoco 3 is a. very common uh approach people try to. build a s time so U I'd rather to call. this package here but we have to install. for example in I want to use data map. plot but there's slpy indicated this. function require. python if I click this right now uh this. function require 2D inviting I have to. prepare a 2d inviting for the algorithm. first and then to call that. so let's just do a 2d embeding that's. easy easy so I will see. uh use. disy and see like 2D I prepare inviting. for the. algorithm. okay so this is my 2D uh. then I I see okay I want to try this but. the now. uh I need to select a working folder to. put my temporary uh output so I. select. uh we can use temporary folder I'll just. select one of this uh document it's fine. so I select.

Document now we run into the problem. because I have no python installed. here I will have to check uh set up P. environment now. this the program try to. find python.exe. suitable uh program which the python. environment so I I have to check where. it is but because I haveen install it. there's no way I can. find so we are not ready so not set so. so how can we do that. uh so today we just install Python and. then uh getting ready and then we are. done for this week okay. so the version I want to install is. called. mini. mini so mini is. uh provide this python environment for. you and. the we just found this uh. installation uh in installing mini. cond this is for Windows you. download. installer okay basically just like in uh. in uh install any. package other packages you download this. installation file and just keep it run.

Click it run and install in your local. account that's. all Su just me because I want to install. a local uh in call down so remember. where you install it okay so because you. want to find where it is later so it's. under user my. username a data local. minond that's. all. okay once it's finished so I can provide. this. link to the channel so in to set. up so after installation I come back to. here and select. setup Pyon environment. and then we search for. that Pyon comment. comment. command uh that's in user my name. uh looks like it's. hidden yeah so I have trouble to find on. it. because this folder is invisible um so. we have to uh. somehow found it. uh. so I switch to this. folder go. into C pass. here. for. yeah something wrong here and uh we I. will come back next week and see what's. what's happened here okay but basically.

you need to give the instruction to see. to let system know where this Python X. exe file is. located all. right. any. questions okay so I cannot set up python. environment on this machine today but I. will let you know what happened next. week. okay all right see you next. Tuesday. for

All Devices iOS Android Chromecast