ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
xiaohong has joined #mlpack
xiaohong has quit [Ping timeout: 260 seconds]
xiaohong has joined #mlpack
Yashwants19 has joined #mlpack
< Yashwants19>
Hi zoq: Can we take multiple input model(DecisionTree<>,SoftmaxRegression,etc) from a single PARAM_MODEL_IN()
< Yashwants19>
In CLI bindings.
Yashwants19 has quit [Remote host closed the connection]
xiaohong has quit [Ping timeout: 260 seconds]
KimSangYeon-DGU has quit [Remote host closed the connection]
< jeffin143>
Today Working at a new PC and don't know what the issue is
vivekp has joined #mlpack
jeffin143 has quit [Remote host closed the connection]
< lozhnikov>
jeffin143: Looks like you didn't install armadillo.
KimSangYeon-DGU has joined #mlpack
< akhandait>
sreenik[m]: Let's talk at 11 tonight if that's okay with you.
jeffin143 has joined #mlpack
< jeffin143>
lozhnikov : yes i didn't, but didn't we have prospect that if not found, it would automatically download ensamllen/armadillo, because i am not sure that i installed armadillo on my old pc too ! :)
< zoq>
Yashwants19: Are you talking about models that are stored as a file? In which case you could use PARAM_VECTOR_IN(string, ...)
< zoq>
jeffin143: ensmallen is downloaded if not available, since it's header only this is easy; but for armadillo there is no such feature.
KimSangYeon-DGU has quit [Remote host closed the connection]
< sreenik[m]>
akhandait: Ok will be fine
Yashwants19 has joined #mlpack
< Yashwants19>
Hi zoq: Yes i m talking about pre trained model that are saved in .bin or .xml files but different models.
Yashwants19 has quit [Ping timeout: 260 seconds]
< zoq>
Yashwants19: So the input is a string (path/filename), in this case you could use VECtOR_IN as mentioned above.
< zoq>
Yashwants19: If you like I could provide a simple example.
vivekp has quit [Ping timeout: 245 seconds]
< akhandait>
sreenik[m]: You there?
< sreenik[m]>
Yes
< akhandait>
So, are you well now?
< sreenik[m]>
Yes, absolutely
< akhandait>
That's great, did you read the evaluation message?
< sreenik[m]>
Yes, I have read that
< akhandait>
Nice
< sreenik[m]>
Thank you for putting your thoughts. I will give it some practice for a couple of days this week
< akhandait>
Sure, I think once you get a little comfortable with structure and testing, it should be a mostly smooth ride
< sreenik[m]>
Yes, I hope so
< akhandait>
So, now, did you make any more commits to your translator repo?
< sreenik[m]>
No, but there has been certain work after the last commit. I will commit once after this chat
< akhandait>
Okay
< akhandait>
In the Examples/ directory, the xml files are empty right now.
< akhandait>
Can you commit some examples with the linear layers you created?
< sreenik[m]>
Yes, that's somewhat the proposed structure
< sreenik[m]>
Yes, I will commit those
< sreenik[m]>
But there is an issue I am facing with the convolution layers
< akhandait>
Okay, tomorrow, I will spend some time on it and give some detailed feedback. If you have made some progress, it will be good if you can commit whatever you can tonight
< akhandait>
What is it?
< sreenik[m]>
Onnx is not too ideal. It does awkward conversions. I will upload a jupyter notebook. You can get the structure of the network from there and you will alos understand what I am talking about
< akhandait>
Hmm, okay
< sreenik[m]>
We can then discuss that after tomorrow night when you have gone through it
< akhandait>
Sure
< akhandait>
About the problem with some onnx layers having extra parameters
< akhandait>
I think as long as we don't remove any existing parameters from the corresponding layers in mlpack and give proper defaults for whatever parameters we add, we should be fine.
< akhandait>
So, I think we can go ahead and add those extra params
< sreenik[m]>
Okay
< akhandait>
So, according to the list in the readme, they are: Batchnorm, Maxpool, Convolution and selu
< sreenik[m]>
Yes
< akhandait>
Have you checked how much these parameters change the layers mathematically
< sreenik[m]>
Not really
KimSangYeon-DGU has joined #mlpack
< akhandait>
Can you check that formulae for these 4 layers by tomorrow night?
< akhandait>
both in onnx and mlpack, and check how much we would have to add?
< sreenik[m]>
Okay I'll try and check them
< akhandait>
About softmax, that's a very common layer and we would need some conversion from onnx softmax to mlpack
< akhandait>
Did we decide something about this the last time we discussed softmax?
< sreenik[m]>
Yes, that won't be much of an issue as we have logsoftmax. The gradient mainly needs to be changed
< sreenik[m]>
Yes we decided to add a new layer probably
< akhandait>
Okay
< akhandait>
So, let's meet again tomorrow night and decide for a good plan for the next week. By then, I will check and try the samples in the repo
< sreenik[m]>
Nice idea. We'll meet tomorrow then
< sreenik[m]>
I will keep the repo updated in the mean time
KimSangYeon-DGU has quit [Remote host closed the connection]
ImQ009 has quit [Quit: Leaving]
travis-ci has joined #mlpack
< travis-ci>
georgedouzas/mlpack#2 (master - 5fcbe45 : Shikhar Jaiswal): The build has errored.