ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
dendre has joined #mlpack
< Param-29Gitter[m>
@zoq also take a look at #2169 once you have a chance.
< Param-29Gitter[m>
also if there is another program on which i can check my performance, please let me know.
dendre has quit [Quit: Leaving]
< jeffin143[m]>
Zoq : found a weired behaviour , in mac
< jeffin143[m]>
If you bit make and then install it
< jeffin143[m]>
And then change your branch
< jeffin143[m]>
And hit make mlpack_test , it would throw and error
< jeffin143[m]>
Precompiled header have changed recompile it
< PrinceGuptaGitte>
Hi @zoq , you asked me to implement ISRU function as a layer. I understand that all activation functions that have parameters are implemented as a layer so that there parameters can be used. However then, wouldn't they become inaccessible by BaseLayer class? Because BaseLayer is calling its activation function's methods assuming those are static methods.
< PrinceGuptaGitte>
(edited) ... a layer. I ... => ... a layer instead of class with static functions. I ...
< PrinceGuptaGitte>
Doesn't it make it really inconsistent to use different types of activation functions? Am I missing something that binds all of it together?
< kartikdutt18Gitt>
Hi @prince776, I think all activation functions that are implemented in layers aren't present in base_layer. Base_layer only consists those activation functions are present in activation_functions folder. This is done to serialize them as layers. The issue that @zoq mentioned was that if I declare a layer with parameter alpha = 0.5 then I can't change it again for that layer i.e. it's not accessible as it is a
< kartikdutt18Gitt>
function parameter. What I think, @zoq wanted to be implemented was that if I declare layer = ISRU(0.5) layer then to change alpha I can simply do layer.alpha() = 0.7
< PrinceGuptaGitte>
Then every activation function that has parameters needs to be implemented as a layer class. I was just wondering is that a good thing? because now we have two ways to make a layer
< PrinceGuptaGitte>
1) BaseLayer<ActivationFunction , Inp , Out> layer; .//only activation functions which are in activation_functions folder
< kartikdutt18Gitt>
Hi @zoq, when you get a chance have look at the new benchmarks in #2178 .
< PrinceGuptaGitte>
Hi @kartikdutt18 thanks for your input earlier. I have implemented ISRU function as a layer(like elu). I've also run tests with no errors. Before pushing the code I was wondering should I also remove the earlier implementation where alpha was unusable?
< GauravSinghGitte>
Hey everyone, I am done with the implementation of the CELU activation function but have doubts regarding the citation of the paper in the code. The link to the original paper is https://arxiv.org/pdf/1704.07483.pdf. Can somebody tell me how to cite it?
< kartikdutt18Gitt>
@prince776, I will take a look.
< kartikdutt18Gitt>
@gaurav-singh1998, take a look at mish activation function, the paper refering to mish has been cited
< PrinceGuptaGitte>
thanks I've pushed it. I hope the implementation is complete
< Saksham[m]>
<pickle-rick[m] "Saksham: You should check out ht"> Should I look forward towards working on specific project related task(the project I am interested in) or should I resolve general issues for the start ?
< pickle-rick[m]>
I'm my opinion, general issues would be a great way to get familiar with the code-base, but it's upto you really. You could open up issues / pull-requests for the projects you're interested in as well.
< zoq>
jeffin143: I wonder if this has something to do with cotire (caching), we could disable cotire and see if that helps.
< zoq>
PrinceGupta: You could use BaseLayer<MishFunction, InputDataType, OutputDataType>, and I agree that this isn't the same as using ELU, but a user proably never uses BaseLayer directly and rather uses the alias defined at the end of the file: https://github.com/mlpack/mlpack/blob/master/src/mlpack/methods/ann/layer/base_layer.hpp#L208 -> MishFunctionLayer, so the interface is the same. Using the BaseLayer
< zoq>
avoids code duplication.
wiking_ has joined #mlpack
wiking_ is now known as wiking
< jeffin143[m]>
Zoq : if cotire (caching ) help a user of Mlpack to speed up things , i don't mind as a developer to hit make mlpack again, i agree it make a developers life hard since he has to it make again and that would mean 1 hr of time, but not sure if we should remove caching
< jeffin143[m]>
I should probably have a go through it once
< PrinceGuptaGitte>
Hi @kartikdutt18 I've tried but doing it the way you suggested that is:
< PrinceGuptaGitte>
HI @kartikdutt18 I've tried so much but I can't seem to figure out a way around it. You suggested to use:
< PrinceGuptaGitte>
`x = (x != 0) * arma::pow(y / x, 3) + (x == 0) * DBL_MAX;` , but this still generates nan values, as we are still adding this term `(x != 0) * arma::pow(y / x, 3)` even when x == 0 which is = 0 * 0/0 = nan. I tried some alternate routes but there weren't working either
< jeffin143[m]>
rcurtin : do you work in developing julia ?
< jeffin143[m]>
Also i am so happy to see Mlpack grow by leaps and bounds :)
< jeffin143[m]>
Saturday Sunday , probably can finish up a pr or some issue or help with the review
< PrinceGuptaGitte>
Like, since y = 0 only for x = 0, i tried to replace 0s in x and y with 1s so that when we divide we get 1(the required derivative) but this doesn't work either since y is const, and copying the matrix will be very slow for performance
< PrinceGuptaGitte>
> HI @kartikdutt18 I've tried so much but I can't seem to figure out a way around it. You suggested to use:
< PrinceGuptaGitte>
> `x = (x != 0) * arma::pow(y / x, 3) + (x == 0) * DBL_MAX;` , but this still generates nan values, as we are still adding this term `(x != 0) * arma::pow(y / x, 3)` even when x == 0 which is = 0 * 0/0 = nan. I tried some alternate routes but there weren't working either
< PrinceGuptaGitte>
due to same reason border value checking in Inverse function is also not working.
< kartikdutt18Gitt>
Ohh I understand because boolean will still give logic 0 that will result in division by zero.
< kartikdutt18Gitt>
I think rather than this you can use for loop to make y(i) = Deriv(x(i))
< kartikdutt18Gitt>
and include this condition in Deriv condition in case you haven't already done so.
< PrinceGuptaGitte>
I considered that but won't that be slow?
< kartikdutt18Gitt>
Quite contrary.
< kartikdutt18Gitt>
refer #2178
< PrinceGuptaGitte>
Thanks
< PrinceGuptaGitte>
I thought armadillo vectorized matrix operations like bumpy
< PrinceGuptaGitte>
Numpy*
< kartikdutt18Gitt>
I tested both just in case.
< PrinceGuptaGitte>
@kartikdutt18 I updated the code with for loop and all edge case detection in Deriv and inverse functions. All tests ran perfectly and there were no warnings during building. Please take a look when you get time.
< PrinceGuptaGitte>
Sorry I think I might have pinged you too much this time, I'll try to do this less often so as not to disturb.
togo has joined #mlpack
< kartikdutt18Gitt>
No worries, I will take a look.
togo has quit [Remote host closed the connection]
togo has joined #mlpack
togo has quit [Quit: Leaving]
togo has joined #mlpack
UmarJ has quit [Ping timeout: 260 seconds]
wiking has quit [Remote host closed the connection]
< pickle-rick[m]>
hey having a little trouble building a test file executable... How exactly do you modify and run the cmakefile to build the tests? Thanks.
< zoq>
pickle-rick: Hello, are you trying to add a new test?
< zoq>
pickle-rick: Btw. nice picture :)
< pickle-rick[m]>
No, I just want to run the existing q_learning_test file. Glad you liked the pic :)
< zoq>
pickle-rick: I see, in this case you can just build mlpack, that will produce an executable called mlpack_test.
< zoq>
To run all test of the QLearningTest test suite you could use: bin/mlpack_test -t QLearningTest or to run only one test use: bin/mlpack_test -t QLearningTest/CartPoleWithDQN
< pickle-rick[m]>
Oh cool. Thanks
< pickle-rick[m]>
I'd like to know how to go about adding new tests as well. If you could point me towards some resources, that'd be helpful.
< pickle-rick[m]>
Or should I just look at the tests folder, and figure out the general pattern?
< zoq>
To add a new test, you have update tests/CMakeLists.txt, add the new test file, for the test file itself you can take a look at the exsisting tests for example.
< pickle-rick[m]>
Ah I see. Appreciate the info!
ocelaiwo[m] has joined #mlpack
wiking has joined #mlpack
ImQ009 has quit [Quit: Leaving]
wiking has quit [Remote host closed the connection]