verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Ping timeout: 256 seconds]
govg has joined #mlpack
Sayan98 has joined #mlpack
Sayan98 has quit [Quit: Hermes - Material IRC Client - https://numixproject.org/]
sshkhrnwbie has joined #mlpack
< sshkhrnwbie> @zoq I have finished up the variance scaling initializer tests. If you could have a look that would be great :)
< sshkhrnwbie> A lot of students interested the DL project for GSoC are looking forward to implementing different initializers. I saw a PR for Glorot opened recently.
ank04 has joined #mlpack
< sshkhrnwbie> Once the variance scaling intializer is merged they can get started with Glorot, He, LeCun initializers etc. using it
sshkhrnwbie has quit [Ping timeout: 260 seconds]
ank04 has quit [Quit: Page closed]
rajeshdm9 has joined #mlpack
< caladrius[m]> zoq: I have implemented the FReLU activation function. Can I submit a PR for that? Also, has any work been done on atrous convolution?
rajeshdm9 has quit [Ping timeout: 260 seconds]
navabhi has joined #mlpack
navabhi has quit [Client Quit]
daivik has joined #mlpack
rajeshdm9 has joined #mlpack
Trion has joined #mlpack
Trion has quit [Client Quit]
Trion has joined #mlpack
Trion has quit [Client Quit]
Trion has joined #mlpack
ank04 has joined #mlpack
ank04 has quit [Client Quit]
rajeshdm9 has quit [Quit: Page closed]
Trion has quit [Remote host closed the connection]
Trion has joined #mlpack
rajeshdm9 has joined #mlpack
daivik has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
rajeshdm9 has quit [Ping timeout: 260 seconds]
rajeshdm9 has joined #mlpack
robertohueso has joined #mlpack
rkaushik15 has joined #mlpack
witness has quit [Quit: Connection closed for inactivity]
< dk97[m]> zoq: There are some conflicting documentations in the dropout layer implemented.
< dk97[m]> Also, I closed the PR of SELU layer as there are some mistakes that need to be corrected. I will do so and submit a PR again.
rkaushik15 has quit [Ping timeout: 260 seconds]
manthan has joined #mlpack
samidha has joined #mlpack
< samidha> Hello, I'm completely new to mlpack and started implementing and following the tutorials in the website. I've been trying to 'load' a CSV file with the command "data::Load("data.csv",data, true);" after defining a matrix using arma. When compiled, I get an error such as "undefined reference to `bool mlpack::data::Load<double>............" which didn't make much sense to me.
< samidha> I checked the documentation and realised that the the variable types for loading a csv file as a matrix and Load<double> is the same.
< samidha> Can someone help me out?
rajeshdm9 has quit [Ping timeout: 260 seconds]
samidha has quit [Quit: Page closed]
ImQ009 has joined #mlpack
travis-ci has joined #mlpack
< travis-ci> mlpack/mlpack#4111 (master - 16d2c10 : Ryan Curtin): The build has errored.
travis-ci has left #mlpack []
manish7294 has joined #mlpack
< manish7294> samidha: Could you please provide the error log or the code you are using to load data? Maybe through pastebin.
daivik has joined #mlpack
Trion_ has joined #mlpack
Trion has quit [Ping timeout: 260 seconds]
witness has joined #mlpack
daivik has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
daivik has joined #mlpack
Trion_ has quit [Quit: Entering a wormhole]
govg has quit [Ping timeout: 245 seconds]
travis-ci has joined #mlpack
< travis-ci> yashsharan/models#7 (master - abe5bfa : yoloman): The build has errored.
travis-ci has left #mlpack []
ank04 has joined #mlpack
ank04 has quit [Client Quit]
daivik has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
AD_ has quit [Ping timeout: 260 seconds]
< rcurtin> firewall change for the build systems is scheduled for 1000-1200 PST, which is now, so hopefully the systems will shortly be accessible if there are not other problems...
jack_ has joined #mlpack
jack_ has quit [Client Quit]
travis-ci has joined #mlpack
< travis-ci> ShikharJ/mlpack#84 (ResizeLayer - 99ec020 : Shikhar Jaiswal): The build has errored.
travis-ci has left #mlpack []
manish7294 has quit [Ping timeout: 276 seconds]
daivik has joined #mlpack
daivik has quit [Client Quit]
daivik has joined #mlpack
caladrius has joined #mlpack
caladrius has quit [Client Quit]
daivik has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
daivik has joined #mlpack
daivik has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
daivik has joined #mlpack
travis-ci has joined #mlpack
< travis-ci> ShikharJ/mlpack#85 (GAN - c3fc5c3 : Shikhar Jaiswal): The build has errored.
travis-ci has left #mlpack []
daivik has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
daivik has joined #mlpack
< dk97[m]> @zoc I had some questions regarding the documentation of dropout layer.
< zoq> dk97[m]: Like?
< dk97[m]> The documentation mentioned in dropout says if deterministic is true, rescale is disabled.
< dk97[m]> However the code has implemented rescale even when deterministic is true
< dk97[m]> @zoq
AD_ has joined #mlpack
< rcurtin> http://masterblaster.mlpack.org/ --- finally up
< zoq> dk97[m]: if deterministic = true and rescale = true, we set output = input;
< zoq> did I miss something?
< dk97[m]> Yes, that should be the case
< dk97[m]> But the code says otherwise
< dk97[m]> @zoc
< zoq> ah, I see, either we have to say false, or use if (rescale) instead of if (!rescale)
< zoq> rcurtin: awesome
< zoq> dk97[m]: do you agree?
robertohueso has quit [Quit: Leaving.]
Prabhat-IIT has joined #mlpack
travis-ci has joined #mlpack
< travis-ci> ShikharJ/mlpack#86 (master - 3c9a3ee : Ryan Curtin): The build has errored.
travis-ci has left #mlpack []
daivik has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
daivik has joined #mlpack
< dk97[m]> sorry for the delayed response zoq
< zoq> dk97[m]: No worries.
< dk97[m]> I would say remove the rescale factor
< dk97[m]> during training we are scaling the inputs accordingly
< zoq> dk97[m]: Okay, do you like to open a PR or should I?
< dk97[m]> during testing, we act as if the dropout is not there, and set input=output
< zoq> dk97[m]: Btw. thanks for pointing that out.
< dk97[m]> if it is alright, may I do it?
< dk97[m]> also, we should keep a check for whether ratio is between 0 and 1
< dk97[m]> i forgot to do that in the alpha dropout, so I will add that to the PR I have opened for it.
< dk97[m]> zoq:
< zoq> But during testing we set output = input * scale; if the user asked for?
< zoq> We usually, don't implement parameter checks, at least not inside the methods. There is an open issue, let's see if I can find it.
< dk97[m]> wouldn't that throw off the dropout implementation? If the user is allowed to rescale the output during test phase.
< dk97[m]> zoq:
manish7294 has joined #mlpack
< zoq> dk97[m]: The scale is somewhat independent, it allows a user to scale the input at test time.
< dk97[m]> I am sorry, but I dont understand why the user should be allowed to scale the input at test time...
< dk97[m]> We are scaling the input during train time so that the expected sum remains the same....
< dk97[m]> Ideally it should be done during testing time, but it is easier to code for the train time, that is why scaling is being done at train and not at test time
< dk97[m]> zoq:
manish7294 has quit [Ping timeout: 268 seconds]
manish7294 has joined #mlpack
< zoq> dk97[m]: you are right, I was thinking that it would make sense to rescale even during test time.
< dk97[m]> So, should the rescale parameter be removed?
< dk97[m]> zoq:
< zoq> dk97[m]: Yes
< zoq> Btw. no need to put my name at the end :)
travis-ci has joined #mlpack
< travis-ci> yashsharan/models#8 (master - 3550028 : yoloman): The build has errored.
travis-ci has left #mlpack []
< dk97[m]> okay!
< dk97[m]> Also, should I put the parameter check or not?
< zoq> I don't think we should add the check right now, we should discussion how we like to handle the checks first.
< dk97[m]> Yeah, I could not find any conclusive comments in the issue.
< zoq> If we handle parameter checks, we should do it for each method not just inside the dropout layer, so there is a unified way.
< dk97[m]> Yes, the same convention should be followed throughout the library
< dk97[m]> I will have a look, and let you know if there can be a way to implement parameter checks so that it is easy for the user to debug.
< zoq> Sounds good.
< dk97[m]> Meanwhile, can I open an issue for the dropout issue, and submit a PR correcting it?
< zoq> No need to open an issue first.
< dk97[m]> okay! :)
daivik has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
daivik has joined #mlpack
daivik has quit [Client Quit]
daivik has joined #mlpack
travis-ci has joined #mlpack
< travis-ci> yashsharan/models#9 (master - a2705c1 : yoloman): The build has errored.
travis-ci has left #mlpack []
< manish7294> rcurtin: I request you to please shed some light on the initialization of DenseA matrix in GaussianMatrixSensingSDP test. I went through the paper but couldn't get how it's being done. I am trying to correlate the way different LRSDP's are implemented, as in case of LMNN we only have it's SDP form and will need to generate the LRSDP using it. Pleas
< manish7294> e also tell me whether I am on right path or not?
daivik has quit [Client Quit]
< rcurtin> manish7294: I am actually not sure on this one, Stephen Tu wrote that test: https://github.com/mlpack/mlpack/commit/e0ef1046133befd2adab352d8dbd24a169d07051#diff-19f28dfe8628cd86d2e2d6b791c4b456
< rcurtin> I thought that the constraint matrices from the regular SDP form applied directly to the LRSDP form and did not need to be changed
< manish7294> rcurtin: yes, that's the way. But I think initialization needs to be given thought on.
robertohueso has joined #mlpack
< rcurtin> so, I am not familiar with the matrix sensing SDP problem you are referencing
< rcurtin> so unfortunately I am not sure I am able to provide much useful input for that one
< manish7294> rcurtin: I was referring to general procedure where we are just given the SDP and need to decide the C, A and B matrix.
< manish7294> Thanks! I will look more into it.
< rcurtin> ah, sorry, I guess the typical way I have seen these encoded is like in the MVU paper: http://www.aaai.org/Papers/AAAI/2006/AAAI06-280.pdf
< rcurtin> in this case, the constraint matrices A and B are determined by the constraints of the problem on page 1684: "Maximize trace(K) subject to..."
< rcurtin> and then constraints (1), (2), and (3) get re-expressed in the right form
< rcurtin> I guess, the question you are asking is, how does one go from the formulation on pg. 1684 to a constraint of the form needed by the mlpack SDP...
< manish7294> Right! This is what disturbing me.
< rcurtin> in this case I'd suggest looking at the Monteiro and Burer paper, eq. (1)
< rcurtin> they express an SDP as a minimization of C * X (for * denoting the inner product of matrices) plus lots of constraints A_i * X = b_i
< rcurtin> so, I seem to remember, the first constraint of MVU (K_ii - 2K_ij + K_jj = || x_i - x_j ||^2) results in an A matrix with only four nonzero entries (either ones or negative ones) at the locations (i, i), (i, j), (j, i) and (j, j)
< rcurtin> and the b_i is just || x_i - x_j ||^2
< manish7294> Yes, using that format we can convert any SDP to LRSDP standard form. But that's conversion holds the trick.
< manish7294> You are absolutely right about MVU
< rcurtin> right, so I think basically for each SDP that you want to convert to the form above, it takes a little reverse engineering to figure out the right representation
< rcurtin> unfortunately like I said earlier I am not fully familiar with the SDP in the test case you are mentioning, so I don't know how the conversion is done there
< rcurtin> it might be worth sending an email to Stephen Tu and asking if he can shed any light on it
< manish7294> Absolutely. Thanks for helping me. I will further try to investigate.
< rcurtin> and I also wouldn't rule out the possibility that there is a bug there, so if you have an alternate derivation that leads to a different result, it's possible that is correct and the one currently implemented is not
< rcurtin> heh, I am not sure how much help I was this time :)
manthan has quit [Ping timeout: 260 seconds]
< dk97[m]> zoq: PR has been submitted for the drop-out fix. 🙂
< manish7294> rcurtin: Thanks, you always helps getting out of the hole :)
< rcurtin> sure, but like I said, I don't think I even answered your question, I just said a bunch about SDPs that I think you already knew then pointed you to Stephen :)
< manish7294> One more silly query: I tried subscribing to mail-list but never got any confirmation mail. Could you help me with that? :)
< zoq> dk97[m]: Thanks, Can you take a look at the style guidelines, looks like the indentation is off.
< dk97[m]> The identations are four spaces instead of two like the usual code...
robertohueso has quit [Quit: Leaving.]
< dk97[m]> zoq: saw the style checks failing, modifying the code for the same.
< zoq> dk97[m]: Great!
travis-ci has joined #mlpack
< travis-ci> yashsharan/models#10 (master - 186d997 : yash sharan): The build has errored.
travis-ci has left #mlpack []
< rcurtin> manish7294: did you subscribe at lists.mlpack.org/mailman/listinfo/mlpack/ ?
< manish7294> yes, I tried many times for past 3 months
< rcurtin> that's strange, with what email?
< manish7294> manish887kr@gmail.com
ImQ009 has quit [Quit: Leaving]
< rcurtin> and what happens when you try to subscribe?
< manish7294> A confirmation message is shown that you may get a confirmation mail depending on subscription settings
< manish7294> Your subscription request has been received, and will soon be acted upon. Depending on the configuration of this mailing list, your subscription request may have to be first confirmed by you via email, or approved by the list moderator. If confirmation is required, you will soon get a confirmation email which contains further instructions.
< rcurtin> can you give me the exact URL you are subscribing to?
< rcurtin> or, the exact URL to the page you are subscribing on
< rcurtin> have you possibly checked your spam folder?
< manish7294> yes
< rcurtin> Feb 26 16:48:52 knife postfix/smtp[6117]: 73CE6700225: to=<manish887kr@gmail.com>, relay=gmail-smtp-in.l.google.com[74.125.138.26]:25, delay=0.27, delays=0.01/0.01/0.07/0.17, dsn=2.0.0, status=sent (250 2.0.0 OK 1519681732 d1-v6si1665696ybn.562 - gsmtp)
< rcurtin> the message goes to gmail's servers successfully
< manish7294> but I haven't recieved anything
< manish7294> :(
< rcurtin> I can see that other gmail users are able to subscribe
< rcurtin> but mailservers and spam filtering are weird so let me just manually add you
< manish7294> Thanks that will be great help :)
< rcurtin> now you are subscribed... you should get a welcome email
< manish7294> Not yet
< rcurtin> I think there may be something in your settings that is filtering out the emails from mlpack.org
< manish7294> Is it possible that you can mail me the password for my account
< rcurtin> I don't know what it is; it's set by mailman automatically
< manish7294> Thanks! It couldn't be helped.I will stick to IRC then :)
Prabhat-IIT has quit [Ping timeout: 260 seconds]
< manish7294> Making a new yahoo account worked :)
robertohueso has joined #mlpack
< dk97[m]> zoq: I am not sure why a style check error is still coming...
< zoq> dk97[m]: The logistic regression and decision tree issues are unrelated, we will fix those.
< dk97[m]> Okay, do let me know if you want anything else changed! 🙂
< zoq> yeah, I'll comment on the PR once I get a chance to take a closer look.
< dk97[m]> cool! Thanks a lot.
AD_ has quit [Ping timeout: 260 seconds]
manish7294 has quit [Remote host closed the connection]
Prabhat-IIT has joined #mlpack