verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
travis-ci has joined #mlpack
< travis-ci>
mlpack/mlpack#2586 (master - 24f3127 : Marcus Edel): The build was fixed.
< rcurtin>
hmm, maybe I should give carbon copy invoices to my coworkers :)
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Client Quit]
mikeling has joined #mlpack
< ironstark>
rcurtin: I will write the blog today. I am trying to fix the shogun problem.. in case that doesnt happen i'll use someone else's machine to use shogun and make the contributions.
< sgupta>
rcurtin: I have installed apache2 on masterblaster. Now, how can I redirect some URL from masterblaster to Apache's index.html file present in /var/www/html ?
< sgupta>
since the local host redirects to the Jenkins home page
< sgupta>
I found a link to run Apache and Jenkins https://yakiloo.com/setup-apache-and-jenkins/ . But this seems to mess up with current configuration. So, I think we can just put all the tar balls to local directory.
< sgupta>
rcurtin: also I'm trying to make a shell script which asks user for library versions and then installs that particular specific version.
govg has quit [Ping timeout: 240 seconds]
govg has joined #mlpack
kris1 has quit [Quit: kris1]
< sgupta>
rcurtin: As I was looking, the boost and armadillo versions, which we need, have the same build instructions. That's a good thing when it comes to automation.
< sgupta>
rcurtin: for now, my plan is to generate a Dockerfile with the shell script, then one can do a docker build using the file. Is this good? Or if you have something different in mind, let me know.
sheogorath27 has quit [Remote host closed the connection]
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Client Quit]
kris1 has quit [Quit: kris1]
kris1 has joined #mlpack
< rcurtin>
sgupta: I think it's reasonable to just run apache on a different port
< rcurtin>
then there will be no overlap with Jenkins and we can easily use iptables to restrict access
< sgupta>
rcurtin: any specific port number?
< sgupta>
rcurtin: set Listen on port 5005 and restarted Apache
sheogorath27 has joined #mlpack
< kris1>
Just a quick question since the weights of the layer refrence the paramter matrix. Which in sgd is called iterate and its passed by refrence. Why do we pass the iterate/parameter matrix to gradient function in ffn it’s used only for evalute function which i think could have also used the locally available parameter matrix. Is it to maintain consistency with some other function that use sgd
< kris1>
Sorry for the long question. Please let me know if you don’t understand it
< zoq>
kris1: yes, for consistency, we have to make sure that every function, that can be optimized e.g. by SGD implements the same interface.
< rcurtin>
sgupta: sounds good, we can add the iptables rules in a few days, there's no huge hurry on that part
< sgupta>
rcurtin: I can't access the Apache server on that port. Need help with this one.
< sgupta>
rcurtin: Also, I you get time give your feedback on other things I mentioned.
< sgupta>
rcurtin: I *if
< rcurtin>
sure, the shell script idea seems reasonable; I'd suggest not asking for user input but just taking it from the command line
< rcurtin>
i.e.
< rcurtin>
$ ./generate-dockerfile armadillo-x.y.z boost-x.y.z or something like that
mikeling has quit [Quit: Connection closed for inactivity]
kris1 has quit [Quit: kris1]
kris1 has joined #mlpack
< ironstark>
I decided to delete all the libraries and install them once again because the installations I had were not consistent with the current benchmark implementation versions. Now I am facing errors while installing mlpack itself. Need help. This is my CMakeError.log file https://paste.ubuntu.com/24882616/
< rcurtin>
!~.
< rcurtin>
oops :)
< rcurtin>
ironstark: is your c++ compiler working?
< rcurtin>
you could try compiling a simple hello world program, i.e. 'g++ main.cpp' where main.cpp is some simple program
< rcurtin>
is the 'build-essential' package installed?
< ironstark>
g++ works
< rcurtin>
can you get any more output from what the actual failure is?
< rcurtin>
I see a couple of things here, it looks primarily like boost isn't installed though
< rcurtin>
Could not find the following Boost libraries:
< rcurtin>
boost_program_options
< rcurtin>
boost_unit_test_framework
< rcurtin>
boost_serialization
< rcurtin>
if you are lazy you can do 'apt-get install libboost-all-dev', if you are not lazy, you can only install the necessary boost libraries :)
< kris1>
Just a question the parameters in the ffn are row vector. Will the gradients computed should also be a row vector
< kris1>
or a column vector
< kris1>
since i reset the layer weights and every thing i the gradients as a matrix and the parmaeters are row vector
< lozhnikov>
kris1: gradients have the same dimensions as the corresponding parameters (for example FFN creates an arma::mat object which has the same dimensions as the parameters matrix and each layer resets its gradient matrix according to the dimensions of the parameters)
< kris1>
Oh i see you mean after the gradients are computed the resetGradient() function is called
< kris1>
lozhnikov:
< lozhnikov>
kris1: I think the particular form of the gradient matrix is not important if you don't compute matrix operations
< kris1>
Hmmm i don’t get what you mean by that
< lozhnikov>
I mean any matrix operations with the gradient matrix
< kris1>
right now what my solution to the training was to flatten the gradient matrix and subtract it from parameter/iterate matrix i do not reset the gradient matrix
< lozhnikov>
I think you do that only in the CDk optimizer (iterate -= stepSize * gradient)
< lozhnikov>
So, the dimensions of the gradient matrix should correspond to the dimensions of the parameters matrix
< kris1>
Hmmm yes
< kris1>
okay got your point
< kris1>
i am doing that only i am actullay using vectorise on the gradient matrix to make it a column vector same as the parameters
< kris1>
i was just concerned should i do it the otherway meaning make the parameter matrix shape be equal to the gradient matrix
< lozhnikov>
I don't think that the operator- works properly if the dimensions of two matrices are not equal
< kris1>
Yes that’s why i said that i am vectorising the gradient matrix to make it equal to the size of parmeter matrix
govg has quit [Ping timeout: 240 seconds]
kris1 has quit [Quit: kris1]
kris1 has joined #mlpack
sumedhghaisas_ has joined #mlpack
sumedhghaisas_ has quit [Ping timeout: 260 seconds]
sumedhghaisas has joined #mlpack
< kris1>
I can’t read the mnist dataset outside from the mlpack folder