verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
virtualgod has joined #mlpack
agobin has quit [Quit: Connection closed for inactivity]
ftuesca has quit [Quit: Leaving]
Neuron1k has joined #mlpack
Neuron1k has quit [Ping timeout: 250 seconds]
Nilabhra has joined #mlpack
kirizaki has joined #mlpack
Mathnerd314 has quit [Ping timeout: 240 seconds]
mentekid has joined #mlpack
ranjan123 has quit [Ping timeout: 250 seconds]
ank_95_ has joined #mlpack
ranjan123_ has quit [Ping timeout: 250 seconds]
ranjan123 has joined #mlpack
kirizaki has quit [Ping timeout: 250 seconds]
Neuron1k has joined #mlpack
KeonKim has joined #mlpack
Nilabhra has quit [Remote host closed the connection]
agobin has joined #mlpack
Neuron1k has quit [Quit: Page closed]
ftuesca has joined #mlpack
Mathnerd314 has joined #mlpack
agobin has quit [Quit: Connection closed for inactivity]
ank_95_ has quit [Quit: Connection closed for inactivity]
Neuron1k has joined #mlpack
skon46 has joined #mlpack
uzipaz has joined #mlpack
< uzipaz> zoq: hi, I just started compiling programs using mlpack, I am using the first example code, http://www.mlpack.org/docs/mlpack-2.0.1/doxygen.php?doc=sample.html
< uzipaz> but I am getting bunch of undefined references when I compile, I use g++ test.cpp -o test.o -std=gnu++11 -Wfatal-errors
< uzipaz> can you help me with this?
< uzipaz> I think its a linking error...?
Keon has joined #mlpack
KeonKim has quit [Ping timeout: 244 seconds]
< zoq> sorry for the short response, heading home
< uzipaz> zoq: I tried linking the library in command line by using -l/usr/local/include/libmlpack.so but that didnt work
< uzipaz> zoq: I also have Netbeans installed with c++ support, so I made a project here and specified libmlpack.so in usr/local/lib/ directory and it works now
< uzipaz> *sry I used -l/usr/local/lib/libmlpack.so
Neuron1k has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
< uzipaz> zoq: anyways, I guess Im finally read to apply ann on my dataset :D
agobin has joined #mlpack
KeonKim has joined #mlpack
Keon has quit [Ping timeout: 240 seconds]
ranjan123_ has joined #mlpack
< ranjan123_> uzipaz:
< ranjan123_> can you paste the error here
< ranjan123_> I think I got the same error
< ranjan123_> but resolved it
< ranjan123_> I can help you
skon46 has quit [Ping timeout: 268 seconds]
skon46 has joined #mlpack
< uzipaz> i was trying to compile on command line, i think i just need to specifiy the mlpack lib file, it didnt work for me... I am using netbeans with c++, and it works for now
< ranjan123_> yes ! same thing I did
< ranjan123_> :)
< ranjan123_> ooo
< ranjan123_> sry
< ranjan123_> " it didnt work"
< ranjan123_> if you paste the error I can tell you. else you can go ahead with whatever works . :)
< uzipaz> In function `__static_initialization_and_destruction_0(int, int)': test.cpp:(.text+0xb34): undefined reference to `mlpack::util::CLIDeleter::CLIDeleter()' test.cpp:(.text+0xb43): undefined reference to `mlpack::util::CLIDeleter::~CLIDeleter()' collect2: error: ld returned 1 exit status
< uzipaz> probably, I need to link to libmlpack when compiling... I tried using -l/usr/local/lib/libmlpack.so but it didnt work
< uzipaz> zoq: I would like to ask you questions about building ann networks in mlpack, I am going through the test codes, there are different types of layers and I am not sure how to differentiate them
Neuron1k has joined #mlpack
< skon46> uzipaz , try running this
< skon46> ldconfig -v
< skon46> all the libraries list should be updated after this.
< skon46> i guess , i faced a similar problem
tsathoggua has joined #mlpack
< uzipaz> skon46: Thanks for the suggestion, i will try it and let you know
< ranjan123_> hello uzipaz
< ranjan123_> have you exported the lib path ?
< ranjan123_> then do ldconfig
< ranjan123_> g++ your_prog.cpp -lmlpack -lboost_serialization -larmadillo -std=c++11
< uzipaz> ranjan123_: i didnt export the lib path
< ranjan123_> ok
< ranjan123_> I guess you have compiled mlpack
< ranjan123_> then you get *.so file at build/lib/
< ranjan123_> right ?
< ranjan123_> if yes then do LD_LIBRARY_PATH=".parent_path"/build/lib/:$LD_LIBRARY_PATH
< ranjan123_> then export LD_LIBRARY_PATH
< ranjan123_> then --> export LD_LIBRARY_PATH
< ranjan123_> then ldconfig
< ranjan123_> then try to compile---> g++ your_prog.cpp -lmlpack -lboost_serialization -larmadillo -std=c++11
< ranjan123_> uzipaz
< uzipaz> ranjan123_: Thanks, I will try this and let you know
< ranjan123_> sure
< ranjan123_> LD_LIBRARY_PATH=".parent_path"/build/lib/:$LD_LIBRARY_PATH <----same as doing ----> LIBRARY_PATH=`pwd`/build/lib/:$LD_LIBRARY_PATH
< ranjan123_> ;:P
< ranjan123_> :P
< uzipaz> ranjan123_: thanks for the help, appreciate it :)
< ranjan123_> works ?
< uzipaz> zoq: I have a question about line 109 in recurrent_network_test.cpp, I cannot find this constructor in SGD.hpp
< uzipaz> ranjan123_: I have not tried this yet, right now Im using mlpack in Netbeans
< zoq> uzipaz: There is only one constructor in sgd.hpp, so I'm not sure which part I should clarify.
< zoq> uzipaz: Maybe decltype(net)?
< uzipaz> zoq: I was just confused about the parameters passed in the constructor at line 109 in recurrent_network_test.cpp
KeonKim has quit [Ping timeout: 246 seconds]
< zoq> uzipaz: So to create the optimizer object you have to specifiy the type of the function and the function itsef that you like to optimize.
< zoq> uzipaz: To get the type of the functionuse in this 'decltype(net)' which returns the type of the network.
< zoq> uzipaz: Depending on the model structure it looks differently e.g. RNN<LinearLayer<arma::mat, arma::mat>, BiasLayer<arma::mat, arma::mat>, ...>
< uzipaz> zoq: I was concerned about the other parameters such as the stepsize, # of iterations passed to SGD constructor but I just passed my own values after studying the constructor in sgd.hpp
< zoq> uzipaz: Yeah, it's always a good idea to specify your own parameter especially if you use SGD
< uzipaz> zoq: I have studied aritficial neural networks a bit, on the webs and in my course, but I have never encountered terms such as LinearLayer and baselayer, can you please explain what is the purpose of these?
virtualgod has quit [Quit: Connection closed for inactivity]
< uzipaz> damn, I am getting more linking errors in compiling a basic RNN with SGD
< zoq> uzipaz: Sure, BaseLayer is just a wrapper class that can be used with various activation functions. So instead of writing BaseLayer<LogisticFunction> you can just use: SigmoidLayer<> or TanHLayer<> or ReLULayer<> etc. So, the BaseLayer acts basically as transfer functions to introduce a non-linearity after a parameterized layer.
< uzipaz> /usr/bin/ld: build/Debug/GNU-Linux/main.o: undefined reference to symbol 'wrapper_dgemm_'
< zoq> So the LinearLayer applies a linear transformation to the incoming data. It's sometimes called fullconnetion layer, because it's basically a connection between two layer (# of units).
< uzipaz> //usr/lib/libarmadillo.so.4: error adding symbols: DSO missing from command line
< zoq> uzipaz: you need to link against armadillo
agobin has quit [Quit: Connection closed for inactivity]
< zoq> uzipaz: We basically use the same names as torch, caffe, tensorflow, etc.
< uzipaz> zoq: thank you, linking against armadillo solved the error
< zoq> uzipaz: Just curious; which names are you using, to model your networks?
< uzipaz> zoq: Im sry, what do you mean by which names Im using?
< zoq> uzipaz: or terms 'I have studied aritficial neural networks a bit, on the webs and in my course, but I have never encountered terms such as LinearLayer and baselayer'
< uzipaz> zoq: so far I've only seen inputLayers and then you have hiddenLayers and then finally an output layer... also, you can have biaslayers connecting with each of layers that you have
mentekid has quit [Ping timeout: 252 seconds]
< uzipaz> zoq: I am getting segmentation fault error at runtime when I call Train function on RNN, I have double checked my parameters to Train method, I have verified my matrices
palashahuja has joined #mlpack
< zoq> uzipaz: Can you show me the strcture of the model, pastbin or github gist?
< uzipaz> zoq: can you see this? http://pastebin.com/sAiJfhJA
< uzipaz> zoq: I used the same model as done in recurrent_network_test.cpp
< zoq> uzipaz: Okay the model looks good, can you check that the input data and labels are looking as expected? I guess I can also check that if you send me the dataset or parts of the dataset.
< zoq> But , it might take some time before I can look at the data.
< uzipaz> zoq: data matrix, is 1123 x 1586 and labels matrix is 1123 x 1, how do I share the dataset with you?
< zoq> uzipaz: Can you provide a public link?
< zoq> uzipaz: You can also send me the dataset via mail.
< uzipaz> zoq: I prefer by email, address?
< zoq> uzipaz: Okay, https://github.com/zoq
< uzipaz> zoq: I just sent you the email, I will be back here in 20 min
< zoq> uzipaz: Okay, I'll have to finish some other things first, but I look into it tonight.
< palashahuja> zoq, hi
< zoq> palashahuja: Hello
< palashahuja> is there any way I could see the call trace in the machine learning benchmarks library
< palashahuja> ?
< zoq> palashahuja: There is no inbuild feature for that, but I can show you the line that you could print. If you like you can implement that feature :)
< palashahuja> Sure, if you could provide more insight how to go about the code I'd be glad to help :)
uzipaz has quit [Ping timeout: 250 seconds]
< zoq> palashahuja: So e.g. https://github.com/zoq/benchmarks/blob/master/methods/mlpack/allknn.py in line 129 we use the subprocess.check_output function to make the function call, and if you print 'cmd' it should contain the command used to make that call
< palashahuja> zoq: ok, got it ..
skon46 has quit [Read error: Connection reset by peer]
< palashahuja> will start working on it ..
< zoq> palashahuja: so I guess, it would be the best solution if you check the value of 'verbose' and if it's true you could print out the command
agobin has joined #mlpack
< palashahuja> hmm ..
< palashahuja> will look into it ASAP ..
< zoq> palashahuja: Sure no rush, and don't feel obligated to do that.
< zoq> Maybe it's a good idea to introduce some verbosity levels, so instead of verbose true or false some value.
< palashahuja> so the verbosity levels should be numeric ?
< zoq> palashahuja: Yeah, I guess that's a good idea
palashahuja has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
uzipaz has joined #mlpack
< uzipaz> zoq: in call, NN.Train(data, labels, opt), data is 1123 x 1586, labels is 1123 x 1, this is giving me segmentation fault
< uzipaz> zoq: I just tried NN.Train(data, trans(labels), opt) and it worked (no runtime error)
< zoq> uzipaz: sounds great :)
< uzipaz> zoq: Do I need to pass the transpose of both data and labels matrices to Train, Im confused...
< zoq> uzipaz: No, note that mlpack uses armadillo matrices which are stored in a column-major format. Take a look at: http://www.mlpack.org/docs/mlpack-2.0.1/doxygen.php?doc=matrices.html for more information. So each column contains a label col(0), col(1), col(2), ...
< zoq> uzipaz: The same applies for the input data each column should contain one sample
ank_95_ has joined #mlpack
< uzipaz> zoq: can you please look into the dataset I sent you... I am unable to load the dataset into arma::mat object by either using arma::mat.load() or data::Load() functions
< uzipaz> the dimensions are loaded correctly but all the values in the matrix are 0
< zoq> uzipaz: remove the '"'
< uzipaz> zoq: thank you, that fixed it
< uzipaz> zoq: Everything seems to be in order now, dimension of the data 1123x1586, total data points 1.78 million with binary classification, running RNN with SGD with 1 hidden layer
< uzipaz> zoq: its running for about 10 min now, no idea about how long will it take to finish
mentekid has joined #mlpack
uzipaz has quit [Ping timeout: 250 seconds]
ank_95_ has quit [Quit: Connection closed for inactivity]
agobin has quit [Quit: Connection closed for inactivity]