verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
sumedhghaisas has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
sumedhghaisas__ has joined #mlpack
kris1 has left #mlpack []
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Client Quit]
chenzhe has quit [Ping timeout: 255 seconds]
chenzhe has joined #mlpack
chenzhe has quit [Ping timeout: 268 seconds]
kris1 has joined #mlpack
sumedhghaisas__ has quit [Ping timeout: 260 seconds]
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
kris1 has quit [Quit: kris1]
mentekid has quit [Quit: Leaving.]
govg has joined #mlpack
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
mentekid has joined #mlpack
sumedhghaisas has joined #mlpack
shikhar has joined #mlpack
sumedhghaisas has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
shikhar has quit [Quit: WeeChat 1.7]
sgupta has quit [Ping timeout: 260 seconds]
sgupta has joined #mlpack
kris1 has joined #mlpack
mikeling has joined #mlpack
kris1 has quit [Quit: kris1]
govg has quit [Ping timeout: 268 seconds]
kris1 has joined #mlpack
< rcurtin> lozhnikov: hi there, I have a random question... :)
< rcurtin> I have recently come across the opportunity to obtain an Elektronika DVK-3M (I guess the correct spelling is Электроника ДВК-3М):
< rcurtin> since these are not typically known in the US, I wanted to know if you have ever seen one, know anything about them, or have any idea how easy to get they are in Russia
< lozhnikov> rcurtin: Hi Ryan, I didn't see that before, I am not familiar with so ancient computers at all:)
< lozhnikov> Why do you want to obtain that? I think this computer should waste a lot of space (I don't know its dimensions but I think that is really big)
< lozhnikov> rcurtin: I looked through 3 pages of google search and I found only 2 advertisements. The first sellers wants about 650$, the second one have already sold (in May 2017) for ~250$
< kris1> Why are the resetGradients functions useful meaining i know that would set the dimenstion of gradients for whatever the gradient was computed. But since the main use of gradients is in the sgd fucntion where the parmaeters are column vector and we subtract it with the gradients which are column vector what is the need to reset the gradients at all.
< kris1> Also lozhnikov i am not able to understand the cross-entropy evaluation metric used by the deeplearning.net guys i don’t understand how the get y’(y predicted) from the rbm?
< lozhnikov> kris1: gradients are used for the backpropagation technique
< kris1> Do you think it is required in the case of the rbm's
< kris1> ?
< lozhnikov> kris1: If I remember right they don't use the cross-entropy at all. They evaluate that but don't use
< kris1> Yes but they show that the cross entropy is going down at each iteration
< lozhnikov> kris1: I think that is not needed in case of RBMs
< kris1> Okay i have tried the rbm on the r10 dataset the pseudo likelihood was going up and down a lot.
< kris1> I tried diffrent lerning rates
< kris1> but didn’t help
< lozhnikov> Have you ever fixed the Forward() function? If I remember right you multiply the input by the same matrix in both layers
< lozhnikov> however, one of them should multiply the input by the transposed matrix
< kris1> Yes i have that fixed now i updated the pr yesterday you can have a look. I think i have not renamed BinomialRandom to BernoulliRandom .
kris1 has quit [Quit: kris1]
shikhar has joined #mlpack
< lozhnikov> kris1: No, you didn't, look at binary_rbm_impl.hpp:58
< lozhnikov> LogisticFunction::Fn(weight * input + otherBias, output);
< lozhnikov> one layer (I don't remember which one the visible or the hidden) should multiply output by the transposed weight matrix
kris1 has joined #mlpack
< kris1> Sorry my net got interuppted. But if you look at the binary_rbm_impl line 44 i changed the intialisation to insize * outsize for the hidden layer
< kris1> so no need to transpose.
< lozhnikov> kris1: No, you didn't, look at binary_rbm_impl.hpp:58
< lozhnikov> LogisticFunction::Fn(weight * input + otherBias, output);
< lozhnikov> one layer (I don't remember which one the visible or the hidden) should multiply output by the transposed weight matrix
< lozhnikov> [I am not sure that you received these messages]
< kris1> so the weight matrix still points to the parameters from 0, insize*outsize. But it’s shape is now insize * outsize and shape of weight matrix for the visible layer would be (outsize, insize)
< kris1> I did recive the message look at like 44. I think that answers why i didn’ t transpose.
< lozhnikov> kris1: If you just change the shape of the data you will not get the transposed matrix
< kris1> Ohhhh sorrry you are right.
< kris1> that was pretty silly.
kris1 has quit [Quit: kris1]
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
sumedhghaisas has joined #mlpack
vivekp has quit [Ping timeout: 258 seconds]
sumedhghaisas has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
kris1 has joined #mlpack
shikhar has quit [Quit: WeeChat 1.7]
vivekp has joined #mlpack
< rcurtin> lozhnikov: yeah, it is a strange hobby I have I guess :)
< rcurtin> the prices you saw sound about like what I am being offered, I dunno about shipping though
< rcurtin> I think the DVK-3M is not so huge, just the size of a regular desktop plus monitor as far as I understand
< zoq> Reminds me of this guy: https://www.youtube.com/watch?v=45X4VP8CGtk the presentation is really entertaining/interesting.
< lozhnikov> rcurtin: I didn't find any info about the size but according to the picture you are right
< lozhnikov> As for me it's huge, however I've got 3 acoustic guitars... So, I think your hobby is not so strange:)
kris1 has left #mlpack []
< sgupta> rcurtin: hey! I am installing boost libraries from source. Just want to know whether they go to /usr/local or somewhere else?
< sgupta> zoq: can you answer this one?
sumedhghaisas__ has joined #mlpack
< zoq> sgupta: If you install libboost-all-dev via apt-get they go in /usr/lib/, so you can do that, but you can always use another directory and specify the boost path during the mlpack build process, if cmake can't find the libs.
sumedhghaisas__ has quit [Ping timeout: 260 seconds]
< sgupta> zoq: okay will check.
< zoq> sgupta: I think I would go with /usr/lib
< sgupta> zoq: while installing boost from source?
< sgupta> zoq: yes, I will try building some version of mlpack and see what works.
< zoq> sgupta: yes, or do you install multiple boost versions on the same host?
< sgupta> zoq: No, just a single boost version.
< zoq> okay, in this case is /usr/lib probably the best option
< sgupta> zoq: okay :)
sumedhghaisas__ has joined #mlpack
kris1 has joined #mlpack
mikeling has quit [Quit: Connection closed for inactivity]
sumedhghaisas__ has quit [Ping timeout: 260 seconds]
sumedhghaisas has joined #mlpack
kris1 has left #mlpack []
sumedhghaisas has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
sumedhghaisas__ has joined #mlpack
mentekid has quit [Quit: Leaving.]
sumedhghaisas has joined #mlpack
sumedhghaisas has quit [Quit: http://www.kiwiirc.com/ - A hand crafted IRC client]
sumedhghaisas__ has quit [Ping timeout: 260 seconds]