ChanServ changed the topic of #mlpack to: Due to ongoing spam on freenode, we've muted unregistered users. See http://www.mlpack.org/ircspam.txt for more information, or also you could join #mlpack-temp and chat there.
information12 has joined #mlpack
information12 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
wenhao has joined #mlpack
verm1n9 has joined #mlpack
verm1n9 has quit [Killed (Unit193 (Spam is not permitted on freenode.))]
wenhao has left #mlpack []
wenhao has joined #mlpack
< wenhao> lozhnikov: Hi Mikhail! About the work product submission for GSoC, how about I write it in the form of a blog and send you the link to preview before submission?
noonehere4u16 has joined #mlpack
noonehere4u16 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
Guest83693 has joined #mlpack
Guest83693 has quit [Remote host closed the connection]
cjlcarvalho has quit [Quit: Konversation terminated!]
cjlcarvalho has joined #mlpack
Patrick17 has joined #mlpack
Patrick17 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
caiojcarvalho has joined #mlpack
cjlcarvalho has quit [Ping timeout: 244 seconds]
bazhang has joined #mlpack
bazhang is now known as Guest96168
Guest96168 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
Guest48853 has joined #mlpack
Guest48853 has quit [Ping timeout: 256 seconds]
Namarrgon22 has joined #mlpack
Namarrgon22 has quit [Killed (Unit193 (Spam is not permitted on freenode.))]
puzzola9 has joined #mlpack
puzzola9 has quit [Killed (Unit193 (Spam is not permitted on freenode.))]
wenhao has quit [Quit: Page closed]
HepaR19 has joined #mlpack
HepaR19 has quit [Remote host closed the connection]
have has joined #mlpack
have has quit [Remote host closed the connection]
eNigmaFx27 has joined #mlpack
eNigmaFx27 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
alkyl23 has joined #mlpack
alkyl23 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
Hoosilon27 has joined #mlpack
Hoosilon27 has quit [Ping timeout: 260 seconds]
rory4 has joined #mlpack
rory4 has quit [Ping timeout: 240 seconds]
Minkar has joined #mlpack
Minkar has quit [Killed (Unit193 (Spam is not permitted on freenode.))]
Guest95115 has joined #mlpack
Guest95115 has quit [Remote host closed the connection]
joepie9110 has joined #mlpack
joepie9110 has quit [Remote host closed the connection]
Ohelig7 has joined #mlpack
Ohelig7 has quit [Remote host closed the connection]
circle has joined #mlpack
circle has quit [Remote host closed the connection]
bray90820_ has joined #mlpack
bray90820_ has quit [Ping timeout: 256 seconds]
Andy95 has joined #mlpack
Andy95 has quit [Client Quit]
Sharker has joined #mlpack
Sharker has quit [Remote host closed the connection]
Syncopix17 has joined #mlpack
Syncopix17 has quit [Ping timeout: 276 seconds]
< zoq> wenhao: If you like you can write the report in the form of a blog post, examples from previous years:
DrJ20 has joined #mlpack
DrJ20 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
< zoq> rcurtin: Any ideas regarding the failing LMNNAccuracyTest test case? Not sure should we disable the test for now?
caiojcarvalho has quit [Quit: Konversation terminated!]
cjlcarvalho has joined #mlpack
christophegx has joined #mlpack
christophegx has quit [Killed (Sigyn (Spam is off topic on freenode.))]
HarryS19 has joined #mlpack
HarryS19 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
< akhandait> zoq: Sorry for asking about this again, but I am still a little confused about why we do this:
< akhandait> sorry, that's wrong, this:
< akhandait> how is mappedError(which is from the next layer) related to padW and padH of the current layer. We do take padding into consideration here:
cjlcarvalho has quit [Quit: Konversation terminated!]
Lord_of_Life7 has joined #mlpack
Lord_of_Life7 has quit [K-Lined]
Guest43996 has joined #mlpack
Guest43996 has quit [Killed (Unit193 (Spam is not permitted on freenode.))]
infina20 has joined #mlpack
infina20 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
< zoq> Just checked the code, and there is no need to cut the error if padding > 1, the DGAN test should fail, and I think this is a result of a wrong kernel size.
mrBlaQ11 has joined #mlpack
mrBlaQ11 has quit [K-Lined]
hammer0650 has joined #mlpack
hammer0650 has quit [Killed (Unit193 (Spam is not permitted on freenode.))]
< akhandait> zoq: To be clear, did you mean that the DCGAN test passes because of a wrong kernel size or did you mean that the current code(dividing by paddding) was a result of wrong kernel size?
< zoq> akhandait: No I guess, it slipped through the last modifications.
< akhandait> zoq: Okay, I tried removing /padH and /padW, but the DCGAN test still passes and so does the convolutional network test(which I modified to non zero padding)
< akhandait> How are they both passing either way?
< zoq> akhandait: Did you build with DEBUG=ON, the test failed on my side.
< akhandait> zoq: Oh, I think I built with the default configuration
< zoq> in this case DEBUG is set to OFF
< zoq> "If you run CMake with no options, it will configure the project to build with no debugging symbols and no profiling information"
< akhandait> I will try it again with debug on
< akhandait> zoq: Yes, it failed now. So, with DEBUG off, what source code are the tests compiled with?
< zoq> With DBEUG=OFF the optional bounds check (armadillo) is disabled at compile-time to get more speed.
< akhandait> zoq: Oh, now I understood.
< akhandait> zoq: Thanks for the reviewing the PR.
jimbeamm has joined #mlpack
jimbeamm has quit [Killed (Unit193 (Spam is not permitted on freenode.))]
ImQ009 has joined #mlpack
prettymuchbryce2 has joined #mlpack
prettymuchbryce2 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
< akhandait> zoq: I think there has been a confusion here. The DCGAN test fails when we DON'T divide by padding. So, have you been meaning to say that we don't need to divide by padding or were you saying we DO need to divide by padding? Just to inform, the ConvolutionalNetworkTest(I modified the first layer to have padding 2) passes either ways(if we divide by padding or not).
< zoq> akhandait: Haven't looked into the DCGAN test issue, my first guess is it uses a wrong kernel/padding/stride parameter, which causes the issue, there should be no need to devide by the padding value.
< zoq> ShikharJ: Do you see any reason to do so?
< akhandait> zoq: Okay, got it.
ShikharJ_ has joined #mlpack
< ShikharJ_> zoq: Sorry, I didn't follow the context of the conversation. akhandait can you send me a link to the changes that you're making, and maybe then I can try to provide an explanation?
< akhandait> ShikharJ: I am not making any changes here, I couldn't understand why we have divided by padding here:
< ShikharJ_> zoq: Also, please let me know if any changes are being made to the convolutional infrastructure of mlpack. I'd like to review the same before we merge anything new. I stopped following the repository, because my e-mail was flooded with conversations.
< zoq> ShikharJ_: For sure, you are the expert, so any input is much appreciated.
nug700 has joined #mlpack
nug700 has quit [K-Lined]
vivekp has joined #mlpack
< ShikharJ_> akhandait: I think it was done to reduce the error map, in order to leave out the area of the error map which was convolved out of additional padding. But I do think there is an issue here now that I think of it. Let me check that again.
PuppyKun3 has joined #mlpack
PuppyKun3 has quit [K-Lined]
< ShikharJ_> akhandait: I think this was only done keeping the speedup benefits. Ideally, there should be no reduction whatsoever, even if it gives a speedup, because we're deliberately losing a bit of information and saturating ourselves only to the input cells when it comes to generating the gradients.
< akhandait> ShikharJ_: That's what I was thinking, we are losing gradients due to this. I will remove it in my next PR.
< akhandait> ShikharJ_: As Marcus said, do check the DCGAN test. It fails when we remove the division by padding.
< ShikharJ_> akhandait: That is expected, because we'll also remove the reduction that is happening later in the routine. But I'm also sensing a problem with removeing the reduction.
< ShikharJ_> *we'll have to
< ShikharJ_> akhandait: Let me make the changes and add a bit of comments.
< akhandait> ShikharJ_: Sure, I will just make the changes locally till your PR gets merged and work with it.
< ShikharJ_> akhandait: I'll have to reduce the error mapping in a way that the matrix multiplication doesn't produce zeros (imagine a 1x1 input with padding 2 and kernel 2x2, the first few convolutions would only produce zeros on the output matrix, and no matter how you update the gradients, the error will stay the same because it would always lead to zeros).
< ShikharJ_> akhandait: I think it was because of this philosophy that this reduction had been implemented, and I think zoq had also mentioned this once to me.
< ShikharJ_> akhandait: The padded zeros are hardly ever useful for generating the gradients.
ShikharJ_ has quit [Quit: Page closed]
Cprossu17 has joined #mlpack
Cprossu17 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
Zuu_ has joined #mlpack
Shinobi has joined #mlpack
Shinobi has quit [K-Lined]
Zuu_ has quit [Ping timeout: 256 seconds]
GrapeNinja has joined #mlpack
GrapeNinja is now known as Guest76125
Guest76125 has quit [Ping timeout: 240 seconds]
Fuchs3 has joined #mlpack
Fuchs3 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
Dominian4 has joined #mlpack
Dominian4 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
Guest37386 has joined #mlpack
Guest37386 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
< zoq> ShikharJ_: Thanks for looking into the Gradient method.
StephenS9 has joined #mlpack
StephenS9 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
Guest75005 has joined #mlpack
Guest75005 has quit [Remote host closed the connection]
ManyRaptors16 has joined #mlpack
ManyRaptors16 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
vivekp has quit [Ping timeout: 268 seconds]
tharkun25 has joined #mlpack
tharkun25 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
Adran19 has joined #mlpack
Adran19 has quit [Killed (Sigyn (Spam is off topic on freenode.))]
liguo has joined #mlpack
liguo has quit [Ping timeout: 240 seconds]
ImQ009 has quit [Ping timeout: 256 seconds]