verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
< manish7294>
In all main.cpp's random.hpp which uses BINDING_TYPE is included way before mlpack_main.cpp(which defines BINDING_TYPE) and this is the case in almost all the methods.
< Atharva>
zoq: The `Backward()`function in FFN class doesn't go over the first layer of the network, but in the case of first layer being sequential, it should go over it. Otherwise, the layers in the sequential object are left with empty errors.
vivekp has quit [Ping timeout: 240 seconds]
vivekp has joined #mlpack
travis-ci has joined #mlpack
< travis-ci>
manish7294/mlpack#54 (evalBounds - 35af793 : Manish): The build was fixed.
< zoq>
Right, in most cases the backward call of the first layer isn't needed, the seq layer is an exception, and I think there are two solutions here, either we add an identity layer or we check inside the FFN class if the layer implements the Model function.
< Atharva>
zoq: Even if we add an identity layer, we will have to check if the layer implementes the Model function. Instead, in that case, we can just call the BackwardVisitor once more. Am I right here?
< zoq>
If we add an Identity layer before the seq layer, we will call the backward of the seq layer since it's the second layer and not the first. Perhaps I missed something?
< zoq>
If we check for the Model function, which acts as an indecator, we don't have to insert an extra identiy layer.
< Atharva>
zoq: Yes you are right. My doubt is, do we ask the users to add the identity layer before the seq layer or do we add it ourselves? In the later case, we would have to check for model function anyway, right?
< zoq>
Atharva: Right, I guess the second idea might be the way to go, less user interaction, what do you think?
< Atharva>
zoq: I think that's better too. So, while adding a layer, we would have to check if it has the Model() function and is it the first layer of the netwok. If yes, we add an Identity layer before it.
< Atharva>
Or, another option can be too check if the first layer has the Model() function and just run the BackwardVisitor() on it if it has. In this case, we don't have to add an extra layer as only the backward function is concerned with it.
< zoq>
Agreed, that's easier.
< Atharva>
zoq: Okay then, I will make these changes in of my PRs.
< zoq>
Great but don't feel obligated, we could use the identity solution for now, if you like.
< Atharva>
zoq: It's not a problem, I have already made a lot of changes locally and they are not much.
cjlcarvalho has joined #mlpack
ImQ009 has joined #mlpack
cjlcarvalho has quit [Remote host closed the connection]