ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
< AlexNguyenGitter> Just a small question, does our library make use of computer gpu ?
< NitishChaturvedi> Yes same doubt
< abernauer[m]> mlpack doesn't use gpu currently though there is a backend currently in development that will support gpu
The_LoudSpeaker has quit [Quit: Leaving bye!]
The_LoudSpeaker has joined #mlpack
The_LoudSpeaker has quit [Changing host]
The_LoudSpeaker has joined #mlpack
ImQ009 has joined #mlpack
< zoq> AlexNguyenGitter: NitishChaturvedi: At this point all you can do is to link against nvblas instead of blas or openblas.
< rcurtin[m]> right, GPU support is still in development via a GPU-based drop-in replacement for Armadillo: https://gitlab.com/conradsnicta/bandicoot-code
< rcurtin[m]> progress is slow there, but moving along little by little :) contributions are welcome if you are interested
< zoq> I'm almost sure we are the first one who try to use cooperative groups with nvrtc.
< zoq> On the positive side, I can pass the correct gpu arch, so no need to hardcode compute_xx.
< rcurtin[m]> nice, well that is an improvement regardless
< zoq> Also was wondering is there any benefit of defining the kernels as a string, instead of putting them in a file and read the file/files?
< rcurtin[m]> yeah, I'm not sure, I was thinking that standalone kernel files might be better
< rcurtin[m]> that would also mean I could modify the kernels without needing to recompile my bandicoot program...
< zoq> I was talking about the kernel source instead of doing std::string source = "...." having them in a file so I don't have to use " and \n" etc. all the time.
< rcurtin[m]> right, exactly, agreed
< zoq> and just read the file as string
< rcurtin[m]> if they're in standalone files, then I can modify, e.g., accu.kernel.h or whatever we call it, and the accu program doesn't need to be recompiled, since accu.kernel.h is read and compiled by NVRTC or OpenCL at runtime
< zoq> I miss my syntax highlighting :)
< rcurtin[m]> same...
< RishabhGarg108Gi> Hey all, I was recently looking at the issue #1254 , where we are discussing about CLI for ANN. There we have to make a parser for building model from the configuration file. My doubt here is that do we need to implement lets say a JSON parser or can we use `boost`'s JSON parser. I am asking this because we are trying to limit our dependence on boost libraries. So, I wanted to know what other think about it.
< zoq> RishabhGarg108Gi: I guess you can also just use a super simple format like: layername param1 param2 param3
< zoq> RishabhGarg108Gi: That one could be parsed pretty easily.
< RishabhGarg108Gi> @zoq , we can do that and that would work just fine. But, I think it would be nicer to have a structured way to define configuration. This way we can simply provide a template that users can simply copy and fill the details in it.
< RishabhGarg108Gi> In this way, user can easily enter the parameters against their names for ex `{ "layer": "convolution", "filters": "16", "kernel size": "3X3"}` raher than writing something like `convolution 16 3X3`
< RishabhGarg108Gi> Please correct me if I am overlooking something :)
< zoq> Yeah that would be nice, maybe we can reuse cereal to parse the code for us.
< RishabhGarg108Gi> I am not familiar with cereal. Can you direct me to some links where I can see more about cereal's parser. Thanks :D
< zoq> It's not a parse we use it for serialize/deserialize, it can output read json.
< zoq> So my idea would be to just write a simple class with some parameters and see if the output looks easy enough.
< RishabhGarg108Gi> Sorry I didn't fully understand. What do you mean by "output" here ? Do you mean the output that I get after parsing ?
< zoq> I was talking about the output file of the serialization, so something like class Test { int param; }; Test test; data::Save("test.json", test);
< RishabhGarg108Gi> Oh, Okay. I will try it out. Thanks!
theloudspeaker_ has joined #mlpack
The_LoudSpeaker has quit [Read error: Connection reset by peer]
theloudspeaker_ has quit [Client Quit]
The_LoudSpeaker has joined #mlpack
The_LoudSpeaker has quit [Changing host]
The_LoudSpeaker has joined #mlpack
ImQ009 has quit [Quit: Leaving]