ChanServ changed the topic of #mlpack to: "mlpack: a fast, flexible machine learning library :: We don't always respond instantly, but we will respond; please be patient :: Logs at http://www.mlpack.org/irc/
UmarJ has quit [Ping timeout: 265 seconds]
UmarJ has joined #mlpack
amanpratapsingh has joined #mlpack
amanpratapsingh has quit [Quit: Connection closed]
ImQ009 has joined #mlpack
amanpratapsingh has joined #mlpack
< amanpratapsingh> Hello everyone
amanpratapsingh has quit [Quit: Connection closed]
Abhi22 has joined #mlpack
Abhi22 has quit [Client Quit]
abhi_22[m] has joined #mlpack
ib07 has quit [Ping timeout: 256 seconds]
ib07 has joined #mlpack
Abhi22 has joined #mlpack
Abhi22 has quit [Client Quit]
gaulishcoin has quit [Ping timeout: 256 seconds]
The_LoudSpeaker has quit [Quit: Leaving bye!]
The_LoudSpeaker has joined #mlpack
< AyushSingh[m]> Thinking of adding BERT Tokenization to mlpack, as it is the foundation block upon which NLP models can be added further. Should I continue?
< AyushSingh[m]> For reference, shall use <https://arxiv.org/abs/1810.04805> and their github repo.
< AlexNguyenGitter> wow that's so nice ! I have used bert but have not had a chance to dig into the algorithm
< zoq> AyushSingh[m]: Would be great to have it; that said there is https://github.com/mlpack/mlpack/blob/bf837f950b728bf08b80ccc696c6bc0d296e54af/src/mlpack/core/data/string_encoding.hpp a tokinizer that could be useful.
< AyushSingh[m]> Thanks zoq,string_encoding.hpp seems useful.
< AyushSingh[m]> Yes Alex Nguyen, hoping to gain more insight into BERT while doing this task.
ib07 has quit [Ping timeout: 260 seconds]
ib07 has joined #mlpack
ImQ009 has quit [Quit: Leaving]
ib07 has quit [Ping timeout: 264 seconds]