verne.freenode.net changed the topic of #mlpack to: http://www.mlpack.org/ -- We don't respond instantly... but we will respond. Give it a few minutes. Or hours. -- Channel logs: http://www.mlpack.org/irc/
< uzipaz>
zoq: thank you, about OutputClass function in BinaryClassificationLayer, would you like to add a 'double argument for confidenceThreshold?
< zoq>
uzipaz: sure
tsathoggua has joined #mlpack
tsathoggua has quit [Quit: Konversation terminated!]
Mathnerd314 has joined #mlpack
mrbean_ has joined #mlpack
uzipaz has quit [Quit: Page closed]
Nilabhra has joined #mlpack
ranjan123 has joined #mlpack
< ranjan123>
hello ! anyboday up there ?
< ranjan123>
*anybody
< ranjan123>
can anybody please clarify this !
< ranjan123>
< ranjan123_> wasiq: I mean, If I wish to write new optimization technique then, should I assume that the function I want to optimize is sum of other function. 15:56 < ranjan123_> i.e min(f(x)) 15:57 < ranjan123_> where f(x)=f1(x)+f2(x)+f3(x)+....
ranjan123 has quit [Ping timeout: 250 seconds]
kronos has joined #mlpack
kronos has quit [Quit: leaving]
Mathnerd314 has quit [Ping timeout: 264 seconds]
alpha has joined #mlpack
awhitesong has quit [Ping timeout: 260 seconds]
alpha__ has joined #mlpack
alpha has quit [Ping timeout: 250 seconds]
ranjan123 has joined #mlpack
alpha__ has quit [Ping timeout: 250 seconds]
Nilabhra has quit [Remote host closed the connection]
mrbean_ has quit [Ping timeout: 250 seconds]
alpha has joined #mlpack
alpha__ has joined #mlpack
< alpha__>
:ranjan123 Hii
< alpha__>
your query got resolved ?
< ranjan123>
na !
< ranjan123>
I think they are sleeping ! :P . time is 5:47 AM over there! :P
alpha has quit [Ping timeout: 250 seconds]
< alpha__>
:ranjan123 see there are 2 properties that the cost function must satisfy .. I dont know if it's a convention or condition..
< ranjan123>
what are the 2 properties ?
< alpha__>
One of the properties is that the cost function for the overall inputs should be able to be described as the sum of cost functions over individual inputs
< alpha__>
the benefit of this is ..
< alpha__>
(i) we sometimes choose a random set from the given inputs to get an idea of the cost or error
< alpha__>
so if the training set is huge .. to save computation time we chose a random subset and compute cost function on it ..
< alpha__>
getting me?
< ranjan123>
yes ! you are right !
< alpha__>
I think that is why it is called Stochastic gradient descend.. coz we select a subset of the inputs randomly ..
< ranjan123>
I am thinking different types of cost function !
< alpha__>
When we use all the inputs given in the training set it is called gradient descend simply ..
< alpha__>
I might be wrong here ..
< alpha__>
different types of cost functions as in ?
< ranjan123>
nana you r right !
< alpha__>
cooll ..
< alpha__>
any other query that I can help you with ?
< ranjan123>
like say ! f(x)=f1(x)+f2(x)+f3(x)
< ranjan123>
where f1(x)=(x-3)^2
< ranjan123>
f2=e^|x-2|
< ranjan123>
f3(x)=some thing
< ranjan123>
f3(x)=something
< alpha__>
ohh.. I think it's something else ..
< ranjan123>
it may not happen that f1==f2==f3
< alpha__>
coz I'm talking about the case where F(x) = F(x1) + F(x2) ..
< ranjan123>
in case of regression f1==f2==f3==f4==f5..
< ranjan123>
Evaluating a function with a function index
< alpha__>
I'll look into it.. will lt you know if I find something :)
< alpha__>
let*
< ranjan123>
hmmm !.
< ranjan123>
:)
< alpha__>
:ranjan123
< ranjan123>
yes
< alpha__>
I think to it's just a test function where the objective function can be decomposed into defferent functions .. in this case the 3 mentioned functions ..
< alpha__>
different*
< alpha__>
I think so*
< ranjan123>
ahh ok !
< ranjan123>
so. let us say f(x)=x^2-4*x+4
virtualgod has quit [Quit: Connection closed for inactivity]
< ranjan123>
we want to minimize f(x)
< alpha__>
yes ..
< ranjan123>
so min value =2
< ranjan123>
so: f1(x)=x^2
< ranjan123>
f2(x)=-4*x
< ranjan123>
f3(x)=4
< ranjan123>
f(x)=f1+f2+f3
< alpha__>
I think so .. yes
< ranjan123>
I think there are some other condition
< ranjan123>
because gradient of f3 is always zero
< ranjan123>
whatever !
< ranjan123>
let us see what they say
< ranjan123>
:)
< ranjan123>
thanks alpha__ . nice talking to you!
< ranjan123>
:)
< alpha__>
:ranjan123 same here :)
mrbean has joined #mlpack
alpha__ has quit [Ping timeout: 250 seconds]
alpha has joined #mlpack
alpha has left #mlpack []
wasiq has joined #mlpack
awhitesong has joined #mlpack
ranjan123 has quit [Ping timeout: 250 seconds]
Nilabhra has joined #mlpack
Mathnerd314 has joined #mlpack
uzipaz has joined #mlpack
< uzipaz>
zoq: the dataset on which im using FFN on, has nominal attributes only, with only three values for each... if I convert my attributes to binary, will using a binary step function as activationFunction help?
ftuesca has joined #mlpack
virtualgod has joined #mlpack
Mathnerd314 has quit [Ping timeout: 268 seconds]
Mathnerd314 has joined #mlpack
ftuesca has quit [Quit: Leaving]
Mathnerd314 has quit [Ping timeout: 276 seconds]
Mathnerd314 has joined #mlpack
tsathoggua has joined #mlpack
uzipaz has quit [Quit: Page closed]
tsathoggua has quit [Quit: Konversation terminated!]
Nilabhra has quit [Read error: Connection reset by peer]