beneroth changed the topic of #picolisp to: PicoLisp language | The scalpel of software development | Channel Log: https://libera.irclog.whitequark.org/picolisp | Check www.picolisp.com for more information
seninha has quit [Quit: Leaving]
isaneran has joined #picolisp
pablo_escoberg has joined #picolisp
<pablo_escoberg> Hey all, so I made this:
<pablo_escoberg> (de lsearch @
<pablo_escoberg>   (make (for (Q (apply 'search (rest)) (search Q)) (link @))))
<pablo_escoberg> Is there a performance issue with this?
<pablo_escoberg> Or will it crap out on really large result sets?
<isaneran> why not try it?
<pablo_escoberg> I tried it.  It works.  But I don't have any large datasets to play with.
<isaneran> you could maybe generate one
<pablo_escoberg> Yeah, I guess I could.  But I think it's the kind of thing abu[7] would probably be able to answer off the top of his head.
<isaneran> possibly, but you'd be in a great position by learning techniques to quickly generate large data sets and testing your algo and analyzing how it performs
<isaneran> so that's just my suggestion
<pablo_escoberg> good point.  I guess I could just generate large amounts of randomish data.
<pablo_escoberg> I'll report back if it works out.  Seems like a useful thing if it doesn't compromise performance, even if it does crap out on large datasets.
<isaneran> if you're on linux (or posix) here is a suggestion for a way to do it
<pablo_escoberg> I am
<isaneran> seq low high | shuf
<isaneran> so you could put 1 as low and 9000000 as high or whatever
<isaneran> and you know you have a number in that range that you can search for
<pablo_escoberg> I was going to just use `for` but I'll look into that.
<isaneran> ah yeah, that's fine too, whatever comes quickly!
<pablo_escoberg> thanks for the advice.
<isaneran> no problem, hope it goes well
<isaneran> if you just put parentheses around the output of that command sequence above you'd have a list in picolisp :P
<pablo_escoberg> which command?
<isaneran> seq 1 100 | shuf
<pablo_escoberg> oh, I see.  Yeah, that would work.
<isaneran> for a language with comma separated elements you could generate them like this
<isaneran> seq 1 10 | shuf | tr -s '\n' ' ' | sed 's/ $//' | sed 's/ /, /g'
<isaneran> or even pipe it to fmt if you wanna limit line length
<pablo_escoberg> right, but I'm realizing I need to use `create` rather than what I'm doing now.  So far, no performance issues with my `lsearch` but creation is taking ages.
<abu[7]> Hi all! I would not 'collect' or 'make' lists longer than a few thousand itemsv
<abu[7]> pablo_escoberg, btw, (apply 'search (rest))
<abu[7]> You don't need to quote here
<abu[7]> And (apply ... (rest)) is better replaced by 'pass'
<abu[7]> So (make (for (Q (pass search) ...
<pablo_escoberg> Great.  That looks quite a bit cleaner.  And yes, I figured a few thousand is probably the limit, which brings me to a different question:  Is there any way to specify offsets and/or limits on query results?
<abu[7]> I would do (for ((I . Q) (search ..) (search Q)) (doSomething @) (T (== Limit I)))
<abu[7]> An offset does not make sense I think, but you can skip of course items
<pablo_escoberg> Right, ok.  Still getting used to this whole non-relational thing.  Do you not use paging in the gui?
<abu[7]> Not in reports (HTML), but of course in printing (SVG -> PDF)
<pablo_escoberg> I see.  OK, I'll figure something out.
<abu[7]> I found some old generator for test data
<abu[7]> addresses
<abu[7]> from 2006 ;)
<pablo_escoberg> LOL.  TY
<abu[7]> It uses some data files though
<abu[7]> but you get the idea
<pablo_escoberg> well, I pretty much have my answer, so I won't really use it.
<abu[7]> only dummy articles
<pablo_escoberg> nice, thanks.
<abu[7]> yeah, not useful as it needs data files
<abu[7]> As you said, for really long data 'create' is to be recommended
<pablo_escoberg> Yeah, but I'm not going to bother with those tests.
<abu[7]> Yeah, not necessary
<abu[7]> The GUI also does paging, in search dialogs
<abu[7]> Displays the first 20 items
<abu[7]> then can scroll single-lines or pages
rob_w has joined #picolisp
<pablo_escoberg> ah, so then how does it retrieve 20 items at a time?
<abu[7]> (do 20 .. (search Q) .. ?
<pablo_escoberg> right, but what about the second page?
<abu[7]> The same again
<abu[7]> It is like CURSOR in SQL
<pablo_escoberg> right, but there's no going backwards, is there?
<abu[7]> In the GUI you can go backwards, because the +Chart takes care of it
<pablo_escoberg> ah, ok, you probably keep track in the session.
<abu[7]> yes, a +Chart keeps a list, not only +DbChart
<pablo_escoberg> Ah, ok.  I'll look at the code.
<abu[7]> I don't remember well myself ;) Did not touch the basic chart code for a while
<abu[7]> +DbChart is now though, uses 'search'
<abu[7]> (replaces +QueryChart which expects Pilog)
<pablo_escoberg> Great.  I'll look at that, then.
<abu[7]> ok, I have a vid conf now
<abu[7]> back in 3 hours
seninha has joined #picolisp
pablo_escoberg has quit [Quit: Client closed]
<abu[7]> done
rob_w has quit [Quit: Leaving]
msavoritias has joined #picolisp
pablo_escoberg has joined #picolisp
isaneran has quit [Remote host closed the connection]
hrberg has quit [Quit: https://quassel-irc.org - Chat comfortably. Anywhere.]
hrberg has joined #picolisp
msavoritias has quit [Ping timeout: 245 seconds]
pablo_escoberg has quit [Quit: Client closed]
msavoritias has joined #picolisp
msavoritias has quit [Read error: Connection reset by peer]