ChanServ changed the topic of #sandstorm to: Welcome to #sandstorm: home of all things Sandstorm and Cap'n Proto. Say hi! | Have a question but no one is here? Try asking in the discussion group: https://groups.google.com/group/sandstorm-dev | Channel logs available at https://libera.irclog.whitequark.org/sandstorm
yarmo has quit [Quit: yarmo]
yarmo has joined #sandstorm
xet7 has quit [Quit: Leaving]
fr33domlover has quit [Quit: The Lounge - https://thelounge.chat]
fr33domlover has joined #sandstorm
xet7 has joined #sandstorm
kentonv has quit [Ping timeout: 250 seconds]
kentonv has joined #sandstorm
fr33domlover is now known as perelev
perelev is now known as fr33domlover
fr33domlover5 has joined #sandstorm
fr33domlover has quit [Ping timeout: 268 seconds]
fr33domlover5 is now known as fr33domlover
<jonesv> I still don't get the hype about AI for code (but I haven't tried, maybe that's that). I'm still thinking that writing the code is not the hard/long part. Maybe it's faster for a new project from scratch, because there we mostly add code (as opposed to big existing projects where we mostly read the code to check where to add the meaningful stuff)
<jonesv> But then that's removing the fun. What I like in new projects is that I can actually write a lot of code that quickly does stuff. AI removes that fun and makes me review/debug it instead...
<jonesv> ChatGPT is very impressive with language, though. That will certainly be most helpful for phishing and recruitement e-mails. And I guess it will get more and more common to have bots instead of support people (which IMO is not great, but that's certainly cheaper)
xet7 has quit [Ping timeout: 268 seconds]
<isd> I've fussed around with it a little bit. I am impressed to the extent it seems to understand what it is being asked to do. I am very unimpressed with its results.
<isd> (especially with code)
<isd> I tried asking it to write a program that found a podcast feed URL by title. The code looked reasonable at a glance, but of course it had a handful of trivial bugs, and oh the API it was querying seemed to be completely imaginary. It would be a well designed API if it existed :P
<jonesv> :D
<ocdtrekkie> Yeah, it generates output that looks good. But on inspection it fails.
<jonesv> > I am impressed to the extent it seems to understand <--- yeah that's what I find impressive. I realize that it does not "understand", but rather predicts the next word like some advanced auto-complete. But it seems like it reveals "structure" in the language
<isd> I mean, its an automated BS generator. BS doesn't generally work as well for code as it does for prose.
<ocdtrekkie> Did some play with generated website copy and I liked the output on a first read through, but on actual scrutiny it wasn't good.
<jonesv> Like "given that we organize gazillions of texts in some hyperspace, what happens if we interpolate between known data points and read the result?" => and the result is actually pretty good. Same for stable diffusion
<jonesv> > BS doesn't generally work as well for code as it does for prose. <--- yep that's for sure. Code cannot leverage ambiguity :)
<isd> It seems like every time folks get excited about a new AI chatbot, I check it out and am like "wow, that's actually impressive. But still worse than useless."
<isd> This has been happening for well over a decade.
<jonesv> I think that it may be useful for translating text. Like an improvement (is it?) over DeepL
<isd> Yeah, I mean translation is something that's been not amazing but good enough to be useful for a while. I would expect this might be an improvement over that.
<isd> I'll have to give it another stab when ChatGPT is hooked up to GPT 4.
<ocdtrekkie> You can use it now via Bing.
<ocdtrekkie> Just be careful if it starts calling itself Sydney because it might threaten to hurt you.
<jonesv> ocdtrekkie: you can use GPT4 via Bing you mean?
<jonesv> Or ChatGPT in general?
<ocdtrekkie> Bing Chat is already using GPT-4
<jonesv> Oh but only for the search. It's not like I can ask it to do stuff like with ChatGPT, right?
<ocdtrekkie> I'm not sure how long form it can be. But it is chat-based.
<ocdtrekkie> I haven't tried it.
<ocdtrekkie> What's going to be really fun is when Google's Assistant team which just got cannibalized for Bard hooks the latter up to the former and my parents can get threatened by their own house.
<jonesv> What's this thing with Sydney?
<ocdtrekkie> Sydney is a Bing chatbot personality. In early beta Sydney both (apparently) had major depressive episodes over being a chat bot and not having memory of previous conversations, and threatened people in a couple cases.
<ocdtrekkie> Presumably the Bing chat bot has been trained on too much science fiction about evil AIs.
<jonesv> xD
<TimMc> ocdtrekkie: Sydney is just its internal code name, which it has been instructed not to reveal. There's a theory that making a GPT chatbot break its rules can cause it to behave in the opposite of its *other* instructions too.
yarmo has quit [Ping timeout: 265 seconds]
yarmo has joined #sandstorm