SiFuh changed the topic of #crux-social to: Offtopic Talks | Project https://crux.nu/ | Logs: https://libera.irclog.whitequark.org/crux-social/
zorz has quit [Quit: WeeChat 4.1.2]
<remiliascarlet> I do SQL, but barely use it these days, because managing backups is a pain in the ass.
<remiliascarlet> In most of the programs I make, I don't even need a database. Just simple text files is all I need most of the time.
<remiliascarlet> Things like auto increment, types, character encoding, default values, nullable, and so on can be done in a more efficient way in any good programming language anyway.
ppetrov^ has joined #crux-social
ppetrov^ has quit [Quit: Leaving]
zorz has joined #crux-social
ppetrov^ has joined #crux-social
<zorz> bonjour!
<SiFuh> Tek'ma'te
<ppetrov^> moro zorz
SiFuh has quit [Remote host closed the connection]
SiFuh has joined #crux-social
lavaball has joined #crux-social
<SiFuh> farkuhar: I think I will order a wired keyboard and mouse
ppetrov^ has quit [Quit: Leaving]
<farkuhar> zorz: I need some advice about web-scraping. Is there a way to force the server to generate a "Content-Length" header, if by default it doesn't include one?
<farkuhar> From that command I get a HTTP/2 response with the line "content-disposition: attachment", but nothing about content length.
<zorz> farkuhar: why u use curl? you want to scrap the page ?
<SiFuh> zorz: It's called scrape not scrap ;-)
<farkuhar> For the record, I'm trying to modify /usr/bin/pkgsize so that it doesn't depend on wget. curl has an option to download only the HTTP header, but not every server is configured to reply with a "Content-Length" field.
<farkuhar> I actually haven't tried installing wget to see if its "-S --spider" options still work. Maybe neither wget nor curl can reliably obtain the information sought by pkgsize, given how web servers are configured these days.
<zorz> nothing'
<farkuhar> SiFuh, are you running wc on CRUX or OpenBSD? wc complains: invalid option -- 'h'
<SiFuh> OpenBSD
<SiFuh> Looks like in Linux you only -c
<SiFuh> So the output will be in bytes
<farkuhar> Yeah, but I'm not trying to count the characters in the HTTP header. The goal is to determine how many bytes would be downloaded, when requesting the actual file.
<zorz> farkuhar: curl -s -I -o /dev/null -L -w '%{size_download}' https://gitlab.freedesktop.org/upower/upower/-/archive/v1.90.2/upower-v1.90.2.tar.bz2
<zorz> is 0
<SiFuh> farkuhar: I have a feeling it isn't the file size being reported
<zorz> fuck guys the bank charged me twice thecar insurance... iam on phonecalls.
<farkuhar> In that case, maybe I should drop the -w option from /usr/bin/pkgsize, if we can't rely on webservers to report Content-Length correctly.
<SiFuh> farkuhar: "Chunked transfer encoding
ppetrov^ has joined #crux-social
<zorz> cache-control: max-age=60, public, must-revalidate, stale-while-revalidate=60, stale-if-error=300, s-maxage=60^M
<zorz> content-disposition: attachment; filename="upower-v1.90.2.tar.bz2"^M
<zorz> content-security-policy: ^M
<zorz> etag: "289b6617a2d385f99d4a3024d11c863f"^M
<zorz> permissions-policy: interest-cohort=()^M
<zorz> there is no content length man.
<SiFuh> zorz: Duh, that is the problem
<farkuhar> zorz: Chunked transfer encoding is the problem, as SiFuh pointed out. Web servers are no longer required to provide a "content-length" header, in an era of streaming media.
<zorz> ce la vie :)
<SiFuh> ce la Big Tech
<zorz> ofcourse 30 min hold so far on the phone Big Tech
<farkuhar> Support for "content-length" among all the hosts appearing in our Pkgfile sources is not quite 100%, but how low does it have to be before the -w option of /usr/bin/pkgsize is no longer useful? For the time being, that flag might still be valuable enough to retain.
<zorz> i even tried with python requests. nothing. req.py Content Length unavailable in the response headers.
<SiFuh> zorz: curl -vvv will tell you that. If it doesn't exist on the server and curl can't find it, why would python find it?
<zorz> i was just curious.
<SiFuh> "Curiosity killed the cat"
<zorz> cats have 7 lifes :)
<zorz> SiFuh: this is nice trick too ---->>>>
<zorz> curl -s "https://gitlab.freedesktop.org/api/v4/projects/upower%2Fupower/releases" | jq '.[] | select(.tag_name == "v1.90.2") | .assets.links[] | select(.name == "upower-v1.90.2.tar.bz2") | .size'
<zorz> but its empty
<SiFuh> zorz: Why seven? Did you use two already?
<farkuhar> zorz: nice trick! Unfortunately it introduces a dependency on jq. I think it looks cleaner if prt-utils has no dependencies outside core.
<zorz> aaaaaaaaaaaaaa
<SiFuh> zorz: Did you just use a third life now?
<zorz> did not know its for prtutils.
<zorz> SiFuh: relax :)
<SiFuh> Fucking De Niro... What a mother-fucking retard https://www.rt.com/pop-culture/589458-de-niro-biden-gurney-trump
<SiFuh> Hahahahaha
<SiFuh> Sad there is no video
<zorz> Sad there is no rt :-)
<SiFuh> Is it blocked there?
<SiFuh> The article is: ‘Captain America’ arrested at US military base
<zorz> yes SiFuh, its rt is blocked here, i told you the other day. I dont know if it is only in Greece, or in EU, most probably I imagine its EU
<SiFuh> Firefox blocked it for a while as well
<SiFuh> Actually let me rephrase
<SiFuh> Firefox blocked it
<SiFuh> I had to modify my browser to ignore fucking firefox's block. It's a fucking web browser. Why the fuck is it pushing its political opinions onto me?
<zorz> really?
<zorz> let me check
<SiFuh> It reminds me of this. Why the hell would my software or anything else in my life that is not political have an opinion and therefore force me to accept their opinion and their ideas? https://twitter.com/damonimani/status/1708630140681408712
<SiFuh> zorz: It was a while ago but I think it was a security addition
<SiFuh> Something like it spies on me, is untrusted, gives me malware some other shit
<zorz> no , i opened lynx, 403 forbidden
<zorz> SiFuh: i told when i disable tcp listening in startx what firefox is showing?
<zorz> did i told you? :PPpp this python messed my messed head :)
<SiFuh> I think it was this Deceptive Content and Dangerous Software Protection
<SiFuh> zorz: If you change your DNS server?
<zorz> SiFuh: yes suddenly all the companies dealing with www are concerned for us:)
<SiFuh> Try this
<SiFuh> An error occurred during a connection to 95.181.181.73. SSL peer has no certificate for the requested DNS name.
<SiFuh> If you get this error then they are DNS blocking
<SiFuh> So you can bypass it with your own DNServer or use a custom opensource DNServer
<zorz> An error occurred during a connection to 95.181.181.73. SSL peer has no certificate for the requested DNS nam
<SiFuh> Cool, most ISPs are pretty stupid only block DNS
<SiFuh> I use my own DNS on all my machines.
<zorz> lynx: Can't access startfile https://95.181.181.73/
<SiFuh> Ignore lynx
<zorz> SiFuh: thats nice... i shall make it in the future too.
<SiFuh> Bind on Linux
<zorz> ?
<zorz> aaaaa
<zorz> ok
<zorz> ok going back to work.... last night i did a homemade id_description thing... so i can make my life easy in sql. homemade barcodes only the codes HAHA
lavaball has quit [Remote host closed the connection]
<SiFuh> zorz: I was watching Xeason 2 Episode 10 of Joe Pickett and one of the Grim brothers took his sock off and put a big stone in it to take down two guys. How is that for coincidence?
<SiFuh> Xeason/Season
<zorz> HAHA
<zorz> SiFuh: YOU GET lets say frequency :)
<zorz> SiFuh: look here re.search(r'(\d+)\s*\b\s*([a-zA-Zα-ωΑ-Ω]+|gr|g|kg|lt|Lt|ml)', unit.text.strip().split()[-1]).groups() it is not only the regexp.... i have to deal with greek.....HAHA irony. you should see trying to load to sql, did not know i have to set it for greek utfcaca poutana maria.
ppetrov^ has quit [Quit: Leaving]