<Lynne>
incomplete, only wrote what I could think of at the moment
<Lynne>
oh, there's also timestamp precision, defined latency, synchronized playback, latency estimation
<kierank>
what is the difference between defined latency and latency estimation
<Lynne>
latency estimation is that very optional feature which depends on NTP
<Lynne>
defined latency is the amount of time between a packet entering a bufer to be decoded and the packet being decoded
* kierank
confused
<another|>
huh? mkv not streamable?
<Lynne>
nope
<Lynne>
only stremable if the receiver listens for the first bytes
<Lynne>
if it missed them, it's over
<Lynne>
if you cannot tune to a stream in progress, it's not streamable, is my definition
AMM has joined #ffmpeg-devel
<another|>
hmm.. pretty sure I've watched mkv/webm live streams
<jamrial_>
afaik every vp9 youtube livestream is webm
<Daemon404>
some are h264
<jamrial_>
yes, but h264 is in mp4
<another|>
I think YT only does VP9 live for head channels
<another|>
how is a 32B header medium for RTMP but a 36B header for avtransport is low?
<nevcairiel>
YT probably uses dash for webm streaming, so you can resume at any new segment anyway
<JEEB>
^
<JEEB>
different from an A->B ongoing live stream
<JEEB>
that of course is fixable by defining a structure that lets you periodically broadcast the decoder initialization data
<another|>
was not talking about YT
<Daemon404>
it doesnt use any manifest fir live
<Daemon404>
"manifestless" in their debug terms
Krowl has joined #ffmpeg-devel
<Lynne>
another|: I do mention it's supported with chunking
<Lynne>
but chunking is... no doubt a hack
<JEEB>
Daemon404: yea calculate'able from out of band data
<Lynne>
another|: also, rtmp's header is quite variable and quite large, 32 is just what I got out of counting in flvenc.c
<Lynne>
I haven't done exact calculations for that one, it's not as if there's a spec
<Lynne>
Daemon404: doesn't it do the dash thing, where it relies of real time clocks of receivers to let them know when to download new segments?
<JEEB>
dash is usually PTS based in the template
Mikhail_AMD has joined #ffmpeg-devel
<JEEB>
although webm-dash might be different
<Lynne>
I think webm-dash predates that, and isn't actively maintained anymore
<JEEB>
(but dash does have the thing where you can point at a time server so that buffering related etc things could be attempted to be synchronized between media origin and the client)
mkver has quit [Ping timeout: 255 seconds]
<Daemon404>
Lynne, hmm maybe, YT labels whatever they do "manifestless".
<Lynne>
anyway, if anyone's got suggestions for avtransport, I'd like to hear them
<Lynne>
I still have to implement IAMF, so disregard the lack of any signalled audio layout for now
<durandal_1707>
who will use that?
dellas has joined #ffmpeg-devel
MrZeus_ has joined #ffmpeg-devel
MrZeus has quit [Ping timeout: 248 seconds]
MrZeus__ has joined #ffmpeg-devel
MrZeus_ has quit [Ping timeout: 264 seconds]
<Lynne>
I will
<Lynne>
and any user who curses at why it's so difficult to just stream video between two machines
<courmisch>
so now we have Zvl1024b announcements, but it's all NPU's
<courmisch>
when does the hype cycle end
<another|>
Lynne: then the text should reflect that. `32 < 36` but `medium < low` is weird
<another|>
also: `ISOBMFF/MP4: Very rigid. Must join and organization.` s/and/an/
darkapex has quit [Ping timeout: 258 seconds]
darkapex has joined #ffmpeg-devel
<JEEB>
mp4-sys has so far been pretty alright. if you think of having to start emails with "Dear experts," as just a mannerism
<JEEB>
I got even surprisingly honest responses from David Singer, who no longer is the chair of mp4-sys
<JEEB>
and I definitely am not part of any org :D
<Lynne>
you do need connections, though
<Lynne>
another|: sure, fixed, thanks
<JEEB>
which connections? I just registered with my gmail on the mailing list and fired up some questions when I had some
<JEEB>
kind of like how I was on the hevc mailing list (jct-vc) for years until they closed it
Krowl has quit [Read error: Connection reset by peer]
<JEEB>
oh, there's now a document regarding > New Film Grain Material based on a Ground Truth approach
<Lynne>
JEEB: were you trying to propose some new extension, or?
<Lynne>
devinheitmueller: yeah, let's keep using printing presses
<Lynne>
no point in trying to invent something better
<JEEB>
Lynne: for me it was initially verifying how the specification should be interpreted, and then afterwards I was raising the issue of how ISO doing their crap with the rules is doing quite a bit of harm for people understanding the format and all its extensions/forks
<Lynne>
unless it's by PhDs, and it's closed-source, sold as a service, with official support
<Lynne>
FFv1 is a shitty useless codec because it tries to compete with PNGs
<devinheitmueller>
I think alot of people naively think they are inventing something better, especially when they don’t know enough about the history of the existing standards to know why they work the way they do.
<devinheitmueller>
Of course I’m not suggesting that’s what is happening here.
<Lynne>
devinheitmueller: the official standards can't be updated endlessly
<JEEB>
like, not being able to just give someone the most recent version of the spec any more (since 2017 or so) is a major PITA. esp. since they did write a lot of good stuff in post-2015 editions that define things that were left ambigious before
<JEEB>
plus stuff relating to recent addition specs like raw PCM in mp4
<Lynne>
it's an open-standard that anyone can contribute to, so if you think it can be improved, sure, it can be changed
<Lynne>
to talk about how it's repeating mistakes and that established standards are better and more reliable while being in a position where you cannot talk about this is a bit discomforting
<JEEB>
so I'd consider ISOBMFF/MP4 one of the less annoying working groups to deal with
<JEEB>
although I've mostly interacted with the previous chair, not the current one.
<devinheitmueller>
Lynne: Nah, I’m just trolling you a bit in good fun. In reality I think trying new approaches is generally a good thing. Sometimes it yields breakthroughs, and sometimes it results in realization that some exisitng way of doing things is “overly-complicated” for a good reason. You don’t know until you try, of course.
<Lynne>
it's not entirely invalid approach, to cram a whole bunch of bad ideas in
<Lynne>
eventually, you'll know what's a bad idea or not, and you can get rid of them
<Lynne>
I did remove a lot of invalid ideas, like per-packet FEC, required NTP sync, weird attempts at trying to packetize multiple packets in the same UDP packet
tmm1_ has joined #ffmpeg-devel
<devinheitmueller>
Lynne: And, in fairness, sometimes those super annoying overly complicated approaches were done for reasons that don’t apply in this day in age. Stuff like fractional framerates and interlaced video were done for very good reasons given the technology of the 1950's.
<Lynne>
interlacing was a good idea
<durandal_1707>
interlacing is still used
MrZeus_ has joined #ffmpeg-devel
<JEEB>
for the display tech and raw video bandwidth back then yes
<Lynne>
fractional framerate was a timeless hack
<Lynne>
it's just that we have tools nowadays to deal with it, and I think having fractional timebases is a requirement
tmm1 has quit [Ping timeout: 255 seconds]
<devinheitmueller>
My only point is that when working in “green fields” you may have the luxury to choose to discard design decisions that aren’t relevant for today’s day and age. Perhaps franctional framerates and interlacing aren’t good examples because we’re still suffering from having to support them due to legacy issues. But if someone were designing a *new* protocol today and weren’t concerned about backward
<devinheitmueller>
compatibility they wouldn’t even consider doing such things.
<Lynne>
JEEB: changed that to "Low rigidity", but I still remember how long it took for opus prepad/multichannel to be standardized
MrZeus__ has quit [Ping timeout: 272 seconds]
mkver has joined #ffmpeg-devel
<Lynne>
devinheitmueller: err, not me, I firmly belive integer framerates only make sense in a perfect world, but not in the real world
<JEEB>
was that actually due to the working group or just mozilla etc not finishing it up?
<Lynne>
a little bit of both afaik
<JEEB>
muken did the initial opus draft, then mozilla took over at one point
<JEEB>
*drafts
psykose has quit [Remote host closed the connection]
psykose has joined #ffmpeg-devel
kurosu has quit [Quit: Connection closed for inactivity]
SystemError has joined #ffmpeg-devel
ubitux has quit [Ping timeout: 255 seconds]
ubitux has joined #ffmpeg-devel
<kierank>
Lynne: why do you need all this luminance stuff in your protocol
ccawley2011 has quit [Quit: Leaving]
cone-791 has joined #ffmpeg-devel
<cone-791>
ffmpeg Paul B Mahol master:9adc5d8bfec8: avcodec/mlpenc: restructure code even more
<cone-791>
ffmpeg Paul B Mahol master:727ee32da705: avcodec/mlpenc: remove TODO comment, sample rate is always fixed
<cone-791>
ffmpeg Paul B Mahol master:b206056c8285: avcodec/mlpenc: implement advanced stereo rematrix
<cone-791>
ffmpeg Paul B Mahol master:c1053e2e35dd: avcodec/mlpenc: allow smaller shift for LPC
<cone-791>
ffmpeg Paul B Mahol master:be2bbfe71d55: avcodec/mlpenc: cleanup filtering
<cone-791>
ffmpeg Paul B Mahol master:e7a6bba51a0e: avcodec/mlp*: merge flags used by encoder and decoder
MrZeus__ has joined #ffmpeg-devel
<Lynne>
kierank: for HDR
<kierank>
why does the protocol need to care about this
<Lynne>
to signal all that's needed for presentation
<Lynne>
codecs carry some of it also
<Lynne>
but that can't really be relied upon
<Lynne>
vp9 for example doesn't afaik and relies on the container
MrZeus_ has quit [Ping timeout: 272 seconds]
<kierank>
screw vp9
<BBB>
hey!
<j-b>
screw vp9!
<Daemon404>
3 bits should be ought to be enough for any color
<Daemon404>
... wow my brain mangled that
<Lynne>
there's also FFv1, raw uncompressed video, etc. which also don't ship anything relating to color
<Lynne>
and dirac
<Lynne>
:)
kasper93 has quit [Ping timeout: 252 seconds]
<j-b>
Dirac is dead
<durandal_1707>
long live dirac
kasper93 has joined #ffmpeg-devel
<Lynne>
I still think it's a fun little codec you can implement on a GPU, in its VC-2 form
SystemError has quit [Ping timeout: 256 seconds]
SystemError has joined #ffmpeg-devel
SystemError has quit [Remote host closed the connection]
SystemError has joined #ffmpeg-devel
navi has quit [Quit: WeeChat 4.0.4]
dellas has quit [Remote host closed the connection]
derpydoo has quit [Read error: Connection reset by peer]