michaelni changed the topic of #ffmpeg-devel to: Welcome to the FFmpeg development channel | Questions about using FFmpeg or developing with libav* libs should be asked in #ffmpeg | This channel is publicly logged | FFmpeg 6.1 has been released! | Please read ffmpeg.org/developer.html#Code-of-conduct
gust82 has quit [Remote host closed the connection]
gust82 has joined #ffmpeg-devel
typological has quit [Quit: Connection closed]
gust82 has quit [Remote host closed the connection]
gust82 has joined #ffmpeg-devel
cone-317 has joined #ffmpeg-devel
<cone-317>
ffmpeg Michael Niedermayer master:fb520708482c: avcodec/h264dec: use BOOL for skip_gray, noref_gray
ubitux has quit [Ping timeout: 246 seconds]
ubitux has joined #ffmpeg-devel
thilo has quit [Ping timeout: 260 seconds]
lexano has quit [Ping timeout: 256 seconds]
thilo has joined #ffmpeg-devel
thilo has quit [Changing host]
thilo has joined #ffmpeg-devel
navi has quit [Quit: WeeChat 4.0.4]
wangbin has joined #ffmpeg-devel
wangbin has quit [K-Lined]
<kierank>
elenril: as george hotz says the amd drivers are garbage
mkver has joined #ffmpeg-devel
lemourin has quit [Read error: Connection reset by peer]
lemourin has joined #ffmpeg-devel
jamrial has quit []
MrZeus has quit [Ping timeout: 255 seconds]
cone-317 has quit [Quit: transmission timeout]
\\Mr_C\\ has quit [Remote host closed the connection]
<elenril>
wtf, why does my new monitor come with a 200W ac adapter
<elenril>
some usb-c PD nonsense I guess
<wbs>
a monitor with usb-c power supply, instead of the default IEC C13 connector?
<elenril>
no, it has a normal DC jack
<elenril>
rated for 19.5V/10A
<elenril>
of course actual product-specific technical information is too complicated to provide
<elenril>
LG apparently has so many models they don't know themselves which can do what
<BtbN>
Typically because you can plug in a Laptop via USB-C, and the monitor will then charge it
<BtbN>
elenril: are you sure that works properly? There's a whole lot of code in there to maintain a constant framerate, not sure how happy it'd be about hanging for potentially a long time.
<BtbN>
You mean what might cause it to EAGAIN there?
<elenril>
yes, when does that condition evaluate to true
<BtbN>
When the frame itself didn't update, but something else did
<BtbN>
And since probing needs an actual frame, it needs to EAGAIN there
<BtbN>
i.e. AcquireNextFrame can come back to you with only new mouse coordinates
<elenril>
isn't that okay for output when we do actually capture the mouse?
<haasn>
michaelni: what's the lifetime of SwsFilter? does it only exist until sws_init_context() returns?
<haasn>
It seems like I need to make a full memdup of this struct in order to re-call sws_init_single_context() after params change
jamrial has joined #ffmpeg-devel
gust82 has quit [Remote host closed the connection]
<BtbN>
elenril: yes? Hence the parameter.
gust82 has joined #ffmpeg-devel
rvalue has joined #ffmpeg-devel
<elenril>
BtbN: i mean that block seems like it'd return EAGAIN even when the mouse was updated
<BtbN>
yes, that's the whole point
<BtbN>
return EAGAIN until an actual frame was returned
<BtbN>
since the probe function needs an actual frame to probe it
<jdek>
elenril: you vs the guy she tells you not to worry about
<elenril>
lol
<elenril>
BtbN: my point is that this will prevent you from capturing mouse movements, in case that is what you actually care about
<elenril>
but whatever, that's beside the point now
<BtbN>
Have you not noticed the parameter?
<BtbN>
The whole block is turned off outside of the probe function
dellas has quit [Ping timeout: 252 seconds]
<elenril>
ah
<elenril>
sorry, failed at reading
<elenril>
reading is very hard
<BtbN>
The patch in general seems fine, probably simpler to just set the timeout to infinite. If that's an option. Not sure if AcquireNextFrame can't still return empty-handed in some case
dellas has joined #ffmpeg-devel
cone-341 has quit [Quit: transmission timeout]
navi has joined #ffmpeg-devel
tester11 has quit [Ping timeout: 255 seconds]
<michaelni>
haasn, swsfilter is owned by the user calling the function the user can free it after init. The user can even setuo the struct on the stack and not alloc it. I suggest you add a sws_cloneFilter() to make a copy of it that can be kept for re-init
<haasn>
yeah that's what I ended up doing
<michaelni>
the arrays in it are likely quite small so the copy and alloc should not matter
<haasn>
sigh, sws_init_single_context() calls sws_setColorspaceDetails() to set up range conversion
<haasn>
so if sws_setColorspaceDetails() should also call sws_init_single_context() when range changes, we have an impossible situation
Krowl has quit [Read error: Connection reset by peer]
<haasn>
needs something more complex also
<haasn>
I think the easier solution would be to forbid sws_setColorspaceDetails() from changing range and require the user to reinit context if it does
<haasn>
after YUVJ removal vf_scale no longer needs to change range from sws_setColorspaceDetails at all
MrZeus has joined #ffmpeg-devel
<haasn>
not to mention sws_init_single_context is not safe to re-call without first freeing members anyway
<haasn>
so probably requiring the user to free + reinit on range change is the best bet
<haasn>
that does mean we can't fix the existing bugs with this function without API break though
<haasn>
(though, could just error out of this function if srcRange != c->srcRange)
novaphoenix has quit [Quit: i quit]
novaphoenix has joined #ffmpeg-devel
lexano has joined #ffmpeg-devel
tester11 has joined #ffmpeg-devel
Krowl has joined #ffmpeg-devel
paulk-bis has quit [Quit: WeeChat 3.0]
paulk has joined #ffmpeg-devel
MrZeus_ has joined #ffmpeg-devel
MrZeus__ has joined #ffmpeg-devel
MrZeus__ has quit [Read error: Connection reset by peer]
MrZeus has quit [Ping timeout: 276 seconds]
<Traneptora>
then how would you change range though with swscale
MrZeus_ has quit [Ping timeout: 256 seconds]
MrZeus has joined #ffmpeg-devel
MrZeus_ has joined #ffmpeg-devel
MrZeus has quit [Ping timeout: 245 seconds]
derpydoo has joined #ffmpeg-devel
MrZeus__ has joined #ffmpeg-devel
MrZeus_ has quit [Ping timeout: 264 seconds]
<JEEB>
Traneptora: full re-creation of chain I guess
<haasn>
any proper fix here would amount to doing the same inside sws_setColorspaceDetails anyway except now we also have to memdup all options and filters
<haasn>
it can be done, just requires moving more code around than I'm comfortable atm
MrZeus_ has joined #ffmpeg-devel
MrZeus__ has quit [Ping timeout: 264 seconds]
ccawley2011 has quit [Read error: Connection reset by peer]
ccawley2011 has joined #ffmpeg-devel
mkver has quit [Ping timeout: 256 seconds]
mkver has joined #ffmpeg-devel
mkver has quit [Ping timeout: 276 seconds]
Krowl has quit [Read error: Connection reset by peer]
tester11 has quit [Ping timeout: 255 seconds]
feiw1 has quit [Ping timeout: 255 seconds]
feiw1 has joined #ffmpeg-devel
ccawley2011 has quit [Read error: Connection reset by peer]
ccawley2011 has joined #ffmpeg-devel
mkver has joined #ffmpeg-devel
Krowl has joined #ffmpeg-devel
ccawley2011 has quit [Read error: Connection reset by peer]
ccawley2011 has joined #ffmpeg-devel
ccawley2011 has quit [Read error: Connection reset by peer]
ccawley2011 has joined #ffmpeg-devel
<elenril>
haasn: what are the source of limited-range rgb in nature?
<JEEB>
as far as I know only over the pipe
<JEEB>
like HDMI etc
<elenril>
could e.g. ddagrab produce limited range?
<JEEB>
most likely not
<JEEB>
the compositor works in full range
<JEEB>
then the output component handles the conversion if required
<elenril>
so it's the GPU encoder that converts it?
<JEEB>
I'd expect like that, yea
<haasn>
elenril: not a thing
<haasn>
Or to be more precise, broken conversion
<elenril>
you mean you don't want it to be a thing, or it actually does not ever exist
<haasn>
That can produce limited range rgb
dellas has quit [Remote host closed the connection]
<haasn>
Well do you count the thousands of videos on YouTube which have wrong conversions hard coded (but are tagged as full range)?
<haasn>
But no, I mean, I’m not aware of any limited range rgb format in existence
<JEEB>
right, so you're speaking of the context of swscale
<JEEB>
:)
<JEEB>
and yea I agree, on the software level
<JEEB>
I've heard stories of projectors wanting limited RGB or something, but as far as I can tell the desktop doesn't care
<JEEB>
the driver / output just gets set to something
<JEEB>
and the limited range signal is only over that pipe
<haasn>
I mean you can trivially take a limited range yuv plane and put it into an rgb image, but that’s not something you can signal in any format
<JEEB>
in video formats you can at least, since the range flag is not limited to specific CICP values to my knowledge.
<haasn>
It is a thing im hdmi apparently yeah
<JEEB>
but anyways, haasn is 100% correct in the sense that limited RGB in software is like "lol"
<JEEB>
you only get that if you literally capture the HDMI signal
ccawley2011 has quit [Read error: Connection reset by peer]
ccawley2011 has joined #ffmpeg-devel
<haasn>
elenril: not that swscale hard codes the assumption that rgb is full range
<haasn>
If you want to relax this and let limited range rgb be a thing, we can, but it would first require changing this assumption everywhere in the code base
<haasn>
Note that*
<JEEB>
yea that's why I picked the route of "just don't mention it" when updating AVColorRange docs
<elenril>
haasn: no, my question is mainly whether one can legitimately encounter a source that produces such a thing
<JEEB>
that way I work around people who have a gung-ho opinion, and it doesn't stop adding support if someone really cares
<elenril>
like a capture device or such
<haasn>
Possibly an HDMI capture card could, although I’m not convinced it’s a legitimate use case
<elenril>
why not?
<JEEB>
yea since if you are capturing the thing generating the output most likely could also output full range :D
<JEEB>
(although I wonder if some people hacked limited range to implement "wide gamut")
* JEEB
feels bad somewhere deep inside after writing that since it's completely possible someone came up with that idea
<haasn>
AVFrame explicitly documents it as “YUV range” and every downstream media player ignores this field for rgb
<haasn>
(Ask me how I know)
<JEEB>
at least AVColorRange is now documented
<haasn>
elenril: because you’d have to go out of your way to configure your devices that way for no conceivable benefit
<JEEB>
TIL it used to be in frame.h, but then moved elsewhere
dellas has joined #ffmpeg-devel
<haasn>
And many devices don’t even allow you to change color range for rgb
<elenril>
don't many devices default to limited range HDMI?
<haasn>
Would be news for me
<JEEB>
none of mine do
<haasn>
Limited range yuv sure
<JEEB>
yea
<haasn>
Also not to be confused with hdmi overscan/underscan
<haasn>
Which many devices assume by default for hdmi
<JEEB>
limited range RGB seems to be something that I've only heard from the depths of cinema projector people
<JEEB>
I think nevcairiel might have a touch to that community
<JEEB>
*with
<haasn>
The only conceivable advantage to limited range rgb is being able to range covert with a single left shift instead of two
<haasn>
DP and USB-C don’t seem to support limited range either
<haasn>
Though on the Internet I found reports of at least one dp-to-hdmi converter that always output limited range
<haasn>
Ugh
<JEEB>
in other words, it's stuff that exists. but the priority of such stuff is incredibly low
<haasn>
17:04 <haasn> DP and USB-C don’t seem to support limited range either <- though only on Intel?
<haasn>
Hrm, how annoying
<haasn>
Hmm, I stand corrected
<haasn>
In ITU-R docs (709, 2020) they provide RGB quantization formulae only for limited rgb
<JEEB>
yes, as that commit I linked added in the docs, BT.2100 finally formally defined it
<JEEB>
oh, RGB
<JEEB>
I just checked generally :)
<JEEB>
and they added full range definition in BT.2100
<haasn>
then in BT.2100 they expand that to both limited and full range (RGB and YUV)
<JEEB>
yup
<haasn>
hrm
<haasn>
elenril: okay, so with that in mind, we should think about a future in which we support full range RGB as well
<haasn>
imo it's out of scope for now just to preserve status quo
<haasn>
limited range*
<j-b>
all that is koda's fault
<durandal_1707>
all that is j-b's fault
<j-b>
no.
<j-b>
all is always koda's fault
<j-b>
You got it wrong
<durandal_1707>
all is always j-b's fault
<j-b>
durandal_1707: you are incorrect.
<haasn>
"I apologize for factual errors in my previous statement. As an AI language model, [...]
<j-b>
haasn: ah, good point.
<j-b>
durandal_1707: It's the fault of the AI.
<jamrial>
ask him what are the last digits of pi. if he gives you a number, he's an AI
<haasn>
the last digit of pi is clearly 1
<haasn>
(base pi)
<j-b>
is a irrational base legal?
<j-b>
(never seen)
Krowl has quit [Read error: Connection reset by peer]
<elenril>
there should be an eu directive against it
<durandal_1707>
there should be an eu directive to ban elenril from FFmpeg
<j-b>
durandal_1707: I love you.
* elenril
stabs durandal_1707
<durandal_1707>
j-b: I hate you.
* durandal_1707
stabs elenril
<j-b>
durandal_1707: I know.
* j-b
stabs elenril and durandal_1707
<durandal_1707>
jamrial: is ambisonic filter now ok?
<jamrial>
i have no more comments about the layouts you're using, at least, so yes for me
* durandal_1707
stabs j-b and elenril
* j-b
dies
<courmisch>
for a split second, I read it as "i have no more comments about the layoffs"
<courmisch>
jamrial: shouldn't size_t be used for sizes (as opposed to strides)?
ccawley2011 has quit [Read error: Connection reset by peer]
ccawley2011 has joined #ffmpeg-devel
<jamrial>
courmisch: mmh, true, len is not used with the pointers at all, but as a counter instead
<jamrial>
still, the c spec states that if the difference between two pointers is > PTRDIFF_MAX then it's UB, so len is limited by that
<elenril>
reminder that we only have 7 candidates for tc
<elenril>
volunteer today
* JEEB
already did \o/
<jamrial>
mkver: ^
<mkver>
?
<jamrial>
do you want to be in the TC?
<mkver>
No
<jamrial>
ok
<jamrial>
courmisch: there's no ac3dsp checkasm test. did he use one he wrote and didn't submit?
<courmisch>
jamrial: I can only guess
<courmisch>
jamrial: he attached it in his second mail on the thread
<jamrial>
ah
<jamrial>
puts()
<courmisch>
it pisses me off how puts() appends a new line, just because
<courmisch>
couldn't possibly be the same as fputs(stdout)
<wbs>
that would be way too regular
<elenril>
inconsistency? in c stdlib?
<elenril>
shocking
<JEEB>
it's more likely than you think
<courmisch>
at least, it's not insecure
<courmisch>
by design
<courmisch>
like it's counterpart gets()
<courmisch>
its*
cone-071 has joined #ffmpeg-devel
<cone-071>
ffmpeg James Almer master:2d9fd814d0b0: x86/: clear the high bits for order in scalarproduct_and_madd functions
<cone-071>
ffmpeg James Almer master:707e46dc544c: test/checkasm: test llauddsp
<wbs>
do we need to write a guide for how to make checkasm testa?
<wbs>
because I've seen too many that actually don't test the right thing at all
<elenril>
I can't think of a nontrivial subject we do NOT need a guide for
<courmisch>
it's just hard to write tests if you don't know the context of the function
<wbs>
although, the most important bit usually is to thoroghly understand the interface of the dsp function to test, and that's usually where it fails already
<courmisch>
the problem is that the people who wrote the using code don't typically want to write checkasm
<wbs>
yep
<wbs>
and we have dozens of untested dsp interfaces
<courmisch>
more like a dozen dozen
<courmisch>
dozens of dozens
<courmisch>
ooh, seems I killed yet another SD card
<courmisch>
that's three down in less than a month
<j-b>
are all SD cards shit?
<wbs>
anyway, yes, knowing the usage context is often the issue, but too often does one see tests randomizing the wrong inputs, not testing the right thing, etc, too
<courmisch>
j-b: that but also flashing full disk images is veeery bad for wear leveling
rix has joined #ffmpeg-devel
<rix>
I have reversed TrueHD ATMOS 4th substream and OAMD parsing. Would like to contribute to ffmpeg. I'm not familiar with the codebase and workflow. Any pointer of how to get started?
<durandal_1707>
run fast as you can
<durandal_1707>
you also can write you own project, whatever you like/prefer
<JEEB>
rix: libavcodec has the mlp related files :) start by making a build dir and `path/to/configure` (with x86(_64) you'll require build-essentials and nasm)
<JEEB>
also after you get your first build done, you may also want to try the FATE test suite which is documented in https://www.ffmpeg.org/fate.html
<rix>
im able to build ffmpeg, and I have mostly read through the mlp related files. I probably need more help in av data formats and how to add more channels and attach metadata. And also how to run test
<JEEB>
ah
<JEEB>
`make fate-rsync ../path/to/samples` will sync the current FATE test suite, about ~2 gigs
<JEEB>
asdf, forgot SAMPLES=
<JEEB>
:)
<JEEB>
*SAMPLES=../path/to/samples
<JEEB>
it's documented on the fate.html linked I posted
<JEEB>
then just `make fate SAMPLES=../path/to/samples` will run the full suite
<JEEB>
if you tell about your requirements for metadata and stuff that can be helped with :)
dellas has quit [Remote host closed the connection]
<rix>
OAMD is about channel to object assignment and 3D panning
<rix>
currently I haven't reversed the object audio renderer part, so it's mostly only useful for reporting or passthrough to other workflow
<JEEB>
so you need some additional thing to say that channels X to Y are actually object channels and not normal audio, and then something like AVFrameSideData that has the location of each object channel?
<JEEB>
(of course such side data type doesn't exist yet but just noting an example)
<rix>
similar to channel layout metadata I guess?
<durandal_1707>
cavern project have ATMOS "decoding" for EAC3
<JEEB>
yea decoding and rendering are two separate things I guess. kinda like with ambisonics
<JEEB>
decoder will expose normal channels and object channels, and then contain location updates in the AVFrames
<JEEB>
then something would have to take that in and render it
<rix>
I will also considering contributing to cavern
<JEEB>
IIRC there's also some standardized object audio structure, not sure how well D's stuff maps to that
<JEEB>
would be nice if we could have one type of mapping metadata for both MPEG-H 3-D Audio and D's stuff
<rix>
IAMF?
<JEEB>
could be, I don't recall details and all that jazz :)
<JEEB>
and if you're doing decoding then most likely your test would be decoding with ffprobe or so, dumping the side data results as text
<JEEB>
this did not add a new test, but instead it changed the result of one of the tests
<JEEB>
`ffprobe -select_streams a:0 -of json -show_frames -i input` is one way you can play around it manually
haihao has joined #ffmpeg-devel
<rix>
so fate will run ffprobe?
<JEEB>
yea the definitions are in the Makefiles
<JEEB>
for example that changed test, hevc-dv-rpu
<JEEB>
`git grep "hevc-dv-rpu"` gives you the file that is defined in
ccawley2011 has joined #ffmpeg-devel
<JEEB>
and if you want to run that specific test, you just run the specific identifier of that test, `make fate-hevc-dv-rpu SAMPLES=../../samples`
blb has quit [Ping timeout: 276 seconds]
<rix>
nice
<JEEB>
you can find existing truehd tests with `git grep -i "truehd" -- tests/`
<rix>
do we have any thd atmos sample files?
<wbs>
JEEB: protip; --samples= to configure, so you don't need to specify it on each run
blb has joined #ffmpeg-devel
<JEEB>
rix: I don't think we have that in FATE suite
<jamrial>
there may be some in trac
<JEEB>
yea, and videolan's sample server most likely has stuff
<rix>
what's trac?
<jamrial>
our issue tracker
<JEEB>
trac.ffmpeg.org , the issue tracker
<JEEB>
:)
<JEEB>
anyways, the idea with FATE samples is that they can be added. just having a minimal enough (preferably less than a MiB) sample, which covers a wide enough range of what's being added/changed
<rix>
yeah I'm just a bit concerned with licensing if using random samples from the bigger internet
<JEEB>
so if there is no D object based audio stuff in yet, as the code gets through review the sample should be uploaded first, and then 48 hours later the patch can be merged as test runners update their contents
<durandal_1707>
Lynne: also much better precision and correctness i get with standard goertzel algo with alternative realization
<durandal_1707>
Lynne: but its slower, reasonable
<rix>
do you do pull request or mailing list review?
<JEEB>
currently mailing list
<JEEB>
hopefully something else at some point in the future
<rix>
haven't really used mailing list to send patch before, but I will try
<JEEB>
so let's say you have a branch off of master
<JEEB>
`mkdir -p my_patch_set/v1 && git format-patch -o my_patch_set/v1/ master..HEAD` gives you a set of patches master to current HEAD that you're on
<JEEB>
if you need a cover letter, --cover-letter
<rix>
what happens if you need another iteration after review?
<JEEB>
make a v2 dir, add -v2 to format-patch
<JEEB>
that adds the prefix PATCH v2 to the emails instead of just PATCH
<rix>
and, use email attachment or inline plaintext?
<cone-071>
ffmpeg Paul B Mahol master:4af412be7153: avfilter: use AV_OPT_TYPE_CHLAYOUT
<JEEB>
git send-email sends them in the messages, but attaching is also valid
<JEEB>
not sure if patchwork can grok multiple attached patches in the same email
<JEEB>
(patchwork is just a convenient tool, not required for any review or whatever)
<rix>
more git commands to learn, but ok
<JEEB>
linking to a github/lab/whatever branch in the patch submission is also completely fine, in some cases it's an absolute PITA to try and figure out the easiest way to grab patches from the ML - especially if patchwork doesn't grok it
<rix>
cool
<JEEB>
(as in, you still need to post patches, but some people have less tooling set up for grabbing patches easily than others)
<courmisch>
note to self: SD card works better when you actually put it in the card reader
<JEEB>
:D
<haasn>
some SD cards probably work worse that way
<courmisch>
dd happily creates /dev/sdd as a regular file when the device does not exist
<courmisch>
I thought the write speed was suspicious
<JEEB>
TIL
<jamrial>
courmisch: would making a requirement that the buffers passed to float_to_fixed24 be 32 aligned ruin the riscv impl?
<jamrial>
actually nvm, it will probably break the neon ones
<courmisch>
jamrial: RVV requires element size alignment, so 32 bits here
<jamrial>
ok
AbleBacon has joined #ffmpeg-devel
<jamrial>
courmisch: sent a patchset changing len to size_t among other things
goodtimesfun has joined #ffmpeg-devel
<courmisch>
jamrial: thanks
<Dmitri_Ovch>
Hi, can someone take a look at this patch? I would like to accept it, but it would be useful to hear the opinion of someone else.
<BtbN>
Only makes me wonder if all those encoders couldn't share more code
<Dmitri_Ovch>
this is not included in the shared part because the property has different names for different encoders. I also thought about it.
<BtbN>
Hm, I don't quite understand the logic of those checks. What exactly is different about -1 and 1?
<BtbN>
My initial thought was that -1 would be auto, 0 off and 1 force-on
<BtbN>
But it'll throw an error with both -1 and 1.
<Dmitri_Ovch>
the goal ща -1 is not to expose the property at all if it is not set by the user
<BtbN>
as in, use what the preset has?
<BtbN>
Other than that, a bunch of minor whitespace issues in the path
<BtbN>
"}else {", odd indentation in the options structs
<BtbN>
Also, what'll happen if you turn it off? Will assigning it to off work, even when not supported?
gust82 has quit [Remote host closed the connection]
gust82 has joined #ffmpeg-devel
<Dmitri_Ovch>
if it is not supported but the property is set by the user, it will return return AVERROR(EINVAL); even if the property is set to off
<jamrial>
imo that should be ENOSYS
<Dmitri_Ovch>
"indentation in the options" - Thanks, I'll fix it
<BtbN>
That seems like an undesireable result to me then
<BtbN>
Why fail if something I don't want on is not supported?
<Dmitri_Ovch>
The logic was such that if the user wants to set a property, and meaningfully adds it to the arguments, then we return an error, and do not continue coding if this failed.
<BtbN>
hm, seems a bit odd in this case "Hey, don't do Smart Access Video. I can't error."
<Dmitri_Ovch>
It probably really makes sense to correct this, thank you
<rix>
is it recommended to configure with --enable-debug ?
<BtbN>
Pretty much, only return an error if the user wants the option _on_ and it can't be turned on.
goodtimesfun has quit [Ping timeout: 276 seconds]
<JEEB>
rix: I only do disable-stripping since that means I can utilize `ffmpeg` instead of `ffmpeg_g` if I want debug symbols
<BtbN>
What does enable-debug even do? Turn off optimizations I guess?
dellas has joined #ffmpeg-devel
navi has quit [Ping timeout: 276 seconds]
dellas has quit [Remote host closed the connection]
dellas has joined #ffmpeg-devel
<rix>
it seems --disable-optimizations means -O0 and --enable-debug=LEVEL means -gLEVEL
Chagalle has joined #ffmpeg-devel
dellas has quit [Remote host closed the connection]
Chagall has quit [Ping timeout: 276 seconds]
navi has joined #ffmpeg-devel
BtbN has quit [Remote host closed the connection]
BtbN has joined #ffmpeg-devel
kurosu has quit [Quit: Connection closed for inactivity]
cone-071 has quit [Quit: transmission timeout]
<jamrial>
Doubt it's -O0 as that would kill DCE, which a lot of things depend on
cone-476 has joined #ffmpeg-devel
<cone-476>
ffmpeg James Almer master:567c67c6c8cb: avcodec/ac3dsp: make len a size_t in float_to_fixed24