BtbN changed the topic of #ffmpeg to: Welcome to the FFmpeg USER support channel | Development channel: #ffmpeg-devel | Bug reports: https://ffmpeg.org/bugreports.html | Wiki: https://trac.ffmpeg.org/ | This channel is publically logged | FFmpeg 7.0 is released
Tinos has joined #ffmpeg
<intrac> ok, adding '-map_metadata -1' seems to remove this 6hr 'Timed Text' stream
<intrac> but mediainfo still lists a 'Menu' subheading, but just with no details listed under it
<intrac> So there's General, Video, Audio, Menu. with the latter having nothing under that subheading
<intrac> ffprobe says:
<intrac> [mov,mp4,m4a,3gp,3g2,mj2 @ 0x555aa3a29c00] Referenced QT chapter track not found
<intrac> so it seems the raw information has been excluded, but some container/reference to it remains
<intrac> hrm
aaabbb has joined #ffmpeg
<intrac> ok, it seems what I needed was: -map_chapters -1
finsternis has quit [Read error: Connection reset by peer]
pmarg has joined #ffmpeg
CarlFK has joined #ffmpeg
pmarg has quit [Quit: Lost terminal]
yans has quit [Remote host closed the connection]
Kei_N_ has quit [Ping timeout: 245 seconds]
Kei_N___ has quit [Ping timeout: 255 seconds]
Kei_N has joined #ffmpeg
Kei_N_ has joined #ffmpeg
Brazhh has joined #ffmpeg
Kei_N__ has joined #ffmpeg
Kei_N_ has quit [Ping timeout: 252 seconds]
Dotz0cat has quit [Ping timeout: 246 seconds]
waleee has quit [Ping timeout: 252 seconds]
Suchiman has quit [Quit: Connection closed for inactivity]
Brazhh has quit [Quit: WeeChat 4.3.5]
Dotz0cat has joined #ffmpeg
jarthur has joined #ffmpeg
Dotz0cat has quit [Ping timeout: 272 seconds]
emanuele6 has quit [Read error: Connection reset by peer]
Norkle has joined #ffmpeg
froyo has joined #ffmpeg
emanuele6 has joined #ffmpeg
Dotz0cat has joined #ffmpeg
<froyo> I want to compress my phone-recorded video. I don't know why it has enormous bitrate (17Mbps!). https://0x0.st/XyGC.txt --->ffprobe output. How do I get the bitrate down to something reasonable like 3-5Mbps?
emanuele6 has quit [Quit: WeeChat 4.3.4]
emanuele6 has joined #ffmpeg
Obsdark has quit [Quit: Nettalk6 - www.ntalk.de]
Brazhh has joined #ffmpeg
manwithluck has quit [Ping timeout: 246 seconds]
manwithluck has joined #ffmpeg
Tinos has quit [Remote host closed the connection]
dreamon has joined #ffmpeg
xx has joined #ffmpeg
CarlFK has quit [Ping timeout: 260 seconds]
<aaabbb> froyo: the bitrate is high because better compression requires higher processing requirements than most phones have to record in real-time
<aaabbb> froyo: you can use two pass recording to specify a specific bitrate
<froyo> Currently messing around with the options. Got some ridiculously low bitrate outputs. Cool stuff! How come 10 minute 720p videos are around 30MiB though? Currently trying VP9, any other options I should throw in there to get my video of similar quality?
<aaabbb> hevc or av1
<froyo> 10minute 720p videos "ON YOUTUBE" -- forgot to mention that.
<aaabbb> they drop quality significantly to do that
<aaabbb> how much time are you willing ot wait for a good encode?
<froyo> depends on how much better the veryslow option is than the veryfast is. I've looked into that -crf 1-63 option, takes half the day on a 2 minute video for me. most likely not an option.
<aaabbb> ffmpeg -an -i recording.mp4 -c:v libx265 -preset slower -b:v 4M -x265-params "pass=1:aq-mode=3" -f null -
<aaabbb> ffmpeg -i recording.mp4 -c:v libx265 -preset slower -b:v 4M -x265-params "pass=2:aq-mode=3" -c:a copy smaller-recording.mp4
<aaabbb> those two commands will set the "slower" encoding, two-passes to give you exactly 4mbps
<aaabbb> the first pass analyzes the video to make bitrate allocation decisions and the second pass does the actual encoding (audio is copied without being re-encoded)
<aaabbb> actually the last one you'll want to add -tag:v hvc1 to it since you're using hevc
<froyo> Just for clarification: My usecase: it's a video of just a few family members talking, the original is 350MiB, and I want to archive it (and alot of similar videos) so I am trying to get the size down drastically for these.
<aaabbb> how much quality are you willing to lose?
<aaabbb> it's always a trade-off
<aaabbb> fast encode + small size = low quality. fast encode + big size = high quality (this is what your original encoding is). slow encode + big size = ultra high quality. slow encode + small size = medium quality
<aaabbb> you want to archive it, as a master copy?
<froyo> Yes, as a master copy. On an external HDD that I'll hopefully not misplace 10 years down the line.
<aaabbb> beware that you *will* lose quality by doing that
Brazhh has quit [Quit: WeeChat 4.3.5]
<aaabbb> make sure that you're willing to lose quality before deleting the original
<froyo> So encoding is lossless right? So by your definition of "high quality" and "ultra high quality", shouldn't have anything to do with the video-feed right?
<aaabbb> no, encoding is lossy
<froyo> Ahh my bad.
<aaabbb> every time you convert a video, the quality is lost (except for with very specific formats like ffv1)
<aaabbb> a lower crf means a higher bitrate and a higher bitrate means more preserved quality. using a slower preset lets the encoder allocate those bits more intelligently and gets you more quality at any given bitrate. but you will always lose quality
<aaabbb> i would just keep the originals honestly unless they're not very important
<froyo> just so i understand, in your example, a slower encode and a faster encode still use the same underlying encoding format/algorithm right? it's just that the slower one is given more time to do its work more intelligently, correct?
<aaabbb> yeah the same format. it's actually a colection of algorithms so the slower one may use more precise algorithms to do the same thing
<froyo> sure. akin to say, a chess engine searching 3 moves ahead instead of 10 moves ahead -- same (or at least similar variant) algo, but just more time => better results
<aaabbb> fast presets may use something called hex motion estimation, slower presets use star motion estimation. they both do the same thing, but the star algorithm is slower (and more accurate)
<aaabbb> yeah exactly
<aaabbb> the faster presets are like "well heuristically, these 3 moves lead nowhere so i won't follow that path" but slow ones are like "we'll try almost every path that isn't obviously bad just in case a path that looks bad at first is actually better"
<froyo> makes sense, that's a nice eli5.
<aaabbb> a higher bitrate also improves quality because there are more bits for the encoder to use. the reason your mobile recording is so big is that it has to use a fast preset to record in real-time, so to compensate, it uses a lot more bits
<froyo> So you're saying that the reason why so many 480p YouTube videos look so great is because they are encoded well (and most likely slowly?) so the "quality" is preserved?
<aaabbb> it's because google has whole buildings with specially built computers that encode slowly in vp9 and av1, so by the time you look at the video, it's had possibly weeks to encode
<aaabbb> but youtube also drops the bitrate a lot so the quality can suffer especially if you watch videos with fast motion or dark scenes. but if they didn't encode it slowly, it would look much much worse
CarlFK has joined #ffmpeg
<aaabbb> they design their own chips called asics, so they literally choose where each transistor goes, instead of just using software on a generic cpu. that lets them encode very well
fengdaolong has joined #ffmpeg
<froyo> Sounds like I should just upload all my videos to YouTube, let them do their magic, and then yt-dlp them back...
<aaabbb> i wouldn't do that, youtube's quality is actually pretty low
<aaabbb> plus their magic can take weeks and they only do their *real* magic for popular videos
<aaabbb> otherwise they use quick and dirty encoding settings while they wait to see if the video is gonna get popular enough for it to be worth sending to their av1 encoding cluster
<aaabbb> i'd just use ffmpeg. if you want a precise file size use 2pass, if you want an approximate one, use crf
<aaabbb> and then set the slowest preset you can tolerate, and let it do its job. it'll do better than what youtube doe
<aaabbb> also, if you record in 10-bit, it's slower but the quality will be better than the default
<aaabbb> ffmpeg -i recording.mp4 -c:v libx265 -preset $preset -crf $crf -pix_fmt yuv420p10e -c:a copy recording-smaller.mp4
<aaabbb> change $preset to whatever you want and $crf to whtever you want
<aaabbb> lower crf mean better quality but higher bitrate
<froyo> I checked my ffprobe output but couldn't tell whether it was 8bit or 10bit. How do I tell?
<aaabbb> it's probably 8 bit but it doesn't matter, turning it to 10 bit even if the source is 8 is still beneficial
<aaabbb> even if the monitor is also 8 bit it allows more efficient compression
<froyo> Beneficial? Doesn't this increase the file size (by a factor of 4 apparently!)
<aaabbb> it also reduces a problem called "banding" where dark gradients get split up into "bands" of gray
<aaabbb> not by 4, it increases file size slightly. that just means you can increase crf to compensate
<aaabbb> because you can't compare crf across different settings
<aaabbb> at the same bitrate, 10-bit will always have hgiher quality than 8-bit
<aaabbb> it just happens that turning on 10-bit will increase the bitrate *as well as* increase the compression efficiency. but it's not a big enough change in bitrate to care about. just increase crf by 1 or 2
<froyo> I've tried a few crf values. I've heard the general guideline that 18-28 is good, but is there some "visual" way to see the difference between these numbers? I'm going to try veryslow, but each encode costs a day, so I don't want to pick 25, and then realise "oh, I should have gone way higher like 30+"
<aaabbb> in that case you would want to use 2pass
<aaabbb> instead of crf
<aaabbb> btw veryslow has severe diminishing returns. it's like 5x slower than slower, but only (something like) 2% better efficiency
<aaabbb> the bump from slow->slower is a huge boost in efficiency (but also a huge increase in time). slower->veryslow is a tiny boost in efficiency, but a huge boost in time, same with veryslow->placebo
<froyo> Oh good to know. Then Slower sounds good. But I thought you thought crf was "better" than 2pass, in whatever way "better" means.
<aaabbb> crf is constant quality, which means it will adjust bitrate to make sure the quality is subjectiely the same
<aaabbb> 2pass is constant bitrate (actually average, not constant), which means the quality will adjust in order to maintain the average bitrate
<aaabbb> complex video + crf 20 = quality "20", high bitrate. simple video + crf 20 = quality "20" low bitrate
<aaabbb> complex video + 2pass 1M bitrate = 1M bitrate, low quality. simple video + 2pass 1M bitrate = 1M bitrate, high quality
<aaabbb> so you use 2pass when you say "i want the final average bitrate to be X, i don't care about the quality just do your best" and crf means "i want the quality to be X, i don't care about the bitrate just do your best"
<aaabbb> crf 20 with one video might have 5x higher bitrate than crf 20 with another video, but the subjective quality between the two will be the same for example
<aaabbb> 2pass is slower though so keep that in mind (a little less than 2x slower)
<froyo> If ELI5 was a real job you'd be the CEO.
<aaabbb> thanks lol
rpthms has quit [Remote host closed the connection]
<aaabbb> what you might want to do is use a very fast preset to test things, choose a crf like 28, see what the final file size is. if the file size and quality are *roughly* acceptable, use crf 28 for everything
rv1sr has joined #ffmpeg
<froyo> By the way, the reason why I'm not posting so much is because I'm actively searching up terms you're writing about (keeping in line with the IRC hacker culture).
<aaabbb> it'll be very rough approximation but it can be "enough"
<aaabbb> no worries. you can ask the meanings of the terms and i can answer if you'd like
<froyo> Great, that'll make things much quicker then :D
rpthms has joined #ffmpeg
<froyo> So regardless of speed, will the crf give the exact same size output?
<aaabbb> other way around, crf will give the exact same *quality* output
<froyo> I thought speed would change the quality
<aaabbb> it does but crf doesn't determine speed
<aaabbb> crf determines quality
<aaabbb> oh i get what you mean
<froyo> Right sorry. Btw, what exactly does "quality" mean?
<aaabbb> you mean crf 20 with veryfast vs cr20 with veryslow?
<froyo> yes. would those two have the exact same file size?
<aaabbb> no the quality will differ as will the size, it's impossible to directly compare crf (or preset) against a different preset (or crf)
<aaabbb> but it will be very roughly similar-kinda-ish for you to make a good guess about whether 28 is "way too high" or "way too low"
<froyo> Is quality a sort of subjective term that judges how well the detail has been captured given the bitrate?
<aaabbb> quality is measured by something called distortion
<aaabbb> the video encoder looks at the input frame, which is the original, and it tries a few algorithms to try to create an approximation of that frame. the difference between the original and the new generated frame is called distortion
<aaabbb> with crf, it will say "this distortion is too high so i'll increase the bitrate"
<aaabbb> or it'll say "this distortion is surprisingly low, so i can reduce bitrate"
<aaabbb> a low crf (high quality) will make it a lot more picky about distortion. a high crf (low quality) will make it tolerate higher distortion
<aaabbb> and modern codecs take into account the human visual system too, so the distortion isn't just an objective "it's X% different"
<froyo> okay, so if i understood you so far, that means: crf 20 veryslow would not tolerate much distortion, and is likelier to keep the bitrate high to satisfy its "minimum distortion required" than crf 25. And while crf20 veryslow and crf20 veryfast have the exact same distortion requirements, and produces the same mathematical quality, the veryslow one looks alot more ahead and can optimize more and potentially
<froyo> lower the bitrate (and hence size) where the veryfast one would have disregarded it
<aaabbb> in theory they'd produce the same distortion requirements, in reality they'll actually differ a lot
<aaabbb> crf 20 veryslow vs crf 20 veryfast
<froyo> So veryslow would keep looking for a way to reduce the bitrate, whereas veryfast would be like "yup, this is cool, it satisifes my distortion criteria, let's move on"
<aaabbb> yeah, but veryslow may use more accurate distortion criteria too
<froyo> or rather, veryslow would be like "yo, i get that this satisfies my distortion, but let me try more algorithms to see if i can get the distortion even lower, and that might potentially be low enough to allow me to reduce the bitrate"
<aaabbb> exactly, yeah
<aaabbb> but it's not possible to accurately compare across presets and crfs. it can be enough to make only a very rough estimate like "woah, crf 21 gives me a file this big even on veryfast!? i guess i won't use it for veryslow either"
<froyo> Well then now that's mathematical quality = distortion. How about actually seeing it with your eyes. I'm going to call this "quality" in scare quotes cos it's subjective. So you're saying that veryslow and veryfast have the same quality but not the same "quality", right?
<froyo> Gotcha.
<aaabbb> veryslow and veryfast just determine what algorithms to use and what parameters to use for those algorithms. you could use veryslow with a super low bitrate and it may generate garbage because it can only do so much. crf just adjusts bitrate to keep the subjective quality roughly the same
<aaabbb> the "seeing it with your own eyes" part is what people who do encoding professionally do. they look at the output, they test with various measurements, and then they fine tune the algorithms to look the best with the particular video. but that isn't necessary for 99% of people. the majority of people can trust the codec's built in quality estimation
<aaabbb> btw the whole idea about how long it would take to find out what crf to use? that's basically what 2pass does, except it automates it. the first pass stores statistics about the video and does a guess about the crf, then the second pass uses the "calculated crf to match exactly 2mbps" that the first pass came up with. it's actually a bit more detailed than that (it passes more detailed information than
<aaabbb> just a calculated crf) but that's the general idea
<froyo> I'll definitely use 2pass then. This discussion was so damn useful and educational. Thanks a ton!!!
<aaabbb> no problem! just remember 2pass takes longer, but there's a trick you can use to speed it up, setting "turbo first pass"
<aaabbb> how preicsely it tracks your specified birate suffers slightly but it will do the first pass much faster
<aaabbb> so first pass i'd recommen this: ffmpeg -an -i in.mp4 -c:v libx265 -x265-params "pass=1:slow-firstpass=0" -b:v $bitrate -preset $preset -f null -
<aaabbb> that will do the first pass where it gathers statistics, the "-an" means it ignores audio
<aaabbb> uh, sorry if forgot to put -pix_fmt yuv420p10le in there
<aaabbb> anyway, the second would be "ffmpeg -i in.mp4 -c:v libx265 -x265-params "pass=2" -b:v $bitrate -preset $preset -pix_fmt yuv420p10le -tag:v hvc1 -c:a copy out.mp4
<aaabbb> the sexample video you gave was 17.7mbps so lt's say you want to decrease the size to 15% original, you'd just put -b:v for 15% of the original bitrate, so for 17.7mbps that would be roughly 2.6mbps (-b:v 2.6M)
<aaabbb> the -tag:v hvc1 is just to improve compatibility and is needed sometimes when you're using hevc (the format that libx265 is for)
fengdaolong has quit [Ping timeout: 252 seconds]
<aaabbb> if you want each video 15% smaller, do the same for each video
<aaabbb> the slow-firstpass=0 will speed the first pass up but the size will vary a bit, so if you tell it to do 2.6mbps it might give you 2.7 or 2.5, but it's sure not gonna give you 50mbps when you ask for 2.6
<aaabbb> if you can stomach longer encode times, remove the "slow-firstpass=0" part from that first command
fengdaolong has joined #ffmpeg
<froyo> wait, so i have to run two different commands for each pass? how does the second pass get the info from the first pass then?
<aaabbb> it saves a file
<froyo> oh it'll create it and i'm supposed to input that into pass=2, ye?
<aaabbb> it knows the file name so you don't have to input manually
<aaabbb> it creates x265_2pass.log and 265_2pass.log.cutree
<aaabbb> after the second pass you are free to delete them
<froyo> also slow-firstpass=0, i thought if i could stomach slow passes I'd just change the $preset to slower, no?
<aaabbb> yeah but i mean if you're already fine with the speed. 2 pass is never more than twice as slow but slower to veryslow can be like 5x slower
<aaabbb> if you can and since family footage is probably important, try to go with slower rather than slow, even if it takes a while
<froyo> slow to slower is good ROI ye?
<aaabbb> yeah it is, at least for something important like archiving family videos. the investment is a fair bit higher but so is the return
<aaabbb> in particular it uses a much better rdo (rate distortion optimization) which is that distortion metric
<froyo> btw, what does -pix_fmt yuv420p10le do? According to my ffprobe output, the video is already in yuv4:2:0
<aaabbb> that turns on 10-bit
<aaabbb> yuv420p would be 8-bit 4:2:0 subsampling, yuv420p10le is 10-bit 4:2:0 subsampling (and 4:2:0 is the most common subsampling anyway)
<aaabbb> the "10" means 10 bit, the "le" means little-endian (endian refers to the order of bits and is necessary to specify if you're above 8 bits)
<froyo> Ah so you're converting to 10bit first (which somehow helps with compression later on, even if it temporarily increases the file size.I don't know exactly how but I'll take your word for it), then applying 4:2:0 on this 10-bit video, right?
<aaabbb> well it'll always be 4:2:0
<aaabbb> it's just that there's no option for yuv-$same_as_Source-10le
<froyo> Oh. I thought little-endian would mean that it would help sync up with my little endian processor for added performance or something...
<aaabbb> well that's probably why everyone uses le, since most processors are le
<aaabbb> but with 8 bit there's no endian issue, because endian only matters for >8 bits
dreamon has quit [Ping timeout: 252 seconds]
<aaabbb> it just means that it'll be doing computation on pixels with 10 bit values internally, even if it has to be eventually truncated to be played on an 8 bit monitor
<froyo> and doing it in this roundabout way *somehow* makes the video better?
xx has quit [Ping timeout: 260 seconds]
<aaabbb> it makes it better because the math can be done at a highe rprecision
<aaabbb> just like if you have 8-bit audio and you want to do some special effects, you'll want to convert it to 32-bit float or even 64-bit double first, even if it'll go right back to 8-bit after
<froyo> ah I see. Nice!
xx has joined #ffmpeg
<froyo> How do you know so much about ffmpeg anyway, and all of these esoteric options? Is this how most people get their archiving done, letting their PC chill in a corner for days?
<aaabbb> well i worked for a video distribution site as a sysadmin. the guy who did encoding left, and i "temporarily" got his job, and i've had his job for... like a year. before then i only knew the basics of ffmpeg
<aaabbb> i just spent a lot of time reading wikipedia, ffmpeg documentation, the doom9 forum (for video codecs), and the hydrogenaudio forum (for audio codecs)
<aaabbb> a lot was reading https://x265.readthedocs.io/en/master/cli.html (the x265 cli, the options can be passed to libx265 in ffmpeg via -x265-params) and asking myself "what does it mean by psychovisual rate/distortion optimization, and why does it claim that a high value can cause smearing?"
<aaabbb> then i look into what rdo is, how psychovisual optimizations work, how other codecs do the same thing etc. i'm not an expert but now at least i know enough to be dangerous :)
<aaabbb> froyo: tbh most people just download some crappy "video conveter pro free edition" off the web and do a crappy encode to avi and delete the originals without realizing how awful their encodes are until it's too late lol
<aaabbb> but generally i like to archive the original
<aaabbb> i'd rather pay for an extra hard drive or two than regret encoding and throwing away the originals in 10 years
<aaabbb> and although there are lossless codecs like ffv1 that i use for archiving analog media like vhs, it's not really useful for archiving digital media because the bitrate is always extremely high
fengdaolong has quit [Quit: WeeChat 4.4.1]
xx has quit [Remote host closed the connection]
xx has joined #ffmpeg
lavaball has joined #ffmpeg
coldfeet has joined #ffmpeg
jarthur has quit [Quit: jarthur]
HerbY_NL has joined #ffmpeg
rv1sr has quit []
MoC has joined #ffmpeg
MoC has quit [Quit: Konversation terminated!]
alexherbo2 has joined #ffmpeg
cmc has quit [Remote host closed the connection]
cmc has joined #ffmpeg
mikehu44 has joined #ffmpeg
Tinos has joined #ffmpeg
yawkat has quit [Ping timeout: 245 seconds]
kepstin has quit [Remote host closed the connection]
froyo has quit [Read error: Connection reset by peer]
kepstin has joined #ffmpeg
HerbY_NL has quit [Ping timeout: 252 seconds]
rex has quit [Quit: Bye]
rex has joined #ffmpeg
froyo has joined #ffmpeg
flom84 has joined #ffmpeg
flom84 has quit [Client Quit]
Kei_N__ has quit [Quit: leaving]
Kei_N has quit [Quit: leaving]
Kei_N has joined #ffmpeg
zsoltiv_ has quit [Ping timeout: 272 seconds]
HerbY_NL has joined #ffmpeg
froyo has quit [Read error: Connection reset by peer]
evilscreww has joined #ffmpeg
froyo has joined #ffmpeg
<froyo> So I am using 2pass to encode my phone-captured video. During the 2nd pass, some generic library error caused the encoding to stop midway. It says 4899 frames were encoded. So can I resume from frame #4900 and then later concat? https://0x0.st/Xydt.txt
<froyo> Potentially a repost because I disconnected from the irc a few seconds after posting this^. sorry if it got printed twice!
<froyo> How do I resume encoding from 4900 though. Presumably I'd have to crop the original, encode that, THEN concat. I found this command -vf select="between(n\,4900\,$endframe),setpts=PTS-STARTPTS" but that ended up creating an audio-only file that still started from the beginning.
<aaabbb> the only way to resume would be if you cut it at a keyframe
lavaball has quit [Remote host closed the connection]
jemius has joined #ffmpeg
rvalue has quit [Read error: Connection reset by peer]
rvalue has joined #ffmpeg
vampirefrog has joined #ffmpeg
HerbY_NL has quit [Quit: My MacBook has gone to sleep. ZZZzzz…]
beaver has joined #ffmpeg
yawkat has joined #ffmpeg
beaver has quit [Remote host closed the connection]
froyo has quit [Read error: Connection reset by peer]
froyo has joined #ffmpeg
<froyo> So I have to trim out.mp4 until the last keyframe (not yet sure whether it has some leftover frames after the last keyframe at 39.75.00000), then *somehow* resume it after that point using the filter options.
<froyo> The trouble here is that if the keyframes of both in.mp4 and out.mp4 lined up, I could continue the encoding from the in.mp4, from 39.750000, and then concat afterwards. But 39.75.0000 is not a keyframe in the in.mp4. Am I SOL?
<froyo> Again I'm doing google searches and reading through posts explaining this, so I hope I'm explaining my situation (and my conclusions) correctly.
<aaabbb> you'd have to cut the out.mp4 to the last keyframe then in in.mp4, seek to the frame that would be right past it
<aaabbb> the keyframes in in.mp4 don't matter at all
<aaabbb> as far as the encoder is concerned, in.mp4 is just a series of raw images with no distinction other than their content, no concept of frame types or anything
<aaabbb> also, this is assuming out.mp4 was actually finalized
<aaabbb> lie if the encoding *stopped* like crashed, you're probably sol
<aaabbb> froyo: can you play the (incomplete) out.mp4?
<aaabbb> s/lie/like/
<aaabbb> if you can play it then it means the moov atom (metadata needed to play the video) was created successfully and then you'll be able to concat it with the rest of the video
<aaabbb> but if it can't play then you're sol and have to restart that second pass again. you don't have to redo the first pass though
froyo has quit [Read error: Connection reset by peer]
edman007 has quit [Remote host closed the connection]
alexherbo2 has quit [Remote host closed the connection]
alexherbo2 has joined #ffmpeg
alexherbo2 has quit [Remote host closed the connection]
alexherbo2 has joined #ffmpeg
alexherbo2 has quit [Remote host closed the connection]
zsoltiv_ has joined #ffmpeg
Sketch has quit [Ping timeout: 248 seconds]
Sketch has joined #ffmpeg
rv1sr has joined #ffmpeg
rv1sr has quit [Client Quit]
rv1sr has joined #ffmpeg
lavaball has joined #ffmpeg
evilscreww has quit [Quit: Leaving]
waleee has joined #ffmpeg
minimal has joined #ffmpeg
froyo has joined #ffmpeg
HerbY_NL has joined #ffmpeg
<froyo> aaabbb: Unstable network. Last message I saw from you was "If it doesn't play then you're sol." The incomplete out.mp4 does indeed play for the first 40 seconds before my media player (mpv) starts bugging (thinking it's 44 seconds long, the playbar flickers).
grepfor has joined #ffmpeg
<froyo> I'm currently looking for a command that will "resume" encoding of out.mp4 at 39.75.00000. It seems that alot of people online have this issue, but the actual command remains has eluded my search. Other people have said I should run it in a VM to be able to pause the program. People say segmenting is the way to go. I'm going to start doing that from now on I guess...
cxc has joined #ffmpeg
HerbY_NL has quit [Quit: My MacBook has gone to sleep. ZZZzzz…]
BSaboia_ is now known as BSaboia
waleee has quit [Ping timeout: 264 seconds]
Sketch has quit [Ping timeout: 255 seconds]
Sketch has joined #ffmpeg
tel has joined #ffmpeg
lucasta has joined #ffmpeg
Sketch has quit [Ping timeout: 248 seconds]
FH_thecat has joined #ffmpeg
froyo has quit [Read error: Connection reset by peer]
Sketch has joined #ffmpeg
BSaboia has quit [Quit: ZNC - https://znc.in]
BSaboia has joined #ffmpeg
rv1sr has quit [Ping timeout: 246 seconds]
rsx has joined #ffmpeg
HerbY_NL has joined #ffmpeg
HerbY_NL has quit [Client Quit]
rsx has quit [Quit: rsx]
rv1sr has joined #ffmpeg
System_Error has joined #ffmpeg
dreamon has joined #ffmpeg
HerbY_NL has joined #ffmpeg
Warcop has joined #ffmpeg
HerbY_NL has quit [Client Quit]
Suchiman has joined #ffmpeg
HerbY_NL has joined #ffmpeg
HerbY_NL has quit [Quit: My MacBook has gone to sleep. ZZZzzz…]
Icycle has quit [Quit: A lol made me boom.]
Icedream has joined #ffmpeg
waleee has joined #ffmpeg
talismanick has quit [Remote host closed the connection]
yans has joined #ffmpeg
Sketch has quit [Ping timeout: 260 seconds]
gioyik has joined #ffmpeg
Sketch has joined #ffmpeg
alexherbo2 has joined #ffmpeg
lavaball has quit [Quit: lavaball]
coldfeet has quit [Remote host closed the connection]
dreamon has quit [Ping timeout: 246 seconds]
rv1sr has quit []
Sketch has quit [Ping timeout: 272 seconds]
deus0ww has quit [Ping timeout: 248 seconds]
deus0ww has joined #ffmpeg
jemius has quit [Quit: Leaving]
Sketch has joined #ffmpeg
kus has quit [Ping timeout: 265 seconds]
kus has joined #ffmpeg
sentriz has quit [Ping timeout: 244 seconds]
lavaball has joined #ffmpeg
Haxxa has quit [Quit: Haxxa flies away.]
yans has quit [Ping timeout: 252 seconds]
yans has joined #ffmpeg
Haxxa has joined #ffmpeg
yans has quit [Ping timeout: 252 seconds]
yans has joined #ffmpeg
kus has quit [Read error: Connection reset by peer]
Tinos has quit [Remote host closed the connection]
<bpmedley> aaabbb: Thank you for your previous discussion about encoding options and parameters. Helped me a lot.
gioyik has quit [Quit: WeeChat 4.4.1]
alexherbo2 has quit [Remote host closed the connection]
kus has joined #ffmpeg
georgereynolds8 has quit [Quit: Ping timeout (120 seconds)]
georgereynolds8 has joined #ffmpeg
BSaboia has quit [Quit: ZNC - https://znc.in]
BSaboia has joined #ffmpeg
Tinos has joined #ffmpeg
flotwig_ has quit [Ping timeout: 246 seconds]
flotwig has joined #ffmpeg
MrZeus has joined #ffmpeg
BSaboia has quit [Quit: ZNC - https://znc.in]
BSaboia has joined #ffmpeg
jarthur has joined #ffmpeg
<vlt> Hello. Is there a known problem when writing to a mounted smb share? I have lots of freshly encoded libx264 mp4 files (using -movflags +faststart) that when read by the very same ffmpeg produce lots of errors: https://termbin.com/okv8
grepfor has quit [Quit: Client closed]
<BtbN> That's be a bug in the filesystem driver then, ffmpeg can only rely on the OS working correctly
<vlt> Ok, I will run some more tests writing to local and remote filesystems. Thank you.
<DHE> faststart will do a hell of a lot of IO at the end of the encode. it might stress out some file servers, or just generally take a while...
Sketch has quit [Ping timeout: 248 seconds]
user982374 has joined #ffmpeg
sentriz has joined #ffmpeg
memset has quit [Ping timeout: 260 seconds]
<user982374> I am experiencing sluggish video playback on Youtube at 1080p resolution while in fullscreen mode (literally almost frame by frame). This is not happening while not in fullscreen mode. I am using Fedora 39 fresh install with dnf update performed after the install and running an Xfce session. During install I selected not to enable 3rd party repositories. Here is the output of dnf repolist all and dnf repolist enabled: https://imgur.com/screenshot-9
<user982374> aCN7n4. Is this issue related to not having certain codecs installed, and if yes, which repositories should I enable? Will this on itself fix the behavior, and if no, which additional packages I might need to install? The commands I've tried so far, and the behavior is identical: https://rpmfusion.org/Configuration, then dnf swap ffmpeg-free ffmpeg --allowerasing, dnf install libva-intel-driver, dnf update @multimedia --setopt="install_weak_deps=Fa
<user982374> lse" --exclude=PackageKit-gstreamer-plugin, dnf install intel-media-driver, DRM is enabled in firefox. Thank you in advance
Sketch has joined #ffmpeg
BSaboia has quit [Quit: ZNC - https://znc.in]
sentriz has quit [Ping timeout: 255 seconds]
BSaboia has joined #ffmpeg
<BtbN> I doubt that's related to ffmpeg.
Sketch has quit [Ping timeout: 248 seconds]
memset has joined #ffmpeg
SuicideShow has quit [Ping timeout: 260 seconds]
SuicideShow has joined #ffmpeg
<JanC> YT anti-download stuff likely
lavaball has quit [Remote host closed the connection]
<BtbN> I also doubt that
<JanC> they want you to watch on their website, with ads
<furq> well they're doing a really bad job of it then
<BtbN> JanC: My interpretation is that it's _in a browser_
<JanC> oh, then why ask in #ffmpeg...
<furq> because his browser uses libavcodec to decode video
<JanC> but there are so many layers in between...
<furq> well yeah
<furq> it definitely has nothing to do with codecs being disabled or whatever
<JanC> and DRM disables hardware decoding AFAICT, that might be relevant too :)
<furq> does youtube ever use drm
<JanC> hm, maybe not
<JanC> maybe for YT Red (or whatever it's called nowadays?)
<furq> if it works in fullscreen but not outside of fullscreen then i assume the issue is elsewhere in the graphics stack
<furq> or the other way around i mean
<JanC> browsers will do all sorts of layering of HTML/CSS/etc. on top also probably, so not purely ffmpeg/libavcodec
jarthur has quit [Quit: jarthur]
Juest has quit [Ping timeout: 252 seconds]
<JanC> I think most browsers use an internal copy of libavcodec also?
Juest has joined #ffmpeg
xx has quit [Ping timeout: 260 seconds]
<DHE> likely, but libavcodec is probably unrelated to whether playback is in full-screen mode.