<FlorianBad>
Now it means that on the server side I will need to slice the mp4 H.264 video file right where the data can be sliced, which I assume is an I-frame?
<FlorianBad>
So my question is: 1- Can I do this manually by just opening the file with my code and is it easy to find where the block of "slicable" data starts/ends.
<FlorianBad>
2- If not, then how can I do this efficiently with ffmpeg, if appropriate (because I don't want to call some heavy ffmpeg command that reads the whole file at every chunk of buffer requested by client)
jtgd has quit [Quit: WeeChat 4.1.2]
<FlorianBad>
(same question for VP9 because I'm considering using exclusively vp9 for html5 now that it seems supported by all browsers)
jtgd has joined #ffmpeg
Ogobaga has quit [Quit: Konversation terminated!]
Ogobaga has joined #ffmpeg
<FlorianBad>
Also, if anyone knows a tool that allows me to visualize the raw bytes of a video file and understand them easily that would be great. e.g. something that tell me "You see here are the headers with meta data, and at that point starts the first frame, and the audio is in these bytes, then here is a keyframe so you could cut here, etc." I'd love to be able to experiment with this and truly understand it
zumba_addict has quit [Quit: Client closed]
<damo22>
hexdump -C
waleee has quit [Ping timeout: 256 seconds]
<FlorianBad>
damo22, sure, I know that too... but it won't explain me anything I see. Good luck trying to find beginning and end of I-frames like that
<furq>
there's h264_analyze for h264 but idk of an equivalent for vp9
<furq>
i take it you have to do this segmenting on the fly
<furq>
if you can do it ahead of time then that would be trivial
jagannatharjun has joined #ffmpeg
chiselfuse has quit [Read error: Connection reset by peer]
fling has quit [Remote host closed the connection]
chiselfuse has joined #ffmpeg
fling has joined #ffmpeg
stolen has joined #ffmpeg
zumba_addict has joined #ffmpeg
zumba_addict has quit [Client Quit]
zumba_addict has joined #ffmpeg
holgersson has joined #ffmpeg
zumba_addict has quit [Quit: Client closed]
Marth64 has quit [Read error: Connection reset by peer]
zumba_addict has joined #ffmpeg
zoff99 has joined #ffmpeg
<zoff99>
hello
<zoff99>
i am trying to do this: 'ffmpeg -re -i video.mp4 -video_size 320x200 -pix_fmt rgb565le -f fbdev /dev/fb0' but with a small C program and seem to fail
<zoff99>
is there an example or documentation how i can do this?
<zoff99>
with ffmpeg version 6.x
AbleBacon has quit [Read error: Connection reset by peer]
clark_hung has quit [Quit: Client closed]
zumba_addict has quit [Quit: Client closed]
zumba_addict has joined #ffmpeg
iqualms has joined #ffmpeg
mosasaur has joined #ffmpeg
<FlorianBad>
furq, I guess I could, I could use folders instead of files, so that the webm VP9 file would be already split into many files in a folder, then my server handles how to read these files. But either way I first need to understand where exactly in the bytes I can split that (which is probably at the I-frames/keyframes I'm assuming?)
kushal__ has joined #ffmpeg
kus has quit [Ping timeout: 260 seconds]
YuGiOhJCJ has joined #ffmpeg
<iqualms>
Does anyone have any idea how to fix "The "mapping" option is deprecated: set input to output plane mapping" message when using mergeplanes? I can't seem to find any info on google.
clark_hung has joined #ffmpeg
<furq>
FlorianBad: i have no idea what mediasource expects but it's trivial to split at keyframes with the segment muxer
<furq>
iqualms: you can just ignore that warning but i guess it wants you to use map0s etc instead
<iqualms>
furq: Thank you, do you have an example of how mapXs are used, or are the first 2 examples using a0/a1/a2 using that?
<furq>
the first example would be mergeplanes=map0s=0:map1s=1:map2s=2
MrZeus_ has quit [Ping timeout: 252 seconds]
kus has joined #ffmpeg
<iqualms>
Hmm, sorry but could you explain that a little more, is that equivalent to mergeplanes=0x001020?
<furq>
yes
<FlorianBad>
furq, if I was using a -segment_time of 0.1 of ex, would I be basically guaranteed to split at each keyframe? (that way I can then handle how I deal with sending them)
<furq>
i believe so
<furq>
each segment will start and end on a keyframe but i forget what it does if the segment_time is very short
<FlorianBad>
ok, and does it the same same as if I was spliting the file? In other words, if I concat all output files afterwards do I find back the exact same bytes as the original?
<furq>
it might generate empty segments
<FlorianBad>
Well, it might put some meta data or headers in each segment, right?
<furq>
and yeah the bitstreams should be identical if you concat them
<furq>
the container won't be
<FlorianBad>
but the output files themselves will have other stuff?
<furq>
yeah you're remuxing
kushal__ has quit [Ping timeout: 268 seconds]
<FlorianBad>
Yeah so that's probably not what I need because I'm pretty certain the JS buffer thing expects just more bytes of the same video coming, not some meta data, headers, or other stuff, since it's just feeding the same video that's already playing
<FlorianBad>
Maybe what I need to study is what happens with livestream stuff, when things are formatted for livestream, because there's no beginning and no end
<furq>
you can create raw h264 segments if you want to
<furq>
but those are missing timing information
<furq>
live streaming usually uses mpegts segments over hls or mpeg-dash
<furq>
or fmp4 segments nowadays
<iqualms>
furq: Could you provide an equivlent for the 2nd example? Trying to get my head around this. :)
<FlorianBad>
furq, well, in the end I am building a player, so my javascript can display any time and progress bar I want. So as long as *I* know what time segments correspond to, that should be fine
<furq>
i mean the frame timing
<furq>
raw h264 is just a set of frames with no timestamps
<furq>
so you'd need to know the framerate separately, or a list of timestamps if it's vfr
<furq>
iqualms: i think i got that first example wrong
<FlorianBad>
ok. So you wouldn't go with a livestream approach then?
<furq>
it should be map0s=0:map1s=0:map2s=0
<FlorianBad>
There's probably a very specific set of bytes in the webm container where I can safely split, maybe I need to study that, because if that's the case all I have to do is make my program on the server side handle the splits just like that
<FlorianBad>
where are the webm and VP9 formats defined? Like the actual raw bytes you get in your final file?
<JEEB>
webm is a subset of matroska, which has an RFC defined by EL GOOG
<FlorianBad>
JEEB, thanks. I wish there was a simple example somewhere that shows the bytes just so I can start understanding it without being overwhelmed with docs :)
<JEEB>
I think mkvinfo with specific options gave you more or less a dump of structures
<JEEB>
on that container side
<iqualms>
furq: Yeah it's not very clear. I was thinking the first example could be map2s=1:map3s=2 and the second map1p=1:map2p=2:map3s=1?
<iqualms>
(or -1 on mapX)
<FlorianBad>
JEEB, oh! Nice, didn't know about that command, will study that tomorrow
<JEEB>
the trace_headers bit stream filter will dump what it was able to parse
<JEEB>
also for the record, VP9 in MP4 is also completely specified, so if you just want to utilize a single container for whatever media you're utilizing that should be possible
<JEEB>
for similar stuff as mkvinfo you can check L-SMASH's boxdumper or AtomicParsley (in QT File Format which MP4 bases on the basic structure was called 'atoms', and with MP4 that was changed to 'boxes')
<FlorianBad>
well today every significant browser seems to support webm + VP9, so I'm planning on only offering that format
elastic_dog has quit [Ping timeout: 246 seconds]
<damo22>
FlorianBad: i think safari does not
<JEEB>
pretty sure mp4+vp9 works on a similar enough amount of stuff
<JEEB>
since I've played it in both chromium and firefox
<FlorianBad>
ok so only about a year ago Safari started to fully support it
<FlorianBad>
which is fine for me because I will release my stuff next year
<FlorianBad>
Any anyone using IE is an idiot, so I'm also ok with that
<damo22>
boycott GMAFIA : google microsoft apple facebook intel amazon
<galad>
from what I see on that site webm is not supported on iOS
<FlorianBad>
oh wait, what is that "WebRTC" thing in caniuse.com for iOS? "Supports VP9 codec used in WebRTC (off by default.)"
<damo22>
every time i send a webm to my friends apple phone he cant see it
elastic_dog has joined #ffmpeg
<FlorianBad>
hmm ok, so does it mean I need to wrap VP9 into mp4?
<FlorianBad>
(like JEEB mentioned above)
<JEEB>
tested random 10bit vp9 in mp4 in 13.6.x safari and doesn't work on an older macbook (2015 model). don't recall if APPL limited VP9 support only to things with built-in hardware decoder.
MrZeus__ has quit [Ping timeout: 245 seconds]
<galad>
the software vp9 decoder might be disabled on laptops, there should be an option to enable it in the developer menu
<damo22>
you have to be part of the MPEG-LA conspiracy to have working video on apple?
<galad>
usually you have to use a format that has an hardware decoder
navi has joined #ffmpeg
<FlorianBad>
Ok, so does it mean VP9+webm for everyone, and H.264+mp4 for Apple devices?
<damo22>
just tell apple users they need to buy a device that respects their choice of video
<iqualms>
furq: Yeah, map1s=1:map2s=2 and map1p=1:map2p=2:map3s=1 should be correct. Thank you for help.
lusciouslover has joined #ffmpeg
rv1sr has joined #ffmpeg
jagannatharjun has quit [Quit: Connection closed for inactivity]
lusciouslover has quit [Ping timeout: 260 seconds]
<zoff99>
can someone point me to a C example how to put an AVFrame or packet on a simple output device like fbdev or SDL ?
<zoff99>
i cant find any simple example
sagax has joined #ffmpeg
zumba_addict has quit [Quit: Client closed]
jagannatharjun has joined #ffmpeg
<JEEB>
zoff99: AVFrame is what contains a decoded raw buffer, for software/RAM buffers you can for example see how ffplay does where it calls `SDL_Update * `
<zoff99>
one thing thats strange is avformat_write_header() takes like many seconds
<zoff99>
i dont know why
<zoff99>
does SDL not need that?
<JEEB>
avformat concerns itself with containers
<JEEB>
or formats/protocols
<JEEB>
not raw received decoded content
<zoff99>
ok
<JEEB>
so if you have a decoded AVFrame, you're in a good state
<zoff99>
basically what i am trying to do is open an ffmpeg output device and put something there. in C
<JEEB>
ah, avdevice
<JEEB>
yes, that is "fun"
<zoff99>
i have a yuv buffer in RAM
<JEEB>
I would not recommend using the avdevice unless you need to
<zoff99>
to start with. and now i want to display it. but well in a generic way. so that i can switch SDL or fbdev or whatever
<zoff99>
I would not recommend using the avdevice unless you need to -> ok
<JEEB>
since they don't deal with AVFrames. at most, some support AV_CODEC_ID_WRAPPED_AVFRAME
<JEEB>
but neither of those two you mentioned do that
<zoff99>
i basically have the YUV data in ram and want to get this on display fast
unlord has quit [Ping timeout: 255 seconds]
<zoff99>
until now i just paint it to /dev/fb0 with cpu. which is meh.
<zoff99>
this does what i want on the raspi5 without X
<zoff99>
i can get raw frames out of the file thats not a problem
<zoff99>
ffplay somewhow does SDL without X aswell. not sure how
<zoff99>
thanks a bunch, i will try it with SDL
zoff99 has quit [Quit: Leaving]
pah is now known as pa
fling has quit [Ping timeout: 240 seconds]
fling has joined #ffmpeg
stolen has quit [Quit: Connection closed for inactivity]
kmikita has quit [Remote host closed the connection]
kmikita has joined #ffmpeg
Blacker47 has joined #ffmpeg
rvalue has quit [Ping timeout: 268 seconds]
zsoltiv__ has quit [Ping timeout: 256 seconds]
waleee has joined #ffmpeg
alexherbo2 has joined #ffmpeg
iliv has quit [Ping timeout: 268 seconds]
rvalue has joined #ffmpeg
iliv has joined #ffmpeg
kron has joined #ffmpeg
function1_ has joined #ffmpeg
function1 has quit [Ping timeout: 260 seconds]
zsoltiv_ has joined #ffmpeg
ZedHedTed has quit [Remote host closed the connection]
ZedHedTed has joined #ffmpeg
l4yer has joined #ffmpeg
YuGiOhJCJ has quit [Quit: YuGiOhJCJ]
clark_hung has quit [Quit: Client closed]
vlm has joined #ffmpeg
alexherbo2 has quit [Remote host closed the connection]
alexherbo2 has joined #ffmpeg
LionEagle has joined #ffmpeg
lexano has joined #ffmpeg
jagannatharjun has quit [Quit: Connection closed for inactivity]
rsx has joined #ffmpeg
Starz0r has quit [Ping timeout: 256 seconds]
Estrella has joined #ffmpeg
Estrella has quit [Client Quit]
vlm has quit [Quit: Leaving]
Estrella has joined #ffmpeg
asbam has quit [Remote host closed the connection]
Starz0r has joined #ffmpeg
ttys000 has quit [Ping timeout: 268 seconds]
Kei_N has quit [Ping timeout: 245 seconds]
Kei_N has joined #ffmpeg
ttys000 has joined #ffmpeg
alexherbo2 has quit [Remote host closed the connection]
alexherbo2 has joined #ffmpeg
jagannatharjun has joined #ffmpeg
BillTorvalds has joined #ffmpeg
AbleBacon has joined #ffmpeg
Marth64 has joined #ffmpeg
Muimi has joined #ffmpeg
Muimi has quit [Client Quit]
mosasaur has quit [Quit: Leaving]
alexherbo2 has quit [Remote host closed the connection]
alexherbo2 has joined #ffmpeg
alexherbo2 has quit [Remote host closed the connection]
alexherbo2 has joined #ffmpeg
iqualms has quit [Remote host closed the connection]
MootPoot has quit [Quit: Connection closed for inactivity]
minimal has joined #ffmpeg
stolen has joined #ffmpeg
LionEagle has quit [Quit: Leaving]
pa has quit [Ping timeout: 276 seconds]
rsx has quit [Quit: rsx]
<MisterMinister>
Greetings and Salutations! Looking to setup a proxy strictly with codec copy, UDP input and HLS output. Can not transcode, too little CPU available. Is there any way to setup failover looped video/image filler please, in case primay UDP input is having an outage? Could playlists, concat demux/protocol be used for that please?
<JEEB>
might want to look into tsduck or so if you want to strictly copy around MPEG-TS, since I think its tooling will let you pass it on with minimal changes
<JEEB>
for backup sources, you'd have to have a backup multicast or so
<MisterMinister>
JEEB: if using multicast as backup instead of a 5-10 sec video file, how would proxy know to return to primary whan its back up? With a looped playlist that would force proxy to attempt to connect to primary source (UDP) and if its still down move to the next item on playlist - the backup 5 sec video. Or so I thought...
<JEEB>
that depends on how you control whatever thing you're in control of
<JEEB>
but yea, feel free to solve this in any way you feel is better :P just that if you want minimal changes to the input MPEG-TS, going through full demux-mux is probably not the best of ideas
<MisterMinister>
JEEB: understood. tsduck you think is doing a better job with TS manipulations and can handle live source switching?
<JEEB>
you only brought up backup source stuff later
<JEEB>
so look up its documentation and figure itout
<JEEB>
usually I utilize one of multicat, udpxy or tsduck to deal with multicast or similar MPEG-TS related shenanigans
kus has quit [Read error: Connection reset by peer]