
Recherche avancée
Médias (1)
-
Video d’abeille en portrait
14 mai 2011, par
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (53)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ; -
Librairies et binaires spécifiques au traitement vidéo et sonore
31 janvier 2010, parLes logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
Binaires complémentaires et facultatifs flvtool2 : (...)
Sur d’autres sites (8556)
-
Error when converting mp4 video to HLS using stream as input in node js using ffmpeg
6 janvier 2023, par Sankalp KatariaI am trying to convert an MP4 file to HLS using ffmpeg.
Code :



var stream = createReadStream(filePath);
ffmpeg(stream)
 .on('stderr', function(stderrLine) {
 console.log('Stderr output: ' + stderrLine);
 })
 .on('end', function() {
 console.log('done processing input stream');
 })
 .on('error', function(err) {
 console.log('an error happened: ' + err.message);
 })
 .save(join(__basedir, "public", `file.m3u8`));




OutPut :



Stderr output: ffmpeg version git-2020-05-22-38490cb Copyright (c) 2000-2020 the FFmpeg developers
Stderr output: built with gcc 9.3.1 (GCC) 20200513
Stderr output: configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libsrt --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom --disable-w32threads --enable-libmfx --enable-ffnvcodec --enable-cuda-llvm --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth --enable-libopenmpt --enable-amf
Stderr output: libavutil 56. 46.100 / 56. 46.100
Stderr output: libavcodec 58. 86.101 / 58. 86.101
Stderr output: libavformat 58. 43.100 / 58. 43.100
Stderr output: libavdevice 58. 9.103 / 58. 9.103
Stderr output: libavfilter 7. 82.100 / 7. 82.100
Stderr output: libswscale 5. 6.101 / 5. 6.101
Stderr output: libswresample 3. 6.100 / 3. 6.100
Stderr output: libpostproc 55. 6.100 / 55. 6.100
Stderr output: [mov,mp4,m4a,3gp,3g2,mj2 @ 000001b68c53cb00] overread end of atom 'stsd' by 34 bytes
Stderr output: [mov,mp4,m4a,3gp,3g2,mj2 @ 000001b68c53cb00] stream 0, offset 0x30: partial file
Stderr output: [mov,mp4,m4a,3gp,3g2,mj2 @ 000001b68c53cb00] Could not find codec parameters for stream 0 (Video: h264 (avc1 / 0x31637661), none, 1920x1080, 2528 kb/s): unspecified pixel format
Stderr output: Consider increasing the value for the 'analyzeduration' and 'probesize' options
Stderr output: Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'pipe:0':
Stderr output: Metadata:
Stderr output: major_brand : isom
Stderr output: minor_version : 512
Stderr output: compatible_brands: isomiso2avc1mp41
Stderr output: encoder : Lavf56.25.101
Stderr output: Duration: 00:03:00.97, start: 0.000000, bitrate: N/A
Stderr output: Stream #0:0(und): Video: h264 (avc1 / 0x31637661), none, 1920x1080, 2528 kb/s, 23.98 fps, 23.98 tbr, 24k tbn, 48k tbc (default)
Stderr output: Metadata:
Stderr output: handler_name : VideoHandler
Stderr output: Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 127 kb/s (default)
Stderr output: Metadata:
Stderr output: handler_name : SoundHandler
Stderr output: Stream mapping:
Stderr output: Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
Stderr output: Stream #0:1 -> #0:1 (aac (native) -> aac (native))
Stderr output: [mov,mp4,m4a,3gp,3g2,mj2 @ 000001b68c53cb00] stream 0, offset 0x30: partial file
Stderr output: pipe:0: Invalid data found when processing input
Stderr output: Cannot determine format of input stream 0:0 after EOF
Stderr output: Error marking filters as finished
Stderr output: Conversion failed!
Stderr output:
an error happened: ffmpeg exited with code 1: pipe:0: Invalid data found when processing input
Cannot determine format of input stream 0:0 after EOF
Error marking filters as finished
Conversion failed!




I've also tried with moveflag option

.outputOptions("-movflags isml+frag_keyframe")
also with-movflags faststart



I've read through
How do you use Node.js to stream an MP4 file with ffmpeg ?



But i didn't quite understand what and how to do it.


-
ffmpeg sRGB jpgs to rec709 mov
9 mars 2021, par maxdownunderI'm trying to convert an sRGB jpeg sequence into a rec709 DnX Quicktime... but struggle to get the colormatrix right.


ffmpeg -y -f image2 -start_number 1 -framerate 24 -i input.%04d.jpg -c:v dnxhd -vf "scale=1920:1080,fps=24,format=yuv422p10,colorspace=all=bt709:itrc=srgb" -b:v 175M -c:a pcm_s16le output.mov



My understanding is this should set everything to rec709, but override the input response curve (itrc=srgb), but this fails with


Unsupported input primaries 2 (unknown)
Error while filtering: Invalid argument
Failed to inject frame into filter network: Invalid argument
Error while processing the decoded data for stream #0:0



I haven't been able to come up with any combination of flags that doesn't trough an error and produces the right colors.


Some docs here : ffmpeg filters trc


Any help much appreciated !


Edit :
Full log :


$ ffmpeg -y -f image2 -start_number 1 -framerate 24 -i input.%04d.jpg -c:v dnxhd -vf "scale=1920:1080,fps=24,format=yuv422p10,colorspace=all=bt709:itrc=srgb" -b:v 175M -c:a pcm_s16le output.mov
ffmpeg version 4.3.1-2020-11-08-essentials_build-www.gyan.dev Copyright (c) 2000-2020 the FFmpeg developers
 built with gcc 10.2.0 (Rev3, Built by MSYS2 project)
 configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-lzma --enable-zlib --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-sdl2 --enable-libwebp --enable-libx264 --en
able-libx265 --enable-libxvid --enable-libaom --enable-libopenjpeg --enable-libvpx --enable-libass --enable-libfreetype --enable-libfribidi --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-d3d11va --enable-dxva2 --enable-li
bmfx --enable-libgme --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libtheora --enable-libvo-amrwbenc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-librubberband
 libavutil 56. 51.100 / 56. 51.100
 libavcodec 58. 91.100 / 58. 91.100
 libavformat 58. 45.100 / 58. 45.100
 libavdevice 58. 10.100 / 58. 10.100
 libavfilter 7. 85.100 / 7. 85.100
 libswscale 5. 7.100 / 5. 7.100
 libswresample 3. 7.100 / 3. 7.100
 libpostproc 55. 7.100 / 55. 7.100
Input #0, image2, from 'input.%04d.jpg':
 Duration: 00:00:00.13, start: 0.000000, bitrate: N/A
 Stream #0:0: Video: mjpeg (Baseline), yuvj444p(pc, bt470bg/unknown/unknown), 1920x1080 [SAR 1200:1200 DAR 16:9], 24 fps, 24 tbr, 24 tbn, 24 tbc
Stream mapping:
 Stream #0:0 -> #0:0 (mjpeg (native) -> dnxhd (native))
Press [q] to stop, [?] for help
[swscaler @ 0000026ac0f81600] deprecated pixel format used, make sure you did set range correctly
Output #0, mov, to 'output.mov':
 Metadata:
 encoder : Lavf58.45.100
 Stream #0:0: Video: dnxhd (DNXHD) (AVdn / 0x6E645641), yuv422p10le, 1920x1080 [SAR 1:1 DAR 16:9], q=2-1024, 175000 kb/s, 24 fps, 12288 tbn, 24 tbc
 Metadata:
 encoder : Lavc58.91.100 dnxhd
[Parsed_colorspace_3 @ 0000026ac0f7e080] Unsupported input primaries 2 (unknown)
Error while filtering: Invalid argument
Failed to inject frame into filter network: Invalid argument
Error while processing the decoded data for stream #0:0
Conversion failed!



-
How can I send a virtual camera to Genymotion or Android Studio Emulator in Ubuntu ?
4 décembre 2020, par ykasurI created a virtual camera using v4l2loopback and ffmpeg. The command I use for ffmpeg is :
ffmpeg -re -l oop 1 -i vin.png -vf format=yuv420p -f v4l2 /dev/video2


vin.png is the image I want to stream to the webcam and /dev/video2 is the virtual webcam I created with v4l2loopback.
The virtual webcam works and I can see it e.g. with onlinemicetest.com/webcam-test.
I'm using the Genymotion emulator with the newest Android API (I tried 7.0, 8.1 and 10.0) on Ubuntu 20.40.
Genymotion detects the virtual camera but only displays a dummy image :
Wrong dummy Image from Genymotion
I also tried (and would prefer to use) the android studio emulator. But I can only select Webcam0 in the configuration of the device camera and that points to the real integrated camera and not to my virtual webcam.


I don't need to use ffmpeg, but I do need to use a tool that lets me control which image to stream from the command line.


Is there a way to solve this ? Many thanks in advance !


Update 17.11.2020 :
The Genymotion support answered me, that they plan to support virtual cameras in the future. They might be ready to add this in mid 2021.