Recherche avancée

Médias (1)

Mot : - Tags -/artwork

Autres articles (99)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

Sur d’autres sites (11377)

  • How can I send a virtual camera to Genymotion or Android Studio Emulator in Ubuntu ?

    4 décembre 2020, par ykasur

    I created a virtual camera using v4l2loopback and ffmpeg. The command I use for ffmpeg is :
ffmpeg -re -l oop 1 -i vin.png -vf format=yuv420p -f v4l2 /dev/video2

    


    vin.png is the image I want to stream to the webcam and /dev/video2 is the virtual webcam I created with v4l2loopback.
The virtual webcam works and I can see it e.g. with onlinemicetest.com/webcam-test.
I'm using the Genymotion emulator with the newest Android API (I tried 7.0, 8.1 and 10.0) on Ubuntu 20.40.
Genymotion detects the virtual camera but only displays a dummy image :
Wrong dummy Image from Genymotion
I also tried (and would prefer to use) the android studio emulator. But I can only select Webcam0 in the configuration of the device camera and that points to the real integrated camera and not to my virtual webcam.

    


    I don't need to use ffmpeg, but I do need to use a tool that lets me control which image to stream from the command line.

    


    Is there a way to solve this ? Many thanks in advance !

    


    Update 17.11.2020 :
The Genymotion support answered me, that they plan to support virtual cameras in the future. They might be ready to add this in mid 2021.

    


  • ffmpeg sRGB jpgs to rec709 mov

    9 mars 2021, par maxdownunder

    I'm trying to convert an sRGB jpeg sequence into a rec709 DnX Quicktime... but struggle to get the colormatrix right.

    


    ffmpeg -y -f image2 -start_number 1 -framerate 24 -i input.%04d.jpg -c:v dnxhd -vf "scale=1920:1080,fps=24,format=yuv422p10,colorspace=all=bt709:itrc=srgb" -b:v 175M -c:a pcm_s16le output.mov


    


    My understanding is this should set everything to rec709, but override the input response curve (itrc=srgb), but this fails with

    


    Unsupported input primaries 2 (unknown)
Error while filtering: Invalid argument
Failed to inject frame into filter network: Invalid argument
Error while processing the decoded data for stream #0:0


    


    I haven't been able to come up with any combination of flags that doesn't trough an error and produces the right colors.

    


    Some docs here : ffmpeg filters trc

    


    Any help much appreciated !

    


    Edit :
Full log :

    


    $ ffmpeg -y -f image2 -start_number 1 -framerate 24 -i input.%04d.jpg -c:v dnxhd -vf "scale=1920:1080,fps=24,format=yuv422p10,colorspace=all=bt709:itrc=srgb" -b:v 175M -c:a pcm_s16le output.mov
ffmpeg version 4.3.1-2020-11-08-essentials_build-www.gyan.dev Copyright (c) 2000-2020 the FFmpeg developers
  built with gcc 10.2.0 (Rev3, Built by MSYS2 project)
  configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-lzma --enable-zlib --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-sdl2 --enable-libwebp --enable-libx264 --en
able-libx265 --enable-libxvid --enable-libaom --enable-libopenjpeg --enable-libvpx --enable-libass --enable-libfreetype --enable-libfribidi --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-d3d11va --enable-dxva2 --enable-li
bmfx --enable-libgme --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libtheora --enable-libvo-amrwbenc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-librubberband
  libavutil      56. 51.100 / 56. 51.100
  libavcodec     58. 91.100 / 58. 91.100
  libavformat    58. 45.100 / 58. 45.100
  libavdevice    58. 10.100 / 58. 10.100
  libavfilter     7. 85.100 /  7. 85.100
  libswscale      5.  7.100 /  5.  7.100
  libswresample   3.  7.100 /  3.  7.100
  libpostproc    55.  7.100 / 55.  7.100
Input #0, image2, from 'input.%04d.jpg':
  Duration: 00:00:00.13, start: 0.000000, bitrate: N/A
    Stream #0:0: Video: mjpeg (Baseline), yuvj444p(pc, bt470bg/unknown/unknown), 1920x1080 [SAR 1200:1200 DAR 16:9], 24 fps, 24 tbr, 24 tbn, 24 tbc
Stream mapping:
  Stream #0:0 -> #0:0 (mjpeg (native) -> dnxhd (native))
Press [q] to stop, [?] for help
[swscaler @ 0000026ac0f81600] deprecated pixel format used, make sure you did set range correctly
Output #0, mov, to 'output.mov':
  Metadata:
    encoder         : Lavf58.45.100
    Stream #0:0: Video: dnxhd (DNXHD) (AVdn / 0x6E645641), yuv422p10le, 1920x1080 [SAR 1:1 DAR 16:9], q=2-1024, 175000 kb/s, 24 fps, 12288 tbn, 24 tbc
    Metadata:
      encoder         : Lavc58.91.100 dnxhd
[Parsed_colorspace_3 @ 0000026ac0f7e080] Unsupported input primaries 2 (unknown)
Error while filtering: Invalid argument
Failed to inject frame into filter network: Invalid argument
Error while processing the decoded data for stream #0:0
Conversion failed!


    


  • Error when converting mp4 video to HLS using stream as input in node js using ffmpeg

    6 janvier 2023, par Sankalp Kataria

    I am trying to convert an MP4 file to HLS using ffmpeg.
Code :

    



    var stream = createReadStream(filePath);
ffmpeg(stream)
    .on('stderr', function(stderrLine) {
        console.log('Stderr output: ' + stderrLine);
    })
    .on('end', function() {
        console.log('done processing input stream');
    })
    .on('error', function(err) {
        console.log('an error happened: ' + err.message);
    })
    .save(join(__basedir, "public", `file.m3u8`));


    



    OutPut :

    



    Stderr output: ffmpeg version git-2020-05-22-38490cb Copyright (c) 2000-2020 the FFmpeg developers
Stderr output:   built with gcc 9.3.1 (GCC) 20200513
Stderr output:   configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libsrt --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom --disable-w32threads --enable-libmfx --enable-ffnvcodec --enable-cuda-llvm --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth --enable-libopenmpt --enable-amf
Stderr output:   libavutil      56. 46.100 / 56. 46.100
Stderr output:   libavcodec     58. 86.101 / 58. 86.101
Stderr output:   libavformat    58. 43.100 / 58. 43.100
Stderr output:   libavdevice    58.  9.103 / 58.  9.103
Stderr output:   libavfilter     7. 82.100 /  7. 82.100
Stderr output:   libswscale      5.  6.101 /  5.  6.101
Stderr output:   libswresample   3.  6.100 /  3.  6.100
Stderr output:   libpostproc    55.  6.100 / 55.  6.100
Stderr output: [mov,mp4,m4a,3gp,3g2,mj2 @ 000001b68c53cb00] overread end of atom 'stsd' by 34 bytes
Stderr output: [mov,mp4,m4a,3gp,3g2,mj2 @ 000001b68c53cb00] stream 0, offset 0x30: partial file
Stderr output: [mov,mp4,m4a,3gp,3g2,mj2 @ 000001b68c53cb00] Could not find codec parameters for stream 0 (Video: h264 (avc1 / 0x31637661), none, 1920x1080, 2528 kb/s): unspecified pixel format
Stderr output: Consider increasing the value for the 'analyzeduration' and 'probesize' options
Stderr output: Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'pipe:0':
Stderr output:   Metadata:
Stderr output:     major_brand     : isom
Stderr output:     minor_version   : 512
Stderr output:     compatible_brands: isomiso2avc1mp41
Stderr output:     encoder         : Lavf56.25.101
Stderr output:   Duration: 00:03:00.97, start: 0.000000, bitrate: N/A
Stderr output:     Stream #0:0(und): Video: h264 (avc1 / 0x31637661), none, 1920x1080, 2528 kb/s, 23.98 fps, 23.98 tbr, 24k tbn, 48k tbc (default)
Stderr output:     Metadata:
Stderr output:       handler_name    : VideoHandler
Stderr output:     Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 127 kb/s (default)
Stderr output:     Metadata:
Stderr output:       handler_name    : SoundHandler
Stderr output: Stream mapping:
Stderr output:   Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
Stderr output:   Stream #0:1 -> #0:1 (aac (native) -> aac (native))
Stderr output: [mov,mp4,m4a,3gp,3g2,mj2 @ 000001b68c53cb00] stream 0, offset 0x30: partial file
Stderr output: pipe:0: Invalid data found when processing input
Stderr output: Cannot determine format of input stream 0:0 after EOF
Stderr output: Error marking filters as finished
Stderr output: Conversion failed!
Stderr output:
an error happened: ffmpeg exited with code 1: pipe:0: Invalid data found when processing input
Cannot determine format of input stream 0:0 after EOF
Error marking filters as finished
Conversion failed!


    



    I've also tried with moveflag option
.outputOptions("-movflags isml+frag_keyframe") also with -movflags faststart

    



    I've read through
How do you use Node.js to stream an MP4 file with ffmpeg ?

    



    But i didn't quite understand what and how to do it.