Recherche avancée

Médias (1)

Mot : - Tags -/belgique

Autres articles (98)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

Sur d’autres sites (8333)

  • NodeJS - How to pipe same video stream to multiple clients ?

    30 août 2013, par SergioBR

    We have a situation trying to serve a video stream.

    Since HTML5 video tag does not support udp to multicast, we are trying to re-use an already converted ffmpeg stream and send it to more than one response. But that does not work.

    The first response gets the stream alright, but the second one does not.
    It seems that the stream cannot be piped out to another response, neither can it be cloned.

    Has anyone done that before ? Any ideas ?

    Thanks in advance !

    Here's the code :

    var request = require('request');
    var http = require('http');
    var child_process = require("child_process");
    var n = 1;
    var stdouts = {};

    http.createServer(function (req, resp) {

    console.log("***** url ["+req.url+"], call "+n);

    if (req.url != "/favicon.ico" && req.url != "/")
    {
    var params = req.url.substring(1).split("/");

    switch (params[0])
    {
     case "VIEW":
       if (params[1] == "C2FLOOR1" || params[1] == "C2FLOOR2" || params[1] == "C2PORFUN" || params[1] == "C2TESTCAM")
         var camera = "rtsp://192.168.16.19:554/Inter/Cameras/Stream?Camera="+params[1];
       else
         var camera = "http://192.168.16.19:8609/Inter/Cameras/GetStream?Camera="+params[1];

       // Write header
       resp.writeHead(200, {'Content-Type': 'video/ogg', 'Connection': 'keep-alive'});

       if (stdouts.hasOwnProperty(params[1]))
       {
         console.log("Getting stream already created for camera "+params[1]);

         var newStdout = Object.create(stdouts[params[1]]);

         newStdout.pipe(resp);
       }
       else
       {
           // Start ffmpeg
           var ffmpeg = child_process.spawn("ffmpeg",[
           "-i",camera,
           "-vcodec","libtheora",
           "-qscale:v","7",        // video quality
           "-f","ogg",             // File format
           "-g","1",               // GOP (Group Of Pictures) size
           "-"                     // Output to STDOUT
           ]);

           stdouts[params[1]] = ffmpeg.stdout;

           // Pipe the video output to the client response
           ffmpeg.stdout.pipe(resp);

       console.log("Initializing camera at "+camera);
       }

       // Kill the subprocesses when client disconnects
    /*
       resp.on("close",function(){
         ffmpegs[params[1]].kill();
         console.log("FIM!");
       });
    */
       break;
    }
    }
    else
    {
    resp.writeHeader(200, {"Content-Type": "text/html"});
    resp.write("WRONG CALL");
    resp.end();
    }
    n++;

    }).listen(8088);

    console.log('Server running at port 8088');
  • ffmpeg on android : playing MPEG2 TS udp multicast stream

    27 août 2013, par user1513822

    I want to make android media player using ffmpeg.
    (catch MPEG2 TS multicast stream via WIFI network and decode it)
    I checked followings :

    • My iptime AP supports WIFI multicast protocol.
      (send multicast stream in wired PC, and wifi connected PC can receive it)
    • My Android phone can receive multicast stream via WIFI.
      I coded NDK socket programming which is join udp multicast group and receive packets
      (I added multicast access grant to AndroidManifest.xml)
    • FFMPEG library is ported to android and it can play local media file.

    But when I try to open network stream using FFMPEG library, avformat_open_input() function returns fail.

    gFormatCtx = avformat_alloc_context();
    av_register_all();
    avcodec_register_all();
    avformat_network_init();
    if(avformat_open_input(&gFormatCtx,"udp://@239.100.100.100:4000",NULL,NULL) != 0)
       return -2;

    this code always return "-2".
    If I use "av_dict_set()" api, which option should I use ?

    av_dict_set(&options, "udp_multicast", "mpegtsraw", 0);

    please let me know what should I check for avformat_open_input error ?

    thanks.

  • convert yuv to mp4 by ffmpeg on android

    10 août 2013, par worldask

    i have to convert yuv to mp4 by ffmpeg on android. When I convert wav to mp4 it works well. but when i convert yuv or yuv + wav to mp4, i got errer message said

    Error decoding AAC frame header

    anybody knows what happened ?

    following is the full debug log

    transferYUV2MP4() enter
    __transfer_yuv_to_mp4() enter
    __transfer_yuv_to_mp4() argv[00/17] = ffmpeg
    __transfer_yuv_to_mp4() argv[01/17] = -loglevel
    __transfer_yuv_to_mp4() argv[02/17] = debug
    __transfer_yuv_to_mp4() argv[03/17] = -y
    __transfer_yuv_to_mp4() argv[04/17] = -i
    __transfer_yuv_to_mp4() argv[05/17] = /sdcard/111.yuv
    __transfer_yuv_to_mp4() argv[06/17] = -i
    __transfer_yuv_to_mp4() argv[07/17] = /sdcard/3.wav
    __transfer_yuv_to_mp4() argv[08/17] = -c:a
    __transfer_yuv_to_mp4() argv[09/17] = aac
    __transfer_yuv_to_mp4() argv[10/17] = -strict
    __transfer_yuv_to_mp4() argv[11/17] = experimental
    __transfer_yuv_to_mp4() argv[12/17] = -b:a
    __transfer_yuv_to_mp4() argv[13/17] = 56k
    __transfer_yuv_to_mp4() argv[14/17] = -preset
    __transfer_yuv_to_mp4() argv[15/17] = ultrafast
    __transfer_yuv_to_mp4() argv[16/17] = /sdcard/111.mp4
    __run_ffmpeg_main() enter
    __run_ffmpeg_main() handle=0xb000f7f8
    __run_ffmpeg_main() dlfunc=0x4b5a2728
    ffmpeg version 1.2.2
    Copyright (c) 2000-2013 the FFmpeg developers
     built on Aug 10 2013 16:34:45 with gcc 4.6 (GCC) 20120106 (prerelease)
     configuration: --target-os=linux --prefix=./android/armv7-a --sysroot=/Users/pht/android/ndks/android-ndk-r9/platforms/android-8/arch-arm/ --enable-gpl --enable-version3 --disable-shared --enable-static --disable-ffprobe --disable-ffplay --disable-ffserver --disable-network --enable-avformat --enable-avcodec --enable-cross-compile --arch=arm --cc=/Users/pht/android-standalone-toolchain/bin/arm-linux-androideabi-gcc --nm=/Users/pht/android-standalone-toolchain/bin/arm-linux-androideabi-nm --cross-prefix=/Users/pht/android-standalone-toolchain/bin/arm-linux-androideabi- --extra-cflags=' -I../fdk-aac/include -I../x264 -O3 -fpic -DANDROID -DHAVE_SYS_UIO_H=1 -Dipv6mr_interface=ipv6mr_ifindex -fasm -Wno-psabi -fno-short-enums -fno-strict-aliasing -finline-limit=300 -mfloat-abi=softfp -mfpu=vfpv3-d16 -marm -march=armv7-a ' --extra-ldflags=' -L../fdk-aac/lib -L../x264 -Wl,-rpath-link=/Users/pht/android/ndks/android-ndk-r9/platforms/android-8/arch-arm//usr/lib -L/Users/pht/android/ndks/android-ndk-r9/platforms/andr
     libavutil      52. 18.100 / 52. 18.100
     libavcodec     54. 92.100 / 54. 92.100
     libavformat    54. 63.104 / 54. 63.104
     libavdevice    54.  3.103 / 54.  3.103
     libavfilter     3. 42.103 /  3. 42.103
     libswscale      2.  2.100 /  2.  2.100
     libswresample   0. 17.102 /  0. 17.102
     libpostproc    52.  2.100 / 52.  2.100
    Splitting the commandline.
    Reading option '-loglevel' ...
    matched as option 'loglevel' (set libav* logging level) with argument 'debug'.
    Reading option '-y' ...
    matched as option 'y' (overwrite output files) with argument '1'.
    Reading option '-i' ...
    matched as input file with argument '/sdcard/111.yuv'.
    Reading option '-i' ...
    matched as input file with argument '/sdcard/3.wav'.
    Reading option '-c:a' ...
    matched as option 'c' (codec name) with argument 'aac'.
    Reading option '-strict' ...
    matched as AVOption 'strict' with argument 'experimental'.
    Reading option '-b:a' ...
    matched as option 'b' (video bitrate (please use -b:v)) with argument '56k'.
    Reading option '-preset' ...
    matched as AVOption 'preset' with argument 'ultrafast'.
    Reading option '/sdcard/111.mp4' ...
    matched as output file.
    Finished splitting the commandline.
    Parsing a group of options: global .
    Applying option loglevel (set libav* logging level) with argument debug.
    Applying option y (overwrite output files) with argument 1.
    Successfully parsed a group of options.
    Parsing a group of options: input file /sdcard/111.yuv.
    Successfully parsed a group of options.
    Opening an input file: /sdcard/111.yuv.
    Format aac detected only with low score of 1, misdetection possible!
    File position before avformat_find_stream_info() is 0
    get_buffer() failed
    Error decoding AAC frame header.
    channel element 2.12 is not allocated
    More than one AAC RDB per ADTS frame is not implemented. Update your FFmpeg version to the newest one from Git. If the problem still occurs, it means that your file has a feature which has not been implemented.
    channel element 3.4 is not allocated
    channel element 2.2 is not allocated
    Number of scalefactor bands in group (44) exceeds limit (40).
    channel element 2.10 is not allocated
    channel element 1.15 is not allocated
    channel element 3.6 is not allocated
    channel element 2.0 is not allocated
    channel element 3.3 is not allocated
    Sample rate index in program config element does not match the sample rate index configured by the container.
    channel element 2.8 is not allocated
    Sample rate index in program config element does not match the sample rate index configured by the container.
    channel element 3.2 is not allocated
    Reserved bit set.
    channel element 2.6 is not allocated
    channel element 2.1 is not allocated
    Dependent coupling is not supported together with LTP
    Dependent coupling is not supported together with LTP
    Dependent coupling is not supported together with LTP
    Dependent coupling is not supported together with LTP
    Dependent coupling is not supported together with LTP

    and the "Dependent coupling..." line loops thousands of times