Recherche avancée

Médias (91)

Autres articles (97)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

Sur d’autres sites (6263)

  • android ffmpeg error fatal signal 11

    10 décembre 2014, par AKUMA

    I’m using ffmpeg on Android. and Use RTSP webcam.

    I refer here.

    I get good video.

    But I receive error when I ended the video.

    Fatal signal 11 (SIGSEGV) at 0x8a7c0d04 (code=1), thread 2537 (Thread-13225)

    So I found the error using the log. But I don’t know why the error occurred.

    The code below is the location of the error.

    void closeMovie(){
     av_read_pause(gFormatCtx);
     if (gVideoBuffer != NULL) {
       free(gVideoBuffer);
       gVideoBuffer = NULL;
     }
     if (gFrame != NULL)
         av_free(gFrame);
     if (gFrameRGB != NULL)
         av_free(gFrameRGB);
     if (gVideoCodecCtx != NULL) {
         avcodec_close(gVideoCodecCtx);
         av_free(gVideoCodecCtx);
         gVideoCodecCtx = NULL;
     }
     if (gFormatCtx != NULL) {
         **av_close_input_file(gFormatCtx);** //here
         gFormatCtx = NULL;
     }
    }

    and

    void av_close_input_file(AVFormatContext *s)
    {
     AVIOContext *pb = (s->iformat->flags & AVFMT_NOFILE) || (s->flags & AVFMT_FLAG_CUSTOM_IO) ?
                      NULL : s->pb;
     **av_close_input_stream(s);**// here
     if (pb)
       avio_close(pb);
    }

    and

    void av_close_input_stream(AVFormatContext *s)
    {
       flush_packet_queue(s);
       if (s->iformat->read_close)
          **s->iformat->read_close(s);** //here
       avformat_free_context(s);
    }

    if //s->iformat->read_close(s); not crash

    But I can not open the video once more. Again to open the video I have to quit the app.

    Sorry, I can’t speak English very well.
    Thanks for reading.

    ps1. LogCat

    12-10 10:08:02.991: D/TAG(21770): TAG - closeMovie1 - in
    12-10 10:08:02.991: D/TAG(21770): TAG - avcodec_close - in
    12-10 10:08:02.991: D/TAG(21770): av_opt_free - on
    12-10 10:08:02.991: D/TAG(21770): av_opt_free - out
    12-10 10:08:02.991: D/TAG(21770): TAG - avcodec_close - out
    12-10 10:08:02.991: D/TAG(21770): TAG - av_close_input_file - in
    12-10 10:08:02.991: D/TAG(21770): TAG - avformat_close_input - in
    12-10 10:08:02.991: D/TAG(21770): TAG - flush_packet_queue -
    12-10 10:08:02.991: D/TAG(21770): flush_packet_queue - on
    12-10 10:08:02.991: D/TAG(21770): flush_packet_queue - out
    12-10 10:08:02.991: D/TAG(21770): TAG - s->iformat->read_close : 0
    12-10 10:08:02.991: D/TAG(21770): TAG - avformat_free_context
    12-10 10:08:02.991: D/TAG(21770): TAG - avformat_free_context - in
    12-10 10:08:02.991: D/TAG(21770): av_opt_free - on
    12-10 10:08:02.991: D/TAG(21770): av_opt_free - out
    12-10 10:08:02.991: D/TAG(21770): av_opt_free - on
    12-10 10:08:02.991: D/TAG(21770): av_opt_free - out
    12-10 10:08:02.991: D/TAG(21770): TAG - avformat_for_length s->nb_streams - 3
    12-10 10:08:02.991: D/TAG(21770): TAG - avformat_for_nb_streams - in
    12-10 10:08:02.991: D/TAG(21770): TAG - avformat_for_ nb_streams - out
    12-10 10:08:02.991: D/TAG(21770): TAG - avformat_for_nb_streams - in
    12-10 10:08:02.991: D/TAG(21770): TAG - avformat_for_ nb_streams - out
    12-10 10:08:02.991: D/TAG(21770): TAG - avformat_for_nb_streams - in
    12-10 10:08:02.991: D/TAG(21770): TAG - avformat_for_ nb_streams - out
    12-10 10:08:02.991: D/TAG(21770): TAG - avformat_for_length s->nb_programs-1 - -1
    12-10 10:08:02.991: D/TAG(21770): TAG - freep 1
    12-10 10:08:02.991: D/TAG(21770): TAG - freep 2
    12-10 10:08:02.991: D/TAG(21770): TAG - while
    12-10 10:08:02.991: D/TAG(21770): TAG - freep 3
    12-10 10:08:02.991: D/TAG(21770): TAG - dict_free 1
    12-10 10:08:02.991: D/TAG(21770): TAG - freep 4
    12-10 10:08:02.991: D/TAG(21770): TAG - freep 3
    12-10 10:08:02.991: D/TAG(21770): TAG - avformat_free_context - out
    12-10 10:08:02.991: D/TAG(21770): TAG - avio_close - in
    12-10 10:08:02.991: D/TAG(21770): TAG - avio_close - 1
    12-10 10:08:02.991: D/TAG(21770): TAG - avio_close - 2
    12-10 10:08:02.991: D/TAG(21770): TAG - avio_close - out
    12-10 10:08:02.991: D/TAG(21770): TAG - avio_close(pb) : 0
    12-10 10:08:02.991: D/TAG(21770): TAG - avio_close - in
    12-10 10:08:02.991: D/TAG(21770): TAG - avio_close - 1
    12-10 10:08:02.991: D/TAG(21770): TAG - avio_close - 2
    12-10 10:08:02.991: D/TAG(21770): TAG - avio_close - out
    12-10 10:08:02.991: D/TAG(21770): TAG - avformat_close_input - out
    12-10 10:08:02.991: D/TAG(21770): TAG - av_close_input_file - out
    12-10 10:08:02.991: D/TAG(21770): TAG - closeMovie1 - out
    12-10 10:08:03.151: D/AbsListView(21770): onVisibilityChanged() is called, visibility : 0
    12-10 10:08:03.151: D/AbsListView(21770): unregisterIRListener() is called
    12-10 10:08:03.201: D/AbsListView(21770): unregisterIRListener() is called
    12-10 10:08:03.221: D/AbsListView(21770): unregisterIRListener() is called
    12-10 10:08:03.241: A/libc(21770): Fatal signal 11 (SIGSEGV) at 0x8bb85698 (code=1), thread 21938 (Thread-23561)

    ps2.The order of the log.

    (BasicPlayer.c) closeMovie in->
    (utils.c) avcodec_close ->
    (utils.c) av_close_input_file ->
    (utils.c) avformat_close_input ->
    (utils.c) flush_packet_queue ->
    (utils.c) **s->iformat->read_close(s)** ->
    (utils.c) avformat_free_context ->
    (utils.c) avio_close(pb) ->
    (BasicPlayer.c) closeMovie out
  • Live video streaming with Node js ,HTML5, MPEG-DASH, FFMPEG and IP camera/Raspberry Pi

    12 mai 2016, par sparks

    I am a programmer but i am very new in live video streaming concepts.I need help

    What i want to accomplish

    I want to develop an online live video streaming system.The scenario is i would have a device or number of devices(raspberry pi & camera only OR ip camera only ...not sure yet) to capture the video and stream the video live in real time remotely to my web app.Multiple clients can connect to the web app and watch the video live.The key things to note here, is that these devices should be wireless(able to connect to internet and live stream the content) and also i want to eliminate the idea of manually configuring the ip adrress to local WIFI router.So simply i turn on the device and it start streaming right away to the web app.

    Infrastructure, Platforms,Browsers, Streaming methods and formats

    In the beginning i just want to stream though chrome web browser(that’s all i care about).But in the future i would build android and IOS mobile apps.So long term i would expect to be Chrome and mobile(Android & IOS platforms)
    So based on my research i learned that the client should be HTML5, streaming method MPEG-DASH(In the future add HLS), the web app will be in Node Js.I also came across Dash.js for Html5.

    My understanding of streaming based on my research

    I also came across things like FFMPEG,Dash encoder and wowza which i am not clear about.Now correct me if i am wrong, my understanding is that FFMPEG get hold of the device/camera and the content(i am not sure the format of the content at this point)and format it(i am not sure what this means in simple english) and then Dash encoder picks up and re-format the content to MPEG-DASH format, which produces MPD and then Dash.js client uses MPD to display the video to the browser.

    QUESTIONS

    1. First correct me if i am wrong based on my understanding above or
      clarify for me.Also I am not sure of where the wowza streaming
      engine come into play. Do i even need it ?

    2. I am not sure of the devices to use between Raspberry pi with camera
      module/ Or IP Wifi camera by itself.I know with raspberry pi
      connected to internet you can set up all the necessary programs and
      stream the video to web app directly(not sure about quality and
      performance) but I am not sure about Wifi camera.Is it possible to
      connect to the wifi camera remotely from the web app programatically
      without opening the wifi router portal manually or i should stick
      with Raspbery Pi ?

    3. For raspberry Pi would i be able to connect it with high quality
      picture IP camera/web cam ? (The point here to get the best picture
      through raspbery Pi)

      My expectations

      Better performance and quality would be great.But i know live streaming is not easy so i am willing to compromise performance to a point but not quality.

    Thank you in advance, Anything will be appreciated.I know this is a lot so take your time :)

  • How to encode multi audio streams with different options at the same time using ffmpeg

    20 mars 2012, par Luca Borrione

    I'm trying to encode a dvd using ffmpeg.

    $ffmpeg -i VTS_01_1.VOB
    Input #0, mpeg, from 'VTS_01_1.VOB':
    Duration: 00:38:06.52, start: 0.287267, bitrate: 3756 kb/s
    Stream #0:0[0x1e0]: Video: mpeg2video (Main), yuv420p, 720x576 [SAR 64:45 DAR 16:9], 9800 kb/s, 25 fps, 25 tbr, 90k tbn, 50 tbc
    Stream #0:1[0x80]: Audio: ac3, 48000 Hz, 5.1(side), s16, 384 kb/s
    Stream #0:2[0x81]: Audio: ac3, 48000 Hz, 5.1(side), s16, 384 kb/s
    Stream #0:3[0x82]: Audio: ac3, 48000 Hz, 5.1(side), s16, 384 kb/s
    Stream #0:4[0x83]: Audio: ac3, 48000 Hz, mono, s16, 96 kb/s
    Stream #0:5[0x28]: Subtitle: dvd_subtitle
    Stream #0:6[0x29]: Subtitle: dvd_subtitle
    Stream #0:7[0x23]: Subtitle: dvd_subtitle
    Stream #0:8[0x24]: Subtitle: dvd_subtitle
    Stream #0:9[0x26]: Subtitle: dvd_subtitle
    Stream #0:10[0x27]: Subtitle: dvd_subtitle

    From the above streams I'm interested in keeping two audio streams : number 1 and 4.
    As you can see the number 4 is already 96kbps so I tried to execute a command which could treat the two streams in a different way :

    cat VTS_01_1.VOB | nice ffmpeg -i - -s 640x368 -vcodec libtheora -r 25 -b:v 1200k -an -metadata title="My Title" -pass 1 -passlogfile "/media/data/outputlog" -f ogg -y /dev/null

    cat VTS_01_1.VOB | nice ffmpeg -i - -map 0:0 -s 640x368 -vcodec libtheora -r 25 -b:v 1200k -async 1 -metadata title="My Title" -map 0:1 -acodec libvorbis -ac 6 -ar 48000 -b:a 192k -metadata title="english" -map 0:4 -acodec libvorbis -ac 2 -ar 48000 -b:a 96k -metadata title="commented" -pass 2 -passlogfile "/media/data/outputlog" "/media/data/output.ogv"

    What I would like to obtain is :

    Input #0, ogg, from 'output.ogv':
    Duration: 00:38:07.20, start: 0.000000, bitrate: 1360 kb/s
    Stream #0:0: Video: theora, yuv420p, 640x368 [SAR 46:45 DAR 16:9], 25 fps, 25 tbr, 25 tbn, 25 tbc
    Stream #0:1: Audio: vorbis, 48000 Hz, stereo, s16, 192 kb/s
    Stream #0:2: Audio: vorbis, 48000 Hz, stereo, s16, 96 kb/s

    Instead with the above command I obtain :

    Input #0, ogg, from 'output.ogv':
    Duration: 00:38:07.20, start: 0.000000, bitrate: 1360 kb/s
    Stream #0:0: Video: theora, yuv420p, 640x368 [SAR 46:45 DAR 16:9], 25 fps, 25 tbr, 25 tbn, 25 tbc
    Stream #0:1: Audio: vorbis, 48000 Hz, stereo, s16, 96 kb/s
    Stream #0:2: Audio: vorbis, 48000 Hz, stereo, s16, 96 kb/s

    So how can I specify different params for multiple audio streams ?

    BTW : I'm on lubuntu oneiric with the latest ffmpeg from git

    ffmpeg version git-2012-03-05-1007a80 Copyright (c) 2000-2012 the FFmpeg developers
    built on Mar  5 2012 09:40:09 with gcc 4.6.1
    configuration: --enable-gpl --enable-libfaac --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libtheora --enable-libvorbis --enable-libx264 --enable-nonfree --enable-version3 --enable-x11grab --enable-libxvid --enable-libvpx
    libavutil      51. 41.100 / 51. 41.100
    libavcodec     54.  8.100 / 54.  8.100
    libavformat    54.  2.100 / 54.  2.100
    libavdevice    53.  4.100 / 53.  4.100
    libavfilter     2. 63.100 /  2. 63.100
    libswscale      2.  1.100 /  2.  1.100
    libswresample   0.  7.100 /  0.  7.100
    libpostproc    52.  0.100 / 52.  0.100

    Therefore the newaudio option is no more recognized