Recherche avancée

Médias (2)

Mot : - Tags -/plugins

Autres articles (68)

  • MediaSPIP Core : La Configuration

    9 novembre 2010, par

    MediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
    Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...)

  • Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs

    12 avril 2011, par

    La manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
    Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

Sur d’autres sites (12338)

  • Getting 'av_interleaved_write_frame() : Broken pipe' error

    2 juillet 2016, par Poornan

    I am trying this blog post.I am new to python, numpy as well as FFMPEG. I could not figure out what causes this issue.

    http://zulko.github.io/blog/2013/09/27/read-and-write-video-frames-in-python-using-ffmpeg/

    here is the code :

    import subprocess as sp
    import numpy

    print ("Hello World!");
    FFMPEG_BIN = "ffmpeg" # on Linux ans Mac OS
    #FFMPEG_BIN = "ffmpeg.exe" # on Windows

    command = [ FFMPEG_BIN,
               '-i', '/Users/eananthaneshan/Movies/myvideo.mp4',
               '-f', 'image2pipe',
               '-pix_fmt', 'yuv444p',
               '-s', '420x360',
               '-r', '24',
               '-vcodec', 'h264', '-']
    pipe = sp.Popen(command, stdout = sp.PIPE, bufsize=-1)

    # read 420*360*3 bytes (= 1 frame)
    raw_image = pipe.stdout.read(420*360*3)
    # transform the byte read into a numpy array
    image =  numpy.fromstring(raw_image, dtype='uint8')
    image = image.reshape((360,420,3)   )
    # throw away the data in the pipe's buffer.
    pipe.stdout.flush()

    This is the output :

    ffmpeg version 2.8 Copyright (c) 2000-2015 the FFmpeg developers
     built with Apple LLVM version 7.0.0 (clang-700.0.72)
     configuration: --prefix=/usr/local/Cellar/ffmpeg/2.8 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-opencl --enable-libx264 --enable-libmp3lame --enable-libvo-aacenc --enable-libxvid --enable-libfreetype --enable-libtheora --enable-libvorbis --enable-libvpx --enable-librtmp --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libass --enable-ffplay --enable-libspeex --enable-libschroedinger --enable-libfdk-aac --enable-libopus --enable-frei0r --enable-libopenjpeg --disable-decoder=jpeg2000 --extra-cflags='-I/usr/local/Cellar/openjpeg/1.5.2_1/include/openjpeg-1.5 ' --enable-nonfree --enable-vda
     libavutil      54. 31.100 / 54. 31.100
     libavcodec     56. 60.100 / 56. 60.100
     libavformat    56. 40.101 / 56. 40.101
     libavdevice    56.  4.100 / 56.  4.100
     libavfilter     5. 40.101 /  5. 40.101
     libavresample   2.  1.  0 /  2.  1.  0
     libswscale      3.  1.101 /  3.  1.101
     libswresample   1.  2.101 /  1.  2.101
     libpostproc    53.  3.100 / 53.  3.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/Users/eananthaneshan/Movies/myvideo.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 1
       compatible_brands: isom
       creation_time   : 2014-04-07 02:02:31
     Duration: 01:00:10.03, start: 0.000000, bitrate: 1024 kb/s
       Stream #0:0(und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 111 kb/s (default)
       Metadata:
         creation_time   : 2014-04-07 02:02:31
         handler_name    : GPAC ISO Audio Handler
       Stream #0:1(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709/unknown/unknown), 720x404, 909 kb/s, 23.98 fps, 23.98 tbr, 24k tbn, 47.95 tbc (default)
       Metadata:
         creation_time   : 2014-04-07 01:14:58
         handler_name    : L-SMASH Video Media Handler
         encoder         : AVC Coding
    Output #0, image2pipe, to 'pipe:':
     Metadata:
       major_brand     : isom
       minor_version   : 1
       compatible_brands: isom
       encoder         : Lavf56.40.101
       Stream #0:0(und): Video: rawvideo (444P / 0x50343434), yuv444p, 420x360, q=2-31, 200 kb/s, 23.98 fps, 23.98 tbn, 23.98 tbc (default)
       Metadata:
         creation_time   : 2014-04-07 01:14:58
         handler_name    : L-SMASH Video Media Handler
         encoder         : Lavc56.60.100 rawvideo
    Stream mapping:
     Stream #0:1 -> #0:0 (h264 (native) -> rawvideo (native))
    Press [q] to stop, [?] for help
    av_interleaved_write_frame(): Broken pipe
    frame=    2 fps=0.0 q=-0.0 Lsize=     886kB time=00:00:00.08 bitrate=87003.8kbits/s    
    video:886kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.000000%
    Conversion failed!
  • Could not open encoder using ffmpeg C APi for MOV format

    22 juin 2016, par lupod

    Context

    I am writing a program for doing some video processing on an input file. I wrote two classes for handling the "reading/writing frames" part that essentially wrap the functions of ffmpeg. These classes may be instantiated by providing an input and output file name, and in their constructor I initialize everything that is needed (or at least I hope so).

    This are the two routines that are called inside the constructors :

    // InputVideoHandler.cpp
    void InputVideoHandler::init(char* name) {
     streamIndex = -1;
     int numStreams;

     if (avformat_open_input(&formatCtx, name, NULL, NULL) != 0)
       throw std::exception("Invalid input file name.");

     if (avformat_find_stream_info(formatCtx, NULL)<0)
       throw std::exception("Could not find stream information.");

     numStreams = formatCtx->nb_streams;

     if (numStreams < 0)
       throw std::exception("No streams in input video file.");

     for (int i = 0; i < numStreams; i++) {
       if (formatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO) {
         streamIndex = i;
         break;
       }
     }

     if (streamIndex < 0)
       throw std::exception("No video stream in input video file.");

     // find decoder using id
     codec = avcodec_find_decoder(formatCtx->streams[streamIndex]->codec->codec_id);
     if (codec == nullptr)
       throw std::exception("Could not find suitable decoder for input file.");

     // copy context from input stream
     codecCtx = avcodec_alloc_context3(codec);
     if (avcodec_copy_context(codecCtx, formatCtx->streams[streamIndex]->codec) != 0)
       throw std::exception("Could not copy codec context from input stream.");

     if (avcodec_open2(codecCtx, codec, NULL) < 0)
       throw std::exception("Could not open decoder.");

     codecCtx->refcounted_frames = 1;
    }

    // OutputVideoBuilder.cpp
    void OutputVideoBuilder::init(char* name, AVCodecContext* inputCtx) {
     if (avformat_alloc_output_context2(&formatCtx, NULL, NULL, name) < 0)
       throw std::exception("Could not determine file extension from provided name.");

     codec = avcodec_find_encoder(inputCtx->codec_id);
     if (codec == nullptr) {
       throw std::exception("Could not find suitable encoder.");
     }

     codecCtx = avcodec_alloc_context3(codec);
     if (avcodec_copy_context(codecCtx, inputCtx) < 0)
       throw std::exception("Could not copy output codec context from input");

     codecCtx->time_base = inputCtx->time_base;

     if (avcodec_open2(codecCtx, codec, NULL) < 0)
       throw std::exception("Could not open encoder.");

     stream = avformat_new_stream(formatCtx, codec);
     if (stream == nullptr) {
       throw std::exception("Could not allocate stream.");
     }

     stream->id = formatCtx->nb_streams - 1;
     stream->codec = codecCtx;
     stream->time_base = codecCtx->time_base;

     av_dump_format(formatCtx, 0, name, 1);
     if (!(formatCtx->oformat->flags & AVFMT_NOFILE)) {
       if (avio_open(&formatCtx->pb, name, AVIO_FLAG_WRITE) < 0) {
         throw std::exception("Could not open output file.");
       }
     }

     if (avformat_write_header(formatCtx, NULL) < 0) {
       throw std::exception("Error occurred when opening output file.");
     }

    }

    As you see, for the Init function of the output handler I require that an AVCodecContext must be provided. In my code, I pass to the constructor the AVCodecContext that is stored in the input handler and that was previously created.

    Question :

    The two functions work fine when I test my program with some video formats, like .mpg or .avi. When I try to process .mov/.mp4 files, however, my code throws the exception that I labeled "Could not open encoder." in OutputVideoBuilder::Init(). Why is this happening ? I read from the General documentation of ffmpeg that that format should be supported as well by ffmpeg. I am assuming that I am doing something wrong in my code, which I do not completely understand because it was created by trying to adapt the tutorials of the documentation to my specific case. Also for this reason, any comments on things that are useless or things that are missing will be greatly appreciated.

  • Encoder (codec png) not found for output stream #0:0 [duplicate]

    7 juin 2016, par Anubhav Dhawan

    This question already has an answer here :

    I’m trying to create a NodeJS app that converts a video into a GIF image.

    I’m using node-gify plugin for this purpose, which uses FFmpeg and GraphicsMagick.

    Here’s my sample code :

    var gify = require('./');
    var http = require('http');
    var fs = require('fs');

    var opts = {
     height: 300,
     rate: 10
    };

    console.time('convert');
    gify('out.mp4', 'out.gif', opts, function(err) {
     if (err) throw err;
     console.timeEnd('convert');
     var s = fs.statSync('out.gif');
     console.log('size: %smb', s.size / 1024 / 1024 | 0);
    });

    And here’s my console error :

    > gify@0.2.0 start /home/daffodil/repos/node-gify-master
    > node example.js

    /home/daffodil/repos/node-gify-master/example.js:24
     if (err) throw err;
              ^

    Error: Command failed: /bin/sh -c ffmpeg -i out.mp4 -filter:v scale=-1:300 -r 10 /tmp/IP5OXJZELd/%04d.png
    ffmpeg version 3.0.2 Copyright (c) 2000-2016 the FFmpeg developers
     built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04)
     configuration: --disable-yasm
     libavutil      55. 17.103 / 55. 17.103
     libavcodec     57. 24.102 / 57. 24.102
     libavformat    57. 25.100 / 57. 25.100
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 31.100 /  6. 31.100
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'out.mp4':
     Metadata:
       major_brand     : mp42
       minor_version   : 1
       compatible_brands: mp42mp41
       creation_time   : 2005-02-25 02:35:57
     Duration: 00:01:10.00, start: 0.000000, bitrate: 106 kb/s
       Stream #0:0(eng): Audio: aac (LC) (mp4a / 0x6134706D), 8000 Hz, stereo, fltp, 19 kb/s (default)
       Metadata:
         creation_time   : 2005-02-25 02:35:57
         handler_name    : Apple Sound Media Handler
       Stream #0:1(eng): Video: mpeg4 (Advanced Simple Profile) (mp4v / 0x7634706D), yuv420p, 192x242 [SAR 1:1 DAR 96:121], 76 kb/s, 15 fps, 15 tbr, 600 tbn, 1k tbc (default)
       Metadata:
         creation_time   : 2005-02-25 02:35:57
         handler_name    : Apple Video Media Handler
       Stream #0:2(eng): Data: none (rtp  / 0x20707472), 4 kb/s (default)
       Metadata:
         creation_time   : 2005-02-25 02:35:57
         handler_name    : hint media handler
       Stream #0:3(eng): Data: none (rtp  / 0x20707472), 3 kb/s (default)
       Metadata:
         creation_time   : 2005-02-25 02:35:57
         handler_name    : hint media handler
    Output #0, image2, to '/tmp/IP5OXJZELd/%04d.png':
     Metadata:
       major_brand     : mp42
       minor_version   : 1
       compatible_brands: mp42mp41
       Stream #0:0(eng): Video: png, none, q=2-31, 128 kb/s (default)
       Metadata:
         creation_time   : 2005-02-25 02:35:57
         handler_name    : Apple Video Media Handler
    Stream mapping:
     Stream #0:1 -> #0:0 (mpeg4 (native) -> ? (?))
    Encoder (codec png) not found for output stream #0:0

       at ChildProcess.exithandler (child_process.js:213:12)
       at emitTwo (events.js:100:13)
       at ChildProcess.emit (events.js:185:7)
       at maybeClose (internal/child_process.js:827:16)
       at Socket.<anonymous> (internal/child_process.js:319:11)
       at emitOne (events.js:90:13)
       at Socket.emit (events.js:182:7)
       at Pipe._onclose (net.js:471:12)
    </anonymous>

    PS : I had a couple of problems installing FFmpeg on my Ubuntu 14.04.

    • First, FFmpeg is removed from Ubuntu 14.04 (legal issues AFAIK). But I managed to apt-get it through this.
    • Second, when I tried to ./configure (as mentioned in its README.md), I got this error - yasm/nasm not found or too old. Use --disable-yasm for a crippled build.. So I used ./configure --disable-yasm instead, and it (somehow) worked.

    Update #1

    After read this log a couple of times, I managed to produce a sample GIF from my mp4 file, by changing the command, which example.js tries to run :

    From

    ffmpeg -i out.mp4 -filter:v scale=-1:300 -r 10 /tmp/Lz43nx6wv1/%04d.png

    To

    ffmpeg -i out.mp4 -filter:v scale=-1:300 -r 10 out.gif

    But it’s still using command line, I need to do this by code.

    So I dived into the code and found that this wrong url is coming from the plugin’s index.js :

    ...

    // tmpfile(s)
     var id = uid(10);
     var dir = path.resolve('/tmp/' + id);
     var tmp  = path.join(dir, '/%04d.png');

    ...

    Is this an issue with the plugin, or am I doing something wrong here ?
    In any case, please put the correct stub here, because I don’t want to touch this part unless I know what I’m doing ?

    Update #2

    Now I installed zlib1g-dev, and then reinstalled both FFmpeg and graphicsMagick, and now I see this error :

    gm convert: No decode delegate for this image format (/tmp/ZQbEAynAcf/0702.png).

    Thanks in advance :)