Recherche avancée

Médias (0)

Mot : - Tags -/organisation

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (53)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (10217)

  • FFMPEG JAVA ffmpeg doesn't work in .jar application

    3 février 2020, par xkenzzo

    FFMPEG works only in developer mode. When I compile my jar program it returns the following error
    the log of the program launched with eclipse in this case everything works

     ffmpeg version git-2019-12-29-e20c6d9 Copyright (c) 2000-2019 the FFmpeg developers
     built with gcc 9.2.1 (GCC) 20191125
     configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom --enable-libmfx --enable-ffnvcodec --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth --enable-libopenmpt --enable-amf
     libavutil      56. 38.100 / 56. 38.100
     libavcodec     58. 65.100 / 58. 65.100
     libavformat    58. 35.101 / 58. 35.101
     libavdevice    58.  9.101 / 58.  9.101
     libavfilter     7. 70.100 /  7. 70.100
     libswscale      5.  6.100 /  5.  6.100
     libswresample   3.  6.100 /  3.  6.100
     libpostproc    55.  6.100 / 55.  6.100
    Input #0, h264, from '.h264':
     Duration: N/A, bitrate: N/A
       Stream #0:0: Video: h264 (Main), yuv420p(progressive), 3840x2160 [SAR 1:1 DAR 16:9], 60 fps, 60 tbr, 1200k tbn, 120 tbc
    Output #0, mp4, to '.mp4':
     Metadata:
       encoder         : Lavf58.35.101
       Stream #0:0: Video: h264 (Main) (avc1 / 0x31637661), yuv420p(progressive), 3840x2160 [SAR 1:1 DAR 16:9], q=2-31, 60 fps, 60 tbr, 15360 tbn, 120 tbc
    Stream mapping:
     Stream #0:0 -> #0:0 (copy)
    Press [q] to stop, [?] for help
    [mp4 @ 000001cedad59440] Timestamps are unset in a packet for stream 0. This is deprecated and will stop working in the future. Fix your code to set the timestamps properly
    frame= 9273 fps=0.0 q=-1.0 Lsize=  121326kB time=00:01:17.26 bitrate=12863.3kbits/s speed= 207x    
    video:121287kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.032451%

    This is where it gets complicated. You can observe the messages that FFMPEG returns to me when I launch my program compile in jar :

    > [h264 @ 0000017c43e5b700] error while decoding MB 58 6, bytestream -14
    > [h264 @ 0000017c43e5b700] concealing 30951 DC, 30951 AC, 30951 MV
    > errors in I frame [h264 @ 0000017c43e49d80] Stream #0: not enough
    > frames to estimate rate; consider increasing probesize [h264 @
    > 0000017c43e49d80] decoding for stream 0 failed

    This is the java code which thanks to the ProcessBuilder will launch FFMPEG :

    String urlVideoMp4 = movie.getPath().replace(".h264", ".mp4");

    File videoFinal = new File(urlVideoMp4);
    //checking if new video exists : if not, creating it with FFMPEG

    LOG.info("moviePath : " + movie);
    ProcessBuilder processBuilder = new ProcessBuilder(FFMPEG, "-framerate", "60", "-r", "120", "-i", movie.getPath(),"-c:v", "copy", urlVideoMp4);
    LOG.debug(processBuilder.command());

    processBuilder.inheritIO().start().waitFor();

    return videoFinal;
  • FFmpeg os.system commands not working but work in Terminal

    28 janvier 2020, par Oscar Dolloway

    I’ve downloaded ffmpeg through the website and ran some commands through the terminal to confirm its install.

    when running the command ’ffmpeg’ in the Terminal it returns

    ffmpeg version 4.2.2 Copyright (c) 2000-2019 the FFmpeg
    developers
    built with Apple LLVM version 10.0.0 (clang-1000.11.45.5)

    if i type into Python

    import os
    os.system ('ffmpeg')

    it returns

    os.system ('ffmpeg')
    sh: ffmpeg: command not found
    Out[25]: 32512

    any ideas ?

    Solution :

    ffmpeg = '/bin/ffmpeg' #path to the binary file

    os.system(ffmpeg)

    Output :

    os.system (ffmpeg)
    ffmpeg version 4.2.2 Copyright (c) 2000-2019 the FFmpeg developers
    built with Apple LLVM version 10.0.0 (clang-1000.11.45.5)
  • Android NDK : How to replace AAsset to work with files from external Storage for FFmpeg decoding

    13 janvier 2020, par michpohl

    I am using Oboe’s RhytmGame example as a guideline to implement oboe and mp3 decoding using FFmpeg into my app. Since I am fairly new to the NDK and a C++ beginner I still struggle with some of the basic concepts I encounter.
    My problem : The example mentioned above only handles files from the asset folder, using the native implementation of Android’s AssetManager.
    Since I am looking to access files on external storage, I have to change this, but it is unclear to me how to do that.

    This is where I am stuck :
    I have a FFmpegExtractor class that calls this method in FFmpeg's avio.h :

    * Allocate and initialize an AVIOContext for buffered I/O. It must be later
    * freed with avio_context_free().
    *
    * @param buffer Memory block for input/output operations via AVIOContext.
    *        The buffer must be allocated with av_malloc() and friends.
    *        It may be freed and replaced with a new buffer by libavformat.
    *        AVIOContext.buffer holds the buffer currently in use,
    *        which must be later freed with av_free().
    * @param buffer_size The buffer size is very important for performance.
    *        For protocols with fixed blocksize it should be set to this blocksize.
    *        For others a typical size is a cache page, e.g. 4kb.
    * @param write_flag Set to 1 if the buffer should be writable, 0 otherwise.
    * @param opaque An opaque pointer to user-specific data.
    * @param read_packet  A function for refilling the buffer, may be NULL.
    *                     For stream protocols, must never return 0 but rather
    *                     a proper AVERROR code.
    * @param write_packet A function for writing the buffer contents, may be NULL.
    *        The function may not change the input buffers content.
    * @param seek A function for seeking to specified byte position, may be NULL.
    *
    * @return Allocated AVIOContext or NULL on failure.
    */
    AVIOContext *avio_alloc_context(
                     unsigned char *buffer,
                     int buffer_size,
                     int write_flag,
                     void *opaque,
                     int (*read_packet)(void *opaque, uint8_t *buf, int buf_size),
                     int (*write_packet)(void *opaque, uint8_t *buf, int buf_size),
                     int64_t (*seek)(void *opaque, int64_t offset, int whence));

    The call is made here :

    bool FFMpegExtractor::createAVIOContext(AAsset *asset, uint8_t *buffer, uint32_t bufferSize,
                                           AVIOContext **avioContext) {

       constexpr int isBufferWriteable = 0;

       *avioContext = avio_alloc_context(
               buffer, // internal buffer for FFmpeg to use
               bufferSize, // For optimal decoding speed this should be the protocol block size
               isBufferWriteable,
               asset, // Will be passed to our callback functions as a (void *)
               read, // Read callback function
               nullptr, // Write callback function (not used)
               seek); // Seek callback function

       if (*avioContext == nullptr){
           LOGE("Failed to create AVIO context");
           return false;
       } else {
           return true;
       }
    }

    I am looking to replace the asset, read and seek arguments so that I can use files from storage instead of AAsset objects.

    This is the read callback passed above :

    int read(void *opaque, uint8_t *buf, int buf_size) {

       auto asset = (AAsset *) opaque;
       int bytesRead = AAsset_read(asset, buf, (size_t)buf_size);
       return bytesRead;
    }

    And this is the seekcallback :

    int64_t seek(void *opaque, int64_t offset, int whence){

       auto asset = (AAsset*)opaque;

       // See https://www.ffmpeg.org/doxygen/3.0/avio_8h.html#a427ff2a881637b47ee7d7f9e368be63f
       if (whence == AVSEEK_SIZE) return AAsset_getLength(asset);
       if (AAsset_seek(asset, offset, whence) == -1){
           return -1;
       } else {
           return 0;
       }
    }

    I have tried just replacing AAsset with FILE, but of course that doesn’t do it. I know how to open and read files, but it is unclear to me if that is what’s expected here, and how I can translate the methods in AAsset to operations that return the desired values for files in storage.
    Can anyone point me in the right direction ?

    Edit : As it couldn’t fit down there here is the code block mentioned in my reply to @BrianChen’s helpful comments :

    bool FFMpegExtractor::openAVFormatContext(AVFormatContext *avFormatContext) {

       int result = avformat_open_input(&avFormatContext,
                                        "", /* URL is left empty because we're providing our own I/O */
                                        nullptr /* AVInputFormat *fmt */,
                                        nullptr /* AVDictionary **options */
       );

    Unfortunately avformat_open_input() produces a
    Fatal signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x20 in tid 23767, pid 23513