Recherche avancée

Médias (33)

Mot : - Tags -/creative commons

Autres articles (105)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

Sur d’autres sites (8674)

  • flv reencode to mp4 for iphone/ipod via ffmpeg and x264 (quality issue)

    3 octobre 2011, par zeroasterisk

    There are a lot of questions on this topic, and I've read most of them and most of the google search results I could come up with.

    When I use FFMPEG to convert a FLV to a iphone3 compatble MP4 file, it just doesn't preserver enough of the quality. Yes, I've worked the hell out of -sameq and -b and -bt settings, text just isn't readable.

    Next I tried to split the video out and process it directly, using these instructions :
    https://sites.google.com/site/linuxencoding/x264-encoding-guide

    The problem is myplayer (via ffmpeg) was not able to determine the duration of the FLV (even though the metadata was set).

    (I assume) Because of that unknown duration, when I create the MP4 file, the resulting x264 file plays through super-fast while the audio plays at the normal rate.

    user@server:/tmp# mplayer -nosound -benchmark -sws 9 -vf dsize=640:480:0,scale=0:0,expand=640:480 -vo yuv4mpeg:file=>(x264 --demuxer y4m --crf 0 --preset slow --threads auto --output output.264 - 2>x264.log) 'input.flv'
    MPlayer 1.0rc4-4.4.5 (C) 2000-2010 MPlayer Team
    mplayer: could not connect to socket
    mplayer: No such file or directory
    Failed to open LIRC support. You will not be able to use your remote control.

    Playing input.flv.
    libavformat file format detected.
    [flv @ 0x1202460]Estimating duration from bitrate, this may be inaccurate
    [lavf] stream 0: video (vp6f), -vid 0
    [lavf] stream 1: audio (nellymoser), -aid 0
    VIDEO:  [VP6F]  1680x992  0bpp  1000.000 fps   33.4 kbps ( 4.1 kbyte/s)
    Clip info:
    audiocodecid: 6
    audiodatarate: 86
    audiosamplerate: 44100
    audiosamplesize: 16
    audiosize: 6097005
    canSeekToEnd: true
    datasize: 8609138
    duration: 567
    framerate: 2
    hasAudio: true
    hasCuePoints: false
    hasKeyframes: true
    hasMetadata: true
    hasVideo: true
    height: 992
    lasttimestamp: 567
    metadatacreator: flvtool++ (Facebook, Motion project, dweatherford)
    stereo: false
    totalframes: 1043
    videocodecid: 4
    videodatarate: 33
    videosize: 2316256
    width: 1680
    Using (default) progressive frame mode.Opening video filter: [expand w=640 h=480]
    Expand: 640 x 480, -1 ; -1, osd: 0, aspect: 0.000000, round: 1
    Opening video filter: [scale w=0 h=0]
    Opening video filter: [dsize=640:480:0]
    ==========================================================================
    Opening video decoder: [ffmpeg] FFmpeg's libavcodec codec family
    Selected video codec: [ffvp6f] vfm: ffmpeg (FFmpeg VP6 Flash)
    ==========================================================================
    Audio: no sound
    Starting playback...
    Movie-Aspect is undefined - no prescaling applied.
    [swscaler @ 0x7f0c738b9620]Lanczos scaler, from yuv420p to yuv420p using MMX2
    VO: [yuv4mpeg] 640x480 => 641x480 Planar YV12

    I have also tried specifying FPS, but no change in results

    user@server:/tmp# mplayer -nosound -fps 25-benchmark -sws 9 -vf dsize=640:480:0,scale=0:0,expand=640:480 -vo yuv4mpeg:file=>(x264 --demuxer y4m --fps 25 --crf 0 --preset slow --threads auto --output output.264 - 2>x264.log) 'input.flv'

    Can someone tell me how to either :

    1. fix my split A/V processing/timing/duration issues ?
    2. improve the
      quality of the FFMPEG conversion of FLV to iphone3 compatible
      format ?
  • Decoding audio via Android using FFMpeg

    31 octobre 2013, par Rob Haupt

    I can play Wav files using the below code without issues. When trying to play the exact same media in Mp3 format I only get garbled junk. I believe I am fundamentally misunderstanding how the avcodec_decode_audio3 function works.

    Since the Wav file contains PCM data when it is decoded it can go straight to the AudioTrack.write function. There must be some additional step to get Mp3s to work like this. I don't know what I'm missing, but I've been pulling my hair out for a week now.

    Java Code

    package com.rohaupt.RRD2;

    import java.io.FileNotFoundException;
    import java.io.FileOutputStream;
    import java.io.IOException;

    import android.app.Activity;
    import android.media.AudioFormat;
    import android.media.AudioManager;
    import android.media.AudioTrack;
    import android.media.MediaPlayer;
    import android.os.Bundle;
    import android.os.SystemClock;

    public class player extends Activity
    {
       private AudioTrack track;
       private FileOutputStream os;
       /** Called when the activity is first created. */
       @Override
       public void onCreate(Bundle savedInstanceState)
       {
           super.onCreate(savedInstanceState);
           setContentView(R.layout.main);
           createEngine();

           MediaPlayer mp = new MediaPlayer();
           mp.start();

           int bufSize = AudioTrack.getMinBufferSize(32000,
                                                     AudioFormat.CHANNEL_CONFIGURATION_STEREO,
                                                     AudioFormat.ENCODING_PCM_16BIT);


           track = new AudioTrack(AudioManager.STREAM_MUSIC,
                                  32000,
                                  AudioFormat.CHANNEL_CONFIGURATION_STEREO,
                                  AudioFormat.ENCODING_PCM_16BIT,
                                  bufSize,
                                  AudioTrack.MODE_STREAM);

           byte[] bytes = new byte[bufSize];

           try {
               os = new FileOutputStream("/sdcard/a.out",false);
           } catch (FileNotFoundException e) {
               // TODO Auto-generated catch block
               e.printStackTrace();
           }

           String result = loadFile("/sdcard/a.mp3",bytes);

           try {
               os.close();
           } catch (IOException e) {
               // TODO Auto-generated catch block
               e.printStackTrace();
           }
       }

       void playSound(byte[] buf, int size) {  
           //android.util.Log.v("ROHAUPT", "RAH Playing");
           if(track.getPlayState()!=AudioTrack.PLAYSTATE_PLAYING)
               track.play();
           track.write(buf, 0, size);

           try {
               os.write(buf,0,size);
           } catch (IOException e) {
               // TODO Auto-generated catch block
               e.printStackTrace();
           }
       }


       private native void createEngine();
       private native String loadFile(String file, byte[] array);

       /** Load jni .so on initialization*/
       static {
            System.loadLibrary("avutil");
            System.loadLibrary("avcore");
            System.loadLibrary("avcodec");
            System.loadLibrary("avformat");
            System.loadLibrary("avdevice");
            System.loadLibrary("swscale");
            System.loadLibrary("avfilter");
            System.loadLibrary("ffmpeg");
       }
    }

    C Code

    #include
    #include
    #include
    #include <android></android>log.h>

    #include "libavcodec/avcodec.h"
    #include "libavformat/avformat.h"

    #define DEBUG_TAG "ROHAUPT"  

    void Java_com_rohaupt_RRD2_player_createEngine(JNIEnv* env, jclass clazz)
       {
           avcodec_init();

           av_register_all();


       }

       jstring Java_com_rohaupt_RRD2_player_loadFile(JNIEnv* env, jobject obj,jstring file,jbyteArray array)
       {  
           jboolean            isCopy;  
           int                 i;
           int                 audioStream=-1;
           int                 res;
           int                 decoded = 0;
           int                 out_size;
           AVFormatContext     *pFormatCtx;
           AVCodecContext      *aCodecCtx;
           AVCodecContext      *c= NULL;
           AVCodec             *aCodec;
           AVPacket            packet;
           jclass              cls = (*env)->GetObjectClass(env, obj);
           jmethodID           play = (*env)->GetMethodID(env, cls, "playSound", "([BI)V");//At the begining of your main function
           const char *        szfile = (*env)->GetStringUTFChars(env, file, &amp;isCopy);
           int16_t *           pAudioBuffer = (int16_t *) av_malloc (AVCODEC_MAX_AUDIO_FRAME_SIZE*2+FF_INPUT_BUFFER_PADDING_SIZE);
           int16_t *           outBuffer = (int16_t *) av_malloc (AVCODEC_MAX_AUDIO_FRAME_SIZE*2+FF_INPUT_BUFFER_PADDING_SIZE);


           __android_log_print(ANDROID_LOG_INFO, DEBUG_TAG, "RAH28 Starting");
           res = av_open_input_file(&amp;pFormatCtx, szfile, NULL, 0, NULL);
           if(res!=0)
           {
               __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH opening input failed with result: [%d]", res);
               return file;
           }

           __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH getting stream info");
           res = av_find_stream_info(pFormatCtx);
           if(res&lt;0)
           {
               __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH getting stream info failed with result: [%d]", res);
               return file;
           }

           __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH getting audio stream");
           for(i=0; i &lt; pFormatCtx->nb_streams; i++) {
             if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_AUDIO &amp;&amp;
                audioStream &lt; 0) {
               audioStream=i;
             }
           }


           if(audioStream==-1)
           {
               __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH couldn&#39;t find audio stream");
               return file;
           }
           __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio stream found with result: [%d]", res);


           aCodecCtx=pFormatCtx->streams[audioStream]->codec;
           __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio codec info loaded");

           __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio codec info [%d]", aCodecCtx->codec_id);

           aCodec = avcodec_find_decoder(aCodecCtx->codec_id);
           if(!aCodec) {
              __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio codec unsupported");
              return file;
           }
           __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio codec info found");


           res = avcodec_open(aCodecCtx, aCodec);
           __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio codec loaded [%d] [%d]",aCodecCtx->sample_fmt,res);

           //c=avcodec_alloc_context();
           av_init_packet(&amp;packet);


           __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH channels [%d] sample rate [%d] sample format [%d]",aCodecCtx->channels,aCodecCtx->sample_rate,aCodecCtx->sample_fmt);


           int x,y;
           x=0;y=0;
           while (av_read_frame(pFormatCtx, &amp;packet)>= 0) {
               __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH frame read: [%d] [%d]",x++,y);

               if (aCodecCtx->codec_type == AVMEDIA_TYPE_AUDIO) {
                           __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH audio ready");
                           int data_size = AVCODEC_MAX_AUDIO_FRAME_SIZE*2+FF_INPUT_BUFFER_PADDING_SIZE;
                           int size=packet.size;
                           y=0;
                           decoded = 0;
                           __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH packet size: [%d]", size);
                           while(size > 0) {

                                   __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH decoding: [%d] [%d]",x,y++);
                                   int len = avcodec_decode_audio3(aCodecCtx, pAudioBuffer, &amp;data_size, &amp;packet);



                                   __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH 1 size [%d] len [%d] data_size [%d] out_size [%d]",size,len,data_size,out_size);
                                   jbyte *bytes = (*env)->GetByteArrayElements(env, array, NULL);

                                   memcpy(bytes + decoded, pAudioBuffer, len); //


                                   __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH 2");

                                   (*env)->ReleaseByteArrayElements(env, array, bytes, 0);
                                   __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH 3");

                                   (*env)->CallVoidMethod(env, obj, play, array, len);

                                   __android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH 4");


                                   size -= len;
                                   decoded += len;

                          }
                          av_free_packet(&amp;packet);
               }

        }

             // Close the video file
           av_close_input_file(pFormatCtx);

           //__android_log_print(ANDROID_LOG_DEBUG, DEBUG_TAG, "RAH Finished Running result: [%d]", res);
           (*env)->ReleaseStringUTFChars(env, file, szfile);  
           return file;  
       }

    To add some detail. When Calling this function with a Wav File I get the following Log Data

    I/ROHAUPT (  227): RAH28 Starting
    D/ROHAUPT (  227): RAH getting stream info
    D/ROHAUPT (  227): RAH getting audio stream
    D/ROHAUPT (  227): RAH audio stream found with result: [0]
    D/ROHAUPT (  227): RAH audio codec info loaded
    D/ROHAUPT (  227): RAH audio codec info [65536]
    D/ROHAUPT (  227): RAH audio codec info found
    D/ROHAUPT (  227): RAH audio codec loaded [1] [0]
    D/ROHAUPT (  227): RAH channels [2] sample rate [32000] sample format [1]
    D/ROHAUPT (  227): RAH frame read: [0] [0]
    D/ROHAUPT (  227): RAH audio ready
    D/ROHAUPT (  227): RAH packet size: [4096]
    D/ROHAUPT (  227): RAH decoding: [1] [0]
    D/ROHAUPT (  227): RAH 1 size [4096] len [4096] data_size [4096] out_size [0]
    D/ROHAUPT (  227): RAH 2
    D/ROHAUPT (  227): RAH 3
    D/ROHAUPT (  227): RAH 4
    D/ROHAUPT (  227): RAH frame read: [1] [1]
    D/ROHAUPT (  227): RAH audio ready
    ...
    D/ROHAUPT (  227): RAH frame read: [924] [1]
    D/ROHAUPT (  227): RAH audio ready
    D/ROHAUPT (  227): RAH packet size: [4096]
    D/ROHAUPT (  227): RAH decoding: [925] [0]
    D/ROHAUPT (  227): RAH 1 size [4096] len [4096] data_size [4096] out_size [0]
    D/ROHAUPT (  227): RAH 2
    D/ROHAUPT (  227): RAH 3
    D/ROHAUPT (  227): RAH 4
    D/ROHAUPT (  227): RAH frame read: [925] [1]
    D/ROHAUPT (  227): RAH audio ready
    D/ROHAUPT (  227): RAH packet size: [3584]
    D/ROHAUPT (  227): RAH decoding: [926] [0]
    D/ROHAUPT (  227): RAH 1 size [3584] len [3584] data_size [3584] out_size [0]
    D/ROHAUPT (  227): RAH 2
    D/ROHAUPT (  227): RAH 3
    D/ROHAUPT (  227): RAH 4

    When calling with an Mp3 file I get the following

    I/ROHAUPT (  280): RAH28 Starting
    D/ROHAUPT (  280): RAH getting stream info
    D/ROHAUPT (  280): RAH getting audio stream
    D/ROHAUPT (  280): RAH audio stream found with result: [0]
    D/ROHAUPT (  280): RAH audio codec info loaded
    D/ROHAUPT (  280): RAH audio codec info [86017]
    D/ROHAUPT (  280): RAH audio codec info found
    D/ROHAUPT (  280): RAH audio codec loaded [1] [0]
    D/ROHAUPT (  280): RAH channels [2] sample rate [32000] sample format [1]
    D/ROHAUPT (  280): RAH frame read: [0] [0]
    D/ROHAUPT (  280): RAH audio ready
    D/ROHAUPT (  280): RAH packet size: [432]
    D/ROHAUPT (  280): RAH decoding: [1] [0]
    D/ROHAUPT (  280): RAH 1 size [432] len [432] data_size [4608] out_size [0]
    D/ROHAUPT (  280): RAH 2
    ...
    D/ROHAUPT (  280): RAH frame read: [822] [1]
    D/ROHAUPT (  280): RAH audio ready
    D/ROHAUPT (  280): RAH packet size: [432]
    D/ROHAUPT (  280): RAH decoding: [823] [0]
    D/ROHAUPT (  280): RAH 1 size [432] len [432] data_size [4608] out_size [0]
    D/ROHAUPT (  280): RAH 2
    D/ROHAUPT (  280): RAH 3
    D/ROHAUPT (  280): RAH 4
    D/ROHAUPT (  280): RAH frame read: [823] [1]
    D/ROHAUPT (  280): RAH audio ready
    D/ROHAUPT (  280): RAH packet size: [432]
    D/ROHAUPT (  280): RAH decoding: [824] [0]
    D/ROHAUPT (  280): RAH 1 size [432] len [432] data_size [4608] out_size [0]
    D/ROHAUPT (  280): RAH 2
    D/ROHAUPT (  280): RAH 3
    D/ROHAUPT (  280): RAH 4
  • Anomalie #2292 (Nouveau) : Header de la dist avec un nom de site un peu long

    6 septembre 2011, par Johan .

    Comme dit sur le forum dev le 01/09 : Si le nom du site est long, il se superpose sur/sous le moteur de recherche. L’affichage du slogan est aussi succeptible faire pas super joli s’il est long. Pour la longueur du nom du site et le pied, là c’est (...)