Recherche avancée

Médias (0)

Mot : - Tags -/publication

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (105)

  • Use, discuss, criticize

    13 avril 2011, par

    Talk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
    The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
    A discussion list is available for all exchanges between users.

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

Sur d’autres sites (8181)

  • Anomalie #3205 : [Plugin-dist Mots] Incompatibilité avec l’API d’édition d’objet ?

    13 avril 2014, par charles razack

    PS : il faut bien lire $set = array('titre'=>'Mon super titre', 'tables_liees'=>'articles'); sans la virgule supplémentaire.
    Erreur de copié-collé...

  • How to find the memory leak in ios xcode ?

    12 août 2014, par Ajin Chacko

    It’s my RTSP streaming ios application with FFMPEG decoder and it streaming fine, But the memory continuously increasing while running. Please help me, Is it a memory leak ?. And how can I track the leak ?.

    Its my video streaming class : RTSPPlayer.m

    #import "RTSPPlayer.h"
    #import "Utilities.h"
    #import "AudioStreamer.h"

    @interface RTSPPlayer ()
    @property (nonatomic, retain) AudioStreamer *audioController;
    @end

    @interface RTSPPlayer (private)
    -(void)convertFrameToRGB;
    -(UIImage *)imageFromAVPicture:(AVPicture)pict width:(int)width height:(int)height;
    -(void)setupScaler;
    @end

    @implementation RTSPPlayer

    @synthesize audioController = _audioController;
    @synthesize audioPacketQueue,audioPacketQueueSize;
    @synthesize _audioStream,_audioCodecContext;
    @synthesize emptyAudioBuffer;

    @synthesize outputWidth, outputHeight;

    - (void)setOutputWidth:(int)newValue
    {
       if (outputWidth != newValue) {
           outputWidth = newValue;
           [self setupScaler];
       }
    }

    - (void)setOutputHeight:(int)newValue
    {
       if (outputHeight != newValue) {
           outputHeight = newValue;
           [self setupScaler];
       }
    }

    - (UIImage *)currentImage
    {
       if (!pFrame->data[0]) return nil;
       [self convertFrameToRGB];
       return [self imageFromAVPicture:picture width:outputWidth height:outputHeight];
    }

    - (double)duration
    {
       return (double)pFormatCtx->duration / AV_TIME_BASE;
    }

    - (double)currentTime
    {
       AVRational timeBase = pFormatCtx->streams[videoStream]->time_base;
       return packet.pts * (double)timeBase.num / timeBase.den;
    }

    - (int)sourceWidth
    {
       return pCodecCtx->width;
    }

    - (int)sourceHeight
    {
       return pCodecCtx->height;
    }

    - (id)initWithVideo:(NSString *)moviePath usesTcp:(BOOL)usesTcp
    {
       if (!(self=[super init])) return nil;

       AVCodec         *pCodec;

       // Register all formats and codecs
       avcodec_register_all();
       av_register_all();
       avformat_network_init();

       // Set the RTSP Options
       AVDictionary *opts = 0;
       if (usesTcp)
           av_dict_set(&opts, "rtsp_transport", "tcp", 0);


       if (avformat_open_input(&pFormatCtx, [moviePath UTF8String], NULL, &opts) !=0 ) {
           av_log(NULL, AV_LOG_ERROR, "Couldn't open file\n");
           goto initError;
       }

       // Retrieve stream information
       if (avformat_find_stream_info(pFormatCtx,NULL) < 0) {
           av_log(NULL, AV_LOG_ERROR, "Couldn't find stream information\n");
           goto initError;
       }

       // Find the first video stream
       videoStream=-1;
       audioStream=-1;

       for (int i=0; inb_streams; i++) {
           if (pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO) {
               NSLog(@"found video stream");
               videoStream=i;
           }

           if (pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_AUDIO) {
               audioStream=i;
               NSLog(@"found audio stream");
           }
       }

       if (videoStream==-1 && audioStream==-1) {
           goto initError;
       }

       // Get a pointer to the codec context for the video stream
       pCodecCtx = pFormatCtx->streams[videoStream]->codec;

       // Find the decoder for the video stream
       pCodec = avcodec_find_decoder(pCodecCtx->codec_id);
       if (pCodec == NULL) {
           av_log(NULL, AV_LOG_ERROR, "Unsupported codec!\n");
           goto initError;
       }

       // Open codec
       if (avcodec_open2(pCodecCtx, pCodec, NULL) < 0) {
           av_log(NULL, AV_LOG_ERROR, "Cannot open video decoder\n");
           goto initError;
       }

       if (audioStream > -1 ) {
           NSLog(@"set up audiodecoder");
           [self setupAudioDecoder];
       }

       // Allocate video frame
       pFrame = avcodec_alloc_frame();

       outputWidth = pCodecCtx->width;
       self.outputHeight = pCodecCtx->height;

       return self;

    initError:
    //  [self release];
       return nil;
    }


    - (void)setupScaler
    {
       // Release old picture and scaler
       avpicture_free(&picture);
       sws_freeContext(img_convert_ctx);  

       // Allocate RGB picture
       avpicture_alloc(&picture, PIX_FMT_RGB24, outputWidth, outputHeight);

       // Setup scaler
       static int sws_flags =  SWS_FAST_BILINEAR;
       img_convert_ctx = sws_getContext(pCodecCtx->width,
                                        pCodecCtx->height,
                                        pCodecCtx->pix_fmt,
                                        outputWidth,
                                        outputHeight,
                                        PIX_FMT_RGB24,
                                        sws_flags, NULL, NULL, NULL);

    }

    - (void)seekTime:(double)seconds
    {
       AVRational timeBase = pFormatCtx->streams[videoStream]->time_base;
       int64_t targetFrame = (int64_t)((double)timeBase.den / timeBase.num * seconds);
       avformat_seek_file(pFormatCtx, videoStream, targetFrame, targetFrame, targetFrame, AVSEEK_FLAG_FRAME);
       avcodec_flush_buffers(pCodecCtx);
    }

    - (void)dealloc
    {
       // Free scaler
       sws_freeContext(img_convert_ctx);  

       // Free RGB picture
       avpicture_free(&picture);

       // Free the packet that was allocated by av_read_frame
       av_free_packet(&packet);

       // Free the YUV frame
       av_free(pFrame);

       // Close the codec
       if (pCodecCtx) avcodec_close(pCodecCtx);

       // Close the video file
       if (pFormatCtx) avformat_close_input(&pFormatCtx);

       [_audioController _stopAudio];
      // [_audioController release];
       _audioController = nil;

     //  [audioPacketQueue release];
       audioPacketQueue = nil;

    //   [audioPacketQueueLock release];
       audioPacketQueueLock = nil;

    //  [super dealloc];
    }

    - (BOOL)stepFrame
    {
       // AVPacket packet;
       int frameFinished=0;

       while (!frameFinished && av_read_frame(pFormatCtx, &packet) >=0 ) {
           // Is this a packet from the video stream?
           if(packet.stream_index==videoStream) {
               // Decode video frame
               avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);
           }

           if (packet.stream_index==audioStream) {
               // NSLog(@"audio stream");
               [audioPacketQueueLock lock];

               audioPacketQueueSize += packet.size;
               [audioPacketQueue addObject:[NSMutableData dataWithBytes:&packet length:sizeof(packet)]];

               [audioPacketQueueLock unlock];

               if (!primed) {
                   primed=YES;
                   [_audioController _startAudio];
               }

               if (emptyAudioBuffer) {
                   [_audioController enqueueBuffer:emptyAudioBuffer];
               }
           }
       }

       return frameFinished!=0;
    }

    - (void)convertFrameToRGB
    {
       sws_scale(img_convert_ctx,
                 pFrame->data,
                 pFrame->linesize,
                 0,
                 pCodecCtx->height,
                 picture.data,
                 picture.linesize);
    }

    - (UIImage *)imageFromAVPicture:(AVPicture)pict width:(int)width height:(int)height
    {
       CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
       CFDataRef data = CFDataCreateWithBytesNoCopy(kCFAllocatorDefault, pict.data[0], pict.linesize[0]*height,kCFAllocatorNull);
       CGDataProviderRef provider = CGDataProviderCreateWithCFData(data);
       CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
       CGImageRef cgImage = CGImageCreate(width,
                                          height,
                                          8,
                                          24,
                                          pict.linesize[0],
                                          colorSpace,
                                          bitmapInfo,
                                          provider,
                                          NULL,
                                          NO,
                                          kCGRenderingIntentDefault);
       CGColorSpaceRelease(colorSpace);
       UIImage *image = [UIImage imageWithCGImage:cgImage];

       CGImageRelease(cgImage);
       CGDataProviderRelease(provider);
       CFRelease(data);

       return image;
    }

    - (void)setupAudioDecoder
    {    
       if (audioStream >= 0) {
           _audioBufferSize = AVCODEC_MAX_AUDIO_FRAME_SIZE;
           _audioBuffer = av_malloc(_audioBufferSize);
           _inBuffer = NO;

           _audioCodecContext = pFormatCtx->streams[audioStream]->codec;
           _audioStream = pFormatCtx->streams[audioStream];

           AVCodec *codec = avcodec_find_decoder(_audioCodecContext->codec_id);
           if (codec == NULL) {
               NSLog(@"Not found audio codec.");
               return;
           }

           if (avcodec_open2(_audioCodecContext, codec, NULL) < 0) {
               NSLog(@"Could not open audio codec.");
               return;
           }

           if (audioPacketQueue) {
             //  [audioPacketQueue release];
               audioPacketQueue = nil;
           }        
           audioPacketQueue = [[NSMutableArray alloc] init];

           if (audioPacketQueueLock) {
           //    [audioPacketQueueLock release];
               audioPacketQueueLock = nil;
           }
           audioPacketQueueLock = [[NSLock alloc] init];

           if (_audioController) {
               [_audioController _stopAudio];
            //   [_audioController release];
               _audioController = nil;
           }
           _audioController = [[AudioStreamer alloc] initWithStreamer:self];
       } else {
           pFormatCtx->streams[audioStream]->discard = AVDISCARD_ALL;
           audioStream = -1;
       }
    }

    - (void)nextPacket
    {
       _inBuffer = NO;
    }

    - (AVPacket*)readPacket
    {
       if (_currentPacket.size > 0 || _inBuffer) return &_currentPacket;

       NSMutableData *packetData = [audioPacketQueue objectAtIndex:0];
       _packet = [packetData mutableBytes];

       if (_packet) {
           if (_packet->dts != AV_NOPTS_VALUE) {
               _packet->dts += av_rescale_q(0, AV_TIME_BASE_Q, _audioStream->time_base);
           }

           if (_packet->pts != AV_NOPTS_VALUE) {
               _packet->pts += av_rescale_q(0, AV_TIME_BASE_Q, _audioStream->time_base);
           }

           [audioPacketQueueLock lock];
           audioPacketQueueSize -= _packet->size;
           if ([audioPacketQueue count] > 0) {
               [audioPacketQueue removeObjectAtIndex:0];
           }
           [audioPacketQueueLock unlock];

           _currentPacket = *(_packet);
       }

       return &_currentPacket;  
    }

    - (void)closeAudio
    {
       [_audioController _stopAudio];
       primed=NO;
    }

    @end
  • IOException : Error running exec() Command, When calling FFmpeg method

    21 avril 2016, par kc ochibili

    i am trying to create a slide show mp4 using this ffmpeg method but i keep getting this IOException Error running exec(). Command when i click the button.

    here is my call

    ffmpegController = new FfmpegController(getTempDirectory(), new File(""));
    ffmpegController.createSlideshowFromImagesAndAudio(slideFrames, getAudioPath(), getOutPath(), 500, mCallbackResponse);

    Here is the source code of the small project.
    Here is the apk

    And here is my Error message

      Error running exec(). Command: [ffmpeg, -y, -i, /storage/emulated/0/TestFFmpeg/frame1.png, /storage/emulated/0/TestFFmpeg/temp/image-000.jpg]

    Working Directory: lib Environment: [VIBE_PIPE_PATH=/dev/pipes, ANDROID_ROOT=/system, EMULATED_STORAGE_SOURCE=/mnt/shell/emulated, LOOP_MOUNTPOINT=/mnt/obb, EMULATED_STORAGE_TARGET=/storage/emulated, ANDROID_BOOTLOGO=1, LD_LIBRARY_PATH=/vendor/lib:/system/lib, EXTERNAL_STORAGE=/storage/emulated/legacy, ANDROID_SOCKET_zygote=9, ANDROID_DATA=/data, PATH=/sbin:/vendor/bin:/system/sbin:/system/bin:/system/xbin, ANDROID_ASSETS=/system/app, ASEC_MOUNTPOINT=/mnt/asec, BOOTCLASSPATH=/system/framework/core.jar:/system/framework/core-junit.jar:/system/framework/bouncycastle.jar:/system/framework/ext.jar:/system/framework/framework.jar:/system/framework/framework2.jar:/system/framework/telephony-common.jar:/system/framework/voip-common.jar:/system/framework/mms-common.jar:/system/framework/android.policy.jar:/system/framework/services.jar:/system/framework/apache-xml.jar:/system/framework/sec_edm.jar:/system/framework/seccamera.jar:/system/framework/secocsp.jar:/system/framework/sc.jar:/system/framework/scrollpause.jar:/system/framework/stayrotation.jar:/system/framework/smartfaceservice.jar:/system/framework/sws.jar:/system/framework/WfdCommon.jar, ANDROID_PROPERTY_WORKSPACE=8,66560, SECONDARY_STORAGE=/storage/extSdCard:/storage/UsbDriveA:/storage/UsbDriveB:/storage/UsbDriveC:/storage/UsbDriveD:/storage/UsbDriveE:/storage/UsbDriveF, ANDROID_STORAGE=/storage]

    Here is my Activity code :

    public class MainActivity extends Activity {

    Button testButton;
    EditText errorLogView;

    TinyDB tinydb;// sharedPreference Wrapper
    static Context context;
    @Override
    protected void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);
       setContentView(R.layout.activity_main);
       context = getApplicationContext();
       tinydb = new TinyDB(context); // sharedPreference Wrapper
       testButton = (Button) findViewById(R.id.test_Image_View);
       errorLogView = (EditText) findViewById(R.id.errorlog);
       setListeners();
    }

    public void setListeners(){
       testButton.setOnClickListener(new OnClickListener() {

           @Override
           public void onClick(View v) {
               // TODO Auto-generated method stub
               Bitmap frame1Bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.ic_launcher);
               //Saves the image to the file system an returns the path
               String firstFrame = tinydb.putImagePNG("TestFFmpeg", "frame1.png", frame1Bitmap);
               String secondFrame = tinydb.putImagePNG("TestFFmpeg", "frame2.png", frame1Bitmap);
               String thirdFrame = tinydb.putImagePNG("TestFFmpeg", "frame3.png", frame1Bitmap);


               ArrayList<clip> slideFrames = new ArrayList<clip>();
               slideFrames.add(new Clip(firstFrame));
               slideFrames.add(new Clip(secondFrame));
               slideFrames.add(new Clip(thirdFrame));

               copyResourceSoundToSDCard();

               FfmpegController ffmpegController = null;
               try {

                   ffmpegController = new FfmpegController(getTempDirectory(), new File(""));
                   ffmpegController.createSlideshowFromImagesAndAudio(slideFrames, getAudioPath(), getOutPath(), 500, mCallbackResponse);

               } catch (FileNotFoundException e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
                   toast("FileNotFoundException");
                   toast(e.getLocalizedMessage());
               } catch (IOException e) {
                   // TODO Auto-generated catch block
                   toast("IOException");
                   toast(e.getLocalizedMessage());
                   errorLogView.setText(e.getLocalizedMessage());
                   e.printStackTrace();
               } catch (Exception e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
                   toast("Exception ");
                   toast(e.getLocalizedMessage());
               }          
           }
       });


    }

    public Clip getAudioPath(){
       Clip mAudPath = null;
       try {
           mAudPath = new Clip(new File(tinydb.getString("audpath")).getCanonicalPath());
       } catch (IOException e1) {
           // TODO Auto-generated catch block
           e1.printStackTrace();
       }
       return mAudPath;
    }

    public Clip getOutPath(){
       String videoName = ("myTestVideo.mp4");
       String saveFolder = ("TestFFmpeg/videos");
       String movieFullPath = setupAudioFolder(saveFolder, videoName);

       Clip outPath = null;
       try {
           outPath = new Clip(new File(movieFullPath).getCanonicalPath());
       } catch (IOException e1) {
           // TODO Auto-generated catch block
           e1.printStackTrace();
       }
       tinydb.putString("outhPath", outPath.path);

       return outPath;
    }

    public void copyResourceSoundToSDCard(){
       try {
           copyRawFile(context, R.raw.screens_shot_sound, getResaveDirectory(), "755");
       } catch (IOException e) {
           // TODO Auto-generated catch block
           e.printStackTrace();
       } catch (InterruptedException e) {
           // TODO Auto-generated catch block
           e.printStackTrace();
       }
    }

    private File getResaveDirectory(){

       String audioName = ("ShotSound.wav");
       String saveFolder = ("TestFFmpeg");
       File appRootFile;
       String path = setupAudioFolder(saveFolder, audioName);
       tinydb.putString("audpath",  path);
       appRootFile = new File(path);
       return appRootFile;
     }

    public String setupAudioFolder(String theFolder, String theImageName){
       File sdcard_path = Environment.getExternalStorageDirectory();
       File mFolder = new File(sdcard_path, theFolder);
       if (!mFolder.exists()) {
           if (!mFolder.mkdirs()) {
               Log.e("While creatingsave path",
                       "Default Save Path Creation Error");
               // Toast("Default Save Path Creation Error");
           }
       }
       String mFullPath = mFolder.getPath() + '/' + theImageName;

       return mFullPath;
    }
    private static void copyRawFile(Context ctx, int resid, File file, String mode) throws IOException, InterruptedException
    {
       final String abspath = file.getAbsolutePath();
       // Write the iptables binary
       final FileOutputStream out = new FileOutputStream(file);
       final InputStream is = ctx.getResources().openRawResource(resid);
       byte buf[] = new byte[1024];
       int len;
       while ((len = is.read(buf)) > 0) {
           out.write(buf, 0, len);
       }
       out.close();
       is.close();
       // Change the permissions
       Runtime.getRuntime().exec("chmod "+mode+" "+abspath).waitFor();
    }      
    ShellCallback mCallbackResponse = new ShellUtils.ShellCallback() {

       @Override
       public void shellOut(String shellLine) {
           // TODO Auto-generated method stub

       }

       @Override
       public void processComplete(int exitValue) {
           // TODO Auto-generated method stub
           toast("process done");

       }
    };

    public File getTempDirectory(){
       String saveFolder = ("TestFFmpeg/temp");
       File appRootFile = setupCustomFile(saveFolder);


       return appRootFile;
    }

    public File setupCustomFile(String theFolder){
       File sdcard_path = Environment.getExternalStorageDirectory();
       File mFolder = new File(sdcard_path, theFolder);
       if (!mFolder.exists()) {
           if (!mFolder.mkdirs()) {
               Log.e("While creatingsave path",
                       "Default Save Path Creation Error");
               // Toast("Default Save Path Creation Error");
           }
       }

       return mFolder;
    }



    public static void toast(String thetext) {
       Toast.makeText(context, thetext, Toast.LENGTH_LONG).show();
    }
    </clip></clip>

    any help would be appreciated