Recherche avancée

Médias (0)

Mot : - Tags -/inscription3

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (112)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (16518)

  • FFmpeg output seeking result to Android LruCache

    26 mai 2016, par vxh.viet

    Dear fellow StackOverflower,

    In my Android application, I’m trying to quickly retrieve a frame from a video using ffmpeg-android-java to display in an ImageView. The problem is using a typical ffmpeg’s -ss seeking command will require to write the output into the memory which I suppose is the big hit on performance :

    ffmpeg -ss 00:23:00 -i Mononoke.Hime.mkv -frames:v 1 out1.jpg

    A typical command execution like above takes around 700 - 1200 milliseconds. So instead of writing into the memory, I would like to write it into LruCache hoping to achieve a better performance.

    The problem is ffmpeg-android-java is a wrapper to execute ffmpeg command and as such I don’t know how to correctly supply the LruCache’s address for the command.

    Below is my code snippet :

    private void seekToPosition(long currentVideoPosition){

       String position = DiskUtils.formatMillisForFFmpeg(currentVideoPosition);

       String[] cmd = {"-ss", String.valueOf(position), "-i", mVideoPath,
                       "-y", "-an", "-frames:v", "1",
                       "/storage/emulated/0/Videos/out.jpg"}; //this the problem, I would like to change this to the address of LruCache

       try {
           // to execute "ffmpeg -version" command you just need to pass "-version"
           mFFmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {

               long start;
               long end;

               @Override
               public void onStart() {
                   canSeek = false;
                   start = System.currentTimeMillis();
               }

               @Override
               public void onProgress(String message) {}

               @Override
               public void onFailure(String message) {
                   Log.d(TAG, "FFmpeg cmd failure");
               }

               @Override
               public void onSuccess(String message) {
                   Log.d(TAG, "FFmpeg cmd success");

                   /*mFFmpeg.killRunningProcesses();
                   Log.d(TAG, "FFmpeg kill running process: " + mFFmpeg.killRunningProcesses());*/
               }

               @Override
               public void onFinish() {
                   canSeek = true;
                   Log.d(TAG, "FFmpeg cmd finished: is FFmpeg process running: " + mFFmpeg.isFFmpegCommandRunning());
                   end = System.currentTimeMillis();
                   Log.d(TAG, "FFmpeg excuted in: " + (end - start) + " milliseconds");
               }
           });
       } catch (FFmpegCommandAlreadyRunningException e) {
           // Handle if FFmpeg is already running
           Log.d(TAG, "FFmpeg exception: " + e);
       }
    }
  • Extract stream from swf with ffmpeg

    8 novembre 2020, par user3235250

    I'm trying to convert the swf file below to mp4. Just doing ffmeg -i l1.swf l1.mp4 does not work. I think I want to extract the audio and the second video streams from l1.swf and combine them to produce the mp4. I think the second video stream's settings match those of the first video stream (although ffprobe does not recognise them). Can anyone supply an ffmpeg command to do what I want ?

    


    $ ffprobe l1.swf 
ffprobe version 4.2.4-1ubuntu0.1 Copyright (c) 2007-2020 the FFmpeg developers
  built with gcc 9 (Ubuntu 9.3.0-10ubuntu2)
  configuration: --prefix=/usr --extra-version=1ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-nvenc --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
  libavutil      56. 31.100 / 56. 31.100
  libavcodec     58. 54.100 / 58. 54.100
  libavformat    58. 29.100 / 58. 29.100
  libavdevice    58.  8.100 / 58.  8.100
  libavfilter     7. 57.100 /  7. 57.100
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  5.100 /  5.  5.100
  libswresample   3.  5.100 /  3.  5.100
  libpostproc    55.  5.100 / 55.  5.100
[swf @ 0x561c270c3e80] SWF compressed file detected
[swf @ 0x561c270c3e80] Could not find codec parameters for stream 2 (Video: none, none): unknown codec
Consider increasing the value for the 'analyzeduration' and 'probesize' options
Input #0, swf, from 'l1.swf':
  Duration: N/A, bitrate: N/A
    Stream #0:0: Video: rawvideo (ARGB / 0x42475241), argb, 640x480, 14 fps, 14 tbr, 14 tbn
    Stream #0:1: Audio: mp3, 44100 Hz, mono, fltp, 64 kb/s
    Stream #0:2: Video: none, none, 14 tbr, 14 tbn
Unsupported codec with id 0 for input stream 2


    


  • Streaming jpegs to a video on my server

    2 décembre 2013, par Andrew Simpson

    I found a solution to my problem IF I was using a Linux/UNIX machine. It is FFServer from the tools from FFMPEG.

    I had been using FFMPEG on my client Winform Desktop C# to convert jpegs into an OGG video file for playback.

    I have now been tasked with uploading the jpegs to my server and rendering it as a video.

    Optimum, I would start an FFMPEG process on my client PC and supply its stdin with jpegs in byte array format. I have achieved this (I have looked around) but is there a way to redirect the stdoutput to my server that can be picked up by my code on the server and render in real-time to my web User ?

    I have looked on the ffmpeg web site but I am unsure how to 'fit' it in with my process.

    This is my code so far :

       public byte[] EncodeAndUploadImages(string _args1, string _fn)  
       {
           byte[] _data = null;
           try
           {
               clientBuild = new Process();
               clientBuild.StartInfo.WorkingDirectory = Environment.CurrentDirectory;
               clientBuild.StartInfo.Arguments = " -f mjpeg -r 30 -i - -c:v libtheora -q:v 7 -r 30 -f ogg -";
               clientBuild.StartInfo.FileName = Environment.CurrentDirectory + @"\ffmpeg.exe";
               clientBuild.StartInfo.UseShellExecute = false;
               clientBuild.StartInfo.RedirectStandardOutput = true;
               clientBuild.StartInfo.RedirectStandardError = true;
               clientBuild.StartInfo.RedirectStandardInput = true;
               clientBuild.StartInfo.CreateNoWindow = true;
               clientBuild.StartInfo.LoadUserProfile = false;
               clientBuild.EnableRaisingEvents = true;    
               clientBuild.Start();

               using (BinaryWriter bw = new BinaryWriter(clientBuild.StandardInput.BaseStream))
               {
                  //I am simulating a stream of jpegs coming in////////////////
                   for (int i = 1; i < 20; i++)
                   {
                       using (MemoryStream ms = new MemoryStream())
                       {
                           System.Diagnostics.Debug.Write(i.ToString("00000"));
                           Bitmap bmp = new Bitmap("h:\\streamin\\" + i.ToString("00000") + ".jpg");
                           bmp.Save(ms, System.Drawing.Imaging.ImageFormat.Jpeg);
                           bw.Write(ms.ToArray());                          
                           bmp.Dispose();
                           ms.Close();
                       }
                   }
                   bw.Close();
               }

               // I need some in my ffmpeg arguments to do something here//////  
               mStandardOutput = clientBuild.StandardOutput.BaseStream;
               mStandardOutput.BeginRead(mReadBuffer, 0, mReadBuffer.Length, StandardOutputReadCallback, null);
               clientBuild.WaitForExit();
               _data = mStandardOutputMs.ToArray();
               mStandardOutput.Close();
           }
           catch (Exception _ex)
           {

           }
           finally
           {
               clientBuild.Dispose();
           }
           return _data;
       }

    Thanks