Recherche avancée

Médias (0)

Mot : - Tags -/latitude

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (52)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Librairies et binaires spécifiques au traitement vidéo et sonore

    31 janvier 2010, par

    Les logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
    Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
    Binaires complémentaires et facultatifs flvtool2 : (...)

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

Sur d’autres sites (9231)

  • Working with ffmpeg in Xamarin Android

    8 janvier 2018, par Ahmed Mujtaba

    I’m building an android app using Xamarin. The requirement of the app is to capture video from the camera and encode the video to send it across to a server. Initially I was using an encoder library on the server side to encode recorded video but it was proving to be extremely unreliable and inefficient specially for large sized video files. I have posted my issues on another thread here I then decided to encode the video on the client side and then send it to the server. I’ve found encoding to be a bit complicated and there isn’t much information available on how this can be done so I search for the only way I knew how to encode a video that is by using ffmpeg codec. I’ve found some solutions. There’s a project on github that demonstrates how ffmpeg is used inside a Xamarin android project. However running the solution doesn’t give any output. The project has a binary ffmpeg file which is installed to the phone directory using the code below :

    _ffmpegBin = InstallBinary(XamarinAndroidFFmpeg.Resource.Raw.ffmpeg, "ffmpeg", false);

    Below is the example code for encoding video into different set of outputs :

               _workingDirectory = Android.OS.Environment.ExternalStorageDirectory.AbsolutePath;
           var sourceMp4 = "cat1.mp4";
           var destinationPathAndFilename = System.IO.Path.Combine (_workingDirectory, "cat1_out.mp4");
           var destinationPathAndFilename2 = System.IO.Path.Combine (_workingDirectory, "cat1_out2.mp4");
           var destinationPathAndFilename4 = System.IO.Path.Combine (_workingDirectory, "cat1_out4.wav");
           if (File.Exists (destinationPathAndFilename))
               File.Delete (destinationPathAndFilename);
           CreateSampleFile(Resource.Raw.cat1, _workingDirectory, sourceMp4);


           var ffmpeg = new FFMpeg (this, _workingDirectory);

           var sourceClip = new Clip (System.IO.Path.Combine(_workingDirectory, sourceMp4));

           var result = ffmpeg.GetInfo (sourceClip);

           var br = System.Environment.NewLine;

           // There are callbacks based on Standard Output and Standard Error when ffmpeg binary is running as a process:

           var onComplete = new MyCommand ((_) => {
               RunOnUiThread(() =>_logView.Append("DONE!" + br + br));
           });

           var onMessage = new MyCommand ((message) => {
               RunOnUiThread(() =>_logView.Append(message + br + br));
           });

           var callbacks = new FFMpegCallbacks (onComplete, onMessage);

           // 1. The idea of this first test is to show that video editing is possible via FFmpeg:
           // It results in a 150x150 movie that eventually zooms on a cat ear. This is desaturated, and there's a fade in.

           var filters = new List<videofilter> ();
           filters.Add (new FadeVideoFilter ("in", 0, 100));
           filters.Add(new CropVideoFilter("150","150","0","0"));
           filters.Add(new ColorVideoFilter(1.0m, 1.0m, 0.0m, 0.5m, 1.0m, 1.0m, 1.0m, 1.0m));
           var outputClip = new Clip (destinationPathAndFilename) { videoFilter = VideoFilter.Build (filters)  };
           outputClip.H264_CRF = "18"; // It's the quality coefficient for H264 - Default is 28. I think 18 is pretty good.
           ffmpeg.ProcessVideo(sourceClip, outputClip, true, new FFMpegCallbacks(onComplete, onMessage));

           //2. This is a similar version version in command line only:
           string[] cmds = new string[] {
               "-y",
               "-i",
               sourceClip.path,
               "-strict",
               "-2",
               "-vf",
               "mp=eq2=1:1.68:0.3:1.25:1:0.96:1",
               destinationPathAndFilename2,
               "-acodec",
               "copy",
           };
           ffmpeg.Execute (cmds, callbacks);

           // 3. This lists codecs:
           string[] cmds3 = new string[] {
               "-codecs",
           };
           ffmpeg.Execute (cmds, callbacks);

           // 4. This convers to WAV
           // Note that the cat movie just has some silent house noise.
           ffmpeg.ConvertToWaveAudio(sourceClip, destinationPathAndFilename4, 44100, 2, callbacks, true);
    </videofilter>

    I have tried different commands but no output file is generated. I have tried to use another project found here but this one has the same issue. I don’t get any errors but no output file is generated. I’m really hoping someone can help me find a way I can manage to use ffmpeg in my project or some way to compress video to transport it to the server.

    I will really appreciate if someone can point me to the right direction.

  • ffmpeg : Merge multiple mpeg videos while switching audio tracks

    12 octobre 2015, par Mark Schneider

    I have 7 mpeg files that were converted from images and videos as follows :

    001.mpeg - created from a still image
    002.mpeg - created from a still image
    003.mpeg - video with an audio track
    004.mpeg - created from a still image
    005.mpeg - video with an audio track
    006.mpeg - created from a still image

    I need to concatenate them together along with an audio soundtrack into a single mp4 file. While the mpeg files created from images are being displayed, I’d like to hear the audio soundtrack, however, while the videos (003.mpeg and 005.mpeg) are being displayed, I’d like to hear their respective audio tracks instead of the soundtrack.

    I’m able to stitch together the files with the audio soundtrack playing throughout using these commands :

    ffmpeg -i concat:"001.mpeg|002.mpeg|003.mpeg|004.mpeg|005.mpeg|007.mpeg" -c copy video_before_adding_audio.mp4

    ffmpeg i video_before_adding_audio.mp4 -i soundtrack.mp3 -map 0:0 -map 1 -vcodec copy -acodec aac -strict experimental -shortest vue.mp4

    Is there a way to get ffmpeg to “duck” the audio soundtrack down to 10% volume while the videos are being shown ? If not, is there a way to duck the audio entirely while those files are being shown ?

    As a secondary question, so as to not have abrupt changes in the sound when switching between the soundtrack and the video’s audio tracks, is there a way to instruct ffmpeg fade the soundtrack out just as 003.mpeg is being reached (maybe over 1/4 second) and to fade it back in when 004.mpeg starts (and to the same for 005.mpeg and 006.mpeg) ?

    Here is the ffprobe output for the 003.mpeg and 005.mpeg files :

    ffprobe version 2.5 Copyright (c) 2007-2014 the FFmpeg developers
     built on Dec 10 2014 19:03:38 with Apple LLVM version 6.0 (clang-600.0.56) (based on LLVM 3.5svn)
     configuration: --prefix=/usr/local/Cellar/ffmpeg/2.5 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-libx264 --enable-libmp3lame --enable-libxvid --enable-libvo-aacenc --enable-vda
     libavutil      54. 15.100 / 54. 15.100
     libavcodec     56. 13.100 / 56. 13.100
     libavformat    56. 15.102 / 56. 15.102
     libavdevice    56.  3.100 / 56.  3.100
     libavfilter     5.  2.103 /  5.  2.103
     libavresample   2.  1.  0 /  2.  1.  0
     libswscale      3.  1.101 /  3.  1.101
     libswresample   1.  1.100 /  1.  1.100
     libpostproc    53.  3.100 / 53.  3.100
    Input #0, mpegts, from '006.vuepart.mpeg':
     Duration: 00:00:07.59, start: 1.555756, bitrate: 2356 kb/s
     Program 1
       Metadata:
         service_name    : Service01
         service_provider: FFmpeg
       Stream #0:0[0x100]: Video: h264 (Main) ([27][0][0][0] / 0x001B), yuv420p, 800x600, 12 fps, 12 tbr, 90k tbn, 24 tbc
       Stream #0:1[0x101](und): Audio: mp2 ([3][0][0][0] / 0x0003), 44100 Hz, mono, s16p, 384 kb/s

    Here is the ffprobe output for the mpeg file created from still images :

    ffprobe version 2.5 Copyright (c) 2007-2014 the FFmpeg developers
     built on Dec 10 2014 19:03:38 with Apple LLVM version 6.0 (clang-600.0.56) (based on LLVM 3.5svn)
     configuration: --prefix=/usr/local/Cellar/ffmpeg/2.5 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-libx264 --enable-libmp3lame --enable-libxvid --enable-libvo-aacenc --enable-vda
     libavutil      54. 15.100 / 54. 15.100
     libavcodec     56. 13.100 / 56. 13.100
     libavformat    56. 15.102 / 56. 15.102
     libavdevice    56.  3.100 / 56.  3.100
     libavfilter     5.  2.103 /  5.  2.103
     libavresample   2.  1.  0 /  2.  1.  0
     libswscale      3.  1.101 /  3.  1.101
     libswresample   1.  1.100 /  1.  1.100
     libpostproc    53.  3.100 / 53.  3.100
    Input #0, mpegts, from '001.vuepart.mpeg':
     Duration: 00:00:02.54, start: 1.522422, bitrate: 609 kb/s
     Program 1
       Metadata:
         service_name    : Service01
         service_provider: FFmpeg
       Stream #0:0[0x100]: Video: h264 (High) ([27][0][0][0] / 0x001B), yuvj420p(pc), 800x600 [SAR 1:1 DAR 4:3], 15 fps, 15 tbr, 90k tbn, 30 tbc
       Stream #0:1[0x101]: Audio: mp2 ([3][0][0][0] / 0x0003), 44100 Hz, mono, s16p, 384 kb/s

    Note : I’ve added a silent audio track to the still images.

  • Working with ffmpeg in Xamarin Android

    20 avril 2017, par Ahmed Mujtaba

    I’m building an android app using Xamarin. The requirement of the app is to capture video from the camera and encode the video to send it across to a server. Initially I was using an encoder library on the server side to encode recorded video but it was proving to be extremely unreliable and inefficient specially for large sized video files. I have posted my issues on another thread here I then decided to encode the video on the client side and then send it to the server. I’ve found encoding to be a bit complicated and there isn’t much information available on how this can be done so I search for the only way I knew how to encode a video that is by using ffmpeg codec. I’ve found some solutions. There’s a project on github that demonstrates how ffmpeg is used inside a Xamarin android project. However running the solution doesn’t give any output. The project has a binary ffmpeg file which is installed to the phone directory using the code below :

    _ffmpegBin = InstallBinary(XamarinAndroidFFmpeg.Resource.Raw.ffmpeg, "ffmpeg", false);

    Below is the example code for encoding video into different set of outputs :

               _workingDirectory = Android.OS.Environment.ExternalStorageDirectory.AbsolutePath;
           var sourceMp4 = "cat1.mp4";
           var destinationPathAndFilename = System.IO.Path.Combine (_workingDirectory, "cat1_out.mp4");
           var destinationPathAndFilename2 = System.IO.Path.Combine (_workingDirectory, "cat1_out2.mp4");
           var destinationPathAndFilename4 = System.IO.Path.Combine (_workingDirectory, "cat1_out4.wav");
           if (File.Exists (destinationPathAndFilename))
               File.Delete (destinationPathAndFilename);
           CreateSampleFile(Resource.Raw.cat1, _workingDirectory, sourceMp4);


           var ffmpeg = new FFMpeg (this, _workingDirectory);

           var sourceClip = new Clip (System.IO.Path.Combine(_workingDirectory, sourceMp4));

           var result = ffmpeg.GetInfo (sourceClip);

           var br = System.Environment.NewLine;

           // There are callbacks based on Standard Output and Standard Error when ffmpeg binary is running as a process:

           var onComplete = new MyCommand ((_) => {
               RunOnUiThread(() =>_logView.Append("DONE!" + br + br));
           });

           var onMessage = new MyCommand ((message) => {
               RunOnUiThread(() =>_logView.Append(message + br + br));
           });

           var callbacks = new FFMpegCallbacks (onComplete, onMessage);

           // 1. The idea of this first test is to show that video editing is possible via FFmpeg:
           // It results in a 150x150 movie that eventually zooms on a cat ear. This is desaturated, and there's a fade in.

           var filters = new List<videofilter> ();
           filters.Add (new FadeVideoFilter ("in", 0, 100));
           filters.Add(new CropVideoFilter("150","150","0","0"));
           filters.Add(new ColorVideoFilter(1.0m, 1.0m, 0.0m, 0.5m, 1.0m, 1.0m, 1.0m, 1.0m));
           var outputClip = new Clip (destinationPathAndFilename) { videoFilter = VideoFilter.Build (filters)  };
           outputClip.H264_CRF = "18"; // It's the quality coefficient for H264 - Default is 28. I think 18 is pretty good.
           ffmpeg.ProcessVideo(sourceClip, outputClip, true, new FFMpegCallbacks(onComplete, onMessage));

           //2. This is a similar version version in command line only:
           string[] cmds = new string[] {
               "-y",
               "-i",
               sourceClip.path,
               "-strict",
               "-2",
               "-vf",
               "mp=eq2=1:1.68:0.3:1.25:1:0.96:1",
               destinationPathAndFilename2,
               "-acodec",
               "copy",
           };
           ffmpeg.Execute (cmds, callbacks);

           // 3. This lists codecs:
           string[] cmds3 = new string[] {
               "-codecs",
           };
           ffmpeg.Execute (cmds, callbacks);

           // 4. This convers to WAV
           // Note that the cat movie just has some silent house noise.
           ffmpeg.ConvertToWaveAudio(sourceClip, destinationPathAndFilename4, 44100, 2, callbacks, true);
    </videofilter>

    I have tried different commands but no output file is generated. I have tried to use another project found here but this one has the same issue. I don’t get any errors but no output file is generated. I’m really hoping someone can help me find a way I can manage to use ffmpeg in my project or some way to compress video to transport it to the server.

    I will really appreciate if someone can point me to the right direction.