Recherche avancée

Médias (91)

Autres articles (96)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

Sur d’autres sites (7127)

  • JavaCV record video in Android

    11 janvier 2017, par wyx

    I want to record video quiet and without preview in Android. So I choice MediaRecorder but I could record only without preview but what make me crazy is that when MediaRecorder start or stop it will with a sound dee.... I try many methods about that . But I think it perhaps sth related to the OS of the mobile. So I try JavaCV because I also want to have a Live function in my app.

    But JavaCV spent me to too much time to solve some strange problems because it’s my first time to do sth about C++ src and video.

    Just compile group: 'org.bytedeco', name: 'javacv-platform', version: '1.3' as the README.md ,I even can’t build my apk.

    Error:Execution failed for task ':app:transformResourcesWithMergeJavaResForDebug'.
    > com.android.build.api.transform.TransformException: com.android.builder.packaging.DuplicateFileException: Duplicate files copied in APK org/bytedeco/javacpp/macosx-x86_64/libusb-1.0.dylib
       File1: /Users/wyx/.gradle/caches/modules-2/files-2.1/org.bytedeco.javacpp-presets/libfreenect/0.5.3-1.3/736d65a3ef042258429d8e7742128c411806b432/libfreenect-0.5.3-1.3-macosx-x86_64.jar
       File2: /Users/wyx/.gradle/caches/modules-2/files-2.1/org.bytedeco.javacpp-presets/libdc1394/2.2.4-1.3/f1498dacc46162ab68faeb8d66cf02b96fe41c61/libdc1394-2.2.4-1.3-macosx-x86_64.jar

    And then I modified it according this issuse
    use this to repalce. It can build the apk. But the can’t run.

     android {
      ..............
       packagingOptions {
           exclude 'META-INF/services/javax.annotation.processing.Processor'
           pickFirst  'META-INF/maven/org.bytedeco.javacpp-presets/opencv/pom.properties'
           pickFirst  'META-INF/maven/org.bytedeco.javacpp-presets/opencv/pom.xml'
           pickFirst  'META-INF/maven/org.bytedeco.javacpp-presets/ffmpeg/pom.properties'
           pickFirst  'META-INF/maven/org.bytedeco.javacpp-presets/ffmpeg/pom.xml'
       }
    }

    dependencies {
       compile group: 'org.bytedeco', name: 'javacv', version: '1.3'
       compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '3.2.1-1.3', classifier: 'android-x86'
       compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '3.2.1-1.3', classifier: 'android-arm'
       compile group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '3.1.0-1.3', classifier: 'android-x86'
       compile group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '3.1.0-1.3', classifier: 'android-arm'
    }

    My demo code VideoService which will invoke in MainActivity

    package com.fs.fs.api;

    import com.fs.fs.App;
    import com.fs.fs.utils.DateUtils;
    import com.fs.fs.utils.FileUtils;

    import org.bytedeco.javacpp.avcodec;
    import org.bytedeco.javacv.FFmpegFrameRecorder;
    import org.bytedeco.javacv.FrameRecorder;

    import java.util.Date;

    /**
    * Created by wyx on 2017/1/11.
    */
    public class VideoService {
       private FFmpegFrameRecorder mFrameRecorder;
       private String path;

       private VideoService() {
       }

       private static class SingletonHolder {
           private static final VideoService INSTANCE = new VideoService();
       }

       public static VideoService getInstance() {
           return SingletonHolder.INSTANCE;
       }

       public void startRecordVideo() {
           String fileName = String.format("%s.%s", DateUtils.date2String(new Date(), "yyyyMMdd_HHmmss"), "mp4");
           path = FileUtils.getExternalFullPath(App.getInstance(), fileName);
           mFrameRecorder = new FFmpegFrameRecorder(path, 640, 480, 1);
           mFrameRecorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
           mFrameRecorder.setVideoOption("tune", "zerolatency");
           mFrameRecorder.setVideoOption("preset", "ultrafast");
           mFrameRecorder.setVideoOption("crf", "28");
           mFrameRecorder.setVideoBitrate(300 * 1000);
           mFrameRecorder.setFormat("mp4");

           mFrameRecorder.setFrameRate(30);
           mFrameRecorder.setAudioOption("crf", "0");
           mFrameRecorder.setSampleRate(48 * 1000);
           mFrameRecorder.setAudioBitrate(960 * 1000);
           mFrameRecorder.setAudioCodec(avcodec.AV_CODEC_ID_AAC);
           try {
               mFrameRecorder.start();
           } catch (FrameRecorder.Exception e) {
               e.printStackTrace();
           }
       }

       public void stop() {
           if (mFrameRecorder != null) {
               try {
                   mFrameRecorder.stop();
                   mFrameRecorder.release();
               } catch (FrameRecorder.Exception e) {
                   e.printStackTrace();
               }
               mFrameRecorder = null;
           }
       }

    }

    MainActivity

    package com.fs.fs.activity;

    import android.app.Activity;
    import android.os.Bundle;

    import com.fs.fs.R;
    import com.fs.fs.api.VideoService;

    import static java.lang.Thread.sleep;


    public class MainActivity extends Activity {

       @Override
       protected void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           setContentView(R.layout.activity_main);


           VideoService.getInstance().startRecordVideo();
           try {
               sleep(10 * 1000);
           } catch (InterruptedException e) {
               e.printStackTrace();
           }
           VideoService.getInstance().stop();
       }
    }

    Error which make me want to cry.

    E/AndroidRuntime: FATAL EXCEPTION: main
                     Process: com.fs.fs, PID: 30259
                     java.lang.NoClassDefFoundError: java.lang.ClassNotFoundException: org.bytedeco.javacpp.avutil
                         at org.bytedeco.javacpp.Loader.load(Loader.java:590)
                         at org.bytedeco.javacpp.Loader.load(Loader.java:530)
                         at org.bytedeco.javacpp.avcodec$AVPacket.<clinit>(avcodec.java:1694)
                         at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:149)
                         at com.fs.fs.api.VideoService.startRecordVideo(VideoService.java:34)
                         at com.fs.fs.activity.MainActivity.onCreate(MainActivity.java:75)
                         at android.app.Activity.performCreate(Activity.java:5304)
                         at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1090)
                         at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2245)
                         at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2331)
                         at android.app.ActivityThread.access$1000(ActivityThread.java:143)
                         at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1244)
                         at android.os.Handler.dispatchMessage(Handler.java:102)
                         at android.os.Looper.loop(Looper.java:136)
                         at android.app.ActivityThread.main(ActivityThread.java:5291)
                         at java.lang.reflect.Method.invokeNative(Native Method)
                         at java.lang.reflect.Method.invoke(Method.java:515)
                         at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:849)
                         at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:665)
                         at dalvik.system.NativeStart.main(Native Method)
                      Caused by: java.lang.ClassNotFoundException: org.bytedeco.javacpp.avutil
                         at java.lang.Class.classForName(Native Method)
                         at java.lang.Class.forName(Class.java:251)
                         at org.bytedeco.javacpp.Loader.load(Loader.java:585)
                         at org.bytedeco.javacpp.Loader.load(Loader.java:530) 
                         at org.bytedeco.javacpp.avcodec$AVPacket.<clinit>(avcodec.java:1694) 
                         at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:149) 
                         at com.fs.fs.api.VideoService.startRecordVideo(VideoService.java:34) 
                         at com.fs.fs.activity.MainActivity.onCreate(MainActivity.java:75) 
                         at android.app.Activity.performCreate(Activity.java:5304) 
                         at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1090) 
                         at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2245) 
                         at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2331) 
                         at android.app.ActivityThread.access$1000(ActivityThread.java:143) 
                         at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1244) 
                         at android.os.Handler.dispatchMessage(Handler.java:102) 
                         at android.os.Looper.loop(Looper.java:136) 
                         at android.app.ActivityThread.main(ActivityThread.java:5291) 
                         at java.lang.reflect.Method.invokeNative(Native Method) 
                         at java.lang.reflect.Method.invoke(Method.java:515) 
                         at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:849) 
                         at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:665) 
                         at dalvik.system.NativeStart.main(Native Method) 
                      Caused by: java.lang.NoClassDefFoundError: org/bytedeco/javacpp/avutil
                         at java.lang.Class.classForName(Native Method) 
                         at java.lang.Class.forName(Class.java:251) 
                         at org.bytedeco.javacpp.Loader.load(Loader.java:585) 
                         at org.bytedeco.javacpp.Loader.load(Loader.java:530) 
                         at org.bytedeco.javacpp.avcodec$AVPacket.<clinit>(avcodec.java:1694) 
                         at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:149) 
                         at com.fs.fs.api.VideoService.startRecordVideo(VideoService.java:34) 
                         at com.fs.fs.activity.MainActivity.onCreate(MainActivity.java:75) 
                         at android.app.Activity.performCreate(Activity.java:5304) 
                         at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1090) 
                         at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2245) 
                         at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2331) 
                         at android.app.ActivityThread.access$1000(ActivityThread.java:143) 
                         at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1244) 
                         at android.os.Handler.dispatchMessage(Handler.java:102) 
                         at android.os.Looper.loop(Looper.java:136) 
                         at android.app.ActivityThread.main(ActivityThread.java:5291) 
                         at java.lang.reflect.Method.invokeNative(Native Method) 
                         at java.lang.reflect.Method.invoke(Method.java:515) 
                         at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:849) 
                         at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:665) 
                         at dalvik.system.NativeStart.main(Native Method) 
                      Caused by: java.lang.ClassNotFoundException: Didn't find class "org.bytedeco.javacpp.avutil" on path: DexPathList[[zip file "/data/app/com.fs.fs-2.apk"],nativeLibraryDirectories=[/data/app-lib/com.fs.fs-2, /vendor/lib, /system/lib, /data/datalib]]
                         at dalvik.system.BaseDexClassLoader.findClass(BaseDexClassLoader.java:56)
                         at java.lang.ClassLoader.loadClass(ClassLoader.java:497)
                         at java.lang.ClassLoader.loadClass(ClassLoader.java:457)
                         at java.lang.Class.classForName(Native Method) 
                         at java.lang.Class.forName(Class.java:251) 
                         at org.bytedeco.javacpp.Loader.load(Loader.java:585) 
                         at org.bytedeco.javacpp.Loader.load(Loader.java:530) 
                         at org.bytedeco.javacpp.avcodec$AVPacket.<clinit>(avcodec.java:1694) 
                         at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:149) 
                         at com.fs.fs.api.VideoService.startRecordVideo(VideoService.java:34) 
                         at com.fs.fs.activity.MainActivity.onCreate(MainActivity.java:75) 
                         at android.app.Activity.performCreate(Activity.java:5304) 
                         at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1090) 
                         at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2245) 
                         at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2331) 
                         at android.app.ActivityThread.access$1000(ActivityThread.java:143) 
                         at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1244) 
                         at android.os.Handler.dispatchMessage(Handler.java:102) 
                         at android.os.Looper.loop(Looper.java:136) 
                         at android.app.ActivityThread.main(ActivityThread.java:5291) 
                         at java.lang.reflect.Method.invokeNative(Native Method) 
                         at java.lang.reflect.Method.invoke(Method.java:515) 
                         at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:849) 
                         at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:665) 
                         at dalvik.system.NativeStart.main(Native Method) 
    </init></clinit></init></clinit></init></clinit></init></clinit>

    So I want to know a comfortable method to achieve my goal : recode video quiet and without preview. And Live real time ?

    I found ffmpeg4android is a prefect library to run ffmpeg command. I just use it to compress videos from MediaRecorder But I don’t how to do use it to achieve my goal.

  • MPEG DASH - do I need to have audio and video tracks as seperate source file for creating DASH package using mp4box

    10 juillet 2016, par Tarun

    I have one source mp4, I tried to create MPEG DASH package using mp4box by GPAC.
    I am able to play output MPD files in OSMO4 player by GPAC.

    However I am not able to play the same in DASH JS player @ http://dashif.org/reference/players/javascript/0.2.3/index.html

    When I try to play the mpd in it I get error "Error creating source buffer"

    I tried reading their MPD files, and I found that those guys are using audio and video as separate source track.

    Ques1) Does DASH specs states that audio and video tracks should be seprate source tracks ?

    Ques2) Please find below the MPD file created by me, Let me know if anybody thinks that there is a problem in it

    <mpd type="static" xmlns="urn:mpeg:DASH:schema:MPD:2011" profiles="urn:mpeg:dash:profile:full:2011" minbuffertime="PT1.5S" mediapresentationduration="PT0H2M31.63S">
    <programinformation moreinformationurl="http://gpac.sourceforge.net">

    </programinformation>
    <period start="PT0S" duration="PT0H2M31.63S">
     <adaptationset>
    <contentcomponent contenttype="video"></contentcomponent>
    <contentcomponent contenttype="audio" lang="und"></contentcomponent>
    <segmenttemplate initialization="flight_init.mp4"></segmenttemplate>
    <representation mimetype="video/mp4" codecs="avc1.64001f,mp4a.40.02" width="1280" height="720" samplerate="44100" numchannels="2" lang="und" startwithsap="1" bandwidth="3096320">
    <segmenttemplate timescale="1000" duration="20164" media="flight_test_flight_3000$Number$.mp4" startnumber="1"></segmenttemplate>
    </representation>
    <representation mimetype="video/mp4" codecs="avc1.64001e,mp4a.40.02" width="640" height="360" samplerate="44100" numchannels="2" lang="und" startwithsap="1" bandwidth="1119428">
    <segmenttemplate timescale="1000" duration="20099" media="flight_test_flight_1000$Number$.mp4" startnumber="1"></segmenttemplate>
    </representation>
    <representation mimetype="video/mp4" codecs="avc1.640014,mp4a.40.02" width="320" height="180" samplerate="44100" numchannels="2" lang="und" startwithsap="1" bandwidth="722208">
    <segmenttemplate timescale="1000" duration="20164" media="flight_test_flight_600$Number$.mp4" startnumber="1"></segmenttemplate>
    </representation>
    </adaptationset>
    </period>
    </mpd>
  • FFMPEG SCREENSHOT GENERATE ERROR : No such filter : 'tile' [duplicate]

    23 mai 2013, par itseasy21

    This question is an exact duplicate of :

    i have been trying on making multiple screenshots from a video file using ffmpeg and i succeed too in command but the only problem is while executing that i am getting this error :

    No such filter: &#39;tile&#39;
    Error opening filters!

    The command i execute is :

    ffmpeg -ss 00:00:10 -i &#39;./tmp/try.avi&#39; -vcodec mjpeg -vframes 1 -vf &#39;select=not(mod(n\,1000)),scale=320:240,tile=2x3&#39; &#39;./tmp/try.jpg&#39;

    The output i get is :

    xxxxx@xxxx.com [~/public_html/xxxx]# ffmpeg -ss 00:00:10 -i &#39;./tmp/try.avi&#39; -vcodec mjpeg -vframes 1 -vf &#39;select=not(mod(n\,1000)),scale=320:240,tile=2x3&#39; &#39;./tmp/try.jpg&#39;

    ffmpeg version 0.7.11, Copyright (c) 2000-2011 the FFmpeg developers
     built on Mar 10 2012 18:07:20 with gcc 4.4.6 20110731 (Red Hat 4.4.6-3)
     configuration: --prefix=/usr --libdir=/usr/lib64 --shlibdir=/usr/lib64 --mandir=/usr/share/man --enable-shared --enable-runtime-cpudetect --enable-gpl --enable-version3 --enable-postproc --enable-avfilter --enable-pthreads --enable-x11grab --enable-vdpau --disable-avisynth --enable-frei0r --enable-libopencv --enable-libdc1394 --enable-libdirac --enable-libgsm --enable-libmp3lame --enable-libnut --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librtmp --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxavs --enable-libxvid --extra-cflags=&#39;-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -fPIC&#39; --disable-stripping
     libavutil    50. 43. 0 / 50. 43. 0
     libavcodec   52.123. 0 / 52.123. 0
     libavformat  52.111. 0 / 52.111. 0
     libavdevice  52.  5. 0 / 52.  5. 0
     libavfilter   1. 80. 0 /  1. 80. 0
     libswscale    0. 14. 1 /  0. 14. 1
     libpostproc  51.  2. 0 / 51.  2. 0

    Seems stream 0 codec frame rate differs from container frame rate: 29.97 (30000/1001) -> 25.00 (25/1)
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from &#39;./tmp/try.avi&#39;:
     Metadata:
       major_brand     : 3gp4
       minor_version   : 512
       compatible_brands: isomiso23gp4
       creation_time   : 1970-01-01 00:00:00
     Duration: 00:09:24.82, start: 0.000000, bitrate: 118 kb/s
       Stream #0.0(und): Video: h263, yuv420p, 176x144 [PAR 12:11 DAR 4:3], 102 kb/s, 25 fps, 25 tbr, 25 tbn, 29.97 tbc
       Metadata:
         creation_time   : 1970-01-01 00:00:00
       Stream #0.1(und): Audio: amrnb, 8000 Hz, 1 channels, flt, 12 kb/s
       Metadata:
         creation_time   : 1970-01-01 00:00:00
    Incompatible pixel format &#39;yuv420p&#39; for codec &#39;mjpeg&#39;, auto-selecting format &#39;yuvj420p&#39;
    [buffer @ 0x1ad89a0] w:176 h:144 pixfmt:yuv420p tb:1/1000000 sar:12/11 sws_param:
    No such filter: &#39;tile&#39;
    Error opening filters!

    any solution for this ????