Recherche avancée

Médias (3)

Mot : - Tags -/plugin

Autres articles (93)

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Gestion de la ferme

    2 mars 2010, par

    La ferme est gérée dans son ensemble par des "super admins".
    Certains réglages peuvent être fais afin de réguler les besoins des différents canaux.
    Dans un premier temps il utilise le plugin "Gestion de mutualisation"

Sur d’autres sites (7966)

  • How to extract a video snippet from a video file that is being written in by ffmpeg in realtime

    26 février 2015, par alex.b

    At the moment i am recording a LIVE stream video from youtube with youtube-dl (https://github.com/rg3/youtube-dl)

    The command i use for this :

    youtube-dl --id -f 92 https://www.youtube.com/watch?v=VYlQJbsVs48

    92 is a format code that i got after executing a command to get the formats, that gave me this list :

    format code  extension  resolution note
    140          m4a        audio only DASH audio  144k , m4a_dash container, aac  @128k (48000Hz)
    141          m4a        audio only DASH audio  272k , m4a_dash container, aac  @256k (48000Hz)
    160          mp4        256x144    DASH video  124k , 15fps, video only
    133          mp4        426x240    DASH video  258k , 30fps, video only
    134          mp4        640x360    DASH video  616k , 30fps, video only
    135          mp4        854x480    DASH video 1116k , 30fps, video only
    136          mp4        1280x720   DASH video 2216k , 30fps, video only
    137          mp4        1920x1080  DASH video 4141k , 30fps, video only
    151          mp4        72p        HLS
    132          mp4        240p       HLS
    92           mp4        240p       HLS
    93           mp4        360p       HLS
    94           mp4        480p       HLS
    95           mp4        720p       HLS
    96           mp4        1080p      HLS  (best)

    This is creating a file called VYlQJbsVs48.mp4.part that gets bigger and bigger of course.

    Is there a way to extract a video snippet from that live stream or form the part file ? Or maybe there is a better way of doing this ?

    What i have noticed is that if i force quit iTerm2 while youtube-dl is running the .part file it creates wont contain any index information (something to do with an moov atom not being present in the mp4 file - which is the information about the number of frames and other things - metadata i think), so it makes me think i cannot extract from the file.

    Maybe if there would be a way that youtube-dl can write the index information at all times or maybe another way that i can record the live stream and get video snippets while its recording.

    I forgot to mention i am doing this on OSX Yosemite. I have FFMPEG installed with homebrew and youtoube-dl

    I am more than happy to try stuff on UBUNTU if there is a solution.

    Any help would be greatly appreciated.

    Thank you.

    Alex

  • Can I stream 2 AVI or MP4 files to a user as if it were 1 file ?

    12 mars 2015, par Macmee

    I have several video files such as :

    bobs_video_part_1.mp4
    bobs_video_part_2.mp4
    bobs_video_part_3.mp4

    and instead of sending the user links to download each video separately :

    http://mynodejsserver.com/videos/bobs_video_part_1.mp4
    http://mynodejsserver.com/videos/bobs_video_part_2.mp4
    http://mynodejsserver.com/videos/bobs_video_part_3.mp4

    I want to be able to give the user just 1 link :

    http://mynodejsserver.com/videos/bobs_video.mp4

    I can combine the videos using something like ffmpeg but then I can’t give the user a link to the video in real time, I have to wait for ffmpeg to finish. I was thinking conceptually that this would work by doing something along the lines of :

    1. open file streams for each video and sum them to calculate total content-length
    2. open a stream/pipe back to the user and start sending data for bobs_video_part_1.mp4
    3. in that same stream start sending back bobs_video_part_2.mp4
    4. keep sending back the next video to the same stream/pipe to the user

    would something like this work ? Since I want to send each video in order anyway I don’t see why I can’t send each video as if it were part of the same video and the client doing the download request would never know.

  • NoMethodFoundException when trying to load native methods from jar file

    20 février 2018, par Anuran Barman

    I am trying to load ffmpeg methods in android. My requirement is that I dont want to pack the .so files within the apk.Only if the user wants then only I will download the jar file and load the ffmpeg native methods.After searching I think that loading .so files at run time is not possible.So what I did that I made the so file and created a different android application with only one class FFMPEG.java whose only duty is to call the native methods from ffmpeg library.so I made the jar file and loaded that into my main application with ClassLoader. Constructor is getting called so it means class is loaded but methods are not getting loaded though they are declared public in the jar file. I am trying to stream RTSP video with FFMPEG.Below are my jar file and main application codes.

    public class FFMPEG {

       public FFMPEG(){
           Log.d(FFMPEG.class.getSimpleName(),"constructor called");
       }

       public static native int naInit(String pFileName);
       public static native int[] naGetVideoRes();
       public static native void naSetSurface(Surface pSurface);
       public static native int naSetup(int pWidth, int pHeight);
       public static native void naPlay();
       public static native void naStop();

       public static boolean loadedLibraries;

       static {
           try {
               System.loadLibrary("avutil");
               System.loadLibrary("avcodec");
               System.loadLibrary("avformat");
               System.loadLibrary("swscale");
               System.loadLibrary("avfilter");
               System.loadLibrary("ffmpeg-jni");
               loadedLibraries = true;
           } catch (Throwable e) {
               e.printStackTrace();
           }
       }

       public int libInit(String filename){
           return  naInit(filename);
       }

       public int[] libGetVideoRes(){
           return  naGetVideoRes();
       }
       public void libSetSurface(Surface surface){
           naSetSurface(surface);
       }
       public int libSetup(int width,int height){
           return naSetup(width,height);
       }
       public void libPlay(){
           naPlay();
       }
       public void libStop(){
           naStop();
       }
    }

    My main application activity code.The jar file location in my sdcard named camlib.jar

    @SuppressWarnings("JniMissingFunction")
    public class MainActivity extends AppCompatActivity implements SurfaceHolder.Callback {

       private SurfaceView surfaceView;
       private ProgressBar progressBar;
       private final String TAG=MainActivity.class.getSimpleName();

       private boolean isPlaying;
       private boolean isClassLoaded;
       private boolean isInitialized;
       private String url="";
       Method libInit,libGetVideoRes,libSetSurface,libSetup,libPlay,libStop;
       Object myInstance;
       @Override
       protected void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,WindowManager.LayoutParams.FLAG_FULLSCREEN);
           setContentView(R.layout.activity_main);
           surfaceView = (SurfaceView) findViewById(R.id.surfaceView);
           progressBar = ((ProgressBar) findViewById(R.id.progressBar));
           surfaceView.getHolder().addCallback(this);
           int permission= ActivityCompat.checkSelfPermission(this, Manifest.permission.READ_EXTERNAL_STORAGE);
           if(permission== PackageManager.PERMISSION_GRANTED){
               loadClass();
           }else{
               ActivityCompat.requestPermissions(this,new String[]{Manifest.permission.READ_EXTERNAL_STORAGE},200);
           }

       }

       @Override
       public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
           super.onRequestPermissionsResult(requestCode, permissions, grantResults);
           if(requestCode==200){
               if(grantResults.length>0){
                   if (grantResults[0]==PackageManager.PERMISSION_GRANTED){
                       loadClass();
                   }else{
                       ActivityCompat.requestPermissions(this,new String[]{Manifest.permission.READ_EXTERNAL_STORAGE},200);
                   }
               }
           }
       }

       public void loadClass(){
           try {
               final String libPath = Environment.getExternalStorageDirectory() + "/camlib.jar";
               final File tmpDir = getDir("dex", 0);

               final DexClassLoader classloader = new DexClassLoader(libPath, tmpDir.getAbsolutePath(), null, this.getClass().getClassLoader());
               final Class classToLoad = (Class) classloader.loadClass("com.myeglu.obbapplication.FFMPEG");

                myInstance  = classToLoad.newInstance();
                libInit= classToLoad.getMethod("libInit");
                libGetVideoRes=classToLoad.getMethod("libGetVideoRes");
                libSetSurface=classToLoad.getMethod("libSetSurface");
                libSetup=classToLoad.getMethod("libSetup");
                libPlay=classToLoad.getMethod("libPlay");
                libStop=classToLoad.getMethod("libStop");
                isClassLoaded=true;
                new PlayVideo().execute();

           } catch (Exception e) {
               e.printStackTrace();
           }
       }

       private void postInit() {
           if (isInitialized) {
               initPlay();
               progressBar.setVisibility(View.GONE);
           } else {
               finish();
           }
       }

       private void initPlay() {
           try {
               int[] res = (int[])libGetVideoRes.invoke(myInstance);
               Log.d("ANURAN", "res width " + res[0] + ": height " + res[1]);
               if (res[0] <= 0) {
                   res[0] = 480;
               }
               if (res[1] <= 0) {
                   res[1] = 320;
               }
               int[] screenRes = getScreenRes();
               int width, height;
               float widthScaledRatio = screenRes[0] * 1.0f / res[0];
               float heightScaledRatio = screenRes[1] * 1.0f / res[1];
               if (widthScaledRatio > heightScaledRatio) {
                   //use heightScaledRatio
                   width = (int) (res[0] * heightScaledRatio);
                   height = screenRes[1];
               } else {
                   //use widthScaledRatio
                   width = screenRes[0];
                   height = (int) (res[1] * widthScaledRatio);
               }
               Log.d(TAG, "width " + width + ",height:" + height);
               updateSurfaceView(width, height);
               libSetup.invoke(myInstance,width,height);
               playMedia();
           }catch (Exception e){

           }
       }

       private void playMedia() {
           try {
               if (progressBar.getVisibility() == View.VISIBLE) {
                   progressBar.setVisibility(View.GONE);
               }
               libPlay.invoke(myInstance);
               isPlaying = true;
           }catch (Exception e){

           }
       }
       private void updateSurfaceView(int pWidth, int pHeight) {
           //update surfaceview dimension, this will cause the native window to change
           Log.d("ANURAN UPDATE SURFACE", "width " + pWidth + ",height:" + pHeight);
           FrameLayout.LayoutParams params = (FrameLayout.LayoutParams) surfaceView.getLayoutParams();
           params.width = pWidth;
           params.height = pHeight;
           surfaceView.setLayoutParams(params);
       }

       @SuppressLint("NewApi")
       private int[] getScreenRes() {
           int[] res = new int[2];
           Display display = getWindowManager().getDefaultDisplay();
           Point size = new Point();
           display.getSize(size);
           res[0] = size.x;
           res[1] = size.y;
           return res;
       }

       @Override
       protected void onStop() {
           super.onStop();
           Toast.makeText(MainActivity.this,"onStop called",Toast.LENGTH_SHORT).show();
           stopPlaying();
           finish();
       }

       @Override
       public void onBackPressed() {
           stopPlaying();
           finish();
       }

       private void stopPlaying() {
           isPlaying = false;
           try{
               libStop.invoke(myInstance);
           }catch (Exception e){

           }
       }

       @Override
       protected void onDestroy() {
           super.onDestroy();
           stopPlaying();
           finish();
       }



       @Override
       protected void onRestart() {
           super.onRestart();
           Toast.makeText(MainActivity.this,"onRestart called",Toast.LENGTH_SHORT).show();
           progressBar.setVisibility(View.VISIBLE);

       }


       @Override
       public void surfaceChanged(SurfaceHolder holder, int format, int width,
                                  int height) {

           if(isClassLoaded){
               try {
                   libSetSurface.invoke(myInstance,holder.getSurface());
               } catch (IllegalAccessException e) {
                   e.printStackTrace();
               } catch (InvocationTargetException e) {
                   e.printStackTrace();
               }
           }

       }

       @Override
       public void surfaceCreated(SurfaceHolder holder) {

       }

       @Override
       public void surfaceDestroyed(SurfaceHolder holder) {
           if(isClassLoaded) {
               try {
                   libSetSurface.invoke(myInstance, null);
               } catch (IllegalAccessException e) {
                   e.printStackTrace();
               } catch (InvocationTargetException e) {
                   e.printStackTrace();
               }
           }
       }



       private class PlayVideo extends AsyncTask {

           @Override
           protected Void doInBackground(Void... voids) {
               try {
                   int temp=(int)libInit.invoke(myInstance,url);
                   isInitialized=(temp==0);
               } catch (IllegalAccessException e) {
                   e.printStackTrace();
               } catch (InvocationTargetException e) {
                   e.printStackTrace();
               }
               return null;
           }

           @Override
           protected void onPostExecute(Void aVoid) {
               super.onPostExecute(aVoid);
               postInit();
               this.cancel(true);
           }
       }
    }