Recherche avancée

Médias (1)

Mot : - Tags -/MediaSPIP

Autres articles (100)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • L’agrémenter visuellement

    10 avril 2011

    MediaSPIP est basé sur un système de thèmes et de squelettes. Les squelettes définissent le placement des informations dans la page, définissant un usage spécifique de la plateforme, et les thèmes l’habillage graphique général.
    Chacun peut proposer un nouveau thème graphique ou un squelette et le mettre à disposition de la communauté.

Sur d’autres sites (12395)

  • (FFMPEG Error)Real-time buffer [USB cam's name] too full or near too full

    9 juin 2020, par ZzangWoo

    I am doing project about making rtsp stream of my USB Camera.
And my project environment is 
- OS : windows server 2019
- CPU : AMD Ryzen 7 3700X
- RAM : 64GB
- GPU : NVIDIA GeForce RTX 2070 SUPER

    



    My project goal is to detect object with YOLO and show original cam video, detected video to client. So I need to change my USB Camera to RTSP Stream. But I had an error below picture.
enter image description here

    



    And this is my command line.

    



    ffmpeg -re -f dshow -i video="JOYTRON HD20" -pix_fmt yuv420p -vsync 1 -threads 0 -vcodec libx264 -r 30 -g 60 -sc_threshold 0 -b:v 640k -bufsize 768k -maxrate 800k -preset veryfast -profile:v baseline -tune film -acodec aac -b:a 128k -ac 2 -ar 48000 -f rtsp rtsp ://localhost:8888/test

    



    I saw an answer that the problem is bandwidth. So I added parameter. It didnt help.

    



    I also added the -rtbufsize parameter and -thread_queue_size parameter. But it didnt help anything.

    



    What should I do ??

    


  • NoMethodFoundException when trying to load native methods from jar file

    20 février 2018, par Anuran Barman

    I am trying to load ffmpeg methods in android. My requirement is that I dont want to pack the .so files within the apk.Only if the user wants then only I will download the jar file and load the ffmpeg native methods.After searching I think that loading .so files at run time is not possible.So what I did that I made the so file and created a different android application with only one class FFMPEG.java whose only duty is to call the native methods from ffmpeg library.so I made the jar file and loaded that into my main application with ClassLoader. Constructor is getting called so it means class is loaded but methods are not getting loaded though they are declared public in the jar file. I am trying to stream RTSP video with FFMPEG.Below are my jar file and main application codes.

    public class FFMPEG {

       public FFMPEG(){
           Log.d(FFMPEG.class.getSimpleName(),"constructor called");
       }

       public static native int naInit(String pFileName);
       public static native int[] naGetVideoRes();
       public static native void naSetSurface(Surface pSurface);
       public static native int naSetup(int pWidth, int pHeight);
       public static native void naPlay();
       public static native void naStop();

       public static boolean loadedLibraries;

       static {
           try {
               System.loadLibrary("avutil");
               System.loadLibrary("avcodec");
               System.loadLibrary("avformat");
               System.loadLibrary("swscale");
               System.loadLibrary("avfilter");
               System.loadLibrary("ffmpeg-jni");
               loadedLibraries = true;
           } catch (Throwable e) {
               e.printStackTrace();
           }
       }

       public int libInit(String filename){
           return  naInit(filename);
       }

       public int[] libGetVideoRes(){
           return  naGetVideoRes();
       }
       public void libSetSurface(Surface surface){
           naSetSurface(surface);
       }
       public int libSetup(int width,int height){
           return naSetup(width,height);
       }
       public void libPlay(){
           naPlay();
       }
       public void libStop(){
           naStop();
       }
    }

    My main application activity code.The jar file location in my sdcard named camlib.jar

    @SuppressWarnings("JniMissingFunction")
    public class MainActivity extends AppCompatActivity implements SurfaceHolder.Callback {

       private SurfaceView surfaceView;
       private ProgressBar progressBar;
       private final String TAG=MainActivity.class.getSimpleName();

       private boolean isPlaying;
       private boolean isClassLoaded;
       private boolean isInitialized;
       private String url="";
       Method libInit,libGetVideoRes,libSetSurface,libSetup,libPlay,libStop;
       Object myInstance;
       @Override
       protected void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,WindowManager.LayoutParams.FLAG_FULLSCREEN);
           setContentView(R.layout.activity_main);
           surfaceView = (SurfaceView) findViewById(R.id.surfaceView);
           progressBar = ((ProgressBar) findViewById(R.id.progressBar));
           surfaceView.getHolder().addCallback(this);
           int permission= ActivityCompat.checkSelfPermission(this, Manifest.permission.READ_EXTERNAL_STORAGE);
           if(permission== PackageManager.PERMISSION_GRANTED){
               loadClass();
           }else{
               ActivityCompat.requestPermissions(this,new String[]{Manifest.permission.READ_EXTERNAL_STORAGE},200);
           }

       }

       @Override
       public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
           super.onRequestPermissionsResult(requestCode, permissions, grantResults);
           if(requestCode==200){
               if(grantResults.length>0){
                   if (grantResults[0]==PackageManager.PERMISSION_GRANTED){
                       loadClass();
                   }else{
                       ActivityCompat.requestPermissions(this,new String[]{Manifest.permission.READ_EXTERNAL_STORAGE},200);
                   }
               }
           }
       }

       public void loadClass(){
           try {
               final String libPath = Environment.getExternalStorageDirectory() + "/camlib.jar";
               final File tmpDir = getDir("dex", 0);

               final DexClassLoader classloader = new DexClassLoader(libPath, tmpDir.getAbsolutePath(), null, this.getClass().getClassLoader());
               final Class classToLoad = (Class) classloader.loadClass("com.myeglu.obbapplication.FFMPEG");

                myInstance  = classToLoad.newInstance();
                libInit= classToLoad.getMethod("libInit");
                libGetVideoRes=classToLoad.getMethod("libGetVideoRes");
                libSetSurface=classToLoad.getMethod("libSetSurface");
                libSetup=classToLoad.getMethod("libSetup");
                libPlay=classToLoad.getMethod("libPlay");
                libStop=classToLoad.getMethod("libStop");
                isClassLoaded=true;
                new PlayVideo().execute();

           } catch (Exception e) {
               e.printStackTrace();
           }
       }

       private void postInit() {
           if (isInitialized) {
               initPlay();
               progressBar.setVisibility(View.GONE);
           } else {
               finish();
           }
       }

       private void initPlay() {
           try {
               int[] res = (int[])libGetVideoRes.invoke(myInstance);
               Log.d("ANURAN", "res width " + res[0] + ": height " + res[1]);
               if (res[0] <= 0) {
                   res[0] = 480;
               }
               if (res[1] <= 0) {
                   res[1] = 320;
               }
               int[] screenRes = getScreenRes();
               int width, height;
               float widthScaledRatio = screenRes[0] * 1.0f / res[0];
               float heightScaledRatio = screenRes[1] * 1.0f / res[1];
               if (widthScaledRatio > heightScaledRatio) {
                   //use heightScaledRatio
                   width = (int) (res[0] * heightScaledRatio);
                   height = screenRes[1];
               } else {
                   //use widthScaledRatio
                   width = screenRes[0];
                   height = (int) (res[1] * widthScaledRatio);
               }
               Log.d(TAG, "width " + width + ",height:" + height);
               updateSurfaceView(width, height);
               libSetup.invoke(myInstance,width,height);
               playMedia();
           }catch (Exception e){

           }
       }

       private void playMedia() {
           try {
               if (progressBar.getVisibility() == View.VISIBLE) {
                   progressBar.setVisibility(View.GONE);
               }
               libPlay.invoke(myInstance);
               isPlaying = true;
           }catch (Exception e){

           }
       }
       private void updateSurfaceView(int pWidth, int pHeight) {
           //update surfaceview dimension, this will cause the native window to change
           Log.d("ANURAN UPDATE SURFACE", "width " + pWidth + ",height:" + pHeight);
           FrameLayout.LayoutParams params = (FrameLayout.LayoutParams) surfaceView.getLayoutParams();
           params.width = pWidth;
           params.height = pHeight;
           surfaceView.setLayoutParams(params);
       }

       @SuppressLint("NewApi")
       private int[] getScreenRes() {
           int[] res = new int[2];
           Display display = getWindowManager().getDefaultDisplay();
           Point size = new Point();
           display.getSize(size);
           res[0] = size.x;
           res[1] = size.y;
           return res;
       }

       @Override
       protected void onStop() {
           super.onStop();
           Toast.makeText(MainActivity.this,"onStop called",Toast.LENGTH_SHORT).show();
           stopPlaying();
           finish();
       }

       @Override
       public void onBackPressed() {
           stopPlaying();
           finish();
       }

       private void stopPlaying() {
           isPlaying = false;
           try{
               libStop.invoke(myInstance);
           }catch (Exception e){

           }
       }

       @Override
       protected void onDestroy() {
           super.onDestroy();
           stopPlaying();
           finish();
       }



       @Override
       protected void onRestart() {
           super.onRestart();
           Toast.makeText(MainActivity.this,"onRestart called",Toast.LENGTH_SHORT).show();
           progressBar.setVisibility(View.VISIBLE);

       }


       @Override
       public void surfaceChanged(SurfaceHolder holder, int format, int width,
                                  int height) {

           if(isClassLoaded){
               try {
                   libSetSurface.invoke(myInstance,holder.getSurface());
               } catch (IllegalAccessException e) {
                   e.printStackTrace();
               } catch (InvocationTargetException e) {
                   e.printStackTrace();
               }
           }

       }

       @Override
       public void surfaceCreated(SurfaceHolder holder) {

       }

       @Override
       public void surfaceDestroyed(SurfaceHolder holder) {
           if(isClassLoaded) {
               try {
                   libSetSurface.invoke(myInstance, null);
               } catch (IllegalAccessException e) {
                   e.printStackTrace();
               } catch (InvocationTargetException e) {
                   e.printStackTrace();
               }
           }
       }



       private class PlayVideo extends AsyncTask {

           @Override
           protected Void doInBackground(Void... voids) {
               try {
                   int temp=(int)libInit.invoke(myInstance,url);
                   isInitialized=(temp==0);
               } catch (IllegalAccessException e) {
                   e.printStackTrace();
               } catch (InvocationTargetException e) {
                   e.printStackTrace();
               }
               return null;
           }

           @Override
           protected void onPostExecute(Void aVoid) {
               super.onPostExecute(aVoid);
               postInit();
               this.cancel(true);
           }
       }
    }
  • FFMPEG/NVDEC Fails When Under 7 Frames

    13 août 2021, par Meme Machine

    I was looking the examples from NVIDIA's repository, specifically their Encoding and Decoding projects. I downloaded the desktop duplication project, which allows you to capture a certain number of frames from the desktop as raw h264. I also got AppDecode, which decodes and displays frames from an input file. I noticed that if I try and capture only a single frame, it fails to decode the input file.

    


    Here is the output

    


    C:\Users\Admin>C:\Users\Admin\source\repos\video-sdk-samples\Samples\x64.Debug\AppDecD3d -d3d 11 -i C:\Users\Admin\source\repos\video-sdk-samples\nvEncDXGIOutputDuplicationSample\x64\Debug\ddatest_0.h264
GPU in use: NVIDIA GeForce RTX 2080 Super with Max-Q Design
Display with D3D11.
[INFO ][17:59:47] Media format: raw H.264 video (h264)
Session Initialization Time: 39 ms
[INFO ][17:59:47] Video Input Information
        Codec        : AVC/H.264
        Frame rate   : 30000/1000 = 30 fps
        Sequence     : Progressive
        Coded size   : [1920, 1088]
        Display area : [0, 0, 1920, 1080]
        Chroma       : YUV 420
        Bit depth    : 8
Video Decoding Params:
        Num Surfaces : 20
        Crop         : [0, 0, 0, 0]
        Resize       : 1920x1088
        Deinterlace  : Weave

Total frame decoded: 7
Session Deinitialization Time: 10 ms

C:\Users\Admin>C:\Users\Admin\source\repos\video-sdk-samples\Samples\x64.Debug\AppDecD3d -d3d 11 -i C:\Users\Admin\source\repos\video-sdk-samples\nvEncDXGIOutputDuplicationSample\x64\Debug\ddatest_0.h264
GPU in use: NVIDIA GeForce RTX 2080 Super with Max-Q Design
Display with D3D11.
[INFO ][17:59:54] Media format: raw H.264 video (h264)
[h264 @ 0000023B8AB5C3A0] decoding for stream 0 failed
Session Initialization Time: 42 ms
[INFO ][17:59:54] Video Input Information
        Codec        : AVC/H.264
        Frame rate   : 30000/1000 = 30 fps
        Sequence     : Progressive
        Coded size   : [1920, 1088]
        Display area : [0, 0, 1920, 1080]
        Chroma       : YUV 420
        Bit depth    : 8
Video Decoding Params:
        Num Surfaces : 20
        Crop         : [0, 0, 0, 0]
        Resize       : 1920x1088
        Deinterlace  : Weave

Total frame decoded: 6
Session Deinitialization Time: 10 ms


    


    I started from 10 frames and counted down to 6 where it eventually failed. It is important for me to know why this happens, because I plan to implement this decoder into my project, and will be feeding it single frames from a stream.

    


    Oh, and also I noticed the coded size is 1088 by 1920 instead of 1080 according to the output log. Not sure why that is occurring or if it is relevant