Recherche avancée

Médias (1)

Mot : - Tags -/copyleft

Autres articles (107)

  • La gestion des forums

    3 novembre 2011, par

    Si les forums sont activés sur le site, les administrateurs ont la possibilité de les gérer depuis l’interface d’administration ou depuis l’article même dans le bloc de modification de l’article qui se trouve dans la navigation de la page.
    Accès à l’interface de modération des messages
    Lorsqu’il est identifié sur le site, l’administrateur peut procéder de deux manières pour gérer les forums.
    S’il souhaite modifier (modérer, déclarer comme SPAM un message) les forums d’un article particulier, il a à sa (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

Sur d’autres sites (10853)

  • react-native-ffmpeg : Native module cannot be null (ios) and null is not an object (evaluating 'RNFFmpegModule.enableLogEvents') (android)

    4 mars 2020, par foxtrotuniform6969

    I have a basic expo project that I’ve eject as expokit, and proceeded to run expo install react-native-ffmpeg. When then running expo start I was greeted with Native module cannot be null on my iOS device and null is not an object (evaluating  'RNFFmpegModule.enableLogEvents').

    After that, I re-read the react-native-ffmpeg documentation and added pod 'react-native-ffmpeg', :podspec => '../node_modules/react-native-ffmpeg/ios/react-native-ffmpeg.podspec' to my ios/Podfile and ran pod install. Still no luck.

    After that, I deleted the node_modules folder and yarn.lock file, re-ran yarn, went into the ios directory, deleted the Pods folder and Podfile.lock file and re-ran pod install and still had no luck !

    The error is less than helpful. Any ideas where to look ?

  • Video compressing not working with ffmpeg4android_lib library

    18 décembre 2019, par Nand Kishor Patidar

    When I am trying video compressing with ffmpeg4android_lib library then it is working on debug mode(Not google play store app) after that when I check live on play store the video compressing is not working and found that the application crashed. After the crash, it gives the error :-
    videokitLibPath not exits : /data/app/com.practice.presto-SSicBo_Qm4UXOQAQ45Vttg==/lib/arm/libvideokit.so
    My App Log is below :

    1576657500.663 20809-20809/com.practice.presto D/MyPracticeAudioVideo: outside wifi file type 2
    1576657500.807 20809-22043/com.practice.presto D/MyPracticeAudioVideo: Worker started
    1576657500.807 20809-22043/com.practice.presto D/MyPracticeAudioVideo: new file video path /storage/emulated/0/DCIM/Camera/VID_20191218_135413.mp4
    1576657500.808 20809-22043/com.practice.presto D/MyPracticeAudioVideo: Acquire wake lock
    1576657500.812 20809-22043/com.practice.presto I/ffmpeg4android: checkForPermissions() called
    1576657500.814 20809-22043/com.practice.presto I/ffmpeg4android: permission already granted
    1576657500.830 20809-22043/com.practice.presto D/MyPracticeAudioVideo: ffmpeg -y -i /storage/emulated/0/DCIM/Camera/VID_20191218_135413.mp4 -strict experimental -s 320x240 -r 25 -aspect 4:3 -vcodec mpeg4 -b 150k -ab 48000 -ac 2 -ar 22050 /storage/emulated/0/pp/1576657432247.mp4
    1576657500.831 20809-22043/com.practice.presto I/ffmpeg4android: running ffmpeg4android_lib: 41.07.00_LM41_00
    1576657500.831 20809-22043/com.practice.presto D/ffmpeg4android: {"ffmpeg","-y","-i","/storage/emulated/0/DCIM/Camera/VID_20191218_135413.mp4","-strict","experimental","-s","320x240","-r","25","-aspect","4:3","-vcodec","mpeg4","-b","150k","-ab","48000","-ac","2","-ar","22050","/storage/emulated/0/pp/1576657432247.mp4"}
    1576657500.833 20809-22043/com.practice.presto D/ffmpeg4android: /storage/emulated/0/DCIM/Camera/VID_20191218_135413.mp4 length in bytes: 15793157
    1576657500.834 20809-22043/com.practice.presto E/ffmpeg4android: **videokitLibPath not exits: /data/app/com.practice.presto-SSicBo_Qm4UXOQAQ45Vttg==/lib/arm/libvideokit.so**
    1576657500.835 20809-22043/com.practice.presto I/Videokit: libvideokit.so load Fail: /data/app/com.practice.presto-SSicBo_Qm4UXOQAQ45Vttg==/lib/arm/libvideokit.so
    1576657500.835 20809-22043/com.practice.presto I/Videokit: args is not NULL
    1576657500.835 20809-22043/com.practice.presto I/Videokit: more then one arg
    1576657500.835 20809-22043/com.practice.presto I/Videokit: function symbol not found
    1576657500.835 20809-22043/com.practice.presto D/Videokit: Calling videokit run via loader
    1576657500.860 20809-22043/com.practice.presto A/libc: **Fatal signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x0 in tid 22043 (Thread-19), pid 20809 (practice.presto)**
  • FFmpeg Javacv - Latency Issue

    24 décembre 2019, par cagney

    I am using an android v21 device to stream data to a javafx application. Its working fine but I have about 2 seconds of latency.

    As of now the basic transportation goes like this

    1. android webrtc/custom implementation 16ms
    2. android packetizer(udp) 6 ms
    3. udp transport assumed at < 5ms
    4. windows depacketizer no buildup of data in buffers
    5. windows ffmpeg framgrabber unkown latency
    6. javafx imageview <1 ms

    My data stream to my desktop and my packetizer is much faster than my frame rate and is often just waiting. There is no buildup of data anywhere else and therefore I assume no delay in any of my code.

    I tested my android device by writing the yuv from camera to a texture and timing how long before the android device can encode the frame into h264 and then how long until its sent. so 16 + 6 = 22ms

    I feel the problem is with the Javacv ffmpeg framegrabber. Im studying this api in order to learn why this is occurring.

    My major concern is that framegrabber takes foever to start...around 4 seconds.

    Once it start I can clearly see how many frames I insert and how many its grabbing and it always lagging by some large number such as 40 up to 200.

    Also Framegrabber.grab() is blocking and runs every 100ms to match my frame rate no matter how fast I tell it to run so I can never catch up.

    Do you have any suggestions ?

    Im starting to think javacv is not a viable solution because it seems many people struggle with this delay issue. If you have alternate suggestions please advise.

    My ffmpeg framgrabber

       public RapidDecoder(final InputStream inputStream, final ImageView view)
    {
       System.out.println(TAG + " starting");

        grabber = new FFmpegFrameGrabber(inputStream, 0);
        converter = new Java2DFrameConverter();
        mView = view;


       emptyBuffer = new Runnable() {
           @Override
           public void run() {
               System.out.println(TAG + " emptybuffer thread running");
               try {

                   grabber.setFrameRate(12);
                   grabber.setVideoBitrate(10000);

                   //grabber.setOption("g", "2");
                  // grabber.setOption("bufsize", "10000");
                   //grabber.setOption("af", "delay 20");
                   //grabber.setNumBuffers(0);
                   //grabber.setOption("flush_packets", "1");
                   //grabber.setOption("probsize", "32");
                   //grabber.setOption("analyzeduration", "0");
                   grabber.setOption("preset", "ultrafast");

                   grabber.setOption("fflags", "nobuffer");
                   //grabber.setVideoOption("nobuffer", "1");
                   //grabber.setOption("fflags", "discardcorrupt");
                   //grabber.setOption("framedrop", "\\");
                  //grabber.setOption("flags","low_delay");
                   grabber.setOption("strict","experimental");
                   //grabber.setOption("avioflags", "direct");
                   //grabber.setOption("filter:v", "fps=fps=30");
                   grabber.setVideoOption("tune", "zerolatency");
                   //grabber.setFrameNumber(60);


                   grabber.start();
               }catch (Exception e)
               {
                   System.out.println(TAG + e);
               }

               while (true)
               {

                   try{
                       grabFrame();
                       Thread.sleep(1);
                   }catch (Exception e)
                   {
                       System.out.println(TAG + " emptybuffer " + e);
                   }

               }



           }
       };

       display = new Runnable() {
           @Override
           public void run() {

               System.out.println(TAG + " display thread running ");

               while(true)
               {

                   try{
                       displayImage();
                       Thread.sleep(10);
                   }catch (Exception e)
                   {
                       System.out.println(TAG + " display " + e);
                   }

               }


           }
       };




    }


    public void generateVideo()
    {
       System.out.println(TAG + " genvid ");




       new Thread(emptyBuffer).start();
       new Thread(display).start();



    }



    public synchronized void grabFrame() throws FrameGrabber.Exception
    {
              //frame = grabber.grabFrame();
           frame = grabber.grab();
       //System.out.println("grab");


    }

    public synchronized void displayImage()
    {


       bufferedImage = converter.convert(frame);
       frame = null;
       if (bufferedImage == null) return;
       mView.setImage(SwingFXUtils.toFXImage(bufferedImage, null));
       //System.out.println("display");
    }

    here you can see i draw texture with image and send to h264 encoder

    @Override
    public void onTextureFrameCaptured(int width, int height, int texId, float[] tranformMatrix, int rotation, long timestamp)
    //Log.d(TAG, "onTextureFrameCaptured : ->") ;

               VideoRenderer.I420Frame frame = new VideoRenderer.I420Frame(width, height, rotation, texId, tranformMatrix, 0,timestamp);
               avccEncoder.renderFrame(frame);
               videoView.renderFrame(frame);
               surfaceTextureHelper.returnTextureFrame();

           }

    Here you can see webrtc encoding happen

    @Override
       public void renderFrame(VideoRenderer.I420Frame i420Frame) {
           start = System.nanoTime();
           bufferque++;

           mediaCodecHandler.post(new Runnable() {
               @Override
               public void run() {
                   videoEncoder.encodeTexture(false, i420Frame.textureId, i420Frame.samplingMatrix, TimeUnit.NANOSECONDS.toMicros(i420Frame.timestamp));
               }
           });


       }

       /**
        * Called to retrieve an encoded frame
        */
       @Override
       public void onEncodedFrame(MediaCodecVideoEncoder.OutputBufferInfo frame, MediaCodec.BufferInfo bufferInfo) {

           b = new byte[frame.buffer().remaining()];
           frame.buffer().get(b);
           synchronized (lock)
           {
               encodedBuffer.add(b);
               lock.notifyAll();
               if(encodedBuffer.size() > 1)
               {
                   Log.e(TAG, "drainEncoder: too big: " + encodedBuffer.size(),null );

               }
           }
           duration = System.nanoTime() - start;
           bufferque--;
           calcAverage();
           if (bufferque > 0)
           {
           Log.d(TAG, "onEncodedFrame: bufferque size: " + bufferque);


       }

    }