Recherche avancée

Médias (0)

Mot : - Tags -/xmlrpc

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (42)

Sur d’autres sites (6899)

  • How to recover video from H264 frames and timestamps [closed]

    10 juin 2024, par kokosda

    My service receives H264 frames and some metadata related to them like Timestamp from MS Teams.

    


    Observations :

    


      

    • Those frames are inter-frame compressed.
    • 


    • Resolution of those frames can change.
    • 


    • Timestamps are like this one 39264692280552704. That represents year 125 if fed to .NET consturctor new DateTime(39264692280552704), so I need to add 1899 years to get a real date.
    • 


    • I can wrap the sequence to a playable mp4 container with ffmpeg -i input.h264 -c copy output.mp4, however it is not what I want because the resulting video plays too fast, like on fast forward. Thus, I would like those timestamps would be considered to recover a real timeline.
    • 


    


    I merged all the H264 frames in one file like input.h264 and saved all the timestamps in another file like metadata.json. In metadata.json, each object describes a single frame from input.h264.

    


    My question is how to recover the source video from frames and timestamps that I received from Teams ? Particularly, using FFMPEG.

    


  • Serve a single frame of a video to a user as an image (efficiently)

    23 juillet 2014, par JoeRocc

    Problem :

    I have multiple 60 minute videos on my server (shared hosting). I need to serve a single frame of any one of these videos to a user instantly and efficiently. Expected maximum load is about 100 requests per minute.

    Notes :

    • I am on a shared hosting service, so FFmpeg is not a possibility (is FFmpeg efficient enough for 100 requests per minute anyway ?).
    • I don’t want to store hundreds of thousands of images on my server (my inodes are limited).

    Possible Solutions :

    • Load the video onto the client-side and then extract a frame and paint it to a canvas as per this article. Though I don’t want the user to have to download the whole video just to get a frame (will they have to do this ?).
    • Use some efficient server-side library to efficiently extract a frame and then serve it to the user.
    •  ???
  • FFMPEG Command in Android Failing to Execute

    15 janvier 2015, par Zoe

    I’m trying to execute ffmpeg commands through an android app I’m developing.

    I found this post which has been somewhat useful :
    Problems with ffmpeg command line on android

    and I downloaded a static build of ffmpeg from here : http://ffmpeg.gusari.org/static/

    The problem is, when this code runs

    public void merge_video(){


         String[] ffmpegCommand = new String[5];
         ffmpegCommand[0] = "/data/data/com.example.zovideo/ffmpeg";
         ffmpegCommand[1] = "-i";
         ffmpegCommand[2] = "concat:storage/emulated/0/DCIM/Camera/VID30141106_211509.mp4|storage/emulated/0/DCIM/Camera/VID30141106_211509.mp4";
         ffmpegCommand[3] = "copy";
         ffmpegCommand[4] = "storage/emulated/0/DCIM/ZoVideo/Output.mp4";  

         try {
             Process ffmpegProcess = new ProcessBuilder(ffmpegCommand).redirectErrorStream(true).start();

             String line;
             BufferedReader reader = new BufferedReader(new InputStreamReader(ffmpegProcess.getInputStream()));
             Log.d(null, "*******Starting FFMPEG");

             while((line = reader.readLine())!=null){

                 Log.d(null, "***"+line+"***");
             }
             Log.d(null,"****ending FFMPEG****");

       } catch (IOException e) {
           e.printStackTrace();
       }
     }

    It fails when trying to start the process with

    Java.io.IOException: Error running exec(). Command: [/data/data/com.example.zovideo/ffmpeg, -i, concat:storage/emulated/0/DCIM/Camera/VID30141106_211509.mp4|storage/emulated/0/DCIM/Camera/VID30141106_211509.mp4, copy, storage/emulated/0/DCIM/ZoVideo/Output.mp4] Working Directory: null Environment: [ANDROID_ROOT=/system, EMULATED_STORAGE_SOURCE=/mnt/shell/emulated, LOOP_MOUNTPOINT=/mnt/obb, LD_PRELOAD=libsigchain.so, ANDROID_BOOTLOGO=1, EMULATED_STORAGE_TARGET=/storage/emulated, EXTERNAL_STORAGE=/storage/emulated/legacy, SYSTEMSERVERCLASSPATH=/system/framework/services.jar:/system/framework/ethernet-service.jar:/system/framework/wifi-service.jar, ANDROID_SOCKET_zygote=10, PATH=/sbin:/vendor/bin:/system/sbin:/system/bin:/system/xbin, ANDROID_DATA=/data, ANDROID_ASSETS=/system/app, ASEC_MOUNTPOINT=/mnt/asec, BOOTCLASSPATH=/system/framework/core-libart.jar:/system/framework/conscrypt.jar:/system/framework/okhttp.jar:/system/framework/core-junit.jar:/system/framework/bouncycastle.jar:/system/framework/ext.jar:/system/framework/framework.jar:/system/framework/telephony-common.jar:/system/framework/voip-common.jar:/system/framework/ims-common.jar:/system/framework/mms-common.jar:/system/framework/android.policy.jar:/system/framework/apache-xml.jar, ANDROID_PROPERTY_WORKSPACE=9,0, ANDROID_STORAGE=/storage]

    I understand from the stackoverflow post I mentioned above that the ffmpeg static build needs to be on my device otherwise my app cannot use it.

    However I’m unsure how to get it in the /data/data/com.example.zovideo folder as needed.

    I have done is download the latest static ffmpeg build from http://ffmpeg.gusari.org/static/ and copied it into my libs/armeabi and libs/armeabi-v7a folders but this obviously hasn’t succeeded in getting into the data/data folder when my app is installed onto my device.
    (I feel like I’m being an idiot by copy/pasting the files but I don’t know what else to do. I don’t know how to compile it myself - I have compiled ffmpeg using a Roman10 tutorial but this produces .so files which from which I understand is not what I need)

    So I’m a little stuck.
    Any advice is greatly appreciated. Thanks