Advanced search

Medias (91)

Other articles (105)

  • Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs

    12 April 2011, by

    La manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
    Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.

  • HTML5 audio and video support

    13 April 2011, by

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • De l’upload à la vidéo finale [version standalone]

    31 January 2010, by

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier; La génération d’une vignette : extraction d’une (...)

On other websites (10122)

  • FFMPEG audio not delaying properly

    1 April 2022, by Spartan 117

    I am trying to mix two audio files together, with one audio being delayed. Here are the file details

    


    Audio 1 - https://pastebin.com/vkYsH88e, "startTime": "03/23/2022 21:20:27", "endTime": "03/24/2022 01:06:11"

    


    Audio 2 - https://pastebin.com/Gs4V96GQ, "startTime": "03/24/2022 01:05:30", "endTime": "03/24/2022 04:21:41"

    


    As you can see from the paste bin and the json property, Audio two starts about 3 hours and 45 minutes after Audio one, here is how i am trying to mix them

    


    ffmpeg -i RTb295d0534191e1acb22a45bb971a12e6.mka -i RT103bfe5f4b129860f69cd8e820f3a10b.mka -filter_complex "[1:a]adelay=13500s:all=1[apad]; [0:a][apad]amix=inputs=2:weights=1|1[aout]" -map [aout] combined_audio.mka


    


    Here is the output of that mixture - https://pastebin.com/KAsw0905

    


    Files have been uploaded here, if anyone wants to test - https://easyupload.io/m/xveng4

    


    The issue i'm having? Based on the parameters provided, the 2nd audio file should start playing after 3 hours and 45 minutes. Instead, it plays after 5 hours and 45 minutes. I'm new to FFMPEG and have been having trouble figuring this out. Any ideas?

    


  • How to send libmmpeg AVPacket through WebRTC (using libdatachannel)

    29 March 2022, by mike

    I'm encoding a video frame with the ffmpeg libraries, generating an AVPacket with compressed data.

    


    Thanks to some recent advice here on S/O, I am trying to send that frame over a network using the WebRTC library libdatachannel, specifically by adapting the example here:

    


    https://github.com/paullouisageneau/libdatachannel/tree/master/examples/streamer

    


    I am seeing problems inside h264rtppacketizer.cpp (part of the library, not the example) which are almost certainly to do with how I'm providing the sample data.
(I don't think that this is anything to do with libdatachannel specifically, it will be an issue with what I'm sending)

    


    The example code reads each encoded frame from a file, and populates a sample by setting the content of the file to the contents of the file:

    


    sample = *reinterpret_cast *>(&fileContents);

    


    sample is just a std::vector<byte>;</byte>

    &#xA;

    I have naively copied the contents of an AVPacket->data pointer into the sample vector:

    &#xA;

    sample.resize(pkt->size);&#xA;memcpy(sample.data(), pkt->data, pkt->size * sizeof(std::byte));    &#xA;

    &#xA;

    but the packetizer is falling over when trying to get length values out of that data.&#xA;Specifically, in the following code, the first iteration gets a length of 1, but the second, looking up index 5, gives 1119887324. This is way too big for my data, which is only 3526 bytes (the whole frame is a single colour so likely to be small once encoded):

    &#xA;

    while (index &lt; message->size()) {&#xA;assert(index &#x2B; 4 &lt; message->size());&#xA;auto lengthPtr = (uint32_t *)(message->data() &#x2B; index);&#xA;uint32_t length = ntohl(*lengthPtr);&#xA;auto naluStartIndex = index &#x2B; 4;&#xA;auto naluEndIndex = naluStartIndex &#x2B; length;&#xA;assert(naluEndIndex &lt;= message->size());    &#xA;        &#xA;auto begin = message->begin() &#x2B; naluStartIndex;&#xA;auto end = message->begin() &#x2B; naluEndIndex;&#xA;nalus->push_back(std::make_shared<nalunit>(begin, end));&#xA;index = naluEndIndex;&#xA;}&#xA;</nalunit>

    &#xA;

    Here is a dump of

    &#xA;

    uint32_t length = ntohl(*lengthPtr);&#xA;

    &#xA;

    for the first few elements of the message (*lengthPtr in parentheses):

    &#xA;

    [2022-03-29 15:12:01.182] [info] index 0: 1  (16777216)&#xA;[2022-03-29 15:12:01.183] [info] index 1: 359  (1728118784)&#xA;[2022-03-29 15:12:01.184] [info] index 2: 91970  (1114046720)&#xA;[2022-03-29 15:12:01.186] [info] index 3: 23544512  (3225577217)&#xA;[2022-03-29 15:12:01.186] [info] index 4: 1732427807  (532693607)&#xA;[2022-03-29 15:12:01.187] [info] index 5: 1119887324  (3693068354)&#xA;[2022-03-29 15:12:01.188] [info] index 6: 3223313413  (98312128)&#xA;[2022-03-29 15:12:01.188] [info] index 7: 534512896  (384031)&#xA;[2022-03-29 15:12:01.188] [info] index 8: 3691315291  (1526728156)&#xA;[2022-03-29 15:12:01.189] [info] index 9: 83909537  (2707095557)&#xA;[2022-03-29 15:12:01.189] [info] index 10: 6004992  (10574592)&#xA;[2022-03-29 15:12:01.190] [info] index 11: 1537277952  (41307)&#xA;[2022-03-29 15:12:01.190] [info] index 12: 2701131779  (50331809)&#xA;[2022-03-29 15:12:01.192] [info] index 13: 768  (196608)&#xA;

    &#xA;

    (I know I should post a complete sample, I am working on it)

    &#xA;

      &#xA;
    • I am fairly sure I am just missing something basic. E.g. am I supposed to do something with the AVPacket side_data, does AVPacket have or miss some header info?

      &#xA;

    • &#xA;

    • If I just fwrite the pkt->data for a single frame to disk, I can read the codec information with ffprobe:

      &#xA;

    • &#xA;

    &#xA;

    Input #0, h264, from &#x27;encodedOut.h264&#x27;:&#xA;Duration: N/A, bitrate: N/A&#xA;Stream #0:0: Video: h264 (Constrained Baseline), yuv420p(progressive), 1280x720, 30 tbr, 1200k tbn&#xA;

    &#xA;

      &#xA;
    • whereas the same for the example input files (again a single frame) gives the following:
    • &#xA;

    &#xA;

    [h264 @ 000001c88d1135c0] Format h264 detected only with low score of 1, misdetection possible!&#xA;[h264 @ 000001c88f337400] missing picture in access unit with size 85306&#xA;[extract_extradata @ 000001c88d11ee40] No start code is found.&#xA;sample-0.h264: Invalid data found when processing input&#xA;

    &#xA;

  • Stream image from Android with FFMPEG

    9 February 2023, by xnok

    I'm currently receiving images from an external source as byte array and I would like to send it as raw video format via ffmpeg to a stream URL, where I have a RTSP server that receives RTSP streams (a similar unanswered question). However, I haven't worked with FFMPEG in Java, so i can't find an example on how to do it. I have a callback that copies the image bytes to a byte array as follows:

    &#xA;

            public class MainActivity extends Activity {&#xA;            final String rtmp_url = "rtmp://192.168.0.12:1935/live/test";&#xA;            private int PREVIEW_WIDTH = 384;&#xA;            private int PREVIEW_HEIGHT = 292;&#xA;            private String TAG = "MainActivity";&#xA;            String ffmpeg = Loader.load(org.bytedeco.ffmpeg.ffmpeg.class);&#xA;            final String command[] = {ffmpeg,&#xA;                            "-y",  //Add "-re" for simulated readtime streaming.&#xA;                            "-f", "rawvideo",&#xA;                            "-vcodec", "rawvideo",&#xA;                            "-pix_fmt", "bgr24",&#xA;                            "-s", (Integer.toString(PREVIEW_WIDTH) &#x2B; "x" &#x2B; Integer.toString(PREVIEW_HEIGHT)),&#xA;                            "-r", "10",&#xA;                            "-i", "pipe:",&#xA;                            "-c:v", "libx264",&#xA;                            "-pix_fmt", "yuv420p",&#xA;                            "-preset", "ultrafast",&#xA;                            "-f", "flv",&#xA;                            rtmp_url};&#xA;            &#xA;      private UVCCamera mUVCCamera;&#xA;&#xA;public void handleStartPreview(Object surface) throws InterruptedException, IOException {&#xA;    Log.e(TAG, "handleStartPreview:mUVCCamera" &#x2B; mUVCCamera &#x2B; " mIsPreviewing:");&#xA;    if ((mUVCCamera == null)) return;&#xA;    Log.e(TAG, "handleStartPreview2 ");&#xA;    try {&#xA;        mUVCCamera.setPreviewSize(mWidth, mHeight, 1, 26, 0, UVCCamera.DEFAULT_BANDWIDTH, 0);&#xA;        Log.e(TAG, "handleStartPreview3 mWidth: " &#x2B; mWidth &#x2B; "mHeight:" &#x2B; mHeight);&#xA;    } catch (IllegalArgumentException e) {&#xA;        try {&#xA;            // fallback to YUV mode&#xA;            mUVCCamera.setPreviewSize(mWidth, mHeight, 1, 26, UVCCamera.DEFAULT_PREVIEW_MODE, UVCCamera.DEFAULT_BANDWIDTH, 0);&#xA;            Log.e(TAG, "handleStartPreview4");&#xA;        } catch (IllegalArgumentException e1) {&#xA;            callOnError(e1);&#xA;            return;&#xA;        }&#xA;    }&#xA;    Log.e(TAG, "handleStartPreview: startPreview1");&#xA;    int result = mUVCCamera.startPreview();&#xA;    mUVCCamera.setFrameCallback(mIFrameCallback, UVCCamera.PIXEL_FORMAT_RGBX);&#xA;    mUVCCamera.startCapture();&#xA;    Toast.makeText(MainActivity.this,"Camera Started",Toast.LENGTH_SHORT).show();&#xA;    ProcessBuilder pb = new ProcessBuilder(command);&#xA;    pb.redirectErrorStream(true);&#xA;    Process process = pb.start();&#xA;    BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()));&#xA;    OutputStream writer = process.getOutputStream();&#xA;    byte img[] = new byte[192*108*3];&#xA;    for (int i = 0; i &lt; 10; i&#x2B;&#x2B;)&#xA;    {&#xA;        for (int y = 0; y &lt; 108; y&#x2B;&#x2B;)&#xA;        {&#xA;            for (int x = 0; x &lt; 192; x&#x2B;&#x2B;)&#xA;            {&#xA;                byte r = (byte)((x * y &#x2B; i) % 255);&#xA;                byte g = (byte)((x * y &#x2B; i*10) % 255);&#xA;                byte b = (byte)((x * y &#x2B; i*20) % 255);&#xA;                img[(y*192 &#x2B; x)*3] = b;&#xA;                img[(y*192 &#x2B; x)*3&#x2B;1] = g;&#xA;                img[(y*192 &#x2B; x)*3&#x2B;2] = r;&#xA;            }&#xA;        }&#xA;&#xA;        writer.write(img);&#xA;    }&#xA;&#xA;    writer.close();&#xA;    String line;&#xA;    while ((line = reader.readLine()) != null)&#xA;    {&#xA;        System.out.println(line);&#xA;    }&#xA;&#xA;    process.waitFor();&#xA;}&#xA;public static void buildRawFrame(Mat img, int i)&#xA;{&#xA;    int p = img.cols() / 60;&#xA;    img.setTo(new Scalar(60, 60, 60));&#xA;    String text = Integer.toString(i&#x2B;1);&#xA;    int font = Imgproc.FONT_HERSHEY_SIMPLEX;&#xA;    Point pos = new Point(img.cols()/2-p*10*(text.length()), img.rows()/2&#x2B;p*10);&#xA;    Imgproc.putText(img, text, pos, font, p, new Scalar(255, 30, 30), p*2);  //Blue number&#xA;}&#xA;

    &#xA;

    Additionally: Android Camera Capture using FFmpeg

    &#xA;

    uses ffmpeg to capture frame by frame from native android camera and instead of pushing it via RTMP, they used to generate a video file as output. Although how the image was passed via ffmpeg was not informed.

    &#xA;

    frameData is my byte array and I'd like to know how can I write the necessary ffmpeg commands using ProcessBuilder to send an image via RTSP using ffmpeg for a given URL.

    &#xA;

    An example of what I am trying to do, In Python 3 I could easily do it by doing:

    &#xA;

    import cv2&#xA;import numpy as np&#xA;import socket&#xA;import sys&#xA;import pickle&#xA;import struct&#xA;import subprocess&#xA;&#xA;fps = 25&#xA;width = 224&#xA;height = 224&#xA;rtmp_url = &#x27;rtmp://192.168.0.13:1935/live/test&#x27;&#xA;    &#xA;    &#xA;    &#xA;    command = [&#x27;ffmpeg&#x27;,&#xA;               &#x27;-y&#x27;,&#xA;               &#x27;-f&#x27;, &#x27;rawvideo&#x27;,&#xA;               &#x27;-vcodec&#x27;, &#x27;rawvideo&#x27;,&#xA;               &#x27;-pix_fmt&#x27;, &#x27;bgr24&#x27;,&#xA;               &#x27;-s&#x27;, "{}x{}".format(width, height),&#xA;               &#x27;-r&#x27;, str(fps),&#xA;               &#x27;-i&#x27;, &#x27;-&#x27;,&#xA;               &#x27;-c:v&#x27;, &#x27;libx264&#x27;,&#xA;               &#x27;-pix_fmt&#x27;, &#x27;yuv420p&#x27;,&#xA;               &#x27;-preset&#x27;, &#x27;ultrafast&#x27;,&#xA;               &#x27;-f&#x27;, &#x27;flv&#x27;,&#xA;               rtmp_url]&#xA;    &#xA;    p = subprocess.Popen(command, stdin=subprocess.PIPE)&#xA;    &#xA;    while(True):&#xA;        frame = np.random.randint([255], size=(224, 224, 3))&#xA;        frame = frame.astype(np.uint8)&#xA;        p.stdin.write(frame.tobytes())&#xA;

    &#xA;

    I would like to do the same thing in Android

    &#xA;

    Update: I can reproduce @Rotem 's answer on Netbeans although, in Android I am getting NullPointer exception error when trying to execute pb.start().

    &#xA;

        Process: com.infiRay.XthermMini, PID: 32089&#xA;    java.lang.NullPointerException&#xA;        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1012)&#xA;        at com.infiRay.XthermMini.MainActivity.handleStartPreview(MainActivity.java:512)&#xA;        at com.infiRay.XthermMini.MainActivity.startPreview(MainActivity.java:563)&#xA;        at com.infiRay.XthermMini.MainActivity.access$1000(MainActivity.java:49)&#xA;        at com.infiRay.XthermMini.MainActivity$3.onConnect(MainActivity.java:316)&#xA;        at com.serenegiant.usb.USBMonitor$3.run(USBMonitor.java:620)&#xA;        at android.os.Handler.handleCallback(Handler.java:938)&#xA;        at android.os.Handler.dispatchMessage(Handler.java:99)&#xA;        at android.os.Looper.loopOnce(Looper.java:226)&#xA;        at android.os.Looper.loop(Looper.java:313)&#xA;        at android.os.HandlerThread.run(HandlerThread.java:67)&#xA;2022-06-02 11:47:20.300 32089-1049/com.infiRay.XthermMini E/libUVCCamera: [1049*UVCPreviewIR.cpp:505:uvc_preview_frame_callback]:receive err data&#xA;2022-06-02 11:47:20.304 32089-1049/com.infiRay.XthermMini E/libUVCCamera: [1049*UVCPreviewIR.cpp:505:uvc_preview_frame_callback]:receive err data&#xA;2022-06-02 11:47:20.304 32089-1049/com.infiRay.XthermMini E/libUVCCamera: [1049*UVCPreviewIR.cpp:505:uvc_preview_frame_callback]:receive err data&#xA;2022-06-02 11:47:20.308 32089-1049/com.infiRay.XthermMini E/libUVCCamera: [1049*UVCPreviewIR.cpp:505:uvc_preview_frame_callback]:receive err data&#xA;2022-06-02 11:47:20.312 32089-32089/com.infiRay.XthermMini E/MainActivity: onPause:&#xA;2022-06-02 11:47:20.314 32089-32581/com.infiRay.XthermMini I/Process: Sending signal. PID: 32089 SIG: 9&#xA;

    &#xA;