Recherche avancée

Médias (0)

Mot : - Tags -/content

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (60)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

Sur d’autres sites (7601)

  • ffmpeg problems with streaming mp4 over udp in local network

    28 novembre 2019, par AJ Cole

    I’m streaming mp4 video files (some of them are avi converted to mp4 with ffmpeg earlier) over udp://232.255.23.23:1234 from linux (embedded) with ffmpeg v3.4.2 to multiple linux (antix) machines that play the stream with MPV, all of this happens in local network so I expected it to work flawlessly, but unfortunately it doesn’t.

    Here are the original commands I tried to use :

    ffmpeg

    ffmpeg -re -i PATH_TO_FILE.mp4 -c copy -f mpegts udp://232.255.23.23:1234

    mpv

    mpv --no-config --geometry=[geo settings] --no-border udp://232.255.23.23:1234

    This seemed to woork good, however a problem appeared that on the displaying end, the stream is actually much longer than the streamed content itself. The mp4 files in total have 5 minutes 36 seconds, and mpv plays the entire stream loop in >= 6 minutes. I think it’s happening because of dropped frames, that mpv waits to recover or something and therefore extends the length of the actual content. This cannot work in my case, as I have a precise time gap for displaying the stream and it cannot be longer than the streamed content.
    All the content is made in 1680x800 resolution and is displayed on a screen with 1680x1050 resoltion (positioned with mpv geometry)

    It appears that using this command for mpv :

    mpv --no-config --framedrop=no --geometry=[geo settings] --no-border udp://232.255.23.23:1234

    made the duration correct, however this introduces huge artifacts in the videos sometimes.

    I read that using -re for streaming can cause these frame drops, so I tried putting a static number of fps for both file input and output stream, for example :

    ffmpeg -re -i PATH_TO_FILE.mp4 -c copy -r 25 -f mpegts udp://232.255.23.23:1234

    This reads the file at native framerate and outputs the stream at 25fps, and it appears to have the timing duration correct, but it also causes occasional articats and I think has worse qualit overall. Output from mpv when one of the artifacts happened :

    [ffmpeg/video] h264: cabac decode of qscale diff failed at 85 19
    [ffmpeg/video] h264: error while decoding MB 85 19, bytestream 85515

    I also tried using --untimed or --no-cache in mpv, but this causes stutters in the video

    I’m also getting requent Invalid video timestamp warnings in MPV, for example : Invalid video timestamp: 1.208333 -> -8.711667

    Playing in mpv without --no-config and with --untimed added also causes frequent artifacts :

    V: -00:00:00 / 00:00:00 Cache:  0s+266KB
    [ffmpeg/video] h264: Invalid NAL unit 8, skipping.
    V: -00:00:00 / 00:00:00 Cache:  0s+274KB
    [ffmpeg/video] h264: Reference 4 >= 4
    [ffmpeg/video] h264: error while decoding MB 6 0, bytestream 31474
    [ffmpeg/video] h264: error while decoding MB 78 49, bytestream -12
    V: 00:00:06 / 00:00:00 Cache:  5s+11KB
    Invalid video timestamp: 6.288333 -> -8.724933
    V: -00:00:05 / 00:00:00 Cache:  3s+0KB
    [ffmpeg/video] h264: Invalid NAL unit 8, skipping.
    [ffmpeg/video] h264: error while decoding MB 59 24, bytestream -27
    V: -00:00:04 / 00:00:00 Cache:  3s+0KB
    [ffmpeg/video] h264: Reference 4 >= 3
    [ffmpeg/video] h264: error while decoding MB 5 2, bytestream 13402
    V: -00:00:03 / 00:00:00 Cache:  2s+0KB
    [ffmpeg/video] h264: Reference 5 >= 4
    [ffmpeg/video] h264: error while decoding MB 51 21, bytestream 9415

    I tried playing the stream with ffplay and it also caused the videos to be "played" 20 seconds longer.
    Is there any way to keep the streaming duration intact and prevent those huge artifacts ? These aren’t any huge video files, they are few MB each, everything happens in local network so the latencies are minimal.

    Output from ffmpeg when streaming one of the files :

    libavutil      55. 78.100 / 55. 78.100
     libavcodec     57.107.100 / 57.107.100
     libavformat    57. 83.100 / 57. 83.100
     libavdevice    57. 10.100 / 57. 10.100
     libavfilter     6.107.100 /  6.107.100
     libswscale      4.  8.100 /  4.  8.100
     libswresample   2.  9.100 /  2.  9.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'SDM.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf57.48.100
     Duration: 00:00:20.00, start: 0.000000, bitrate: 1883 kb/s
       Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1680x800 [SAR 1:1 DAR 21:10], 1880 kb/s, 24 fps, 24 tbr, 12288 tbn, 48 tbc (default)
       Metadata:
         handler_name    : VideoHandler
    Output #0, mpegts, to 'udp://232.255.23.23:1234':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf57.83.100
       Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1680x800 [SAR 1:1 DAR 21:10], q=2-31, 1880 kb/s, 24 fps, 24 tbr, 90k tbn, 25 tbc (default)
       Metadata:
         handler_name    : VideoHandler
    Stream mapping:
     Stream #0:0 -> #0:0 (copy)
    Press [q] to stop, [?] for help
    frame=  480 fps= 24 q=-1.0 Lsize=    5009kB time=00:00:19.87 bitrate=2064.7kbits/s speed=   1x
    video:4592kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 9.082929%

    Edit : all the files don’t contain any audio, so it should be even less traffic on the network

  • FFmpeg : unspecified pixel format when opening video with custom context

    14 février 2021, par Pedro

    I am trying to decode a video with a custom context. The purpose is that I want to decode the video directly from memory. In the following code, I am reading from file in the read function passed to avio_alloc_context - but this is just for testing purposes.

    



    I think I've read any post there is on Stackoverflow or on any other website related to this topic. At least I definitely tried my best to do so. While there is much in common, the details differ : people set different flags, some say av_probe_input_format is required, some say it isn't, etc. And for some reason nothing works for me.

    



    My problem is that the pixel format is unspecified (see output below), which is why I run into problems later when calling sws_getContext. I checked pFormatContext->streams[videoStreamIndex]->codec->pix_fmt, and it is -1.

    



    Please note my comments // things I tried and // seems not to help in the code. I think, the answer might be hidden somehwere there. I tried many combinations of hints that I've read so far, but I am missing a detail I guess.

    



    The problem is not the video file, because when I go the standard way and just call avformat_open_input(&pFormatContext, pFilePath, NULL, NULL) without a custom context, everything runs fine.

    



    The code compiles and runs as is.

    



    #include <libavformat></libavformat>avformat.h>&#xA;#include &#xA;#include &#xA;&#xA;FILE *f;&#xA;&#xA;static int read(void *opaque, uint8_t *buf, int buf_size) {&#xA;    if (feof(f)) return -1;&#xA;    return fread(buf, 1, buf_size, f);&#xA;}&#xA;&#xA;int openVideo(const char *pFilePath) {&#xA;    const int bufferSize = 32768;&#xA;    int ret;&#xA;&#xA;    av_register_all();&#xA;&#xA;    f = fopen(pFilePath, "rb");&#xA;    uint8_t *pBuffer = (uint8_t *) av_malloc(bufferSize &#x2B; AVPROBE_PADDING_SIZE);&#xA;    AVIOContext *pAVIOContext = avio_alloc_context(pBuffer, bufferSize, 0, NULL,&#xA;                      &amp;read, NULL, NULL);&#xA;&#xA;    if (!f || !pBuffer || !pAVIOContext) {&#xA;        printf("error: open / alloc failed\n");&#xA;        // cleanup...&#xA;        return 1;&#xA;    }&#xA;&#xA;    AVFormatContext *pFormatContext = avformat_alloc_context();&#xA;    pFormatContext->pb = pAVIOContext;&#xA;&#xA;    const int readBytes = read(NULL, pBuffer, bufferSize);&#xA;&#xA;    printf("readBytes = %i\n", readBytes);&#xA;&#xA;    if (readBytes &lt;= 0) {&#xA;        printf("error: read failed\n");&#xA;        // cleanup...&#xA;        return 2;&#xA;    }&#xA;&#xA;    if (fseek(f, 0, SEEK_SET) != 0) {&#xA;        printf("error: fseek failed\n");&#xA;        // cleanup...&#xA;        return 3;&#xA;    }&#xA;&#xA;    // required for av_probe_input_format&#xA;    memset(pBuffer &#x2B; readBytes, 0, AVPROBE_PADDING_SIZE);&#xA;&#xA;    AVProbeData probeData;&#xA;    probeData.buf = pBuffer;&#xA;    probeData.buf_size = readBytes;&#xA;    probeData.filename = "";&#xA;    probeData.mime_type = NULL;&#xA;&#xA;    pFormatContext->iformat = av_probe_input_format(&amp;probeData, 1);&#xA;&#xA;    // things I tried:&#xA;    //pFormatContext->flags = AVFMT_FLAG_CUSTOM_IO;&#xA;    //pFormatContext->iformat->flags |= AVFMT_NOFILE;&#xA;    //pFormatContext->iformat->read_header = NULL;&#xA;&#xA;    // seems not to help (therefore commented out here):&#xA;    AVDictionary *pDictionary = NULL;&#xA;    //av_dict_set(&amp;pDictionary, "analyzeduration", "8000000", 0);&#xA;    //av_dict_set(&amp;pDictionary, "probesize", "8000000", 0);&#xA;&#xA;    if ((ret = avformat_open_input(&amp;pFormatContext, "", NULL, &amp;pDictionary)) &lt; 0) {&#xA;        char buffer[4096];&#xA;        av_strerror(ret, buffer, sizeof(buffer));&#xA;        printf("error: avformat_open_input failed: %s\n", buffer);&#xA;        // cleanup...&#xA;        return 4;&#xA;    }&#xA;&#xA;    printf("retrieving stream information...\n");&#xA;&#xA;    if ((ret = avformat_find_stream_info(pFormatContext, NULL)) &lt; 0) {&#xA;        char buffer[4096];&#xA;        av_strerror(ret, buffer, sizeof(buffer));&#xA;        printf("error: avformat_find_stream_info failed: %s\n", buffer);&#xA;        // cleanup...&#xA;        return 5;&#xA;    }&#xA;&#xA;    printf("nb_streams = %i\n", pFormatContext->nb_streams);&#xA;&#xA;    // further code...&#xA;&#xA;    // cleanup...&#xA;    return 0;&#xA;}&#xA;&#xA;int main() {&#xA;    openVideo("video.mp4");&#xA;    return 0;&#xA;}&#xA;

    &#xA;&#xA;

    This is the output that I get :
    &#xA;readBytes = 32768
    &#xA;retrieving stream information...
    &#xA;[mov,mp4,m4a,3gp,3g2,mj2 @ 0xdf8d20] stream 0, offset 0x30 : partial file&#xA;[mov,mp4,m4a,3gp,3g2,mj2 @ 0xdf8d20] Could not find codec parameters for stream 0 (Video : h264 (avc1 / 0x31637661), none, 640x360, 351 kb/s) : unspecified pixel format
    &#xA;Consider increasing the value for the 'analyzeduration' and 'probesize' options
    &#xA;nb_streams = 2

    &#xA;&#xA;

    UPDATE :
    &#xA;Thanks to WLGfx, here is the solution : The only thing that was missing was the seek function. Apparently, implementing it is mandatory for decoding. It is important to return the new offset - and not 0 in case of success (some solutions found in the web just return the return value of fseek, and that is wrong). Here is the minimal solution that made it work :

    &#xA;&#xA;

    static int64_t seek(void *opaque, int64_t offset, int whence) {&#xA;    if (whence == SEEK_SET &amp;&amp; fseek(f, offset, SEEK_SET) == 0) {&#xA;        return offset;&#xA;    }&#xA;    // handling AVSEEK_SIZE doesn&#x27;t seem mandatory&#xA;    return -1;&#xA;}&#xA;

    &#xA;&#xA;

    Of course, the call to avio_alloc_context needs to be adapted accordingly :

    &#xA;&#xA;

    AVIOContext *pAVIOContext = avio_alloc_context(pBuffer, bufferSize, 0, NULL,&#xA;                      &amp;read, NULL, &amp;seek);&#xA;

    &#xA;

  • Rust Win32 FFI : User-mode data execution prevention (DEP) violation

    28 avril 2022, par TheElix

    I'm trying to pass a ID3D11Device instance from Rust to a C FFI Library (FFMPEG).

    &#xA;

    I made this sample code :

    &#xA;

    pub fn create_d3d11_device(&amp;mut self, device: &amp;mut Box, context: &amp;mut Box) {&#xA;            let av_device : Box<avbufferref> = self.alloc(HwDeviceType::D3d11va);&#xA;            unsafe {&#xA;                let device_context = Box::from_raw(av_device.data as *mut AVHWDeviceContext);&#xA;                let mut d3d11_device_context = Box::from_raw(device_context.hwctx as *mut AVD3D11VADeviceContext);&#xA;                d3d11_device_context.device = device.as_mut() as *mut _;&#xA;                d3d11_device_context.device_context = context.as_mut() as *mut _;&#xA;                let avp = Box::into_raw(av_device);&#xA;                av_hwdevice_ctx_init(avp);&#xA;                self.av_hwdevice = Some(Box::from_raw(avp));&#xA;            }&#xA;        }&#xA;</avbufferref>

    &#xA;

    On the Rust side the Device does work, but on the C side, when FFMEPG calls ID3D11DeviceContext_QueryInterface the app crashes with the following error : Exception 0xc0000005 encountered at address 0x7ff9fb99ad38: User-mode data execution prevention (DEP) violation at location 0x7ff9fb99ad38

    &#xA;

    The address is actually the pointer for the lpVtbl of QueryInterface, like seen here :

    &#xA;

    The disassembly of the address also looks correct (this is done on an another debugging session) :

    &#xA;

    (lldb) disassemble --start-address 0x00007ffffdf3ad38&#xA;    0x7ffffdf3ad38: addb   %ah, 0x7ffffd(%rdi,%riz,8)&#xA;    0x7ffffdf3ad3f: addb   %al, (%rax)&#xA;    0x7ffffdf3ad41: movabsl -0x591fffff80000219, %eax&#xA;    0x7ffffdf3ad4a: outl   %eax, $0xfd&#xA;

    &#xA;

    Do you have any pointer to debug this further ?

    &#xA;

    EDIT : I made a Minimal Reproducion Sample. Interestingly this does not causes a DEP Violation, but simply a Segfault.

    &#xA;

    On the C side :

    &#xA;

    int test_ffi(ID3D11Device *device){&#xA;    ID3D11DeviceContext *context;&#xA;    device->lpVtbl->GetImmediateContext(device, &amp;context);&#xA;    if (!context) return 1;&#xA;    return 0;&#xA;}&#xA;

    &#xA;

    On the Rust side :

    &#xA;

    unsafe fn main_rust(){&#xA;    let mut device = None;&#xA;    let mut device_context = None;&#xA;    let _ = match windows::Win32::Graphics::Direct3D11::D3D11CreateDevice(None, D3D_DRIVER_TYPE_HARDWARE, OtherHinstance::default(), D3D11_CREATE_DEVICE_DEBUG, &amp;[], D3D11_SDK_VERSION, &amp;mut device, std::ptr::null_mut(), &amp;mut device_context) {&#xA;        Ok(e) => e,&#xA;        Err(e) => panic!("Creation Failed: {}", e)&#xA;    };&#xA;    let mut device = match device {&#xA;        Some(e) => e,&#xA;        None => panic!("Creation Failed2")&#xA;    };&#xA;    let mut f2 : ID3D11Device = transmute_copy(&amp;device); //Transmuting the WinAPI into a bindgen ID3D11Device&#xA;    test_ffi(&amp;mut f2);&#xA;}&#xA;

    &#xA;

    The bindgen build.rs :

    &#xA;

    extern crate bindgen;&#xA;&#xA;use std::env;&#xA;use std::path::PathBuf;&#xA;&#xA;fn main() {&#xA;    // Tell cargo to tell rustc to link the system bzip2&#xA;    // shared library.&#xA;    println!("cargo:rustc-link-lib=ffi_demoLIB");&#xA;    println!("cargo:rustc-link-lib=d3d11");&#xA;&#xA;    // Tell cargo to invalidate the built crate whenever the wrapper changes&#xA;    println!("cargo:rerun-if-changed=library.h");&#xA;&#xA;    // The bindgen::Builder is the main entry point&#xA;    // to bindgen, and lets you build up options for&#xA;    // the resulting bindings.&#xA;    let bindings = bindgen::Builder::default()&#xA;        // The input header we would like to generate&#xA;        // bindings for.&#xA;        .header("library.h")&#xA;        // Tell cargo to invalidate the built crate whenever any of the&#xA;        // included header files changed.&#xA;        .parse_callbacks(Box::new(bindgen::CargoCallbacks))&#xA;        .blacklist_type("_IMAGE_TLS_DIRECTORY64")&#xA;        .blacklist_type("IMAGE_TLS_DIRECTORY64")&#xA;        .blacklist_type("PIMAGE_TLS_DIRECTORY64")&#xA;        .blacklist_type("IMAGE_TLS_DIRECTORY")&#xA;        .blacklist_type("PIMAGE_TLS_DIRECTORY")&#xA;        // Finish the builder and generate the bindings.&#xA;        .generate()&#xA;        // Unwrap the Result and panic on failure.&#xA;        .expect("Unable to generate bindings");&#xA;&#xA;    // Write the bindings to the $OUT_DIR/bindings.rs file.&#xA;    let out_path = PathBuf::from(env::var("OUT_DIR").unwrap());&#xA;    bindings&#xA;        .write_to_file(out_path.join("bindings.rs"))&#xA;        .expect("Couldn&#x27;t write bindings!");&#xA;}&#xA;

    &#xA;

    The Complete Repo can be found over here : https://github.com/TheElixZammuto/demo-ffi

    &#xA;