Recherche avancée

Médias (91)

Autres articles (39)

  • Personnaliser les catégories

    21 juin 2013, par

    Formulaire de création d’une catégorie
    Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
    Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
    On peut modifier ce formulaire dans la partie :
    Administration > Configuration des masques de formulaire.
    Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
    Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...)

  • Ajouter notes et légendes aux images

    7 février 2011, par

    Pour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
    Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
    Modification lors de l’ajout d’un média
    Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...)

  • Contribute to translation

    13 avril 2011

    You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
    To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
    MediaSPIP is currently available in French and English (...)

Sur d’autres sites (7562)

  • How to compress output file using FFmpeg - Apple ProRes 422

    17 octobre 2018, par user1526912

    I am new to video encoding and trying to encode a music video for the apple itunes video store.

    I am currently using FFmpeg for encoding.

    My source file is mp4 file type and file size=650MB

    I encode the file using the Apple ProRes 422 (HQ) codec and output a mov file.

    ffmpeg -y -i busy1.mp4  -vcodec prores -profile:v 3 -r "29.97" -c:a mp2   busy2.mov

    I am trying to encode the video according to the following specs :

    ● Apple ProRes 422 (HQ)
    ● VBR expected at 220 Mbps

    Encoded PASP Converted to ProRes From
    1920 x 1080 1:1 HDCAM SR, D5, ATSC
    1280 x 720 1:1 ATSC progressive

    29.97 interlaced frames per second for video sourced

    Music Video Audio Source Profile

    ● MPEG-2 layer II stereo
    ● 384 kpbs
    ● 48Khz

    The file is encoded perfectly fine however the output is 6Gb in size.

    Why would the file be so large after encoding ?

    Am I doing something wrong here ?

  • Node.js - Buffer Data to Ffmpeg

    24 septembre 2017, par user8568709

    I used Node.js and Ffmpeg to create animations. Because I was trying to avoid third-party avi/mp4 parsers, I decided to output the animation as raw rgb24 data file and then use some program to convert it to mp4 file.

    I found that Ffmpeg is free and open source which can do exactly it. So, I made a Node.js application which allocates a Buffer of size 1920 x 1080 x 3 (width times height times number of bytes per pixel), then I created a rendering context library, and finally I animated frame by frame and saved each frame consecutivelly in a binary file (using fs module).

    Then I invoked Ffmpeg to convert it to mp4 file and it works very good. Animations are pretty easy to make and Ffmpeg does its job correctly.

    However, the only problem is because it is very slow and eats space on hard disk. I want to create very long animations (more than a hour). The final mp4 file is relativelly small, but raw video file is extremelly big. About ninety percents of each frame are black pixels, so Ffmpeg comress it very good, but raw file cannot be compressed and it takes sometimes mor ethan 100 Gigabytes. Also, there is very unnecessary double processing same data. Firstly I process it in Node.js to save data to file, and then Ffmpeg reads it to convert it to mp4. There is a lot of unnecessary work.

    So, I’m looking for a way (and I’m pretty sure it is possible, but I didn’t find a way to do it yet) to output raw video data (one frame at a time) to Ffmpeg process (without saving anything to the hard disk).

    My goal is to do the following :

    1. Open Ffmpeg process
    2. Render a frame in Node.js
    3. Output raw byte stream to Ffmpeg
    4. Wait for Ffmpeg to encode it and append to mp4 file
    5. Let Ffmpeg wait for my Node.js process to render next frame

    Is there a way to achieve it ? I really don’t see a reason to post code, because my current code has nothing to do with the question I’m asking here. I don’t struggle with syntax errors or implementation problems. No, instead I just don’t know which parameters to pass to Ffmpeg process in order to achieve what I’ve already explained.

    I’ve searched in documentation to find out which parameters I need to pass to Ffmpeg process in order to let it read raw data from stdin instead from file, and also to wait until my Node.js process render next frame (so to disable time limit) because rendering a frame may take more than 24 hours. Therefore, Ffmpeg process should wait without time limit. However, I didn’t find anything about it in documentation.

    I know how to write to stdin from Node.js and similar technical stuff, so no need to explain it. The only question(s) here :

    1. Which parameters to pass to Ffmpeg ?
    2. Do I need to create Ffmpeg process (using child_process) with some special options ?

    Thank you in advance. Please, take it easy, this is my first question ! :)

  • How to capture UDP packets from ffmpeg with Wireshark ?

    15 septembre 2017, par Davis8988

    I have 2 laptops connected via LAN cable and static ip’s (v4).
    I use ffmpeg to capture desktop from laptop1 and stream it to laptop2 :

    ffmpeg -y -rtbufsize 100M -f gdigrab -framerate 30 -probesize 10M -draw_mouse 1  
    -i desktop -c:v libx264 -r 30 -preset Medium -tune zerolatency -crf 25 -f mpegts  
    udp://150.50.1.2:1080  

    And I use ffplay to recieve the stream on laptop2 and play it and it works - I can see laptop1’s desktop :

    ffplay -fflags nobuffer -sync ext udp://127.0.0.1:1080  

    Now I want also to capture the udp packets sent by ffmpeg to laptop2 via Wireshark(I start Wireshark on laptop2).
    But when I start Wireshark and press "capture" - I don’t see any packets beeing sent from the ip (as source) of laptop1. I see some few lines in Wireshark beeing added every 1 minute or so but from different ip’s.

    Why I can’t see the stream sent by ffmpeg in laptop1 to laptop2 ?