Recherche avancée

Médias (21)

Mot : - Tags -/Nine Inch Nails

Autres articles (37)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs

    12 avril 2011, par

    La manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
    Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.

Sur d’autres sites (7425)

  • Revision 5b11e38aa7 : Merge "small update to peek_si/get_si documentation"

    12 juillet 2013, par James Zern

    Merge "small update to peek_si/get_si documentation"

  • FFMPEG decoder only decoding correctly on small resolutions

    2 décembre 2019, par Matija Manđurov

    So I am building a encoding/decoding system with FFMPEG to use in Unity. The idea is to create an MP4 stream starting from a Texture in Unity that gets converted to an MP4 stream, gets sent over a network connection and returns to a Texture on the other side. I have a working version that handles correctly over small resolutions, and I am pretty sure its not performance related why it is falling at higher resolution. The commands are as follows :

    1) Converting textures to MP4 stream :

    "-pixel_format rgba -colorspace bt709 -f rawvideo -vcodec rawvideo -framerate {_frameRate} -video_size {_width}x{_height} -i - -pix_fmt yuv420p -f mpegts - "

    Input and Output are both using std in and std out. This creates a correct MP4 stream that I have verified is working at least as high as Full HD resolution in real time using FFPlay. So performance wise the encoder is fine.

    2) converting MP4 stream to Raw Video

    "-i - -r {_frameRate} -s {_width}x{_height} -pix_fmt yuv420p -f rawvideo - "

    the input is the incoming MP4 stream sent over the network and output is sent to std out in raw yuv420 frames. This also works in real time on small resolutions. My guess is that I am not defining the input parameters correctly since the errors are as follows.

    1) The process works with no delays with resolutions where both height and width are up to 144. For example 144x144 works perfectly, 160x144 doesn’t work. I assume that that small of a difference cannot be performance related. Is there a buffer than needs to be expanded for larger resolutions ?

    2) I noticed that the bit-rate and frame-rate of the encoder and decoder differ greatly. For example when using a 144*144 texture the bit-rate is 140 kbits/s and frame-rate is 30 (as defined by me), while the decoder for that resolutions is running at 66 frame-rate and a bit-rate of 7250 kbits/s. Again I assume FFMPEG is reading the inputting video wrong due to me falling to defined the input parameters correctly.

    3) When increasing the resolution over 144*144 I noticed that FFMPEG is outputting frames at a much higher rate. I set a system that collects data from std out and prepares a frame once it collects enough bytes. For a low resolution this is happening at exactly the frame-rate specified (in most cases 30) while as soon as the resolution is increased its outputting roughly at 200 frame-rate.

    4) Finally FFMPEG is not producing any errors and visually when I have too large of a resolution the resulting textures are distorted, looking like green static. The green I assume is from my yuv420 to RGB converter.

    In summary my main question is :

    when receiving an MP4 stream over UDP packets that are then forwarded over std in to FFMPEG how does one correctly define the input parameters to make sure FFMPEG is reading the video correctly and outputting rawvideo yuv420 frames.

    Thank you

  • Small pieces of video splitted by ffmpeg are not playing or playing with artifacts

    14 novembre 2011, par Rnd_d

    I'm trying to cut video file into small parts (0.5-5 secs) and encode these parts to h.264/aac with ffmepg. I'm using this cmd :

    ffmpeg -i 1.avi -ss 00:05:00 -t 00:00:01,50 -vcodec libx264 -threads 0
    -vpre default -acodec libfaac out1.mp4

    in vlc, some pieces are plaing without video, sound only, other pieces had a video artifacts like this screenshot.

    Maybe i didn't use some necessary agruments for ffmpeg ?