Recherche avancée

Médias (91)

Autres articles (88)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

  • Dépôt de média et thèmes par FTP

    31 mai 2013, par

    L’outil MédiaSPIP traite aussi les média transférés par la voie FTP. Si vous préférez déposer par cette voie, récupérez les identifiants d’accès vers votre site MédiaSPIP et utilisez votre client FTP favori.
    Vous trouverez dès le départ les dossiers suivants dans votre espace FTP : config/ : dossier de configuration du site IMG/ : dossier des média déjà traités et en ligne sur le site local/ : répertoire cache du site web themes/ : les thèmes ou les feuilles de style personnalisées tmp/ : dossier de travail (...)

Sur d’autres sites (8447)

  • Ffmpeg burnt in subtitles out of sync when converting to hls

    19 mai 2020, par user1503606

    I have a file that has subtitles burn into it and they are perfectly in sync.

    



    Here is the file. https://983yqbz442.s3.amazonaws.com/little-mermaid-captions.mp4

    



    I run this command to convert to hls and it creates the .ts files and the .vtt files.

    



    ffmpeg -i little-mermaid-captions.mp4 -profile:v baseline -level 3.0 -s 640x360 -start_number 0 -hls_time 10 -hls_list_size 0 -f hls index.m3u8


    



    I also then create a master.m3u8 file in the same folder with the following.

    



    #EXTM3U
#EXT-X-VERSION:3
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-STREAM-INF:BANDWIDTH=800000,RESOLUTION=640x360,SUBTITLES="subtitles"
index.m3u8
#EXT-X-MEDIA:TYPE=SUBTITLES,GROUP-ID="subtitles",NAME="English",DEFAULT=YES,AUTOSELECT=YES,FORCED=NO,LANGUAGE="ENG",URI="index_vtt.m3u8"


    



    Now if I play the master.m3u8 file the subtitles are now out of sync and are about 1 sec to quick. I understand this is probably a setting I am missing through FFmpeg but really stuck on this and would appreciate any insight.

    



    Thanks

    



    More info.

    



    Here is a link to the direct .m3u8 this can be opened in Safari.

    



    https://983yqbz442.s3.amazonaws.com/hlstests/master.m3u8

    



    The generated vtt file is here.

    



    https://983yqbz442.s3.amazonaws.com/hlstests/subs-0.vtt

    



    If you look at the start of the .vtt file you will see this.

    



    WEBVTT

00:06.840 --> 00:10.320
once long ago in the deep blue below


    



    It should start at 00:06.840 but when playing the .m3u8 file in Safari you should see it starts at around 5 seconds not 6 about a second too early.

    


  • ffmpeg RTSP streams to RGB24 using GPU

    31 janvier 2020, par jerq

    I have successfully output RTSP stream to RGB24 pipe but I notice the CPU usage is still high (20-45%) despite using -hwaccel cuvid and -vcodec h264_cuvid (without hardware acceleration it can go up to 100-300% ; based on "top" command on Linux).

    The current code I am using is :

    ffmpeg
    .input(*rtsp_add*, rtsp_transport='tcp', fflags='nobuffer', flags='low_delay', hwaccel='cuvid', vcodec='h264_cuvid', vsync=0)
    .output('pipe:', format='rawvideo', pix_fmt='rgb24')
    .run_async(quiet=False, pipe_stdout=True)

    The moment i use -vcodec h264_nvenc, I get a bunch of error as I believe it is incompatible with RGB24. Using the command "ffmpeg -pix_fmts", I notice almost nothing falls under the "Hardware accelerated format". I do not mind working in YUV420p, YUV444p, RGB24 etc... Does this mean that I am unable to output images into the pipe if I choose to use hardware encoding ?

    I would prefer to write the output to a pipe as compared to writing it to a physical file and reading it again to manipulate the data for performance. The option to output to a in-memory JPG or PNG is also good.

  • C/C++ examples for HW acceleration with FFMPEG on Android (JNI)

    12 février 2024, par Ramil Galin

    Are there any good C/C++ tutorials or code examples for how can one write hardware accelerated decoding on Android (JNI) ?

    


    I found related questions here on stackoverflow (here and here), but would like to know if there is something more contemporary and related to Android.