Recherche avancée

Médias (0)

Mot : - Tags -/upload

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (111)

  • (Dés)Activation de fonctionnalités (plugins)

    18 février 2011, par

    Pour gérer l’ajout et la suppression de fonctionnalités supplémentaires (ou plugins), MediaSPIP utilise à partir de la version 0.2 SVP.
    SVP permet l’activation facile de plugins depuis l’espace de configuration de MediaSPIP.
    Pour y accéder, il suffit de se rendre dans l’espace de configuration puis de se rendre sur la page "Gestion des plugins".
    MediaSPIP est fourni par défaut avec l’ensemble des plugins dits "compatibles", ils ont été testés et intégrés afin de fonctionner parfaitement avec chaque (...)

  • Le plugin : Podcasts.

    14 juillet 2010, par

    Le problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
    Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
    Types de fichiers supportés dans les flux
    Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...)

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

Sur d’autres sites (7403)

  • Haskell - Turning multiple image-files into one video-file using the ffmpeg-light package

    25 avril 2021, par oRole

    Background
    
I wrote an application for image-processing which uses the ffmpeg-light package to fetch all the frames of a given video-file so that the program afterwards is able to apply grayscaling, as well as edge detection alogrithms to each of the frames.

    


    Now I'm trying to put all of the frames back into a single video-file.

    


    Used Libs
    
ffmpeg-light-0.12.0
    
JuicyPixels-3.2.8.3
    
...

    


    What have I tried ?
    
I have to be honest, I didn't really try anything because I'm kinda clueless where and how to start. I saw that there is a package called Command which allows running processes/commands using the command line. With that I could use ffmpeg (not ffmpeg-light) to create a video out of image-files which I would have to save to the hard drive first but that would be kinda hacky.

    
Within the documentation of ffmpeg-light on hackage (ffmpeg-light docu) I found the frameWriter function which sounds promising.

    


    frameWriter :: EncodingParams -> FilePath -> IO (Maybe (AVPixelFormat, V2 CInt, Vector CUChar) -> IO ()) 


    


    I guess FilePath would be the location where the video file gets stored but I can't really imagine how to apply the frames as EncodingParams to this function.

    


    Others
    
I can access :

    


      

    • r, g, b, a as well asy. a values
    • 


    • image width / height / format
    • 


    


    Question
    
Is there a way to achieve this using the ffmpeg-light package ?

    


    As the ffmpeg-light package lacks of documentation when it comes to conversion from images to video, I really would appreciate your help. (I do not expect a fully working solution.)

    


    Code
    
The code that reads the frames :

    


    -- Gets and returns all frames that a given video contains
getAllFrames :: String -> IO [(Double, DynamicImage)]
getAllFrames vidPath = do 
  result <- try (imageReaderTime $ File vidPath) :: IO (Either SomeException (IO (Maybe (Image PixelRGB8, Double)), IO()))
  case result of 
    Left ex -> do 
                 printStatus "Invalid video-path or invalid video-format detected." "Video" 
                 return []
    Right (getFrame, _) -> addNextFrame getFrame [] 

-- Adds up all available frames to a video.
addNextFrame :: IO (Maybe (Image PixelRGB8, Double)) -> [(Double, DynamicImage)] -> IO [(Double, DynamicImage)]
addNextFrame getFrame frames = do
  frame <- getFrame
  case frame of 
    Nothing -> do 
                 printStatus "No more frames found." "Video"
                 return frames
    _       -> do                             
                 newFrameData <- fmap ImageRGB8 . swap . fromJust <$> getFrame 
                 printStatus ("Frame: " ++ (show $ length frames) ++ " added.") "Video"
                 addNextFrame getFrame (frames ++ [newFrameData]) 


    


    Where I am stuck / The code that should convert images to video :

    


    -- Converts from several images to video
juicyToFFmpeg :: [Image PixelYA8] -> ?
juicyToFFmpeg imgs = undefined


    


  • How to take screenshots of streamers using the Twitch API

    10 décembre 2020, par oo92

    My goal is to create a dataset of gameplays on Twitch using the API. How I want to do it is this :

    


      

    1. Get a list of live streams using the API.
    2. 


    3. Use streamlink and ffmpeg on Python to take the screenshots through the Stream source
    4. 


    


    To get the streams, I have the following code thanks to a SO user :

    


    from twitch import TwitchClient&#xA;&#xA;client = TwitchClient(client_id=&#x27;<my client="client">&#x27;)&#xA;streams = client.streams.get_live_streams(limit=100)&#xA;&#xA;print(streams)&#xA;</my>

    &#xA;

    The output is this. Its a lot larger, I just made it shorter... :

    &#xA;

    [{&#x27;id&#x27;: 40889579422, &#x27;game&#x27;: &#x27;Among Us&#x27;, &#x27;broadcast_platform&#x27;: &#x27;live&#x27;, &#x27;community_id&#x27;: &#x27;&#x27;, &#x27;community_ids&#x27;: [], &#x27;viewers&#x27;: 74594, &#x27;video_height&#x27;: 1080, &#x27;average_fps&#x27;: 60, &#x27;delay&#x27;: 0, &#x27;created_at&#x27;: datetime.datetime(2020, 12, 9, 23, 53, 34), &#x27;is_playlist&#x27;: False, &#x27;stream_type&#x27;: &#x27;live&#x27;, &#x27;preview&#x27;: {&#x27;small&#x27;: &#x27;https://static-cdn.jtvnw.net/previews-ttv/live_user_sykkuno-80x45.jpg&#x27;, &#x27;medium&#x27;: &#x27;https://static-cdn.jtvnw.net/previews-ttv/live_user_sykkuno-320x180.jpg&#x27;, &#x27;large&#x27;: &#x27;https://static-cdn.jtvnw.net/previews-ttv/live_user_sykkuno-640x360.jpg&#x27;, &#x27;template&#x27;: &#x27;https://static-cdn.jtvnw.net/previews-ttv/live_user_sykkuno-{width}x{height}.jpg&#x27;}, &#x27;channel&#x27;: {&#x27;mature&#x27;: False, &#x27;status&#x27;: &#x27;amongus at 4 !!&#x27;, &#x27;broadcaster_language&#x27;: &#x27;en&#x27;, &#x27;broadcaster_software&#x27;: &#x27;&#x27;, &#x27;display_name&#x27;: &#x27;Sykkuno&#x27;, &#x27;game&#x27;: &#x27;Among Us&#x27;, &#x27;language&#x27;: &#x27;en&#x27;, &#x27;id&#x27;: 26154978, &#x27;name&#x27;: &#x27;sykkuno&#x27;, &#x27;created_at&#x27;: datetime.datetime(2011, 11, 15, 1, 29, 29, 140794), &#x27;updated_at&#x27;: datetime.datetime(2020, 12, 10, 0, 35, 50, 916363), &#x27;partner&#x27;: True, &#x27;logo&#x27;: &#x27;https://static-cdn.jtvnw.net/jtv_user_pictures/sykkuno-profile_image-6ab1e70e07e29e9b-300x300.jpeg&#x27;, &#x27;video_banner&#x27;: &#x27;https://static-cdn.jtvnw.net/jtv_user_pictures/4b654ce5-58dc-4fa6-b77c-7250bb2d5269-channel_offline_image-1920x1080.png&#x27;, &#x27;profile_banner&#x27;: &#x27;https://static-cdn.jtvnw.net/jtv_user_pictures/1caee146-3323-4d45-9907-96c20c224d3e-profile_banner-480.png&#x27;, &#x27;profile_banner_background_color&#x27;: &#x27;&#x27;, &#x27;url&#x27;: &#x27;https://www.twitch.tv/sykkuno&#x27;, &#x27;views&#x27;: 23590965, &#x27;followers&#x27;: 1928724, &#x27;broadcaster_type&#x27;: &#x27;&#x27;, &#x27;description&#x27;: &#x27;Hi ! &#x27;, &#x27;private_video&#x27;: False, &#x27;privacy_options_enabled&#x27;: False}}, ... &#xA;

    &#xA;

    First, I want to know how I can iterate through the JSON to get the channel name using the id. I tried to do it on my own but it said that it isn't subscriptable.

    &#xA;

    Second, I have the following code that tries to use ffmpeg to get a screenshot of the stream through its source :

    &#xA;

    import streamlink, os&#xA;&#xA;# This code connects to the streamer&#x27;s source&#xA;# Get the Twitch API to work so you can add the streamer&#x27;s name at the end of the link&#xA;username = &#x27;Sykkuno&#x27;&#xA;streams = streamlink.streams(&#x27;http://twitch.tv/&#x27; &#x2B; username)&#xA;&#xA;# Stream source&#xA;stream = streams["best"].url&#xA;&#xA;# Directory where the screenshots will be saved&#xA;dir_path = os.getcwd() &#x2B; &#x27;/&#x27; &#x2B; username&#xA;&#xA;# number of streamers&#xA;streamers = 1&#xA;&#xA;os.system(&#x27;ffmpeg -i &#x27; &#x2B; stream &#x2B;&#x27; -r 0.5 -f image2 ${dir}/output_%09d.jpg&#x27;)&#xA;

    &#xA;

    But that is throwing the following error :

    &#xA;

    ffmpeg version 4.1.3-0ppa1~18.04 Copyright (c) 2000-2019 the FFmpeg developers&#xA;  built with gcc 7 (Ubuntu 7.3.0-27ubuntu1~18.04)&#xA;  configuration: --prefix=/usr --extra-version=&#x27;0ppa1~18.04&#x27; --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-nonfree --enable-libfdk-aac --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared&#xA;  libavutil      56. 22.100 / 56. 22.100&#xA;  libavcodec     58. 35.100 / 58. 35.100&#xA;  libavformat    58. 20.100 / 58. 20.100&#xA;  libavdevice    58.  5.100 / 58.  5.100&#xA;  libavfilter     7. 40.101 /  7. 40.101&#xA;  libavresample   4.  0.  0 /  4.  0.  0&#xA;  libswscale      5.  3.100 /  5.  3.100&#xA;  libswresample   3.  3.100 /  3.  3.100&#xA;  libpostproc    55.  3.100 / 55.  3.100&#xA;[hls,applehttp @ 0x557212854940] Opening &#x27;https://video-edge-c2a360.yto01.abs.hls.ttvnw.net/v1/segment/CuAErHgNDEVK3cqjkRZLcz4El99YD2ZSy9tgtjc3ZTrPuN4584-GY3DKYO3MFCBt9M3v8_7IfHAlLvHeUn2wq4d-VC7u4Zv2-k1Bee9IPmIXbFQLZFKrYiT98OTzFMDaElsEZ9Sr4MXz6FoAC1yXhhZ0MYahZVnZa1Tuqnn217nN_rN8wr7kdScBIir_Uo1s2C_I8_54mi2uzdgxB9AYj0a0kG2UtdPkUVA6Qc7XIZ0nUhEeObEf80N0uW9ZU6WHpO3V6G9RD0VkmwUexDWk9MaLy1NJuvLSzRaMfOmKxhrzn3UDoY4CrQN11KOnHYCiCOfvhZmMKzSqtiA8YP0Q9iS03eZZhPQ3WxBHWhd6VeZ0btNeeoudGX73EBIj4ujZnKWKfPiN8K_HLj5FZWqQ5m_4Q1llQlnSAfmhzXR9PHAkz8nxRVcFR-tFokGzFEfkZGHngPNz9boLmo4KHx6404rocPUcXbTHkYsWXZwFC356AhfrNn6x0JYHGfcDpRsEFebLQljEpCyhnNOHFEf9b7Ipeiy521cCupzoEz1uMrzW77h9FJwn0GwvY3jp2KwRXJMvArYwiUHccBmfW7ZDLO9vH7eJGUnAzoy586KMSPLVeLxWMWRDfKkAcI8vYXOIuxNb_MZKm2O_dcoFrDDta5TZn-MLjFquT47P9NbXxlTTJfimlYyMG17SPoW3KJVtAJmjoj0GZ1ehftLJWz1Qgd8zmg40u4g8fz867eCHW_Tiv65yd6qJZD-_bD3G916fm3KeA9bPyhalTD99CWhQL7f0apCfX_6IiQElgPT6kOrFI-ISEG4GdhIpFi4ubzSx29pm7H4aDIXx6BAI5ziHVMPbrg.ts&#x27; for reading&#xA;[hls,applehttp @ 0x557212854940] Opening &#x27;https://video-edge-c2a360.yto01.abs.hls.ttvnw.net/v1/segment/CuIECWLarz2I5hNCtGLSTDXbCIgahI91KOHXVmPHddzZediyiVbcicZpGakO99CJp1cK_6OnmOKBzZ3sn3KogNyPVRqSy32IpanlyPPUTz23TojfR5DTmJa3zampOzVMfETvvPEpj58-kGhQeQQGtK1zLV2h465RElCLkmc7SeWbXEZ5j6IQ38OVFZ0vdcMHVTNPtJaY7509bE196-9-5YYFiET_-kdkS2X6_lhVQBZrq45PBxApTzeLkDx9ZtGQx78dLr_tZepww_uVnJTwxI_TA3tU8z5_w8ml6rh0GK1W6lTlvuDkPKAwIDLvG736MtbPGz9cFPoRAFQxD3QSwAM-bO1grvWlNsOUDUYLXLNjuejmz8xmRpeE0pqJYxUboRZpxrPXTi52HcX8lpGT4Lx2z6hJcoi9npQttK58HDSHQ3cYH3rlNsYV_RlZ3F-u-fZSn8Em67-vAYeXAMaBe9xxv0Zu2n1TrdPICMyGmk-VgLK788IeDhxqM441GONGLxo6AF8RE5_OawTf9n_MVzImsX4LMn0oN8e8w6mjk7YDuNdA4mEl99Erg8xMdX6Q3fDYTdStaC-zQwXgMctfGpIsUpp91BtRBPMynFCxZ8fZB7NvGFNZTYWDvCZjisiwzs0N1pDdDM_Fkj3i46_Ou105z348PLHRA28Dt1qgn_NjzwRNoaowFx7PxQ-X2zRpAhwNcqaTOHYh5NYZHToVE16WiOe9HzVs_I_3Wqu44bypEsGrhxYhgXSGfvO757iTbErZHmkidsBF2BET5j3JeIifr4fJooalAKc_5uiSPMCGSTtV9hIQX5mQWJudg9UsMRp1AWVrnBoMQPAhy1UwBDtAOrBs.ts&#x27; for reading&#xA;Input #0, hls,applehttp, from &#x27;https://video-weaver.yto01.hls.ttvnw.net/v1/playlist/CvADco9ZO_szpL7RVLVI3U6zl4IMo2uaUBvZTWAyEOKETN-SxI3m3hLLxpmCfz06rlyCGCPimEbk28kKPSbVYtMThiZFRtLxnrdpLr5DWBQwbpIStPQwMetU4Z04Mkm3nEWplL8hpn69Cmd_eWSdI1yHubK_sRL-n1ml5akAEWWGkS0OgVRQTTsYv4RpkbeWc8wrr3DsDBDfyrxciGSZevnStZOLwg-tNuNu3VNugLigNG1HMsNdxoJGO6wLyO-6FL5EQidOiSw2THSibAvADAWMVOxNX2z39d-nJwbVprbWFnox_lDlh-y88uhjj_MpU6OwkXP9vgrBDEmKjtRbYsIN5N94-8pDFEB3yIYk10vUiuod1yfqztVCqWo5m9r48uINP1CD-Bwgybo6K7Oko-TUjMj43GqAu0mmPtkgPFO69LpQQMifZTn_XbVQ5UUfoCwyl-ljObfRE51aCvZP_dOTsUyqi_JRE8h-C3306aW1ISXwCs3YpnjDxJv6yKyWTktBvsN5NWMVzJg-_-MAtTmoy-w2ppbbOozLzyrGF4zfCetaHLmHtbsKe1Ed1nndJg9b6T7v-87ExrI0eQwRjH3gmBTgqu_peS5oQGBaDOe4QSGMNFgf8lQfuqvFH-miXgpk8RytdorzT7wB05iX7c3bhBIQ3FvvjNIal3IgvyxKatZs4BoMW_P_CuyZOUJs2_Yr.m3u8&#x27;:&#xA;  Duration: N/A, start: 60.000000, bitrate: N/A&#xA;  Program 0 &#xA;    Metadata:&#xA;      variant_bitrate : 0&#xA;    Stream #0:0: Audio: aac (LC) ([15][0][0][0] / 0x000F), 44100 Hz, stereo, fltp&#xA;    Metadata:&#xA;      variant_bitrate : 0&#xA;    Stream #0:1: Video: h264 (Main) ([27][0][0][0] / 0x001B), yuv420p(tv, unknown/bt709/unknown), 1920x1080, 29.97 tbr, 90k tbn, 2k tbc&#xA;    Metadata:&#xA;      variant_bitrate : 0&#xA;    Stream #0:2: Data: timed_id3 (ID3  / 0x20334449)&#xA;    Metadata:&#xA;      variant_bitrate : 0&#xA;Stream mapping:&#xA;  Stream #0:1 -> #0:0 (h264 (native) -> mjpeg (native))&#xA;Press [q] to stop, [?] for help&#xA;[swscaler @ 0x557213e262c0] deprecated pixel format used, make sure you did set range correctly&#xA;Output #0, image2, to &#x27;/output_%09d.jpg&#x27;:&#xA;  Metadata:&#xA;    encoder         : Lavf58.20.100&#xA;    Stream #0:0: Video: mjpeg, yuvj420p(pc), 1920x1080, q=2-31, 200 kb/s, 0.50 fps, 0.50 tbn, 0.50 tbc&#xA;    Metadata:&#xA;      variant_bitrate : 0&#xA;      encoder         : Lavc58.35.100 mjpeg&#xA;    Side data:&#xA;      cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1&#xA;[image2 @ 0x557213012040] Could not open file : /output_000000001.jpg&#xA;av_interleaved_write_frame(): Input/output error&#xA;frame=    1 fps=0.0 q=7.5 Lsize=N/A time=00:00:02.00 bitrate=N/A speed=27.7x    &#xA;video:46kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown&#xA;Conversion failed!&#xA;

    &#xA;

  • When I use Fluent-Ffmpeg to access Ffmpeg, there are two different threads but I dont want it

    25 mars 2019, par Ahmet Hakan Billur

    I try to broadcast with rtsp live stream from IP camera on web app that is improved with node.js-jsmpeg([a link]https://www.npmjs.com/package/fluent-ffmpeg !), web socket, html5(canvas).Everything ok that live streaming works but missing frame and high CPU usaged by streaming on web app and I try to reduce so I can intervene ffmpeg with fluent-ffmpeg but when I monitor CPU usaged I can see there 2 different threads following as and look at screenshot of CPU ;
    enter image description here

    ffmpeg -rtsp_trasport tcp -i rtsp ://10.6.0.225 -f mpeg1video - is worked by jsmpeg and canvas/html5
    index.html

       

       
       
       
       

       <div><canvas width="640" height="360"></canvas></div>
       div><canvas width="640" height="360"></canvas>  
       <code class="echappe-js">&lt;script type=&quot;text/javascript&quot; src='http://stackoverflow.com/feeds/tag/jsLib/jsmpeg.js'&gt;&lt;/script&gt;

    &lt;script type=&quot;text/javascript&quot; src='http://stackoverflow.com/feeds/tag/jsLib/ffmpegUtil.js'&gt;&lt;/script&gt;

    &lt;script type=&quot;text/javascript&quot;&gt;<br />
         var canvas = document.getElementById('videoCanvas');<br />
         var ws = new WebSocket(&quot;ws://10.6.0.206:9999&quot;)<br />
         var player = new jsmpeg(ws, {canvas:canvas, autoplay:true,audio:false,loop: true});<br />
       &lt;/script&gt;

    other one /usr/bin/ffmpeg -i rtsp ://10.6.0.225 -y out.ts is work by following piece of code in app.js

    Stream = require('node-rtsp-stream');
    stream = new Stream({
       name: 'name',
       streamUrl: 'rtsp://10.6.0.225',
       wsPort: 9999
    });

    var ffmpeg = require('fluent-ffmpeg');
    var proc = new ffmpeg();

    proc
    .addInput('rtsp://10.6.0.225')
    .on('start', function(ffmpegCommand) {
       /// log something maybe
       console.log('start-->'+ffmpegCommand)
    })
    .on('progress', function(data) {
       /// do stuff with progress data if you want
       console.log('progress-->'+data)
    })
    .on('end', function() {
       /// encoding is complete, so callback or move on at this point
       console.log('end-->')
    })
    .on('error', function(error) {
       /// error handling
       console.log('error-->'+error)

    })
    .output('out.ts')
    .run();

    and then I don’t want to get two different ffmpeg command threads in there.
    Does anyone have an idea ?
    Thanks in advice.