Recherche avancée

Médias (1)

Mot : - Tags -/karaoke

Autres articles (45)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Déploiements possibles

    31 janvier 2010, par

    Deux types de déploiements sont envisageable dépendant de deux aspects : La méthode d’installation envisagée (en standalone ou en ferme) ; Le nombre d’encodages journaliers et la fréquentation envisagés ;
    L’encodage de vidéos est un processus lourd consommant énormément de ressources système (CPU et RAM), il est nécessaire de prendre tout cela en considération. Ce système n’est donc possible que sur un ou plusieurs serveurs dédiés.
    Version mono serveur
    La version mono serveur consiste à n’utiliser qu’une (...)

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

Sur d’autres sites (13162)

  • Ffmpeg deletes my file ?

    9 janvier 2018, par Melvin Roest

    Does anyone know why ffmpeg sometimes seemingly deletes my files ? It mostly happens when I put my laptop to sleep and wait for quite a bit after opening my Finder window.

    Here is an example command.

    ffmpeg -i  Rec\ 01-9-18\ 1.trec  -c:a aac -c:v libx264 tutorial.mp4

    It never happens when I don’t let my laptop sleep and open it immediately after finishing.

    I’m on Mac OS X.

  • Can I stream WebM video to a video element through WebSocket ?

    5 février 2015, par Ravenstine

    I’d like to be able to capture my desktop using FFmpeg and send the video through a WebSocket to the client where the video stream can be played in an HTML5 video tag. It seems that the current way to accomplish this is by sending JPG data and placing it on a canvas element. I would rather stream actual video and audio data before resorting to that.

    Below is my server code. Note that I am running Ruby 2.2.0 on Linux(Debian Stable).

    require 'bundler'
    Bundler.require

    EM.run do
     EM::WebSocket.run host: "0.0.0.0", port: 9393 do |ws|

       cmd = "ffmpeg -y -f x11grab -s 1600x900 -r 15 -i :0.0 -tune fastdecode -b:v 150k -threads 4 -f webm -"
       @handler = EM.popen3(cmd, stdout: Proc.new { |data|
         puts(data)
         ws.send(Base64.encode64(data))
       }, stderr: Proc.new{|err|})

       @handler.callback do
         puts "hey"
       end

       @handler.errback do |err_code|
         puts err_code
       end

       ws.onopen do |handshake|
         puts "WebSocket connection open"
       end

       ws.onclose do
         puts "Connection closed"
         # @handler.kill('TERM', true)
       end

     end

     class SimpleView < Sinatra::Base
       set :public_folder, './'
       configure do
         set :threaded, false
       end
       get '/' do
         send_file
       end
     end

     EM.run do
       Rack::Server.start({
         app:    Rack::Builder.app{map('/'){ run SimpleView.new }},
         server: 'thin',
         Host:   '0.0.0.0',
         Port:   '8181'
       })
     end

    end

    Here is the client code(JavaScript) :

    var stream = new WebSocket('ws://localhost:9393')
    var videoElement = document.querySelector("#desktop")
    var videoSource = document.querySelector("source")
    window.MediaSource = window.MediaSource || window.WebKitMediaSource;
    var mediaSource = new MediaSource()
    videoElement.src = window.URL.createObjectURL(mediaSource)

    stream.onopen = function(){
     console.log('connection open')
    }

    stream.onclose = function(){
     console.log('connection closed')
    }

    mediaSource.addEventListener('sourceopen', function(e){
     var sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vp8,vorbis"')

     stream.onmessage = function(e){
       var byteCharacters = atob(e.data)

       var byteNumbers = new Array(byteCharacters.length)
       for (var i = 0; i < byteCharacters.length; i++) {
         byteNumbers[i] = byteCharacters.charCodeAt(i)
       }

       var byteArray = new Uint8Array(byteNumbers)

       sourceBuffer.appendStream(byteArray)

     }

    }, false)

    What’s happening is I’m capturing the stdout of FFmpeg, converting each chunk of data to base64, sending it through the websocket, then the client tries to decode the base64 chunk into something a source buffer can understand and play the mediasource in the video element.

    Though it doesn’t surprise me that this doesn’t work, I would still like it to and I’m hoping maybe there’s just one little thing that I’m missing. I get nothing except black inside the video element.

    NOTE : I am using the original FFmpeg, and NOT Avconv. I compiled it with all codecs enabled.

    The full source is available at https://github.com/Ravenstine/simpleview.

  • How to deal with whitespace in foldernames

    27 mars 2015, par FlashSolutions

    I am trying to use ffmpeg to process input files with spaces in them.

    To check the syntax, I try using Dir in a cmd window and get "file not found".

    Dir "C:/DigitalSignageManager/LLRA photos/SpinningTop.mp4"

    I have tried escaping with \ and ^ and used double and single quotes, but nothing seems to work.

    What is the proper syntax to make this work ?