Recherche avancée

Médias (1)

Mot : - Tags -/artwork

Autres articles (90)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (9849)

  • X264—Why does "—pass 1 —stats" make the file bigger ?

    4 mai 2016, par Codecodeup

    I am using --pass 1 --stats <stats file="file" location="location"></stats> to generate the stats file for H.264 encoding with x264. The command is like this :

    <x264 binary="binary"> <input file="file" /> -o <output file="file"> --preset veryslow --crf 27
     --tune ssim  --scenecut 0 --pass 1 --stats <stats file="file" location="location">
    </stats></output></x264>

    When I ran this command with and without --pass 1 --stats <stats file="file" location="location"></stats>, the difference between the sizes of the final output files are very different. The one with --pass 1 --stats <stats file="file" location="location"></stats> is much bigger. What is the reason ? Thanks.

  • seeking mp4 with audiostream inaccurate in mobile safari

    23 février 2015, par dubbelj

    I have an issue with seeking an MP4 videofile on iOS (ipad). We use currentTime to skip through the video. This is working very well on most browsers, and also for h264 files without an audiostream on iOS. When I add an audiostream to the mp4 file however, the seeking becomes very inaccurate. This leads to all sorts of issues as the application bases a lot around timing.

    So we definitely concluded that a file that is encoded (using ffmpeg) with the exact same encoding settings but without audio is working well.

    These are the ffmpeg parameters I’m using when encoding with audio (added breaks for readability) :

    ffmpeg
    -i source.mov
    -force_key_frames 00:00:04.840,00:00:04.880,00:00:04.920,00:00:04.960,00:00:05.000,00:00:05.040,00:00:05.080,00:00:14.760,00:00:14.800,00:00:14.840,00:00:14.880,00:00:14.920,00:00:14.960,00:00:15.000,00:00:24.760,00:00:24.800,00:00:24.840,00:00:24.880,00:00:24.920,00:00:24.960,00:00:25.000,00:00:31.720,00:00:31.760,00:00:31.800,00:00:31.840,00:00:31.880,00:00:31.920,00:00:31.960
    -c:a libvo_aacenc
    -b:a 128k
    -ar 44100
    -c:v libx264
    -preset slow
    -crf 21
    -g 25
    -r 25
    -maxrate 2000k
    -bufsize 2000k
    -pix_fmt yuv420p
    -movflags +faststart
    -tune zerolatency
    -vstats
    encoded.mp4

    Things I’ve tried :

    • all available aac and mp3 codecs
    • different audio and video bitrates
    • different audio frequencies
    • with without zerolatency faststart flags
    • maximum amount of keyframes in file
    • settings keyframes only at (and around) the area where I need to skip to

    Any ideas anyone ?

  • Can I stream WebM video to a video element through WebSocket ?

    5 février 2015, par Ravenstine

    I’d like to be able to capture my desktop using FFmpeg and send the video through a WebSocket to the client where the video stream can be played in an HTML5 video tag. It seems that the current way to accomplish this is by sending JPG data and placing it on a canvas element. I would rather stream actual video and audio data before resorting to that.

    Below is my server code. Note that I am running Ruby 2.2.0 on Linux(Debian Stable).

    require 'bundler'
    Bundler.require

    EM.run do
     EM::WebSocket.run host: "0.0.0.0", port: 9393 do |ws|

       cmd = "ffmpeg -y -f x11grab -s 1600x900 -r 15 -i :0.0 -tune fastdecode -b:v 150k -threads 4 -f webm -"
       @handler = EM.popen3(cmd, stdout: Proc.new { |data|
         puts(data)
         ws.send(Base64.encode64(data))
       }, stderr: Proc.new{|err|})

       @handler.callback do
         puts "hey"
       end

       @handler.errback do |err_code|
         puts err_code
       end

       ws.onopen do |handshake|
         puts "WebSocket connection open"
       end

       ws.onclose do
         puts "Connection closed"
         # @handler.kill('TERM', true)
       end

     end

     class SimpleView &lt; Sinatra::Base
       set :public_folder, './'
       configure do
         set :threaded, false
       end
       get '/' do
         send_file
       end
     end

     EM.run do
       Rack::Server.start({
         app:    Rack::Builder.app{map('/'){ run SimpleView.new }},
         server: 'thin',
         Host:   '0.0.0.0',
         Port:   '8181'
       })
     end

    end

    Here is the client code(JavaScript) :

    var stream = new WebSocket('ws://localhost:9393')
    var videoElement = document.querySelector("#desktop")
    var videoSource = document.querySelector("source")
    window.MediaSource = window.MediaSource || window.WebKitMediaSource;
    var mediaSource = new MediaSource()
    videoElement.src = window.URL.createObjectURL(mediaSource)

    stream.onopen = function(){
     console.log('connection open')
    }

    stream.onclose = function(){
     console.log('connection closed')
    }

    mediaSource.addEventListener('sourceopen', function(e){
     var sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vp8,vorbis"')

     stream.onmessage = function(e){
       var byteCharacters = atob(e.data)

       var byteNumbers = new Array(byteCharacters.length)
       for (var i = 0; i &lt; byteCharacters.length; i++) {
         byteNumbers[i] = byteCharacters.charCodeAt(i)
       }

       var byteArray = new Uint8Array(byteNumbers)

       sourceBuffer.appendStream(byteArray)

     }

    }, false)

    What’s happening is I’m capturing the stdout of FFmpeg, converting each chunk of data to base64, sending it through the websocket, then the client tries to decode the base64 chunk into something a source buffer can understand and play the mediasource in the video element.

    Though it doesn’t surprise me that this doesn’t work, I would still like it to and I’m hoping maybe there’s just one little thing that I’m missing. I get nothing except black inside the video element.

    NOTE : I am using the original FFmpeg, and NOT Avconv. I compiled it with all codecs enabled.

    The full source is available at https://github.com/Ravenstine/simpleview.