Recherche avancée

Médias (0)

Mot : - Tags -/flash

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (106)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

Sur d’autres sites (14966)

  • Can FFMPEG output like if it was a WDM / directshow webcam ?

    22 septembre 2015, par Jeffcurious

    I am using a 3D scanning program that uses a Directshow input from a camera / webcam.

    To increase the quality of the scan, I purchased a Blackmagic HDMI acquisition card that appears as a "webcam" in Windows. I am capturing the live HDMI output from a Canon DSLR (60D) into the Blackmagic card and then into the scanning program. But the output of the live Canon HDMI is interlaced (1080i). This is causing problems during the scan process.

    I would like to deinterlace the signal coming from the Blackmagic with FFmpeg and then output it "as if it was a windows WDM/directshow".

    Can FFMPEG output a stream that appears to other application as if it was a regular webcam ?
    Thanks,
    Jeff

  • Ffmpeg : How to capture audio and metadata simultaneously

    12 janvier 2020, par ChrisDoernen

    I am developing a multi platform app for live audio streaming written in JS. The goal is to get the meters/volumes per channel while capturing audio from the sound card. This has to be done in one command since I get the error device or resource busy when firing multiple commands with the same input.

    Capturing audio works fine using this command :

    ffmpeg -y -f alsa -i hw:CARD=PCH,DEV=0 -ac 2 -b:a 192k -acodec libmp3lame -f mp3 -probesize 64 -rtbufsize 64 -reservoir 0 -fflags +nobuffer -hide_banner pipe:1

    Getting the volume for the right channel works with this command (left channel is analog providing 0.0.1 to -map_channel) :

    ffmpeg -f alsa -i hw:CARD=PCH,DEV=0 -map_channel 0.0.0 -af ebur128=metadata=1,ametadata=print:key=lavfi.r128.M -f null pipe:1

    The question is how to combine these, providing a way to pipe the outputs correctly.

    As a first step, my current approach is to utilize the file argument of ametadata filter (documenation here) and writing to a socket opened with the following JS code

    var net = require('net');

    var server = net.createServer(function (stream) {
     stream.on('data', function (c) { console.log('data:', c.toString()); });
    });

    server.listen('/tmp/test.sock');

    like

    ffmpeg -f alsa -i hw:CARD=PCH,DEV=0 -map_channel 0.0.1 -af ebur128=metadata=1,ametadata=mode=print:key=lavfi.r128.M:file=unix\:/tmp/test.sock:direct -f null -

    but the socket receives no data and there is no error in ffmpeg.

    Redirecting the output of the streaming command to the socket howerver works :

    ffmpeg -y -f alsa -i hw:CARD=PCH,DEV=0 -ac 2 -b:a 192k -acodec libmp3lame -f mp3 -probesize 64 -rtbufsize 64 -reservoir 0 -fflags +nobuffer -hide_banner unix:/tmp/test.sock

    I am wondering what I am missing and whether I am on the right track alltogether.

  • ffmpeg shell script to squeeze videos for twitter

    12 juin 2018, par André Levy

    Here’s a pretty rookie attempt at creating a shell script to squeeze videos into twitter’s 140s limitation :

    for f in "$@"
    do
       /Applications/ffmpeg -i $f -filter_complex "[0:v]setpts=140 / \
     $(/Applications/ffprobe -i $f -show_entries format=duration -v quiet -of csv="p=0") \
     * PTS[v];[0:a]atempo= \
     $(/Applications/ffprobe -i $f -show_entries format=duration -v quiet -of csv="p=0") \
     / 140[a]" -map "[v]" -map "[a]"  "${f%.*}_twitter.mp4"
    done

    It works, but it’s pretty badly written. Can you help me improve it ?

    1. How can I run ffprobe only once ?
    2. Do I need ffprobe at all, or does ffmpeg have the duration of the input amongst its arguments ?
    3. Will it encode faster with other settings (e.g. HW acceleration) ?
    4. Any twitter upload scripts out there to pipe this into ?

    Needless to say, I looked for this answers all around, to no avail.

    Cheers.