
Recherche avancée
Médias (1)
-
The Slip - Artworks
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (100)
-
Automated installation script of MediaSPIP
25 avril 2011, parTo overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
The documentation of the use of this installation script is available here.
The code of this (...) -
Que fait exactement ce script ?
18 janvier 2011, parCe script est écrit en bash. Il est donc facilement utilisable sur n’importe quel serveur.
Il n’est compatible qu’avec une liste de distributions précises (voir Liste des distributions compatibles).
Installation de dépendances de MediaSPIP
Son rôle principal est d’installer l’ensemble des dépendances logicielles nécessaires coté serveur à savoir :
Les outils de base pour pouvoir installer le reste des dépendances Les outils de développements : build-essential (via APT depuis les dépôts officiels) ; (...) -
XMP PHP
13 mai 2011, parDixit Wikipedia, XMP signifie :
Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)
Sur d’autres sites (9777)
-
Live encoding with FFmpeg , decklink and pipe [duplicate]
5 novembre 2016, par SKALISThis question already has an answer here :
What is the best way to get live video input from a Blackmagic Decklink card and get it encoded with an external encoder (NvEncoder)
I try :
mkfifo output.yuv
ffmpeg -f decklink -r 30000/1000 -pix_fmt uyvy422 -i "DeckLink Mini Recorder 4K@24" output.yuv
& NvEncoder -i output.yuv -pix_fmt yuv420p -bitrate 21M -fps30 output.tsProblem seems that the NvEncoder cannot take the uyvy422 format, so can I change that to yuv420p when capturing and before sending it trough the pipe ?
The decklink card only can give me uyvy422
-
How to pipe ppm data into ffmpeg from blender frameserver with while loop in PowerShell
14 octobre 2016, par RadiumBlender 2.6 manual features this little
sh
script for encoding a video from the Blender frameserver viaffmpeg
. It works great on Windows with Cygwin, but only without-hwaccel
hardware acceleration flag.#!/bin/sh
BLENDER=http://localhost:8080
OUTPUT=/tmp/output.ogv
eval `wget ${BLENDER}/info.txt -O - 2>/dev/null |
while read key val ; do
echo R_$key=$val
done`
i=$R_start
{
while [ $i -le $R_end ] ; do
wget ${BLENDER}/images/ppm/$i.ppm -O - 2>/dev/null
i=$(($i+1))
done
} | ffmpeg -vcodec ppm -f image2pipe -r $R_rate -i pipe:0 -b 6000k -vcodec libtheora $OUTPUT
wget ${BLENDER}/close.txt -O - 2>/dev/null >/dev/nullI’d like to encode my videos from Blender’s in Windows with
-hwaccel dxva2
which works with PowerShell. I’ve begun converting the script to PowerShell but I have run into one last problem. I am having difficulty replicating this part of the script in PowerShell.i=$R_start
{
while [ $i -le $R_end ] ; do
wget ${BLENDER}/images/ppm/$i.ppm -O - 2>/dev/null
i=$(($i+1))
done
} | ffmpeg -vcodec ppm -f image2pipe -r $R_rate -i pipe:0 -b 6000k -vcodec libtheora $OUTPUTBelow is my conversion to PowerShell.
echo "gathering data";
$blender = "http://localhost:8080";
$output = "C:\Users\joel\Desktop\output.mp4";
$webobj = wget $blender"/info.txt";
$lines = $webobj.Content -split('[\r\n]') | ? {$_};
$info = @{};
foreach ($line in $lines) {
$lineinfo = $line -split('[\s]') | ? {$_};
$info[$lineinfo[0]] = $lineinfo[1];
}
echo $info;
[int]$end = [convert]::ToInt32($info['end'],10);
[int]$i = [convert]::ToInt32($info['start'],10);
$video="";
( while ($i -le $end) {
$frame = wget $blender"/images/ppm/"$i".ppm" > $null;
echo $frame.Content > $null;
$i++;
} ) | ffmpeg -hwaccel dxva2 -vcodec ppm -f image2pipe -r $info['rate'] -i pipe:0 -b 6000k -vcodec libx264 $output;This is the piece I’m having trouble with. I’m not quite sure what the proper syntax is to pipe the data into the
ffmpeg
command in the same way as the bash script above.( while( $i -le $end ) {
$frame = wget $blender"/images/ppm/"$i".ppm" > $null;
echo $frame.Content > $null;
$i++;
} ) | ffmpeg -hwaccel dxva2 -vcodec ppm -f image2pipe -r $info['rate'] -i pipe:0 -b 6000k -vcodec libx264 $output;Here is the output :
PS C :\Users\joel\Desktop> .\encode.ps1 gathering data
Name Value
-----
rate 30
height 720
ratescale 1
end 57000
width 1280
start 1
while : The term ’while’ is not recognized as the name of a cmdlet, function,
script file, or operable program. Check the spelling of the name, or if a path
was included, verify that the path is correct and try again.
At C :\Users\joel\Desktop\encode.ps1:15 char:3
+ ( while ($i -le $end)
+
+ CategoryInfo : ObjectNotFound : (while:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException -
Running command with pipe in Golang exec
15 septembre 2016, par Jon StevensI am trying to get the example from here working to record a webpage with phantomjs and pipe the stdout, which are images, to the ffmpeg command to create the video. The command stated that you need to run is :
phantomjs runner.js | ffmpeg -y -c:v png -f image2pipe -r 25 -t 10 -i - -c:v libx264 -pix_fmt yuv420p -movflags +faststart dragon.mp4
If I run a similar version of that command directly in the terminal, I can get it working just fine. The issue is that I need to run the above command through the Golang os/exec package. With the :
cmd := exec.Command(parts[0], parts[1:]...)
method, the first parameter is the base executable of the command being executed and it is not honoring the pipe. I would like to get this working in one command so I don’t have to write all the images to files and then run a second ffmpeg command to read from all of those images. Any suggestions ?