
Recherche avancée
Médias (1)
-
Carte de Schillerkiez
13 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (35)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
ANNEXE : Les plugins utilisés spécifiquement pour la ferme
5 mars 2010, parLe site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)
-
Participer à sa documentation
10 avril 2011La documentation est un des travaux les plus importants et les plus contraignants lors de la réalisation d’un outil technique.
Tout apport extérieur à ce sujet est primordial : la critique de l’existant ; la participation à la rédaction d’articles orientés : utilisateur (administrateur de MediaSPIP ou simplement producteur de contenu) ; développeur ; la création de screencasts d’explication ; la traduction de la documentation dans une nouvelle langue ;
Pour ce faire, vous pouvez vous inscrire sur (...)
Sur d’autres sites (6672)
-
ffmpeg video replay with time-sync needs actual recording times
16 juillet 2018, par navySVI am attempting to use ffmpeg to replay multiple video files time-synched, but the zero-based video start time is preventing this.
I have ffmpeg commands to successfully capture a Microsoft Windows 7 desktop into a video file and replay it with a timestamp value (see below), but the internal timestamp is always starting near zero. How can ffmpeg display the actual time when the video was recorded (and not the time since the start of the video i.e. zero) ?
For example, if the video started to be recorded at 10:47 am, the ffplay command should display a timestamp similar to "10:47:31" during playback (and not "00:00:31").
video-capture command :
ffmpeg -f gdigrab -offset_x 0 -offset_y 0 -video_size 1920x1080 -i desktop -c:v libx264 -preset medium -f mpegts -framerate 24 -y fileA.ts
playback command :
ffplay -vf "drawtext=fontfile=/windows/fonts/arial.ttf: text='%{pts\:gmtime\:0\:%H\\\:%M\\\:%S}':box=1:x=(w-tw)/2:y=h-(2*lh)" fileA.ts
parameters I’ve tried unsuccessfully in the previous commands (including moving these around into different places in the commands) :
-timestamp now
-vsync 0
-copyts(every attempt to use -copyts generates errors about "non-strictly-monotonic PTS" or "Non-monotonous DTS in output stream" no matter where I put this parameter)
-filter_complex "[0:v] setpts=PTS"
The ultimate goal is to capture four video files (recorded on four different computers and probably having different start times), and then to replay all four in time-sync (which is not possible using only the zero-based start times).
For example, I’ve been successful at replaying four video files in a 2x2 arrangement, using the following command (I added the -ss parameter to demonstrate I can move the start time of the replay). Unfortunately, they always time-sync to the zero-based first video frame (so they all play from the beginning of the video file). I need the replay to be time-syncing to the actual recorded time for each video. If the four videos were captured starting at times 10:47:00, 10:47:51, 10:48:44, and 10:49:01, I want to be able to replay all of them so that all are displaying the same timestep at the same time (so if one video were displaying 10:48:33, all of the videos would be displaying the same time or a blank screen if that time was unavailable) .
ffmpeg -ss 00:00:30 -i fileA.ts -i fileB.ts -i fileC.ts -i fileD.ts -filter_complex "[0:v][1:v]hstack[top];[2:v][3:v]hstack[bottom];[top][bottom]vstack[v]" -map "[v]" -timestamp now -f mpegts - | ./ffplay - -x 1920 -y 1080
Ideally, I would also like to be able to use a real time value (something like "ffplay -ss 10:48:00 ...") to start the video replay at a different position, but worst-case I can write a script to do the needed conversion of the time value.
My ffmpeg version is a Windows 7 64-bit static build "N-90810-g153e920892" on 2018Apr22 (downloaded from https://www.ffmpeg.org/download.html)
-
Writing an MP4 slideshow video to S3 using the Lambda FFmpeg Layer
15 avril 2020, par GracieI am using AWS Lambda with the FFmpeg layer to try and build a 15 second MP4 file (beach.mp4) - from 3 static images that show for 3, 5 and 7 seconds in sequence.
These 3 images are within my Lambda upload deployment zip on S3, along with the sequence.txt file needed for the function.



SEQUENCE.TXT



file beach1.jpg
outpoint 3
file beach2.jpg
outpoint 8
file beach3.jpg
outpoint 15



FFMPEG COMMAND



ffmpeg -f concat -i sequence.txt -c:v libx264 -tune stillimage -c:a aac -b:a 192k -pix_fmt yuv420p -shortest beach.mp4



I am writing a file to S3, but it is blank, only 15 bytes. So doesn't contain the MP4 file created by FFmpeg. I think this has something to do with sync or streaming the video file so both the txt can be read and the MP4 can be written to a file, but not sure.



How can I read the .txt contents and then write the ffmpeg command to a file in /tmp/ ?



You can download or view the files at https://lifeisabeach.netlify.app/
(For some strange reason the MP4 length when built locally is 19 seconds, when it should be 15 !)



const util = require('util');
const AWS = require('aws-sdk');
const s3 = new AWS.S3();
const { readFileSync, writeFileSync, unlinkSync, writeFile, readdir } = require('fs');
//const fs = require('fs');
const path = require('path');
const { spawnSync } = require('child_process');

exports.handler = async (event, context, callback) => {

 const outputBucket = 'mys3bucket';
 const sequenceTXT = "sequence.txt";

 // FFmpeg creates the file, using the contents of sequence.txt to create timed image slides
 const mp4create = await spawnSync(
 '/opt/bin/ffmpeg',
 [
 '-f',
 'concat',
 '-i',
 sequenceTXT,
 '-c:v',
 'libx264',
 '-tune',
 'stillimage',
 '-c:a',
 'aac',
 '-b:a',
 '192k',
 '-pix_fmt',
 'yuv420p',
 '-shortest',
 'beach.mp4'
 ]
 );

 // Write ffmpeg output to a file in /tmp/
 const writeMP4File = util.promisify(writeFile);
 await writeMP4File('/tmp/beach.mp4', mp4create, 'binary');
 console.log('MP4 content written to /tmp/');

 // Copy MP4 data to a variable to enable write to S3 Bucket
 let result = mp4create;
 console.log('MP4 Result contents ', result);

 const vidFile = readFileSync('/tmp/beach.mp4');

 // Set S3 bucket details and put MP4 file into S3 bucket from /tmp/
 const s3 = new AWS.S3();
 const params = {
 Bucket: outputBucket,
 Key: 'beach.mp4',
 ACL: 'private',
 Body: vidFile
 };

 // Put MP4 file from AWS Lambda function /tmp/ to an S3 bucket
 const s3Response = await s3.putObject(params).promise();
 callback(null, s3Response);

};



-
Desktop streaming to Wowza server using ffmpeg
24 avril 2013, par Gergely LukacsyRecently, I'm trying to utilize ffmpeg for streaming live desktop screen to a Wowza media server.
I have partial success on the subject so far :
I've managed to record desktop screen using the UScreenCapture directx dshow filter, and I'm also able to send this record to the server.
However, when I'm trying to send the screen directly to the server, it fails every time. The player is buffering pretty slowly, and shows a blank screen when it's done (the counter keeps counting).So, here are the working methods
for recording screen :
ffmpeg -f dshow -i video="UScreenCapture" -r 25 -vcodec libx264 output.flv
and for streaming video :
ffmpeg -re -i -map 0 -c copy -vbsf h264_mp4toannexb -f mpegts udp://stream.server.xyz:52000?pkt_size=1024
And this is the code I'm using :
ffmpeg -f dshow -i video="UScreenCapture" -s width x height -r framerate -vcodec libx264 -pix_fmt yuv420p -b:v bitrate -an -vbsf h264_mp4toannexb -f mpegts udp ://your.destination.url:PORTNUMBER ?pkt_size=some_bytes
As far as I can remember, it worked well before I upgraded wowza.
Some additional info :
- OS : win7 sp1 64bit
- ffmpeg N-49610-gc2dd5a1 (Zeranoe FFmpeg build 2013 Feb 5)
- UScreenCapture : x64 Edition Version 2.0.14
- Wowza 3.5.2 running on a Debian linux 2.6.32-5-amd64
- Flowplayer : 3.2.15
Looking trough the Wowza log files, something caught my attention. It seems when I'm streaming desktop screen, the server somhow gets uncomplete packets, but when I'm streaming a video file, the error doesn't occurs.
RTPDePacketizerMPEGTS.handleRTPPacket
WARN server comment 2013-04-11 11:26:24 - - - - - 152629.665 - - - - - - - -RTPDePacketizerMPEGTS.handleRTPPacket: Incomplete packet: 1504:1472
WARN server comment 2013-04-11 11:26:27 - - - - - 152632.782 - - - - - - - -RTPDePacketizerMPEGTS.handleRTPPacket: Incomplete packet: 1504:1472
WARN server comment 2013-04-11 11:26:31 - - - - - 152636.383 - - - - - - - -RTPDePacketizerMPEGTS.handleRTPPacket: Incomplete packet: 1504:1472
WARN server comment 2013-04-11 11:26:38 - - - - - 152643.484 - - - - - - - -RTPDePacketizerMPEGTS.handleRTPPacket: Incomplete packet: 1504:1472
WARN server comment 2013-04-11 11:26:47 - - - - - 152653.088 - - - - - - - -RTPDePacketizerMPEGTS.handleRTPPacket: Incomplete packet: 1504:1472
WARN server comment 2013-04-11 11:26:52 - - - - - 152657.587 - - - - - - - -RTPDePacketizerMPEGTS.handleRTPPacket: Incomplete packet: 1504:1472
WARN server comment 2013-04-11 11:26:56 - - - - - 152661.624 - - - - - - - -RTPDePacketizerMPEGTS.handleRTPPacket: Incomplete packet: 1504:1472
WARN server comment 2013-04-11 11:27:05 - - - - - 152670.805 - - - - - - - -RTPDePacketizerMPEGTS.handleRTPPacket: Incomplete packet: 1504:1472What causes this error ? Any ideas ?
Thanks in advance !