
Recherche avancée
Médias (1)
-
La conservation du net art au musée. Les stratégies à l’œuvre
26 mai 2011
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (57)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)
Sur d’autres sites (12577)
-
Am I missing a timeout param in FFMPEG ?
5 mai 2020, par Dave SteinI'm running an ffmpeg command like this :



ffmpeg -loglevel quiet -report -timelimit 15 -timeout 10 -protocol_whitelist file,http,https,tcp,tls,crypto -i ${inputFile} -vframes 1 ${outputFile} -y



This is running in an AWS Lambda function. My Lambda timeout is at 30 seconds. For some reason I am getting "Task timed out" messages still. I should note I log before and after the command, so I know it's timing out during this task.



Update



In terms of the entire lambda execution I do the following :



- 

-
Invoke a lambda to get an access token. This lambda makes on API request. It has a timeout of 5 seconds. The max time was 660MS for one request.
-
Make another API request to verify data. The max time was 1.6 seconds.
-
Run FFMPEG









timelimit
is supposed toExit after ffmpeg has been running for duration seconds in CPU user time.
. Theoretically this shouldn't run more than 15 seconds then, plus maybe 2-3 more before the other requests.


timeout
is probably superfluous here. There were a lot of definitions for it in the manual, but I think that was mainly waiting on input ? Either way, I'd thinktimelimit
would cover my bases.


Update 2



I checked my debug log and saw this :



Reading option '-timelimit' ... matched as option 'timelimit' (set max runtime in seconds) with argument '15'.
Reading option '-timeout' ... matched as AVOption 'timeout' with argument '10'.




Seems both options are supported by my build



Update 2



I have updated my code with a lot of logs. I definitively see the FFMPEG command as the last thing that executes, before stalling out for the 30 second timeout



Update 3
I can reproduce the behavior by pointing at a track instead of full manifest. I have set the command to this :



ffmpeg -loglevel debug -timelimit 5 -timeout 5 -i 'https://streamprod-eastus-streamprodeastus-usea.streaming.media.azure.net/0c495135-95fa-48ec-a258-4ba40262e1be/23ab167b-9fec-439e-b447-d355ff5705df.ism/QualityLevels(200000)/Manifest(video,format=m3u8-aapl)' -vframes 1 temp.jpg -y



A few things here :



- 

- I typically point at the actual manifest (not the track), and things usually run much faster
- I have lowered the
timelimit
andtimeout
to 5. Despite this, when i run a timer, the command runs for 15 seconds every time. It outputs a bunch of errors, likely due to this being track rather than full manifest, and then spits out the desired image.







The full output is at https://gist.github.com/DaveStein/b3803f925d64dd96cd45ae9db5e5a4d0


-
-
Is ffmpeg able to read ArrayBuffer input from stream
7 juillet 2017, par jAndyI want to accomplish the following tasks :
- Record Video+Audio from any HTML5 (
MediaStream
) capable browser - Send that data via
WebSocket
asBlob
/ArrayBuffer
chunks to a server - Broadcast that input stream-data to multiple clients
As it turns out, this brought me into a world of pain. The first task is fairly simple using the HTML5
MediaStream
objects alongside WebSockets.// ... for simplicity...
navigator.mediaDevices.getUserMedia({ audio: true, video: true }).then(stream => {
let mediaRecorder = new MediaRecorder( stream );
// ...
mediaRecorder.ondataavailable = e => {
webSocket.send( 'newVideoData', e.data ); // configured for binary data
};
});Now, I want to receive those data fragments and stream those via
nginx vod module
, because I guess I want the output stream in HLS or DASH.
I could write a littlenodejs
script as backend, which just receives the binary chunks and write them to a file or stream, and just reference it songinx vod module
could possibly read it and create them3u8
manifest on the fly ?I am wondering now,
- if
ffmpeg
is able to read that binary data directly (should bewebm format
), without a man-in-the-middle script, "somehow" ? - If not, do I have to write the data down into a file and pass that as input to
ffmpeg
or can I (should I) pipe the data to a self spawnedffmpeg
instance ? (if so, how ?) - Do I actually need the
nginx server
(probably alongside rtmp module) to deliver the output stream as HLS or could I just useffmpeg
to also create a dynamic manifest ? - Is the
nginx vod module
capable of creating a dynamic hls/dash manifest or must the input data be complete beforehand ? - Ultimately, am I on the totally wrong track here ? :P
Actually I just want to create a little video-live-chat demo, without any plugins or 3rd party encoding software, pure browser.
- Record Video+Audio from any HTML5 (
-
How to watermark on video in android using ffmpeglibrary programmatically
2 mai 2016, par Sakibmohammad SyedI want to watermark one video file using ffmpeg library. I have one png, one mp4 file in my Videokit folder and I want to watermark png image to video file and below is my code but I don’t know why I am unable to watermark on it. I have no more idea about implementation of ffmpeg library so please help me to solve such issue.
Here is my MainActivity.public class MainActivity extends AppCompatActivity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
String workFolder = "/sdcard/Videokit/";
GeneralUtils.deleteFileUtil(workFolder + "/vk.log");
PowerManager powerManager = (PowerManager) this.getSystemService(Activity.POWER_SERVICE);
PowerManager.WakeLock wakeLock = powerManager.newWakeLock(PowerManager.PARTIAL_WAKE_LOCK, "VK_LOCK");
wakeLock.acquire();
String commandStr = "ffmpeg -i /sdcard/Videokit/test.mp4 -i /sdcard/Videokit/test.png -filter_complex transpose=1,overlay=10:10 -y /sdcard/out.mp4";
LoadJNI vk = new LoadJNI();
try {
vk.run(GeneralUtils.utilConvertToComplex(commandStr), workFolder, MainActivity.this);
GeneralUtils.copyFileToFolder(commandStr, workFolder);
} catch (CommandValidationException e) {
Log.e("Prefs.TAG", "vk run exeption.", e);
}
}
}Here is my Manifest.xml
<?xml version="1.0" encoding="utf-8"?>
<manifest package="com.example.android.compressexample">
<application>
<activity>
<action></action>
<category></category>
</activity>
</application>
</manifest>