
Recherche avancée
Médias (5)
-
ED-ME-5 1-DVD
11 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
Revolution of Open-source and film making towards open film making
6 octobre 2011, par
Mis à jour : Juillet 2013
Langue : English
Type : Texte
-
Valkaama DVD Cover Outside
4 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
Valkaama DVD Label
4 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Image
-
Valkaama DVD Cover Inside
4 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
Autres articles (79)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (6506)
-
local.ERROR : ffmpeg failed to execute command
20 août 2024, par AMNA IQBALI am trying to draw points on a video.


The Video is 12 seconds long and for one second there are 17 data points which needs to be plotted on the video of one frame (1 second).

It works for 6 seconds but not for 12 seconds.

Why it's not working for longer videos ? Is there any limitations of commands in ffmpeg ?


public function overlayCoordinates(Request $request)
{
Log::info('Received request to overlay coordinates on video.');

 set_time_limit(600); // 600 seconds = 10 minutes
 try {

 $request->validate([
 'video' => 'required|file|mimes:mp4,avi,mkv,webm|max:102400', // Video max size 100MB
 'coordinates' => 'required|file|mimes:txt|max:5120', // Coordinates text file max size 5MB
 ]);

 $videoFile = $request->file('video');
 $coordinatesFile = $request->file('coordinates');

 $videoFilePath = $videoFile->getRealPath();
 $videoFileName = $videoFile->getClientOriginalName();

 // Move the video file to the desired location if needed
 $storedVideoPath = $videoFile->storeAs('public/videos', $videoFileName);

 // Open the video file using Laravel FFmpeg
 $media = FFMpeg::fromDisk('public')->open('videos/' . $videoFileName);
 $duration = $media->getDurationInSeconds();

 Log::info('Duration: ' . $duration);

 $coordinatesJson = file_get_contents($coordinatesFile->getPathname());
 $coordinatesArray = json_decode($coordinatesJson, true);

 $frameRate = 30; // Assuming a frame rate of 30 fps
 $visibilityDuration = 0.5; // Set duration to 0.5 second

 for ($currentTime = 0; $currentTime < 7; $currentTime++) {
 $filterString = ""; // Reset filter string for each frame
 $frameIndex = intval($currentTime * $frameRate); // Convert current time to an index

 if (isset($coordinatesArray['graphics'][$frameIndex])) {
 // Loop through the first 5 keypoints (or fewer if not available)
 $keypoints = $coordinatesArray['graphics'][$frameIndex]['kpts'];
 for ($i = 0; $i < min(12, count($keypoints)); $i++) {
 $keypoint = $keypoints[$i];

 $x = $keypoint['p'][0] * 1920; // Scale x coordinate to video width
 $y = $keypoint['p'][1] * 1080; // Scale y coordinate to video height

 $startTime = $frameIndex / $frameRate; // Calculate start time
 $endTime = $startTime + $visibilityDuration; // Set end time for 0.5 second duration

 // Add drawbox filter for the current keypoint
 $filterString .= "drawbox=x={$x}:y={$y}:w=10:h=10:color=red@0.5:t=fill:enable='between(t,{$startTime},{$endTime})',";
 }
 Log::info("Processing frame index: {$frameIndex}, Drawing first 5 keypoints.");
 }

 $filterString = rtrim($filterString, ',');
 
 // Apply the filter for the current frame
 if (!empty($filterString)) {
 $media->addFilter(function ($filters) use ($filterString) {
 $filters->custom($filterString);
 });
 }
 }

 $filename = uniqid() . '_overlayed.mp4';
 $destinationPath = 'videos/' . $filename;

 $format = new \FFMpeg\Format\Video\X264('aac');
 $format->setKiloBitrate(5000) // Increase bitrate for better quality
 ->setAdditionalParameters(['-profile:v', 'high', '-preset', 'veryslow', '-crf', '18']) // High profile, very slow preset, and CRF of 18 for better quality
 ->setAudioCodec('aac')
 ->setAudioKiloBitrate(192); // Higher audio bitrate
 

 // Export the video in one pass to a specific disk and directory
 $media->export()
 ->toDisk('public')
 ->inFormat($format)
 ->save($destinationPath);

 return response()->json([
 'message' => 'Video processed successfully with overlays.',
 'path' => Storage::url($destinationPath)
 ]);
 } catch (\Exception $e) {
 Log::error('Overlay process failed: ' . $e->getMessage());
 return response()->json(['error' => 'Overlay process failed. Please check logs for details.'], 500);
 }
}



-
Using FFmpeg to receive H.264 video stream by SRTP in Windows [closed]
29 août 2024, par Taylor YiI am using LGPL version of FFmpeg on Windows. I've downloaded it from https://github.com/BtbN/FFmpeg-Builds.


I have been trying to receive Secure RTP video stream using FFmpeg.
I want to receive the video as bgr24 rawvideo, and send it into output stream (using -).


I have a script in GStreamer that works, so I want to convert it to FFmpeg.
Below are the GStreamer script. It plays the received video using autovideosink ;


gst-launch-1.0 -v udpsrc port=4000 caps="application/x-srtp, ssrc=(uint)1356955624, mki=(buffer)01, srtp-key=(buffer)012345678901234567890123456789012345678901234567890123456789, srtp-cipher=(string)aes-128-icm, srtp-auth=(string)hmac-sha1-80, srtcp-cipher=(string)aes-128-icm, srtcp-auth=(string)hmac-sha1-80" ! srtpdec ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink



At first I wanted to see if the output is correct, so I set the output as a video file instead.
Here are the scripts I tried ;


ffmpeg -protocol_whitelist file,udp,rtp -i settings.sdp -c:v copy out.mp4



The 'settings.sdp' file is like this ;


c=IN IP4 0.0.0.0
m=video 5000 RTP/SAVP 96
a=rtpmap:96 H264/90000
a=ssrc:1356955624 cname:stream
a=crypto:1 AES_CM_128_HMAC_SHA1_80 inline:ASNFZ4kBI0VniQEjRWeJASNFZ4kBI0VniQEjRWeJ|2^20|1:1
a=rtcp-mux
a=rtcp-fb:96 nack
a=srtp-cipher:aes-128-icm
a=srtp-auth:hmac-sha1-80
a=srtcp-cipher:aes-128-icm
a=srtcp-auth:hmac-sha1-80



but, when I try running this script, this error log is printed ;


Incorrect amount of SRTP params
[sdp @ 000001e40785f340] RTP H.264 NAL unit type 27 is not implemented. Update your FFmpeg version to the newest one from Git. If the problem still occurs, it means that your file has a feature which has not been implemented.
[sdp @ 000001e40785f340] nal size exceeds length: 6664 608
[sdp @ 000001e40785f340] nal size exceeds length: 65101 502
[h264 @ 000001e4078661c0] non-existing PPS 0 referenced
[extract_extradata @ 000001e4078c4c80] Invalid NAL unit 0, skipping.
 Last message repeated 2 times
[h264 @ 000001e4078661c0] Invalid NAL unit 0, skipping.
 Last message repeated 2 times
[h264 @ 000001e4078661c0] non-existing PPS 0 referenced
[h264 @ 000001e4078661c0] decode_slice_header error
[h264 @ 000001e4078661c0] no frame!
[sdp @ 000001e40785f340] Undefined type (30)



Is there an option I didn't add ? What can I do to fix this ?


-
How to share video stream to WSL2 while ffmpeg ? [closed]
27 octobre 2024, par 笑先生Most solutions for using a camera in WSL are to build your own WSL kernel. I have realized it with the steps mentioned in Capturing webcam video with OpenCV in WSL2


However, it's complicated and time-consuming. I want to realize it by sharing video streaming.


Share method


Step1 : Run the command below on Windows to check all the camera devices. I see an integrated camera
"Integrated Webcam" (video)
in the output.

ffmpeg -list_devices true -f dshow -i dummy



Step2 : Check the IP of Ethernet adapter vEthernet (WSL). In my computer, it's
172.24.176.1


Step3 : Run the command below on Windows to share video streaming.


ffmpeg -f dshow -i video="Integrated Webcam" -preset ultrafast -tune zerolatency -vcodec libx264 -f mpegts udp://172.24.176.1:5000



Test


Run the command to play the video streaming :
ffplay udp://172.24.176.1:5000


It can show the video when the command is run with a terminal of Windows (Win10).


But, it cannot show anything when the command is run on with a terminal of WSL (Ubuntu 22.04). Why ?