
Recherche avancée
Médias (3)
-
Exemple de boutons d’action pour une collection collaborative
27 février 2013, par
Mis à jour : Mars 2013
Langue : français
Type : Image
-
Exemple de boutons d’action pour une collection personnelle
27 février 2013, par
Mis à jour : Février 2013
Langue : English
Type : Image
-
Collections - Formulaire de création rapide
19 février 2013, par
Mis à jour : Février 2013
Langue : français
Type : Image
Autres articles (98)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
ANNEXE : Les plugins utilisés spécifiquement pour la ferme
5 mars 2010, parLe site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)
Sur d’autres sites (7887)
-
Sending frames from memory to FFMPEG command line program .NET 6
15 septembre 2021, par Alessandro MartinelliI'm trying to use the pipe to send frames generated by my program to ffmpeg command line utility without saving them on disk. Please note I was able to generate a video by first saving frames on disk as images and then having FFMPEG generate a video from such images, but that approach is worse performance-wise and implies writing more data on SSD.


I could use the help of this post, but I have FFMPEG returning an error on frame size, and I don't know how to solve it.


My code is the following :


//object storing single frame returned from camera
mv.impact.acquire.Request pRequest; 
string outputPath = ...;
List frames = new List();

[...]

// Single frame is saved into memory
MemoryStream stream = new MemoryStream();
using (RequestBitmapData data = pRequest.bitmapData) {
 data.bitmap.Save(stream, System.Drawing.Imaging.ImageFormat.Bmp);
 // Please note that now printing data.bitmap.PixelFormat would return Format24bppRgb
}
frames.Add(stream.ToArray());

[...] 

Console.WriteLine(frames.Count + " frames collected. First one length is " + frames.First().Length);

string ffmpegArgument = "/C " + _ffmpegPath + "\\ffmpeg -y -f rawvideo -pix_fmt rgb24 -framerate 3 -video_size 728x544 -i - -c:v libx264 -preset 9 -c:a libvo_aacenc " + outputPath;

Process cmd = new Process();
cmd.StartInfo.FileName = "cmd.exe";
cmd.StartInfo.Arguments = ffmpegArgument;
cmd.StartInfo.UseShellExecute = false;
cmd.StartInfo.RedirectStandardError = false;
cmd.StartInfo.RedirectStandardInput = true;

Console.WriteLine("Executing command " + ffmpegArgument + "...");
cmd.Start();
foreach (byte[] frame in frames) {
 cmd.StandardInput.Write(frame);
}
cmd.StandardInput.Flush();
cmd.StandardInput.Close();



However, when I execute the program, I have the following output :


492 frames collected. First one length is 1188150
Executing command /C ExternalTools\FFmpeg\ffmpeg -y -f rawvideo -pix_fmt rgb24 -framerate 32 -video_size 728x544 -i - -c:v libx264 -preset ultrafast -c:a libvo_aacenc -b:a 128k "Output\TemporaryVideo\2021-09-13 18_58_58.mp4"...
ffmpeg version 2021-06-27-git-49e3a8165c-essentials_build-www.gyan.dev Copyright (c) 2000-2021 the FFmpeg developers
built with gcc 10.3.0 (Rev2, Built by MSYS2 project)

[...] (configurations)

[rawvideo @ 00000241d34feac0] Packet corrupt (stream = 0, dts = 0).
Input #0, rawvideo, from 'pipe:':
 Duration: N/A, start: 0.000000, bitrate: 304152 kb/s
 Stream #0:0: Video: rawvideo (RGB[24] / 0x18424752), rgb24, 728x544, 304152 kb/s, 32 tbr, 32 tbn
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> h264 (libx264))
pipe:: corrupt input packet in stream 0
[rawvideo @ 00000241d3511640] Invalid buffer size, packet size 6396 < expected frame_size 1188096
Error while decoding stream #0:0: Invalid argument

[...] (cpu capabilities)

Output #0, mp4, to 'Output\TemporaryVideo\2021-09-13 18_58_58.mp4':
 Metadata:
 encoder : Lavf59.3.101
 Stream #0:0: Video: h264 (avc1 / 0x31637661), yuv444p, 728x544, q=2-31, 32 fps, 16384 tbn
 Metadata:
 encoder : Lavc59.2.100 libx264
 Side data:
 cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
frame= 0 fps=0.0 q=0.0 Lsize= 0kB time=00:00:00.00 bitrate=N/A speed= 0x
video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
Conversion failed!



The value expected by FFMPEG seems correct (1188096 = 728 x 544 x 3), but I don't understand where FFMPEG gets that "packet size 6396" from. Furthermore, that value (6396) changes at every program execution.


I'm pretty sure frames are not corrupted since, if I save such frames on disk, the image is generated correctly.


Thank you for your time,
Alessandro


-
How to stream cv2.VideoWriter frames to and RTSP server
10 juin 2022, par chasez0rEnvironment : Docker, Ubuntu 20.04, OpenCV 3.5.4, FFmpeg 4.2.4


Im currently reading the output of a
cv2.VideoCapture
session using theCV_FFMPEG
backend and successfully writing that back out in real time to a file usingcv2.VideoWriter
. The reason I am doing this is to drawing bounding boxes on the input and saving it to a new output.

The problem is I am doing this in a headless environment (Docker container). And I’d like to view what's being written to
cv2.VideoWriter
in realtime.

I know there are ways to pass my display through using XQuartz for example so I could use
cv2.imshow
. But what I really want to do is write those frames to an RTSP Server. So not only my host can "watch" but also other hosts could watch too.

After the video is released I can easily stream the video to my RTSP Server using this command.


ffmpeg -re -stream_loop -1 -i output.mp4 -c copy -f rtsp rtsp://rtsp_server_host:8554/stream


Is there anyway to pipe the frames as they come in to the above command ? Can
cv2.VideoWriter
itself write frames to an RTSP Server ?

Any ideas would be much appreciated ! Thank you.


-
FFMpeg on docker
31 mai 2022, par user1765862I'm trying to run FFMpegCore library in the docker
Here is my Dockerfile


FROM public.ecr.aws/lambda/dotnet:6 AS base

FROM mcr.microsoft.com/dotnet/sdk:6.0-bullseye-slim as build
WORKDIR /src
COPY ["AWSServerless.csproj", "AWSServerless/"]
RUN dotnet restore "AWSServerless/AWSServerless.csproj"

WORKDIR "/src/AWSServerless"
COPY . .
RUN dotnet build "AWSServerless.csproj" --configuration Release --output /app/build

FROM build AS publish

#fix for using System.Drawing.Common on docker
RUN apt-get update && apt-get install -y apt-utils libgdiplus libc6-dev

RUN apt-get install -y ffmpeg

RUN dotnet publish "AWSServerless.csproj" \
 --configuration Release \ 
 --runtime linux-x64 \
 --self-contained false \ 
 --output /app/publish \
 -p:PublishReadyToRun=true 

FROM base AS final
WORKDIR /var/task

CMD ["AWSServerless::AWSServerless.LambdaEntryPoint::FunctionHandlerAsync"]
COPY --from=publish /app/publish .



When I try to use any of FFMpegCore commands I'm getting following error in the log




System.ComponentModel.Win32Exception (2) : An error occurred trying to
start process 'ffmpeg' with working directory '/var/task'. No such
file or directory