
Recherche avancée
Autres articles (95)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Multilang : améliorer l’interface pour les blocs multilingues
18 février 2011, parMultilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela. -
ANNEXE : Les plugins utilisés spécifiquement pour la ferme
5 mars 2010, parLe site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)
Sur d’autres sites (4777)
-
Fragmented mp4 file is not played by MSE
19 mai 2020, par DanielI created a fragmented mp4 file with ffmpeg (from h264), and removed the first 6 moof and mdat pairs.



So now it still has the correct order of boxes : ftyp, moov, moof, mdat, moof, mdat, ..., but the first moof packet has the sequenceNumber of 7.



VLC can play it fine, 'Movies & TV' can also play, but the first some seconds are black.



If I drag the file into the browser, it can also play it fine.



It is however not being displayed at all in the browser (Chrome) if I feed it via MSE.



No error messages are printed, and in the media-internals' log it can be seen that the videoplayer starts playing in the first second and suspends it only in the 18th second :



Timestamp Property Value
00:00:00.000 origin_url "https://localhost:8443/"
00:00:00.000 kFrameUrl "https://localhost:8443/websocket/videodemo.html"
00:00:00.000 kFrameTitle "WebSocket and MSE demo"
00:00:00.000 url "blob:https://localhost:8443/3b4d4b1a-7c08-4136-95fe-dabc14fba95f"
00:00:00.000 info "ChunkDemuxer"
00:00:00.000 pipeline_state "kStarting"
00:00:01.067 kVideoTracks [{"alpha mode":"is_opaque","codec":"h264","coded size":"1600x900","color space":"{primaries:BT709, transfer:BT709, matrix:BT709, range:LIMITED}","encryption scheme":"Unencrypted","flipped":false,"has_extra_data":false,"natural size":"1600x900","profile":"h264 main","rotation":"0°","visible rect":"0,0 1600x900"}]
00:00:01.067 debug "Video rendering in low delay mode."
00:00:01.070 info "Using D3D11 device for DXVA"
00:00:01.075 kIsVideoDecryptingDemuxerStream false
00:00:01.075 kVideoDecoderName "MojoVideoDecoder"
00:00:01.075 kIsPlatformVideoDecoder true
00:00:01.075 info "Selected MojoVideoDecoder for video decoding, config: codec: h264, profile: h264 main, alpha_mode: is_opaque, coded size: [1600,900], visible rect: [0,0,1600,900], natural size: [1600,900], has extra data: false, encryption scheme: Unencrypted, rotation: 0°, flipped: 0, color space: {primaries:BT709, transfer:BT709, matrix:BT709, range:LIMITED}"
00:00:01.075 pipeline_state "kPlaying"
00:00:01.067 duration "unknown"
00:00:18.926 pipeline_state "kSuspending"
00:00:18.926 pipeline_state "kSuspended"
00:00:18.927 event "SUSPENDED"




Here is the video file for reference.



What is the problem with this file, why is it not displayed in the browser with MSE ?


-
How should I add a transparent watermark.png over my RTMP h264 stream with ffmpeg ?
16 juin 2013, par RoelandPI have a Raspberry Pi with the new camera module hooked up to (in this case) Bambuser. You can see the stream here, it's from a windmill in The Netherlands (camera position will be better within a few weeks).
I succesfully have the stream running, but now I want to add an image (alpha transparent png) on top of the input-stream which is piped to ffmpeg to be streamed to Bambuser.
I currently use the following command (user specific details wiped out) to succesfully stream the input from the Raspberry Camera module (it's great, HD & all, hardware rendering) to Bambuser, following the great tutorial by Slickstreamer :
raspivid -t 999999999 -w 960 -h 540 -fps 25 -b 500000 -o - | ffmpeg -i - -vcodec copy -an -metadata title="STREAM NAME" -f flv rtmp://USER_X.fme.bambuser.com/b-fme/USER_STREAM_KEY_X
I followed the docs about ffmpeg and it seems to me I should use the '-vf'-command to apply the 'movies :' filter, like so :
raspivid -t 999999999 -w 960 -h 540 -fps 25 -b 500000 -o - | ffmpeg -i - -vf "movie='/home/USER/watermark.png' [logo]; [in][logo] overlay=main_w-overlay_w-10:10 [out]" -vcodec copy -an -metadata title="STREAM NAME" -f flv rtmp://USER_X.fme.bambuser.com/b-fme/USER_STREAM_KEY_X
and various other -vf commands, like '-vf vflip' or '-vf mandelbrot'. But it doesn't seem to work, as the stream just shows the direct input from the Raspberry Camera.
This is the output when started with the following -vf command :
raspivid -t 999999999 -w 960 -h 540 -fps 25 -b 500000 -o - | ffmpeg -i - -vcodec copy -vf 'movie=0:png:/home/USER/watermark.png [watermark];[in] [watermark]overlay=0:0:1[out]' -an -metadata title="STREAM NAME" -f flv rtmp://USER_X.fme.bambuser.com/b-fme/USER_STREAM_KEY_X
ffmpeg version N-54036-g6c4516d Copyright (c) 2000-2013 the FFmpeg developers built on Jun 15 2013 XX:XX with gcc 4.6 (Debian 4.6.3-14+rpi1) configuration : libavutil 52. 35.101 / 52. 35.101 libavcodec 55. 16.100 / 55. 16.100 libavformat 55. 8.102 / 55. 8.102 libavdevice 55. 2.100 / 55. 2.100 libavfilter 3. 77.101 / 3. 77.101 libswscale 2. 3.100 / 2. 3.100 libswresample 0. 17.102 / 0. 17.102 [h264 @ 0x1917cc0] max_analyze_duration 5000000 reached at 5000000 microseconds Input #0, h264, from 'pipe :' : Duration : N/A, bitrate : N/A Stream #0:0 : Video : h264 (High), yuv420p, 960x540, 25 fps, 25 tbr, 1200k tbn, 50 tbc Output #0, flv, to 'rtmp ://USER_X.fme.bambuser.com/b-fme/USER_STREAM_KEY_X' : Metadata : title : STREAM NAME encoder : Lavf55.8.102 Stream #0:0 : Video : h264 ([7][0][0][0] / 0x0007), yuv420p, 960x540, q=2-31, 25 fps, 1k tbn, 1200k tbc Stream mapping : Stream #0:0 -> #0:0 (copy) frame= 2344 fps= 27 q=-1.0 size= 4827kB time=00:01:33.72 bitrate= 421.9kbits/s
As mentioned above, other -vf filters also don't seem to apply on the output stream on Bambuser, I think I fundamentally do something wrong here.
- Should I map the Raspivid-stream and map the image 'watermark.png' on top of that ? Would that be the solution ? Anyone experience with this ?
Thank you very much for your thoughts in advance.
-
Package is installed inside docker but actual command provide exception
1er juin 2022, par user1765862I'm trying to use ffmpeg core package inside .net 6 dockerized project. I install ffmpeg core inside Dockerfile, reference actual package FFMpegCore inside solution but when I try to apply any of the commands from the ffmpeg core lib I'm getting error




An error occurred trying to start process './ffmpeg' with working
directory '/var/task'. No such file or directory




Docker build is done with no error.


Dockerfile


FROM public.ecr.aws/lambda/dotnet:6 AS base
....
RUN apt-get install -y ffmpeg
....
FROM base AS final
WORKDIR /var/task
COPY --from=publish /app/publish .



As per ffmpeg core docs in order to use ffmpeg I need to set its binary folder, so I add ffmpeg.config.json


{
 "BinaryFolder": "/var/task",
 "TemporaryFilesFolder": "/tmp"
}



Actual error is being thrown when I try to execute following command




An error occurred trying to start process './ffmpeg' with working
directory '/var/task'. No such file or directory




This is the place where error gets triggered


using FFMpegCore;
 ...
 public class MyController : ControllerBase
 {
 public async Task<string> Get()
 { 
 await FFMpegArguments
 .FromPipeInput(new StreamPipeSource(myfile))
 .OutputToPipe(new StreamPipeSink(outputStream), options => options
 .WithVideoCodec("vp9")
 .ForceFormat("webm"))
 .ProcessAsynchronously();
 ...
 } 
 }
</string>


Update :
After changing
BinaryFolder
location to/usr/bin
I'm getting following error

An error occurred trying to start process '/usr/bin/ffmpeg' with working directory '/var/task'. No such file or directory



Update #2
This is my complete Dockerfile


FROM public.ecr.aws/lambda/dotnet:6 AS base

FROM mcr.microsoft.com/dotnet/sdk:6.0-bullseye-slim as build
WORKDIR /src
COPY ["AWSServerless.csproj", "AWSServerless/"]
RUN dotnet restore "AWSServerless/AWSServerless.csproj"

WORKDIR "/src/AWSServerless"
COPY . .
RUN dotnet build "AWSServerless.csproj" --configuration Release --output /app/build

FROM build AS publish 

RUN apt-get update \
 && apt-get install -y apt-utils libgdiplus libc6-dev \
 && apt-get install -y ffmpeg

RUN dotnet publish "AWSServerless.csproj" \
 --configuration Release \ 
 --runtime linux-x64 \
 --self-contained false \ 
 --output /app/publish \
 -p:PublishReadyToRun=true 

FROM base AS final
WORKDIR /var/task

CMD ["AWSServerless::AWSServerless.LambdaEntryPoint::FunctionHandlerAsync"]
COPY --from=publish /app/publish .