
Recherche avancée
Médias (91)
-
999,999
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Slip - Artworks
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
-
Demon seed (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
The four of us are dying (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Corona radiata (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Lights in the sky (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
Autres articles (11)
-
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...) -
Ajouter notes et légendes aux images
7 février 2011, parPour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
Modification lors de l’ajout d’un média
Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...) -
Gestion générale des documents
13 mai 2011, parMédiaSPIP ne modifie jamais le document original mis en ligne.
Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)
Sur d’autres sites (3241)
-
Live AAC and H264 data into live stream
10 mai 2024, par tzulegerI have a remote camera that captures H264 encoded video data and AAC encoded audio data, places the data into a custom ring buffer, which then is sent to a Node.js socket server, where the packet of information is detected as audio or video and then handled accordingly. That data should turn into a live stream, the protocol doesn't matter, but the delay has to be around 4 seconds and can be played on iOS and Android devices.


After reading hundreds of pages of documentation, questions, or solutions on the internet, I can't seem to find anything about handling two separate streams of AAC and H264 data to create a live stream.


Despite attempting many different ways of achieving this goal, even having a working implementation of HLS, I want to revisit ALL options of live streaming, and I am hoping someone out there can give me advice or guidance to specific documentation on how to achieve this goal.


To be specific, this is our goal :


- 

- Stream AAC and H264 data from remote cellular camera to a server which will do some work on that data to live stream to one user (possibly more users in the future) on a mobile iOS or Android device
- Delay of the live stream should be a maximum of 4 seconds, if the user has bad signal, then a longer delay is okay, as we obviously cannot do anything about that.
- We should not have to re-encode our data. We've explored WebRTC, but that requires OPUS audio packets and thus requires us to re-encode the data, which would be expensive for our server to run.








Any and all help, ranging from re-visiting an old approach we took to exploring new ones, is appreciated.


I can provide code snippets as well for our current implementation of LLHLS if it helps, but I figured this post is already long enough.


I've tried FFmpeg with named pipes, I expected it to just work, but FFmpeg kept blocking on the first named pipe input. I thought of just writing the data out to two files and then using FFmpeg, but it's continuous data and I don't have enough knowledge on FFmpeg on how I could use that type of implementation to create one live stream.


I've tried implementing our own RTSP server on the camera using Gstreamer (our camera had its RTSP server stripped out, wasn't my call) but the camera's flash storage cannot handle having GStreamer on it, so that wasn't an option.


My latest attempt was using a derivation of hls-parser to create an HLS manifest and mux.js to create MP4 containers for
.m4s
fragmented mp4 files and do an HLS live stream. This was my most successful attempt, where we successfully had a live stream going, but the delay was up to 16 seconds, as one would expect with HLS live streaming. We could drop the target duration down to 2 seconds and get about 6-8 seconds delay, but this could be unreliable, as these cameras could have no signal making it relatively expensive to send so many IDR frames with such low bandwidth.

With the delay being the only factor left, I attempted to upgrade the implementation to support Apple's Low Latency HLS. It seems to work, as the right partial segments are getting requested and everything that makes LLHLS is working as intended, but the delay isn't going down when played on iOS' native AVPlayer, as a matter of fact, it looks like it worsened.


I would also like to disclaim, my knowledge on media streaming is fairly limited. I've learned most of what I speak of in this post over the past 3 months by reading RFCs, documentation, and stackoverflow/reddit questions and answers. If anything appears to be confusing, it might be just my lack of understanding of it.


-
How to improve Desktop capture performance and quality with ffmpeg [closed]
6 novembre 2024, par Francesco BramatoI'm developing a game capture feature from my Electron app. I'm working on this since a while and tried a lot of different parameters combinations, now i'm running out of ideas :)


I've read tons of ffmpeg documentation, SO posts, other sites, but i'm not really a ffmpeg expert or video editing pro.


This is how it works now :


The app spawn an ffmpeg command based on user's settings :


- 

- Output format (mp4, mkv, avi)
- Framerate (12, 24, 30, 60)
- Codec (X264, NVidia NVENC, AMD AMF)
- Bitrate (from 1000 to 10000kpbs)
- Presets (for X264)
- Audio output (a dshow device like StereoMix or VB-Cable) and Audio input (a dshow device like the Microphone)
- Final Resolution (720p, 1080p, 2K, Original Size)
















The command executed, as far, is :


ffmpeg.exe -nostats -hide_banner -hwaccel cuda -hwaccel_output_format cuda -f gdigrab -draw_mouse 0 -framerate 60 -offset_x 0 -offset_y 0 -video_size 2560x1440 -i desktop -f dshow -i audio=@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{D61FA53D-FA37-4BE7-BE2F-4005F94790BB} -ar 44100 -colorspace bt709 -color_trc bt709 -color_primaries bt709 -c:v h264_nvenc -b:v 6000k -preset slow -rc cbr -profile:v high -g 60 -acodec aac -maxrate 6000k -bufsize 12000k -pix_fmt yuv420p -f mpegts -



one of the settings is the recording mode : full game session or replay buffer.
In case of full game session, the output is a file, for replay buffer is stdout.


The output format is mpegts because, as far i have read in a lot of places, the video stream can be cut in any moment.


Replays are cutted with different past and future duration based on game events.


In full game session, the replays are cutted directly from the mpegts.


In replay buffer mode, the ffmpeg stdout is redirect to the app that record the buffer (1 or 2 minutes), when the replay must be created, the app saves on the disk the buffer section according to past and future duration and with another ffmpeg command, copy it to a mp4 or mkv final file.


Generally speaking, this works reliably.


There are few issues :


- 

- nonetheless i ask ffmpeg to capture at 60fps, the final result is at 30fps (using
-r 60
will speed up the final result) - some user has reported FPS drops in-game, specially when using NVidia NVENC (and having a NVIDIA GPU), using X264 seems save some FPS
- colors are strange compared to original, what i see on screen, they seem washed out - i could have solved this using
-colorspace bt709 -color_trc bt709 -color_primaries bt709
but don't know if is the right choice - NVIDIA NVenc with any other preset that is not
slow
creates videos terribly laggy










here two examples, 60 FPS, NVIDIA NVENC (slow, 6000kbs, MP4


Recorded by my app : https://www.youtube.com/watch?v=Msm62IwHdlk


Recorded by OB with nearly same settings : https://youtu.be/WuHoLh26W7E


Hope someone can help me


Thanks !


-
Frames and Size Increase as Speed Reduces When Ffmpeg Converts HLS to MPEGTS
28 juillet 2021, par Mikeyy10I've been trying my hands on programmatically using ffmpeg to convert HTTP live streams to mpegts. All of the HLS I've tried have been on remote servers and everything worked fine until this particular feed. As you can see in the logs, the number of frames keeps increasing by 150 and the size increases as a result. At the same time, the speed reduces. This is the code I used :
ffmpeg -i http://xxxxxxxx-xxxxxxx-xx.m3u8 -y -r 100 -vcodec libx264 -f mpegts -preset ultrafast -tune zerolatency -t 00:10:00.500 /home/xxxx-xxxxxx/app/backend/queues/../files/output.mpegts

I have tried different fps starting with 30 and upward to 100. I noticed that the number of frames and the size continue to increase in direct proportion to the fps. Is there a better way to reencode this sort of stream. I'm using fluent-ffmpeg and I stiil haven't figured out how to apply thenobuffer
option but I'm not sure that'll solve the problem anyway.

[hls,applehttp @ 0x1d97f60] Opening 'http://xxxxxxxxxxx-xxx99.ts' for reading
Input #0, hls,applehttp, from 'http://xxxxxxxxxxxxxxxx-xxxxxxxx.m3u8':
 Duration: N/A, start: 28505.278111, bitrate: N/A
 Program 0 
 Metadata:
 variant_bitrate : 0
 Stream #0:0: Video: h264 (Constrained Baseline) ([27][0][0][0] / 0x001B), yuv420p, 320x240 [SAR 1:1 DAR 4:3], 25 fps, 25 tbr, 90k tbn, 50 tbc
 Metadata:
 variant_bitrate : 0
Stream mapping:
 Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
Press [q] to stop, [?] for help
[libx264 @ 0x1dc0ea0] using SAR=1/1
[libx264 @ 0x1dc0ea0] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 AVX2 LZCNT BMI2
[libx264 @ 0x1dc0ea0] profile Constrained Baseline, level 3.0
Output #0, mpegts, to '/home/xxxxx-xxxxxx-xxx/app/backend/queues/../files/output.mpegts':
 Metadata:
 encoder : Lavf57.83.100
 Stream #0:0: Video: h264 (libx264), yuv420p, 320x240 [SAR 1:1 DAR 4:3], q=-1--1, 100 fps, 90k tbn, 100 tbc
 Metadata:
 variant_bitrate : 0
 encoder : Lavc57.107.100 libx264
 Side data:
 cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
[hls,applehttp @ 0x1d97f60] Opening 'http://xxxxxxxxxxxxxxxx-xxxxx100.ts' for reading
frame= 149 fps=0.0 q=24.0 size= 239kB time=00:00:05.92 bitrate= 331.0kbits/s speed=9.52x 
[hls,applehttp @ 0x1d97f60] Opening 'http://xxxxxxxxxxxxxxxx-xxxxx101.ts' for reading
frame= 299 fps= 59 q=25.0 size= 256kB time=00:00:11.92 bitrate= 175.9kbits/s speed=2.35x 
[hls,applehttp @ 0x1d97f60] Opening 'http://xxxxxxxxxxxxxxxx-xxxxx102.ts' for reading
frame= 449 fps= 43 q=24.0 size= 512kB time=00:00:17.92 bitrate= 234.1kbits/s speed=1.72x 
[hls,applehttp @ 0x1d97f60] Opening 'http://xxxxxxxxxxxxxxxx-xxxxx103.ts' for reading
frame= 599 fps= 38 q=22.0 size= 768kB time=00:00:23.92 bitrate= 263.0kbits/s speed=1.52x 
[hls,applehttp @ 0x1d97f60] Opening 'http://xxxxxxxxxxxxxxxx-xxxxx104.ts' for reading
frame= 749 fps= 35 q=22.0 size= 1024kB time=00:00:29.92 bitrate= 280.4kbits/s speed=1.42x 
[hls,applehttp @ 0x1d97f60] Opening 'http://xxxxxxxxxxxxxxxx-xxxxx105.ts' for reading
frame= 899 fps= 28 q=23.0 size= 1280kB time=00:00:35.92 bitrate= 291.9kbits/s speed=1.12x 
[hls,applehttp @ 0x1d97f60] Opening 'http://xxxxxxxxxxxxxxxx-xxxxx106.ts' for reading
frame= 1049 fps= 25 q=23.0 size= 1536kB time=00:00:41.92 bitrate= 300.2kbits/s speed=0.998x 
[hls,applehttp @ 0x1d97f60] Opening 'http://xxxxxxxxxxxxxxxx-xxxxx107.ts' for reading
frame= 1199 fps= 28 q=24.0 size= 1792kB time=00:00:47.92 bitrate= 306.3kbits/s speed=1.11x 
[hls,applehttp @ 0x1d97f60] Opening 'http://xxxxxxxxxxxxxxxx-xxxxx108.ts' for reading
frame= 1349 fps= 28 q=25.0 size= 2048kB time=00:00:53.92 bitrate= 311.2kbits/s speed=1.11x 
Past duration 0.999992 too large
 Last message repeated 6 times
[hls,applehttp @ 0x1d97f60] Opening 'http://xxxxxxxxxxxxxxxx-xxxxx109.ts' for reading
frame= 1499 fps= 28 q=23.0 size= 2048kB time=00:00:59.64 bitrate= 281.3kbits/s speed=1.11x 
[hls,applehttp @ 0x1d97f60] Opening 'http://xxxxxxxxxxxxxxxx-xxxxx110.ts' for reading
frame= 1649 fps= 28 q=24.0 size= 2304kB time=00:01:05.64 bitrate= 287.5kbits/s speed=1.11x 
[http @ 0x23b86e0] HTTP error 404 Not Found
[hls,applehttp @ 0x1d97f60] Failed to reload playlist 0
[http @ 0x23b86e0] HTTP error 404 Not Found
[hls,applehttp @ 0x1d97f60] Failed to reload playlist 0
frame= 1799 fps= 13 q=24.0 size= 2560kB time=00:01:11.64 bitrate= 292.7kbits/s speed=0.523x 
[http @ 0x23b86e0] HTTP error 404 Not Found
[hls,applehttp @ 0x1d97f60] Failed to reload playlist 0
frame= 1800 fps= 13 q=24.0 size= 2560kB time=00:01:11.68 bitrate= 292.6kbits/s speed=0.514x 
[http @ 0x1e34ea0] HTTP error 404 Not Found
[hls,applehttp @ 0x1d97f60] Failed to reload playlist 0
frame= 1800 fps= 13 q=24.0 size= 2560kB time=00:01:11.68 bitrate= 292.6kbits/s speed=0.512x 
frame= 1800 fps= 13 q=24.0 Lsize= 2756kB time=00:01:11.68 bitrate= 315.0kbits/s speed=0.512x 
video:2359kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 16.855715%
[libx264 @ 0x1dc0ea0] frame I:8 Avg QP:20.62 size: 21713
[libx264 @ 0x1dc0ea0] frame P:1792 Avg QP:23.84 size: 1251
[libx264 @ 0x1dc0ea0] mb I I16..4: 100.0% 0.0% 0.0%
[libx264 @ 0x1dc0ea0] mb P I16..4: 0.2% 0.0% 0.0% P16..4: 32.8% 0.0% 0.0% 0.0% 0.0% skip:67.0%
[libx264 @ 0x1dc0ea0] coded y,uvDC,uvAC intra: 83.7% 62.6% 40.9% inter: 12.7% 15.3% 7.9%
[libx264 @ 0x1dc0ea0] i16 v,h,dc,p: 26% 29% 31% 14%
[libx264 @ 0x1dc0ea0] i8c dc,h,v,p: 58% 20% 14% 8%
[libx264 @ 0x1dc0ea0] kb/s:1073.41