
Advanced search
Other articles (86)
-
Le profil des utilisateurs
12 April 2011, byChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...) -
Configurer la prise en compte des langues
15 November 2010, byAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...) -
XMP PHP
13 May 2011, byDixit Wikipedia, XMP signifie :
Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)
On other websites (8389)
-
How to send written frames in real time/synchronized with FFmpeg and UDP?
20 June 2018, by potu1304I wanted to nearly live stream my Unit game with FFmpeg to a simple client. I have one Unity game in which each frame is saved as an jpg image. These images are wrapped in ffmpeg and send over udp to a simple c# client where I use ffplay to play the stream. The problem is, that FFmpeg is wrapping the images way faster than the unity app can write them. So ffmpeg quits but Unity is still writing frames. Is there a way to set ffmpeg in a loop to wait for the next image or can I somehow make a for loop without call every time ffmpeg?
Here is my function from my capturing script in Unity:
Process process;
//BinaryWriter _stdin;
public void encodeFrame()
{
ProcessStartInfo startInfo = new ProcessStartInfo();
var basePath = Application.streamingAssetsPath + "/FFmpegOut/Windows/ffmpeg.exe";
info.Arguments = "-re -i screen_%d.jpg -vcodec libx264 -r 24 -f mpegts udp://127.0.0.1:1100";
info.RedirectStandardOutput = true;
info.RedirectStandardInput = true;
info.RedirectStandardError = true;
info.CreateNoWindow = true;
info.UseShellExecute = false;
info.RedirectStandardError = true;
UnityEngine.Debug.Log(string.Format(
"Executing \"{0}\" with arguments \"{1}\".\r\n",
info.FileName,
info.Arguments));
process = Process.Start(info);
//_stdin = new BinaryWriter(process.StandardInput.BaseStream);
process.WaitForExit();
var outputReader = process.StandardError;
string Error = outputReader.ReadToEnd();
UnityEngine.Debug.Log(Error);
}And here the function from my cs file from my simple windowsform application:
private void xxxFFplay()
{
text = "start";
byte[] send_buffer = Encoding.ASCII.GetBytes(text);
sock.SendTo(send_buffer, endPoint);
ffplay.StartInfo.FileName = "ffplay.exe";
ffplay.StartInfo.Arguments = "udp://127.0.0.1:1100";
ffplay.StartInfo.CreateNoWindow = true;
ffplay.StartInfo.RedirectStandardOutput = true;
ffplay.StartInfo.UseShellExecute = false;
ffplay.EnableRaisingEvents = true;
ffplay.OutputDataReceived += (o, e) => Debug.WriteLine(e.Data ?? "NULL", "ffplay");
ffplay.ErrorDataReceived += (o, e) => Debug.WriteLine(e.Data ?? "NULL", "ffplay");
ffplay.Exited += (o, e) => Debug.WriteLine("Exited", "ffplay");
ffplay.Start();
Thread.Sleep(500); // you need to wait/check the process started, then...
// child, new parent
// make 'this' the parent of ffmpeg (presuming you are in scope of a Form or Control)
//SetParent(ffplay.MainWindowHandle, this.panel1.Handle);
// window, x, y, width, height, repaint
// move the ffplayer window to the top-left corner and set the size to 320x280
//MoveWindow(ffplay.MainWindowHandle, -5, -300, 320, 280, true);
}Does have somebody some ideas? I am really stuck at this to create a somehow "live" stream.
Best regards
-
How to detect the real orientation of a video recorded by mobile with "auto rotate" disabled
28 October 2022, by FlamingMoeWhen you record a video, the rotation metadata has 4 possible values (0, 90, 180 and 270) and it stores how the device was held when starting the recording.


But mobile phones has also a feature to enable or disable "auto rotation" of screen.


It's very typical, nowadays, to have that feature disable, and many people record videos with the phone horizontally, but since the "auto rotation" is disable, the metadata stores like the video was recorded as vertically.


How to handle this?


-
how to make ffmpeg transcoding from mjpeg to h264 in real time?
3 August 2020, by SolskGaerI have a mjpeg stream which output picture 10FPS(cellphone screenshot), and I use the following command to trancode it to a h264 stream and play it on my laptop


ffmpeg -f mjpeg -r 20 -y -i http://127.0.0.1:53293/ -vcodec libx264 -preset veryfast -profile:v baseline -b:v 1024k -r 10 -f h264 pipe:1 | ffplay -i pipe:0



but the output stream is a few seconds behind the cellphone screen. Here is the output of ffmpeg


ffplay version 4.3.1 Copyright (c) 2003-2020 the FFmpeg developers
 built with Apple clang version 11.0.3 (clang-1103.0.32.62)
 configuration: --prefix=/usr/local/Cellar/ffmpeg/4.3.1 --enable-shared --enable-pthreads --enable-version3 --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libdav1d --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librtmp --enable-libspeex --enable-libsoxr --enable-videotoolbox --disable-libjack --disable-indev=jack
ffmpeg version 4.3.1 Copyright (c) 2000-2020 the FFmpeg developers
 built with Apple clang version 11.0.3 (clang-1103.0.32.62)
 configuration: --prefix=/usr/local/Cellar/ffmpeg/4.3.1 --enable-shared --enable-pthreads --enable-version3 --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libdav1d --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librtmp --enable-libspeex --enable-libsoxr --enable-videotoolbox --disable-libjack --disable-indev=jack
 libavutil 56. 51.100 / 56. 51.100
 libavcodec 58. 91.100 / 58. 91.100
 libavformat 58. 45.100 / 58. 45.100
 libavdevice 58. 10.100 / 58. 10.100
 libavfilter 7. 85.100 / 7. 85.100
 libavresample 4. 0. 0 / 4. 0. 0
 libswscale 5. 7.100 / 5. 7.100
 libswresample 3. 7.100 / 3. 7.100
 libpostproc 55. 7.100 / 55. 7.100
 libavutil 56. 51.100 / 56. 51.100
 libavcodec 58. 91.100 / 58. 91.100
 libavformat 58. 45.100 / 58. 45.100
 libavdevice 58. 10.100 / 58. 10.100
 libavfilter 7. 85.100 / 7. 85.100
 libavresample 4. 0. 0 / 4. 0. 0
 libswscale 5. 7.100 / 5. 7.100
 libswresample 3. 7.100 / 3. 7.100
 libpostproc 55. 7.100 / 55. 7.100
Input #0, mjpeg, from 'http://127.0.0.1:53293/':
 Duration: N/A, bitrate: N/A
 Stream #0:0: Video: mjpeg (Baseline), yuvj420p(pc, bt470bg/unknown/unknown), 750x1334 [SAR 144:144 DAR 375:667], 20 tbr, 1200k tbn, 20 tbc
Stream mapping:
 Stream #0:0 -> #0:0 (mjpeg (native) -> h264 (libx264))
Press [q] to stop, [?] for help
[libx264 @ 0x7f8f2900da00] using SAR=1/1
[libx264 @ 0x7f8f2900da00] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 @ 0x7f8f2900da00] profile Constrained Baseline, level 3.2, 4:2:0, 8-bit
Output #0, h264, to 'pipe:1':
 Metadata:
 encoder : Lavf58.45.100
 Stream #0:0: Video: h264 (libx264), yuvj420p(pc), 750x1334 [SAR 144:144 DAR 375:667], q=-1--1, 1024 kb/s, 10 fps, 10 tbn, 10 tbc
 Metadata:
 encoder : Lavc58.91.100 libx264
 Side data:
 cpb: bitrate max/min/avg: 0/0/1024000 buffer size: 0 vbv_delay: N/A
frame= 30 fps=5.2 q=29.0 size= 97kB time=00:00:00.10 bitrate=7911.0kbits/s dup=0 drop=26 speed=0.0172
frame= 33 fps=5.2 q=41.0 size= 190kB time=00:00:00.40 bitrate=3888.2kbits/s dup=0 drop=29 speed=0.0633
frame= 35 fps=5.1 q=40.0 size= 253kB time=00:00:00.60 bitrate=3457.8kbits/s dup=0 drop=31 speed=0.0874
frame= 38 fps=5.1 q=40.0 size= 330kB time=00:00:00.90 bitrate=3006.1kbits/s dup=0 drop=34 speed=0.122x
frame= 40 fps=5.1 q=40.0 size= 381kB time=00:00:01.10 bitrate=2838.3kbits/s dup=0 drop=36 speed=0.139x
frame= 43 fps=5.1 q=39.0 size= 460kB time=00:00:01.40 bitrate=2691.3kbits/s dup=0 drop=39 speed=0.166x
frame= 45 fps=5.0 q=39.0 size= 505kB time=00:00:01.60 bitrate=2585.9kbits/s dup=0 drop=41 speed=0.178x
frame= 48 fps=5.0 q=38.0 size= 552kB time=00:00:01.90 bitrate=2381.0kbits/s dup=0 drop=44 speed= 0.2x 
frame= 50 fps=5.0 q=33.0 size= 562kB time=00:00:02.10 bitrate=2190.7kbits/s dup=0 drop=46 speed=0.209x
frame= 53 fps=5.0 q=37.0 size= 572kB time=00:00:02.40 bitrate=1951.2kbits/s dup=0 drop=49 speed=0.227x
frame= 55 fps=5.0 q=37.0 size= 581kB time=00:00:02.60 bitrate=1830.0kbits/s dup=0 drop=51 speed=0.234x
frame= 58 fps=5.0 q=36.0 size= 591kB time=00:00:02.90 bitrate=1670.8kbits/s dup=0 drop=54 speed=0.25x 
frame= 60 fps=4.9 q=35.0 size= 598kB time=00:00:03.10 bitrate=1580.2kbits/s dup=0 drop=56 speed=0.255x
frame= 63 fps=5.0 q=35.0 size= 608kB time=00:00:03.40 bitrate=1465.2kbits/s dup=0 drop=59 speed=0.268x
frame= 65 fps=4.9 q=34.0 size= 613kB time=00:00:03.60 bitrate=1395.7kbits/s dup=0 drop=61 speed=0.273x
frame= 68 fps=5.0 q=34.0 size= 659kB time=00:00:03.90 bitrate=1384.5kbits/s dup=0 drop=64 speed=0.284x
frame= 70 fps=4.9 q=34.0 size= 682kB time=00:00:04.10 bitrate=1363.2kbits/s dup=0 drop=66 speed=0.288x
frame= 73 fps=4.9 q=35.0 size= 720kB time=00:00:04.40 bitrate=1340.1kbits/s dup=0 drop=69 speed=0.298x
frame= 75 fps=4.9 q=35.0 size= 746kB time=00:00:04.60 bitrate=1329.1kbits/s dup=0 drop=71 speed=0.301x
frame= 78 fps=4.9 q=36.0 size= 786kB time=00:00:04.90 bitrate=1314.6kbits/s dup=0 drop=74 speed=0.31x 
frame= 80 fps=4.9 q=36.0 size= 811kB time=00:00:05.10 bitrate=1303.3kbits/s dup=0 drop=76 speed=0.312x



I don't think the CPU of my laptop is the bottleneck of this process, here is the spec of my laptop



 Model Name: MacBook Pro
 Model Identifier: MacBookPro16,1
 Processor Name: 6-Core Intel Core i7
 Processor Speed: 2.6 GHz
 Number of Processors: 1
 Total Number of Cores: 6
 L2 Cache (per Core): 256 KB
 L3 Cache: 12 MB
 Hyper-Threading Technology: Enabled
 Memory: 16 GB



but I can't track the reason why this happened. What can I make the output stream be real time? Any suggestion would be appreciated.