
Recherche avancée
Médias (1)
-
Sintel MP4 Surround 5.1 Full
13 mai 2011, par
Mis à jour : Février 2012
Langue : English
Type : Video
Autres articles (97)
-
Les sons
15 mai 2013, par -
Automated installation script of MediaSPIP
25 avril 2011, parTo overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
The documentation of the use of this installation script is available here.
The code of this (...) -
MediaSPIP en mode privé (Intranet)
17 septembre 2013, parÀ partir de la version 0.3, un canal de MediaSPIP peut devenir privé, bloqué à toute personne non identifiée grâce au plugin "Intranet/extranet".
Le plugin Intranet/extranet, lorsqu’il est activé, permet de bloquer l’accès au canal à tout visiteur non identifié, l’empêchant d’accéder au contenu en le redirigeant systématiquement vers le formulaire d’identification.
Ce système peut être particulièrement utile pour certaines utilisations comme : Atelier de travail avec des enfants dont le contenu ne doit pas (...)
Sur d’autres sites (9385)
-
Making a movie out of pictures in correct order
6 novembre 2022, par astrogabShort version


How can one combine files
img1000.png
,img5000.png
,img10000.png
,img11000.png
in the right order into a movie ?

Longer version


I am using ffmpeg to make a movie out of snapshots of a simulation. There should be for instance 5 images per second. The names are :


image0200.png
image0300.png
image0400.png
image0500.png
image1000.png
image1500.png
image2000.png
...
image8500.png
image9000.png
image9500.png
image10000.png
image15000.png



i.e., they are sequential but there are irregular gaps in the numbers. The numbers are formatted according to
'%04d'
but go above 9999. I have tried

ffmpeg -y -loglevel debug -nostats \
-r:v 5 -thread_queue_size 1024 -f image2 \
-pattern_type glob -i "*[0-9][0-9][0-9][0-9].png" \
-r:v 30 -preset veryslow -pix_fmt yuv420p -crf 28 \
-an AMDG.mp4



and many, many other variations but either only two frames end up being visible in the movie (even though the images are found when using -debug) or only the files up to
image9500.png
are used (andglob
does not seem to allow[0-9]{4,}
as for regex), or, with

ffmpeg -y -loglevel debug -nostats \
 -r:v 5 \
 -thread_queue_size 1024 -f image2 -pattern_type glob \
 -i "image[0-9][0-9][0-9][0-9].png" \
 -r:v 5 \
 -thread_queue_size 1024 -f image2 -pattern_type glob \
 -i "image[1-9][0-9][0-9][0-9][0-9].png" \
 -r:v 30 -preset veryslow -pix_fmt yuv420p -crf 28 \
 -map 0 -map 1 \
 -an AMDG.mp4



there are apparently two streams in the output movie and only one of them is being played. (I realised in the process
-map 0 -map 1
was needed in order for both input streams to be used.)

In one of the variations of options I found (now I have lost what it was exactly !) all images were included but the order was not the desired one :
image1000.png
was shown beforeimage10000.png
. Apparently a newer version of ffmpeg (I haveffmpeg version 3.4.8-0ubuntu0.2
) has the ability to sort likesort -V
, so thatimage10000
come afterimage1000
, but reinstalling ffmpeg is in general not a practical option. Also renaming the files is not practical and creating e.g. soft links with sequential names in the format '%05d' starting at 0 and in steps of 1 (so that-i '%05d'
could be used) is of course not elegant.

With the
-concat
filter as in https://unix.stackexchange.com/questions/77016/ffmpeg-pattern-type-glob-not-loading-files-in-correct-order, i.e.,

ffmpeg -y -loglevel debug -nostats -r:v 5 \
 -thread_queue_size 1024 -f image2 -f concat \
 -safe 0 -i <(find . -maxdepth 1 -regex 'image*.png' \
 -exec echo "file $(pwd)/"{} \; | sort -V) \
 -r:v 30 -codec:v libx264 -preset veryslow -pix_fmt yuv420p -crf 28 \
 -an \
 AMDG.mp4



the processing took a long time and made the whole system sluggish, while producing a movie of 60 kB showing only two different images.


I have the impression that there are several issues at once... Thanks if you can help !


-
CJEU rules US cloud servers don’t comply with GDPR and what this means for web analytics
17 juillet 2020, par Jake Thornton -
Sending Pictures over Network with Unity C#
30 septembre 2020, par TobiGG B.I am currently working with the HoloLens and thus Unity. The idea is to make a webcam Livestream with as little delay as possible to an instance of FFmpeg on a different device for encoding. However, I run into some issues.


My first problem is the Conversion from a Color32 Array to a Byte Array. It creates a huge amount of bandwidth, especially when I am trying to record and stream with 30fps and a 1280x720 resolution.
Here is some code :


IEnumerator Init()
 {
 webCam = new WebCamTexture(1280, 720, 30);
 webCam.Play();
 image.texture = webCam;
 Debug.Log("WebCam Width: " + webCam.width);
 Debug.Log("WebCam Height: " + webCam.height);
 currentTexture = new Texture2D(webCam.width, webCam.height);
 
 data = new Color32[webCam.width * webCam.height];
 listener = new TcpListener(IPAddress.Any, 10305);
 listener.Start();



 while (webCam.width < 100)
 {
 
 yield return null;
 }

 StartCoroutine("Recording");
 }

 WaitForEndOfFrame endOfFrame = new WaitForEndOfFrame();

 IEnumerator Recording()
 {

 TcpClient tcpClient = null;
 NetworkStream tcpStream = null;
 bool isConnected = false;

 Loom.RunAsync(() =>
 {
 while(!stopCapture)
 {
 tcpClient = listener.AcceptTcpClient();
 Debug.Log("Client Connected!");
 isConnected = true;
 tcpStream = tcpClient.GetStream();
 }
 });
 while(!isConnected)
 {
 yield return endOfFrame;
 }
 readyToGetFrame = true;
 byte[] messageLength = new byte[SEND_RECEIVE_COUNT];
 while(!stopCapture)
 {
 yield return endOfFrame;

 webCam.GetPixels32(data);
 byte[] webCamBytes = Utilities.Color32ArrayToByteArray(data);
 
 readyToGetFrame = false;
 Loom.RunAsync(() => {
 tcpStream.Write(webCamBytes, 0, webCamBytes.Length);
 readyToGetFrame = true;
 });

 while (!readyToGetFrame)
 {
 yield return endOfFrame;
 }
 }
 }



public static class Utilities
{
 public static byte[] Color32ArrayToByteArray(Color32[] colors)
 {
 if (colors == null || colors.Length == 0)
 {
 return null;
 }

 int lengthOfColor32 = Marshal.SizeOf(typeof(Color32));
 int byteArrayLength = lengthOfColor32 * colors.Length;
 byte[] bytes = new byte[byteArrayLength];

 GCHandle handle = default(GCHandle);

 try
 {
 handle = GCHandle.Alloc(colors, GCHandleType.Pinned);
 IntPtr ptr = handle.AddrOfPinnedObject();
 Marshal.Copy(ptr, bytes, 0, byteArrayLength);
 }
 finally
 {
 if(handle != default(GCHandle))
 {
 handle.Free();
 }
 }

 return bytes;
 }
}



To fix this problem, I tried to use EncodePNG and EncodeJPG instead. But there is my second problem. If I use these methods, I got a big loss of performance.


So instead of this


webCam.GetPixels32(data);
 byte[] webCamBytes = Utilities.Color32ArrayToByteArray(data);



I use this


currentTexture.SetPixels(webCam.GetPixels());
 byte[] webCamBytes = currentTexture.EncodeToJPG();



Thus my question :
Is there any way to reduce the bandwith by maybe cutting off the alpha values as they are not needed ? Or is there another good way such as converting to a different color or picture format ?


Because right now I ran out of usable ideas and feel kinda stuck. Might just be me though ;-)


PS : Using external libraries is a tad difficult as it needs to run on HoloLens and due to other regulations.


Thank you in advance !