
Recherche avancée
Médias (91)
-
Spitfire Parade - Crisis
15 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Wired NextMusic
14 mai 2011, par
Mis à jour : Février 2012
Langue : English
Type : Video
-
Video d’abeille en portrait
14 mai 2011, par
Mis à jour : Février 2012
Langue : français
Type : Video
-
Sintel MP4 Surround 5.1 Full
13 mai 2011, par
Mis à jour : Février 2012
Langue : English
Type : Video
-
Carte de Schillerkiez
13 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (63)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)
Sur d’autres sites (6224)
-
lavc/rawdec : Use AV_PIX_FMT_PAL8 for raw 1 bpp video in AVI
28 janvier 2016, par Mats Petersonlavc/rawdec : Use AV_PIX_FMT_PAL8 for raw 1 bpp video in AVI
From
https://msdn.microsoft.com/en-us/library/windows/desktop/dd318229%28v=vs.85%29.aspx:"If biCompression equals BI_RGB and the bitmap uses 8 bpp or less, the
bitmap has a color table immediatelly following the BITMAPINFOHEADER
structure. The color table consists of an array of RGBQUAD values. The
size of the array is given by the biClrUsed member. If biClrUsed is
zero, the array contains the maximum number of colors for the given
bitdepth ; that is, 2^biBitCount colors."Nothing about "monochrome" here. Unfortunately, pal8 to monow conversion
seems a bit flaky, but that’s another story.Signed-off-by : Michael Niedermayer <michael@niedermayer.cc>
-
.NET Process - Redirect stdin and stdout without causing deadlock
21 juillet 2017, par user1150856I’m trying to encode a video file with FFmpeg from frames generated by my program and then redirect the output of FFmpeg back to my program to avoid having an intermediate video file.
However, I’ve run into what seems to be a rather common problem when redirecting outputs in System.Diagnostic.Process, mentioned in the remarks of the documentation here, which is that it causes a deadlock if run synchronously.
After tearing my hair out over this for a day, and trying several proposed solutions found online, I still cannot find a way to make it work. I get some data out, but the process always freezes before it finishes.
Here is a code snippet that produces said problem :
static void Main(string[] args)
{
Process proc = new Process();
proc.StartInfo.FileName = @"ffmpeg.exe";
proc.StartInfo.Arguments = String.Format("-f rawvideo -vcodec rawvideo -s {0}x{1} -pix_fmt rgb24 -r {2} -i - -an -codec:v libx264 -preset veryfast -f mp4 -movflags frag_keyframe+empty_moov -",
16, 9, 30);
proc.StartInfo.UseShellExecute = false;
proc.StartInfo.RedirectStandardInput = true;
proc.StartInfo.RedirectStandardOutput = true;
FileStream fs = new FileStream(@"out.mp4", FileMode.Create, FileAccess.Write);
//read output asynchronously
using (AutoResetEvent outputWaitHandle = new AutoResetEvent(false))
{
proc.OutputDataReceived += (sender, e) =>
{
if (e.Data == null)
{
outputWaitHandle.Set();
}
else
{
string str = e.Data;
byte[] bytes = new byte[str.Length * sizeof(char)];
System.Buffer.BlockCopy(str.ToCharArray(), 0, bytes, 0, bytes.Length);
fs.Write(bytes, 0, bytes.Length);
}
};
}
proc.Start();
proc.BeginOutputReadLine();
//Generate frames and write to stdin
for (int i = 0; i < 30*60*60; i++)
{
byte[] myArray = Enumerable.Repeat((byte)Math.Min(i,255), 9*16*3).ToArray();
proc.StandardInput.BaseStream.Write(myArray, 0, myArray.Length);
}
proc.WaitForExit();
fs.Close();
Console.WriteLine("Done!");
Console.ReadKey();
}Currently i’m trying to write the output to a file anyway for debugging purposes, but this is not how the data will eventually be used.
If anyone knows a solution it would be very much appreciated.
-
Interesting behavior in Media Source Extensions
28 mai 2020, par newtonian_figI'm trying to build a fairly standard video player using Media Source Extensions ; however, I want the user to be able to control when the player moves on to a new video segment. For example, we might see the following behavior :



- 

- Video player plays 1st segment
- Source Buffer runs out of data causing the video to appear paused
- When the user is ready, they click a button that adds the 2nd segment to the Source Buffer
- The video continues by playing the 2nd segment











This works well, except that when the video appears paused during step 2 it doesn't stop at the last frame of the 1st segment. Instead, it stops two frames before the end of the 1st segment. Those last two frames aren't being dropped, they just get played after the user clicks the button to advance the video. This is an issue for my application, and I'm trying to figure out a way to make sure all of the frames from the 1st segment get played before the end of step 2.



I suspect that these last two frames are getting held up in the video decoder buffer. Especially since calling endOfStream() on my Media Source after adding the 1st segment to the Source Buffer causes the 1st segment to play all the way through with no frames left behind.



Additional Info



- 

- I created each video segment file from a series of PNGs using the following ffmpeg command





ffmpeg -i %04d.png -movflags frag_keyframe+empty_moov+default_base_moof video_segment.mp4



- 

- Maybe this is a clue ? End of stream situations not handled correctly (last frames are dropped)
- Another interesting thing to note is that if the video only has 2 frames or less, MSE doesn't play it at all.
- The browser I'm using is Chrome. The code for my MSE player is just taken from the Google Developers example, but I'll post it here for completeness. This code only covers up to step 2 since that's where the issue is.









<code class="echappe-js"><script>&#xA;const mediaSource = new MediaSource();&#xA;video.src = URL.createObjectURL(mediaSource);&#xA;mediaSource.addEventListener(&#x27;sourceopen&#x27;, sourceOpen, { once: true });&#xA;&#xA;function sourceOpen() {&#xA; URL.revokeObjectURL(video.src);&#xA; const sourceBuffer = mediaSource.addSourceBuffer(&#x27;video/mp4; codecs="avc1.64001f"&#x27;);&#xA; sourceBuffer.mode = &#x27;sequence&#x27;;&#xA;&#xA; // Fetch the video and add it to the Source Buffer&#xA; fetch(&#x27;https://s3.amazonaws.com/bucket_name/video_file.mp4&#x27;)&#xA; .then(response => response.arrayBuffer())&#xA; .then(data => sourceBuffer.appendBuffer(data));&#xA;}&#xA;&#xA;</code></pre>&#xA;