
Recherche avancée
Médias (2)
-
Granite de l’Aber Ildut
9 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
-
Géodiversité
9 septembre 2011, par ,
Mis à jour : Août 2018
Langue : français
Type : Texte
Autres articles (75)
-
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...) -
Participer à sa traduction
10 avril 2011Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
Actuellement MediaSPIP n’est disponible qu’en français et (...) -
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
Sur d’autres sites (7312)
-
Convert H.264 Annex B to MPEG-TS
24 octobre 2019, par LaneSO...
I have RAW H.264 video data captured via RTSP in a local file and I am attempting to playback the video in a Java FX application. In order to do this, I need to use Http Live Streaming.
I have successfully prototyped a Java FX architecture that can play a video via HLS with a local server using a local folder containing a .m3u8 (HLS index) file and collection of .ts (MPEG-TS) files. The last piece for me is to replace the .ts files with .264 / .h264 files and in the local server, perform the conversion / wrapping of the H.264 Annex B data into MPEG-TS.
I am having trouble figuring out what is required to get H.264 Annex B into MPEG-TS. I have found the following information...
"Annex B is commonly used in live and streaming formats such as
transport streams..."szatmary.org/blog/25
"Annex B of of the document specifies one such format, which wraps NAL
units in a format resembling a traditional MPEG video elementary
stream, thus making it suitable for use with containers like MPEG
PS/TS unable to provide the required framing..."wiki.multimedia.cx/ ?title=H.264
"Java FX supports a number of different media types. A media type is
considered to be the combination of a container format and one or more
encodings. In some cases the container format might simply be an
elementary stream containing the encoded data."docs.oracle.com/javafx/2/api/javafx/scene/media/package-summary.html
"Use the CODECS attribute of the EXT-X-STREAM-INF tag. When this
attribute is present, it must include all codecs and profiles required
to play back the stream..."developer.apple.com/library/ios/documentation/networkinginternet/conceptual/streamingmediaguide/FrequentlyAskedQuestions/FrequentlyAskedQuestions.html
It seems like I am missing something simple around Elementary and Transport Streams. I have used ffmpeg to convert my H.264 file into a TS file and try to understand the differences. I have an idea of the approximate format differences, but I am still lacking on the details to do it. Does anyone have a link showcasing this or know something simple about how to serve H.264 Annex B data over MPEG-TS ?
I am not looking to use a tool, I need to have a custom file format locally where I parse out the H.264 Annex B data and perform the format change in memory, on the fly. I know of a way to use ffmpeg with pipes to accomplish this, but I do not want to have any dependencies and performance is important.
-
Evolution #4720 : [css vars] Utiliser nos variables CSS dans le thème de l’espace privé
27 avril 2021Ah oui bien les astuces à base de —spip-is-ltr.
Pour compléter sur le sujet des propriétés de positionnement, dans le futur, pour avoir le support complet quelque soit la direction (horizontale ou verticale) il faudra définir le writing-mode : https://developer.mozilla.org/en-US/docs/Web/CSS/writing-mode
Valeur qu’il faudrait pouvoir récupérer en fonction du code de langue, donc.
D’après ce que j’ai compris, en son absence ça se repose sur la direction du document (dir="rtl"), donc c’est bon pour les langues à l’horizontale.Bref, on a donc pris un peu d’avance et commencé à utiliser tout de suite des variables CSS.
Le choix s’est fait un peu tout seul : ça simplifie énormément la tâche, surtout en l’absence de préprocesseur, et ça permet d’unifier et maintenir plus facilement tous les composants.Par contre avant de poursuivre, il faudrait peut-être faire un petit point d’étape, voir rédiger des guidelines pour ne pas assister à une hyper-inflation de ces variables, et qu’elles ne soient pas utilisées à tord et à travers dans tous les sens.
La règle que j’ai suivie jusqu’à présent :
- Des variables globales
--spip-xxx
, de portée générale et utilisables partout : couleurs, propriétés de texte, arrondis des blocs, gouttières, etc. - Et pour chaque composant, quelques variables qui lui sont propres :
--composant-xxx
. À quelques exceptions près, elles ne devraient pas être utilisées en dehors. J’essaie de limiter le nombre en général, une dizaine au max (sauf pour les boutons, un cas spécial).
Enfin, il y a 2 variables globales bien importantes qu’il va falloir mettre au point : ce sont celles qui définissent les gouttières horizontales et verticales. Importantes cas après, chaque composant se basera dessus pour ses propres besoins, et au final on aura des espacements bien harmonisés et facilement contrôlables.
Il peut s’agit d’une mesure arbitraire, mais en général pour la gouttière horizontale on prend l’équivalent d’une hauteur de ligne, c’est à dire
font-size * line-height
à la racine du document.
Je l’ai ajoutée en prévision (spip-spacing-y
), sauf que pour l’instant elle est pas trop utilisable : le font-size qu’on reçoit dans l’env est pas celui qui est utilisé sur le body, donc ça fausse toutes les mesures. Le font-size du body est redéfini plusieurs fois d’affilée, c’est le bordel. Bref, encore des choses à mettre au point. - Des variables globales
-
FFMPEG not enough data (x y), trying to decode anyway
7 juin 2016, par Forest J. HandfordI’m trying to make videos of Direct3D games using a C# app. For non-Direct3D games I stream images from Graphics.CopyFromScreen which works. When I copy the screen from Direct3D and stream it to FFMPEG I get :
[bmp @ 00000276b0b9c280] not enough data (5070 < 129654), trying to
decode anywayAn MP4 file is created, but it is always 0 bytes.
To get screenshots from Direct3D, I am using Justin Stenning’s Direct3DHook. This produces images MUCH bigger than when I get images from Graphics.CopyFromScreen (8 MB vs 136 KB). I’ve tried increasing the buffer (-bufsize) but the number on the left of the error is not impacted.
I’ve tried resizing the image to 1/6th the original. That reduces the number on the right, but does not eliminate it. Even when the number on the right is close to what I have for Graphics.CopyFromScreen I get an error. Here is a sample of the current code :
using System;
using System.Diagnostics;
using System.Threading;
using System.Drawing;
using Capture.Hook;
using Capture.Interface;
using Capture;
using System.IO;
namespace GameRecord
{
public class Video
{
private const int VID_FRAME_FPS = 8;
private const int SIZE_MODIFIER = 6;
private const double FRAMES_PER_MS = VID_FRAME_FPS * 0.001;
private const int SLEEP_INTERVAL = 2;
private const int CONSTANT_RATE_FACTOR = 18; // Lower crf = Higher Quality https://trac.ffmpeg.org/wiki/Encode/H.264
private Image image;
private Capture captureScreen;
private int processId = 0;
private Process process;
private CaptureProcess captureProcess;
private Process launchingFFMPEG;
private string arg;
private int frame = 0;
private Size? resize = null;
/// <summary>
/// Generates the Videos by gathering frames and processing via FFMPEG.
/// </summary>
public void RecordScreenTillGameEnd(string exe, OutputDirectory outputDirectory, CustomMessageBox alertBox, Thread workerThread)
{
AttachProcess(exe);
RequestD3DScreenShot();
while (image == null) ;
Logger.log.Info("Launching FFMPEG ....");
resize = new Size(image.Width / SIZE_MODIFIER, image.Height / SIZE_MODIFIER);
// H.264 can let us do 8 FPS in high res . . . but must be licensed for commercial use.
arg = "-f image2pipe -framerate " + VID_FRAME_FPS + " -i pipe:.bmp -pix_fmt yuv420p -crf " +
CONSTANT_RATE_FACTOR + " -preset ultrafast -s " + resize.Value.Width + "x" +
resize.Value.Height + " -vcodec libx264 -bufsize 30000k -y \"" +
outputDirectory.pathToVideo + "\"";
launchingFFMPEG = new Process
{
StartInfo = new ProcessStartInfo
{
FileName = "ffmpeg",
Arguments = arg,
UseShellExecute = false,
CreateNoWindow = true,
RedirectStandardInput = true,
RedirectStandardError = true
}
};
launchingFFMPEG.Start();
Stopwatch stopWatch = Stopwatch.StartNew(); //creates and start the instance of Stopwatch
do
{
Thread.Sleep(SLEEP_INTERVAL);
} while (workerThread.IsAlive);
Logger.log.Info("Total frames: " + frame + " Expected frames: " + (ExpectedFrames(stopWatch.ElapsedMilliseconds) - 1));
launchingFFMPEG.StandardInput.Close();
#if DEBUG
string line;
while ((line = launchingFFMPEG.StandardError.ReadLine()) != null)
{
Logger.log.Debug(line);
}
#endif
launchingFFMPEG.Close();
alertBox.Show();
}
void RequestD3DScreenShot()
{
captureProcess.CaptureInterface.BeginGetScreenshot(new Rectangle(0, 0, 0, 0), new TimeSpan(0, 0, 2), Callback, resize, (ImageFormat)Enum.Parse(typeof(ImageFormat), "Bitmap"));
}
private void AttachProcess(string exe)
{
Thread.Sleep(300);
Process[] processes = Process.GetProcessesByName(Path.GetFileNameWithoutExtension(exe));
foreach (Process currProcess in processes)
{
// Simply attach to the first one found.
// If the process doesn't have a mainwindowhandle yet, skip it (we need to be able to get the hwnd to set foreground etc)
if (currProcess.MainWindowHandle == IntPtr.Zero)
{
continue;
}
// Skip if the process is already hooked (and we want to hook multiple applications)
if (HookManager.IsHooked(currProcess.Id))
{
continue;
}
Direct3DVersion direct3DVersion = Direct3DVersion.AutoDetect;
CaptureConfig cc = new CaptureConfig()
{
Direct3DVersion = direct3DVersion,
ShowOverlay = false
};
processId = currProcess.Id;
process = currProcess;
var captureInterface = new CaptureInterface();
captureInterface.RemoteMessage += new MessageReceivedEvent(CaptureInterface_RemoteMessage);
captureProcess = new CaptureProcess(process, cc, captureInterface);
break;
}
Thread.Sleep(10);
if (captureProcess == null)
{
ShowUser.Exception("No executable found matching: '" + exe + "'");
}
}
/// <summary>
/// The callback for when the screenshot has been taken
/// </summary>
///
///
///
void Callback(IAsyncResult result)
{
using (Screenshot screenshot = captureProcess.CaptureInterface.EndGetScreenshot(result))
if (screenshot != null && screenshot.Data != null && arg != null)
{
if (image != null)
{
image.Dispose();
}
image = screenshot.ToBitmap();
// image.Save("D3DImageTest.bmp");
image.Save(launchingFFMPEG.StandardInput.BaseStream, System.Drawing.Imaging.ImageFormat.Bmp);
launchingFFMPEG.StandardInput.Flush();
frame++;
}
if (frame < 5)
{
Thread t = new Thread(new ThreadStart(RequestD3DScreenShot));
t.Start();
}
else
{
Logger.log.Info("Done getting shots from D3D.");
}
}
/// <summary>
/// Display messages from the target process
/// </summary>
///
private void CaptureInterface_RemoteMessage(MessageReceivedEventArgs message)
{
Logger.log.Info(message);
}
}
}When I search the internet for the error all I get is the FFMPEG source code, which has not proven to be illuminating. I have been able to save the image directly to disk, which makes me feel like it is not an issue with disposing the data. I have also tried only grabbing one frame, but that produces the same error, which suggests to me it is not a threading issue.
Here is the full sample of stderr :
2016-06-02 18:29:38,046 === ffmpeg version N-79143-g8ff0f6a Copyright (c) 2000-2016 the FFmpeg developers
2016-06-02 18:29:38,047 === built with gcc 5.3.0 (GCC)
2016-06-02 18:29:38,048 === configuration: --enable-gpl
--enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libdcadec --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmfx --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink --enable-zlib
2016-06-02 18:29:38,062 === libavutil 55. 19.100 / 55. 19.100
2016-06-02 18:29:38,063 === libavcodec 57. 30.100 / 57. 30.100
2016-06-02 18:29:38,064 === libavformat 57. 29.101 / 57. 29.101
2016-06-02 18:29:38,064 === libavdevice 57. 0.101 / 57. 0.101
2016-06-02 18:29:38,065 === libavfilter 6. 40.102 / 6. 40.102
2016-06-02 18:29:38,066 === libswscale 4. 0.100 / 4. 0.100
2016-06-02 18:29:38,067 === libswresample 2. 0.101 / 2. 0.101
2016-06-02 18:29:38,068 === libpostproc 54. 0.100 / 54. 0.100
2016-06-02 18:29:38,068 === [bmp @ 000002cd7e5cc280] not enough data (13070 < 8294454), trying to decode anyway
2016-06-02 18:29:38,069 === [bmp @ 000002cd7e5cc280] not enough data (13016 < 8294400)
2016-06-02 18:29:38,069 === Input #0, image2pipe, from 'pipe:.bmp':
2016-06-02 18:29:38,262 === Duration: N/A, bitrate: N/A
2016-06-02 18:29:38,262 === Stream #0:0: Video: bmp, bgra, 1920x1080, 8 tbr, 8 tbn, 8 tbc
2016-06-02 18:29:38,263 === [libx264 @ 000002cd7e5d59a0] VBV bufsize set but maxrate unspecified, ignored
2016-06-02 18:29:38,264 === [libx264 @ 000002cd7e5d59a0] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 AVX2 LZCNT BMI2
2016-06-02 18:29:38,265 === [libx264 @ 000002cd7e5d59a0] profile Constrained Baseline, level 1.1
2016-06-02 18:29:38,266 === [libx264 @ 000002cd7e5d59a0] 264 - core 148 r2665 a01e339 - H.264/MPEG-4 AVC codec - Copyleft 2003-2016 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=8 scenecut=0 intra_refresh=0 rc=crf mbtree=0 crf=18.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=0
2016-06-02 18:29:38,463 === Output #0, mp4, to 'C:\Users\fores\AppData\Roaming\Affectiva\n_Artifacts_20160602_182857\GameplayVidOut.mp4':
2016-06-02 18:29:38,464 === Metadata:
2016-06-02 18:29:38,465 === encoder : Lavf57.29.101
2016-06-02 18:29:38,469 === Stream #0:0: Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv420p, 320x180, q=-1--1, 8 fps, 16384 tbn, 8 tbc
2016-06-02 18:29:38,470 === Metadata:
2016-06-02 18:29:38,472 === encoder : Lavc57.30.100 libx264
2016-06-02 18:29:38,474 === Side data:
2016-06-02 18:29:38,475 === cpb: bitrate max/min/avg: 0/0/0 buffer size: 30000000 vbv_delay: -1
2016-06-02 18:29:38,476 === Stream mapping:
2016-06-02 18:29:38,477 === Stream #0:0 -> #0:0 (bmp (native) -> h264 (libx264))
2016-06-02 18:29:38,480 === [bmp @ 000002cd7e5cc9a0] not enough data (13070 < 8294454), trying to decode anyway
2016-06-02 18:29:38,662 === [bmp @ 000002cd7e5cc9a0] not enough data (13016 < 8294400)
2016-06-02 18:29:38,662 === Error while decoding stream #0:0: Invalid data found when processing input
2016-06-02 18:29:38,663 === frame= 0 fps=0.0 q=0.0 Lsize= 0kB time=00:00:00.00 bitrate=N/A speed= 0x
2016-06-02 18:29:38,663 === video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
2016-06-02 18:29:38,664 === Conversion failed!In memory, the current image is 320 pixels wide and 180 pixels long. The pixel format is Format32bppRgb. The horizontal and vertical resolutions seem odd, they are both 96.01199. When filed to disk here is ffprobe output for the file :
ffprobe version N-79143-g8ff0f6a Copyright (c) 2007-2016 the FFmpeg developers
built with gcc 5.3.0 (GCC)
configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libdcadec --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmfx --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink --enable-zlib
libavutil 55. 19.100 / 55. 19.100
libavcodec 57. 30.100 / 57. 30.100
libavformat 57. 29.101 / 57. 29.101
libavdevice 57. 0.101 / 57. 0.101
libavfilter 6. 40.102 / 6. 40.102
libswscale 4. 0.100 / 4. 0.100
libswresample 2. 0.101 / 2. 0.101
libpostproc 54. 0.100 / 54. 0.100
Input #0, png_pipe, from 'C:\Users\fores\git\game-playtest-tool\GamePlayTest\bin\x64\Debug\D3DFromCapture.bmp':
Duration: N/A, bitrate: N/A
Stream #0:0: Video: png, rgba(pc), 1920x1080 [SAR 3779:3779 DAR 16:9], 25 tbr, 25 tbn, 25 tbcHere is a PNG version of an example screenshot from the current code (playing Portal 2) :
Any ideas would be greatly appreciated. My current workaround is to save the files to the HDD and compile the video after gameplay, but it’s a far less performant option. Thank you !