
Recherche avancée
Médias (1)
-
Carte de Schillerkiez
13 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (93)
-
Les vidéos
21 avril 2011, parComme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;
Sur d’autres sites (9354)
-
Suppress black margins on the sides of an animation
26 avril 2018, par Clinton WinantI need to make an animation out of a collection of jpeg images. The image size, as given by display, is 1200x900. I can control the size of the jpg images with convert, but not sure what a good size would be. I use the following ffmpeg call :
ffmpeg -f image2 -i img%04d.jpg -r 24 sound.avi
In spite of a long string of warnings like
Past duration 0.879997 too large
sound.avi is produced, however the animation includes black right and left margins (see attached screen shot of the first frame)
that I need to suppress. I am under the impression that the 4x3 format of the jpg images is standard ? I view the animation with
mplayer sound.avi
The OS is Debian buster
Further experiments suggest that the black margins disappear if the jpg files have an aspect ratio 16:9. Is that the only AR possible ?
The output of
ffmpeg -f image2 -i img%04d.jpg -vf cropdetect -vframes 5 -f null
requested by @Gyan is
ffmpeg version 3.4.2-2 Copyright (c) 2000-2018 the FFmpeg developers
built with gcc 7 (Debian 7.3.0-15)
configuration: --prefix=/usr --extra-version=2 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-librsvg --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
libavutil 55. 78.100 / 55. 78.100
libavcodec 57.107.100 / 57.107.100
libavformat 57. 83.100 / 57. 83.100
libavdevice 57. 10.100 / 57. 10.100
libavfilter 6.107.100 / 6.107.100
libavresample 3. 7. 0 / 3. 7. 0
libswscale 4. 8.100 / 4. 8.100
libswresample 2. 9.100 / 2. 9.100
libpostproc 54. 7.100 / 54. 7.100
Trailing options were found on the commandline.
Input #0, image2, from 'img%04d.jpg':
Duration: 00:00:00.40, start: 0.000000, bitrate: N/A
Stream #0:0: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 1200x900 [SAR 150:150 DAR 4:3], 25 fps, 25 tbr, 25 tbn, 25 tbc -
Black video when using ffplay of received data over upd [closed]
4 février 2023, par Alexandru-Marian BuzaI'm encoding frames from a id3d11texture2d, and the receive packet size is around 70-80.
When i'm sending over udp, and use ffplay to show the video, i get a black screen.
Resolution and fps are well determined by ffplay.


AcquiredBuffer.Attach(Buffer.MetaData.pSurface);

 ID3D11Texture2D* texture;
 ID3D11Texture2D* stagingTexture;
 AcquiredBuffer->QueryInterface(IID_PPV_ARGS(&texture));
 
 texture->GetDesc(&desc);
 desc.Usage = D3D11_USAGE_STAGING;
 desc.CPUAccessFlags = D3D11_CPU_ACCESS_READ;
 desc.Format = DXGI_FORMAT_R8G8B8A8_UNORM;
 desc.BindFlags = 0;
 desc.MiscFlags = 0;
 hr = m_Device->Device->CreateTexture2D(&desc, NULL, &stagingTexture);
 if (stagingTexture != 0) {
 m_Device->DeviceContext->CopyResource(stagingTexture, texture);
 D3D11_MAPPED_SUBRESOURCE mappedData;
 m_Device->DeviceContext->Map(stagingTexture, 0, D3D11_MAP_READ, 0, &mappedData);
 
 memcpy(frame->data[0], mappedData.pData, mappedData.DepthPitch);
 auto ret = avcodec_send_frame(codec, frame);
 if (ret < 0) {
 writeText("Send frame failed");
 }
 else {
 while (ret >= 0) {
 ret = avcodec_receive_packet(codec, packet);
 for (int i = 0; i < packet->size; i++) {
 if (packet->data[i] != 0) {
 }
 }
 if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF) {
 break;
 }
 else if (ret < 0) {
 writeText("Error durring encoding");
 }
 if (sendto(s, (char*)packet->data, packet->size * sizeof(uint8_t), 0, (sockaddr*)&dest, sizeof(dest)) == SOCKET_ERROR) {
 writeText("Failed to send frame");
 }
 av_packet_unref(packet);
 }
 }
 
 m_Device->DeviceContext->Unmap(stagingTexture, 0);
 stagingTexture->Release();



-
opencv3.1 python VideoCapture black screen
3 mai 2017, par alexi’m trying to use opencv3.1 with python3.5
following this official)tutorial
http://docs.opencv.org/3.0-beta/doc/py_tutorials/py_gui/py_video_display/py_video_display.html#display-videothe camera led is "on" but the window doesn’t show any image (black)
my current enviroment :
windows 10
python 3.5.2 (32bit)
numpy 1.12.0b1 (32bit) binaries downloaded from : http://www.lfd.uci.edu/ gohlke/pythonlibs/
opencv 3.1.0 (32bit) binaries downloaded from : http://www.lfd.uci.edu/ gohlke/pythonlibs/
they seem installed correctly !
as the tutorial says :
Note Make sure proper versions of ffmpeg or gstreamer is installed. Sometimes, it is a headache to work with Video Capture mostly due to wrong installation of ffmpeg/gstreamer.
i’ve no ffmpeg installed !!! so i suppose this is the cause of my problem
so the questions are :
1) which version of ffmpeg is needed ?
2) how install ffmpeg on windows ? it seems a setup doesn’t exist(there is a binary section with a zip file ? how to use it after unzipped ?)
3) is possible to see(in some way) if the opencv binaries(that i’ve downloaded) were compiled with ffmpeg support(flag ?)PS : i’ve tried to add the ffmpeg folder path the the PATH enviroment variable but nothing change