
Recherche avancée
Médias (1)
-
La conservation du net art au musée. Les stratégies à l’œuvre
26 mai 2011
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (97)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Multilang : améliorer l’interface pour les blocs multilingues
18 février 2011, parMultilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela. -
ANNEXE : Les plugins utilisés spécifiquement pour la ferme
5 mars 2010, parLe site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)
Sur d’autres sites (11406)
-
Encoding of D3D11Texture2D to an rtsp stream using libav*
1er décembre 2020, par uzerFirstly I'll declare that I am just beginning with whole libav* and no experience with DirectX, so please go easy on me.


I have managed to create a rtsp stream using libav* using video file as source. Now, I am trying to create an rtsp stream from an ID3D11Texture2D, which I am obtaining from GDI API using Bitblit method. Here's my approach for creating live rtsp stream :


- 

- Set input context

- 

- AVFormatContext* ifmt_ctx = avformat_alloc_context() ;
- avformat_open_input(&ifmt_ctx, _videoFileName, 0, 0) ;






- Set output context

- 

- avformat_alloc_output_context2(&ofmt_ctx, NULL, "rtsp", _rtspServerAdress) ; //RTSP
- copy all the codec context and stream from input to output






- Start streaming

- 

- while av_read_frame(ifmt_ctx, &pkt) ; is valid av_interleaved_write_frame(ofmt_ctx, &pkt) ;
- with some timestamp checks and conditions for livestreaming














Now I am finding it difficult to follow current libav* documentation (which is deprecated) and little tutorial content available online.


The most relevant article I found on working between directX and libav* is this article.
However it's actually doing the opposite of what I need to do. I am not sure how to go about creating input stream and context with DirectX texture ! How can I convert the texture into an AVFrame which can be encoded to an AVStream ?


Here's some rough outline of what I am expecting


ID3D11Texture2D* win_textureptr = WindowManager::Get().GetWindow(RtspStreaming::WindowId())->GetWindowTexture();

 D3D11_TEXTURE2D_DESC* desc;
 win_textureptr->GetDesc(desc);
 int width = desc->Width;
 int height = desc->Height;
 //double audio_time=0.0;
 auto start_time = std::chrono::system_clock::now();
 std::chrono::duration<double> video_time;
 
 //DirectX BGRA to h264 YUV420p
 SwsContext* conversion_ctx = sws_getContext(
 width, height, AV_PIX_FMT_BGRA,
 width, height, AV_PIX_FMT_YUV420P,
 SWS_BICUBLIN | SWS_BITEXACT, nullptr, nullptr, nullptr);
 
 uint8_t* sw_data[AV_NUM_DATA_POINTERS];
 int sw_linesize[AV_NUM_DATA_POINTERS];

 while (RtspStreaming::IsStreaming())
 {

 //copy the texture
 //win_textureptr->GetPrivateData();
 

 // convert BGRA to yuv420 pixel format
 /*
 frame = av_frame_alloc();
 //this obviously is incorrect... I would like to use d3d11 texture here instead of frame
 sws_scale(conversion_ctx, frame->data, frame->linesize, 0, frame->height,
 sw_data, sw_linesize);

 frame->format = AV_PIX_FMT_YUV420P;
 frame->width = width;
 frame->height = height;*/


 //encode to the video stream
 
 /* Compute current audio and video time. */
 video_time = std::chrono::system_clock::now() - start_time;

 //write frame and send
 av_interleaved_write_frame(ofmt_ctx, &pkt);

 av_frame_unref(frame);
 }

 av_write_trailer(ofmt_ctx);
</double>


- Set input context

-
strange artifacts in image - where does a frame start ?
26 mai 2017, par user3387542We are live broadcasting a webcam stream. No audio, video only. The current command that works great :
# Direct replay works well:
ffmpeg -f video4linux2 -r 10 -s 1280x720 -i /dev/video0 -c:v rawvideo -f rawvideo -pix_fmt yuv420p - | \
ffplay -f rawvideo -pixel_format yuv420p -video_size 1280x720 -framerate 10 -but as soon as we try to send this data over the network (udp broadcast / gigabit lan) we are getting strange artefacts into the image.
# server command:
ffmpeg -f video4linux2 -r 10 -s 1280x720 -i /dev/video0 -c:v rawvideo -f rawvideo -pix_fmt yuv420p - | \
socat - UDP-DATAGRAM:10.0.0.255:12345,broadcast
# client command:
socat -u udp-recv:12345,reuseaddr - | \
ffplay -f rawvideo -pixel_format yuv420p -video_size 1280x720 -framerate 10 -Where do these artifacts come from and how to get rid of them ? Does this has something to do with the client not knowing where a certain video frame starts ?
We have chosen to stream raw video to reduce latency. The final goal would be to apply opencv tools to the video and react live depending on the situation. Which works great, as long as the camera is plugged in directly into this computer. But we need to set it apart and need multiple clients.
The camera used is a Microsoft® LifeCam Studio(TM).
$ v4l2-ctl -d 0 --list-formats
ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: 'YUYV'
Name : YUYV 4:2:2
Index : 1
Type : Video Capture
Pixel Format: 'MJPG' (compressed)
Name : Motion-JPEG
Index : 2
Type : Video Capture
Pixel Format: 'M420'
Name : YUV 4:2:0 (M420)Update
To narrow down the issue, I tried to split it up into different tasks :
1.0. Writing the stream to a file :
ffmpeg -f video4linux2 -r 10 -s 1280x720 -i /dev/video0 -c:v rawvideo -f rawvideo -pix_fmt yuv420p - > ~/deltemp/rawout
1.1. Reading the file : The result looks great, no artefacts :
cat ~/deltemp/rawout | ffplay -f rawvideo -pixel_format yuv420p -video_size 1280x720 -framerate 10 -
2.0 Starting the stream and broadcasting the stream as mentioned in the server command above
2.1 Writing the UDP stream to a file. And watching the file (artifacts are back again)
socat -u udp-recv:12345,reuseaddr - > ~/deltemp/rawout
cat ~/deltemp/rawout | ffplay -f rawvideo -pixel_format yuv420p -video_size 1280x720 -framerate 10 -As test 1 showed no artifacts and test 2 did, it must be something with udp packet loss.
Test 3 : Reducing quality to 640x480 did not help either.
-
How do I configure ffmpeg & openh264 so that the video file can be opened in Windows Media Player 12
10 mars 2017, par Sacha GuyerI have successfully created h264/mp4 movie files with ffmpeg and the x264 library.
Now I would like to change the h264 library from x264 to openH264. I could replace the x264 library with openH264, recompile ffmpeg and produce movie files, without changing my sources that produce the movie. The resulting movie opens fine in Quicktime on Mac, but on Windows, Windows Media Player 12 cannot play it.
The documentation about Windows Media Player support for h264 is unclear. File types supported by Windows Media Player states in the table that Windows Media Player 12 supports mp4, but the text below says :
Windows Media Player does not support the playback of the .mp4 file format.
From what I have observed, Windows Media Player 12 IS capable of playing h264/mp4 files, but only when created with x264.
Does anyone know how I need to adjust the configuration of the codec/context so that the movie plays in Windows Media Player ? Does Windows Media Player only support certain h264 profiles ?
I noticed the warning :
[libopenh264 @ 0x...] [OpenH264] this = 0x..., Warning:bEnableFrameSkip = 0,bitrate can’t be controlled for RC_QUALITY_MODE,RC_BITRATE_MODE and RC_TIMESTAMP_MODE without enabling skip frame
With the configuration :
av_dict_set(&options, "allow_skip_frames", "1", 0);
I could get rid of this warning, but the movie still does not play. Are there other options that need to be set so that the movie plays in Windows Media Player ?
Thank you for your help
ffprobe output of the file that does play fine in Windows Media Player :
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'test_x264.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
title : retina
encoder : Lavf57.56.100
comment : Creation Date: 2017-03-10 07:47:39.601
Duration: 00:00:04.17, start: 0.000000, bitrate: 17497 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661),
yuv420p, 852x754, 17495 kb/s, 24 fps, 24 tbr, 24k tbn, 48 tbc (default)
Metadata:
handler_name : VideoHandlerffprobe output of the file that does not play in Windows Media Player :
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'test_openh264.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
title : retina
encoder : Lavf57.56.100
comment : Creation Date: 2017-03-10 07:49:27.024
Duration: 00:00:04.17, start: 0.000000, bitrate: 17781 kb/s
Stream #0:0(und): Video: h264 (Constrained Baseline) (avc1 / 0x31637661),
yuv420p, 852x754, 17779 kb/s, 24 fps, 24 tbr, 24k tbn, 48k tbc (default)
Metadata:
handler_name : VideoHandler