
Recherche avancée
Médias (1)
-
SWFUpload Process
6 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
Autres articles (105)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Use, discuss, criticize
13 avril 2011, parTalk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
A discussion list is available for all exchanges between users. -
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...)
Sur d’autres sites (8611)
-
Video encoding green screen
22 août 2016, par Mohammad Abu MusaI am building a screen recorder, the input stream is formatted with
PP_VIDEOFRAME_FORMAT_I420
and the output is formatted withAV_PIX_FMT_YUV420P
below is the code I use to do the conversionconst uint8_t* data = static_cast<const>(frame.GetDataBuffer());
pp::Size size;
frame.GetSize(&size);
uint32_t buffersize = frame.GetDataBufferSize();
if (is_recording_) {
vpx_codec_iter_t iter = NULL;
const vpx_codec_cx_pkt_t *pkt;
// copy the pixels into our "raw input" container.
int bytes_filled = avpicture_fill(&pic_raw, data, AV_PIX_FMT_YUV420P, out_width, out_height);
if(!bytes_filled) {
Logger::Log("Cannot fill the raw input buffer");
return;
}
if(vpx_codec_encode(&codec, &raw, frame_cnt, 1, flags, VPX_DL_REALTIME))
die_codec(&codec, "Failed to encode frame");
while( (pkt = vpx_codec_get_cx_data(&codec, &iter)) ) {
switch(pkt->kind) {
case VPX_CODEC_CX_FRAME_PKT:
glb_app_thread.message_loop().PostWork(callback_factory_.NewCallback(&EncoderInstance::write_ivf_frame_header, pkt));
glb_app_thread.message_loop().PostWork(callback_factory_.NewCallback(&EncoderInstance::WriteFile, pkt));
break;
default:break;
}
}
frame_cnt++;
</const>I have three questions :
1- Is the conversion is done correctly ? do I have to investigate image formats more ? are data channels mapped correctly.
2- What causes the green screen to show ? what does it mean ?
3- Is this a thread issue ? I mean is data passed correctly and the conversion is done correctly but the threads are racing
-
Unable to play mp4 file with JavaFX MediaPlayer
20 juillet 2016, par LennartI want to capture the computers screen and play resulting video with the MediaPlayer of JavaFX (jdk1.8.0_91).
I found the supported encoding and container types and I think that mp4 with H.264/AVC and AAC is the best choice.
I’m using FFmpeg with the DirectShow device Screen Capture Recorder to capture the screen and save the resulting video as mp4 file. I used the following command :
ffmpeg -rtbufsize 1500M -framerate 25 -f dshow -i video="screen-capture-recorder":audio="virtual-audio-capturer" -r 25 -c:v libx264 -pix_fmt yuv420p output.mp4
But the MediaPlayer isn’t able to play the video and throws a MediaException instead :
MediaException: UNKNOWN : [com.sun.media.jfxmediaimpl.platform.gstreamer.GSTMediaPlayer@133b675a] ERROR_MEDIA_INVALID: ERROR_MEDIA_INVALID
at javafx.scene.media.MediaException.getMediaException(MediaException.java:160)
at javafx.scene.media.MediaPlayer$_MediaErrorListener.onError(MediaPlayer.java:2615)
at com.sun.media.jfxmediaimpl.NativeMediaPlayer$EventQueueThread.HandleErrorEvents(NativeMediaPlayer.java:691)
at com.sun.media.jfxmediaimpl.NativeMediaPlayer$EventQueueThread.run(NativeMediaPlayer.java:425)I previously used a video from Youtube (downloaded with some random Youtube downloader) for testing and it worked. I displayed the video codecs with the VLC player and it shows the same data (except for the frame rate and resolution) :
Left is the Youtube video and right the FFmpeg output.Solution :
The problem was the resolution of the video. I found an open jdk bug that says that the maximum resolution is 1920x1088 due to limitations of the used decoder. The video played fine after I scaled it. -
Stream OpenGL framebuffer over HTTP (via FFmpeg)
17 juin 2016, par mOflI have an OpenGL application of which rendered images need to be streamed over internet to mobile clients. Previously, it sufficed to simply record the rendering into a video file, which is already working, and now this should be extended to subsequent streaming.
What is working right now :
- Render a scene to an OpenGL framebuffer object
- Capture the FBO content using NvIFR
- Encode it to H.264 using NvENC (no CPU round trip required)
- Download the encoded frame to host memory as a byte array
- Append this frame to a video file
None of this steps involves FFmpeg or any other library so far. I now want to replace the last step with "Stream the current frame’s byte array over internet" and I assume that using FFmpeg and FFserver would be a reasonable choice for this. Am I correct ? If not, what would be the proper way ?
If so, how do I approach this within my C++ code ? As pointed out, the frame is already encoded. Also, there is no sound or other stuff, simply a H.264 encoded frame as byte array that is updated irregularly and should be converted into a steady video stream. I assume that this would be FFmpeg’s job and that the subsequent streaming via FFserver would be simple from there. What I don’t know is how to feed my data to FFmpeg in the first place, as all FFmpeg tutorials I found (in a non-exhaustive search) work on a file or webcam/capture device as data source, not volatile data in main memory.
The file mentioned above that I am already able to create is a C++ file stream to which I append each single frame, meaning that different framerates of video and rendering are not treated correctly. This also needs to be taken care of at some point.
Can somebody point me in the right direction ? Can I forward data from my application to FFmpeg to build a proper video feed without writing to the hard disk ? Tutorials are greatly appreciated. By the way FFmpeg/FFserver is not mandatory. If you have a better idea for streaming of OpenGL framebuffer contents, I’m eager to know.