
Recherche avancée
Médias (1)
-
The Slip - Artworks
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (54)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
-
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (6363)
-
Black screen when playing a video with ffmpeg and SDL on iOS
1er avril 2012, par patrickI'm attempting to create a video player on iOS using ffmpeg and SDL. I'm decoding the video stream and attempting to convert the pixel data into a SDL_Surface and then convert that over to an SDL_Texture and render it on screen. However, all I'm getting is a black screen. I know the video file is good and can be viewed fine from VLC. Any idea what I'm missing here ?
Initialization code :
// initialize SDL (Simple DirectMedia Layer) to playback the content
if( SDL_Init(SDL_INIT_VIDEO | SDL_INIT_AUDIO | SDL_INIT_TIMER) )
{
DDLogError(@"Unable to initialize SDL");
return NO;
}
// create window and renderer
window = SDL_CreateWindow(NULL, 0, 0, SCREEN_WIDTH, SCREEN_HEIGHT,
SDL_WINDOW_OPENGL | SDL_WINDOW_BORDERLESS |
SDL_WINDOW_SHOWN);
if ( window == 0 )
{
DDLogError(@"Unable to initialize SDL Window");
}
renderer = SDL_CreateRenderer(window, -1, 0);
if ( !renderer )
{
DDLogError(@"Unable to initialize SDL Renderer");
}
// Initialize the FFMpeg and register codecs and their respected file formats
av_register_all();Playback code :
AVFormatContext *formatContext = NULL ;DDLogInfo(@"Opening media file at location:%@", filePath);
const char *filename = [filePath cStringUsingEncoding:NSUTF8StringEncoding];
// Open media file
if( avformat_open_input(&formatContext, filename, NULL, NULL) != 0 )
{
DDLogWarn(@"Unable to open media file. [File:%@]", filePath);
NSString *failureReason = NSLocalizedString(@"Unable to open file.", @"Media playback failed, unable to open file.");
if ( error != NULL )
{
*error = [NSError errorWithDomain:MediaPlayerErrorDomain
code:UNABLE_TO_OPEN
userInfo:[NSDictionary dictionaryWithObject:failureReason
forKey:NSLocalizedFailureReasonErrorKey]];
}
return NO; // Couldn't open file
}
// Retrieve stream information
if( avformat_find_stream_info(formatContext, NULL) <= 0 )
{
DDLogWarn(@"Unable to locate stream information for file. [File:%@]", filePath);
NSString *failureReason = NSLocalizedString(@"Unable to find audio/video stream information.", @"Media playback failed, unable to find stream information.");
if ( error != NULL )
{
*error = [NSError errorWithDomain:MediaPlayerErrorDomain
code:UNABLE_TO_FIND_STREAM
userInfo:[NSDictionary dictionaryWithObject:failureReason
forKey:NSLocalizedFailureReasonErrorKey]];
}
return NO; // Missing stream information
}
// Find the first video or audio stream
int videoStream = -1;
int audioStream = -1;
DDLogInfo(@"Locating stream information for media file");
for( int index=0; index<(formatContext->nb_streams); index++)
{
if( formatContext->streams[index]->codec->codec_type==AVMEDIA_TYPE_VIDEO )
{
DDLogInfo(@"Found video stream");
videoStream = index;
break;
}
else if( mediaType == AUDIO_FILE &&
(formatContext->streams[index]->codec->codec_type==AVMEDIA_TYPE_AUDIO) )
{
DDLogInfo(@"Found audio stream");
audioStream = index;
break;
}
}
if( videoStream == -1 && (audioStream == -1) )
{
DDLogWarn(@"Unable to find video or audio stream for file");
NSString *failureReason = NSLocalizedString(@"Unable to locate audio/video stream.", @"Media playback failed, unable to locate media stream.");
if ( error != NULL )
{
*error = [NSError errorWithDomain:MediaPlayerErrorDomain
code:UNABLE_TO_FIND_STREAM
userInfo:[NSDictionary dictionaryWithObject:failureReason
forKey:NSLocalizedFailureReasonErrorKey]];
}
return NO; // Didn't find a video or audio stream
}
// Get a pointer to the codec context for the video/audio stream
AVCodecContext *codecContext;
DDLogInfo(@"Attempting to locate the codec for the media file");
if ( videoStream > -1 )
{
codecContext = formatContext->streams[videoStream]->codec;
}
else
{
codecContext = formatContext->streams[audioStream]->codec;
}
// Now that we have information about the codec that the file is using,
// we need to actually open the codec to decode the content
DDLogInfo(@"Attempting to open the codec to playback the media file");
AVCodec *codec;
// Find the decoder for the video stream
codec = avcodec_find_decoder(codecContext->codec_id);
if( codec == NULL )
{
DDLogWarn(@"Unsupported codec! Cannot playback meda file [File:%@]", filePath);
NSString *failureReason = NSLocalizedString(@"Unsupported file format. Cannot playback media.", @"Media playback failed, unsupported codec.");
if ( error != NULL )
{
*error = [NSError errorWithDomain:MediaPlayerErrorDomain
code:UNSUPPORTED_CODEC
userInfo:[NSDictionary dictionaryWithObject:failureReason
forKey:NSLocalizedFailureReasonErrorKey]];
}
return NO; // Codec not found
}
// Open codec
if( avcodec_open2(codecContext, codec, NULL) < 0 )
{
DDLogWarn(@"Unable to open codec! Cannot playback meda file [File:%@]", filePath);
NSString *failureReason = NSLocalizedString(@"Unable to open media codec. Cannot playback media.", @"Media playback failed, cannot open codec.");
if ( error != NULL )
{
*error = [NSError errorWithDomain:MediaPlayerErrorDomain
code:UNABLE_TO_LOAD_CODEC
userInfo:[NSDictionary dictionaryWithObject:failureReason
forKey:NSLocalizedFailureReasonErrorKey]];
}
return NO; // Could not open codec
}
// Allocate player frame
AVFrame *playerFrame=avcodec_alloc_frame();
// Allocate an AVFrame structure
AVFrame *RGBframe=avcodec_alloc_frame();
if( RGBframe==NULL )
{
// could not create a frame to convert our video frame
// to a 16-bit RGB565 frame.
DDLogWarn(@"Unable to convert video frame. Cannot playback meda file [File:%@]", filePath);
NSString *failureReason = NSLocalizedString(@"Problems interpreting video frame information.", @"Media playback failed, cannot convert frame.");
if ( error != NULL )
{
*error = [NSError errorWithDomain:MediaPlayerErrorDomain
code:UNABLE_TO_LOAD_FRAME
userInfo:[NSDictionary dictionaryWithObject:failureReason
forKey:NSLocalizedFailureReasonErrorKey]];
}
return NO; // Could not open codec
}
int frameFinished = 0;
AVPacket packet;
// Figure out the destination width/height based on the screen size
int destHeight = codecContext->height;
int destWidth = codecContext->width;
if ( destHeight > SCREEN_HEIGHT || (destWidth > SCREEN_WIDTH) )
{
if ( destWidth > SCREEN_WIDTH )
{
float percentDiff = ( destWidth - SCREEN_WIDTH ) / (float)destWidth;
destWidth = destWidth - (int)(destWidth * percentDiff );
destHeight = destHeight - (int)(destHeight * percentDiff );
}
if ( destHeight > SCREEN_HEIGHT )
{
float percentDiff = (destHeight - SCREEN_HEIGHT ) / (float)destHeight;
destWidth = destWidth - (int)(destWidth * percentDiff );
destHeight = destHeight - (int)(destHeight * percentDiff );
}
}
SwsContext *swsContext = sws_getContext(codecContext->width, codecContext->height, codecContext->pix_fmt, destWidth, destHeight, PIX_FMT_RGB565, SWS_BICUBIC, NULL, NULL, NULL);
while( av_read_frame(formatContext, &packet) >= 0 )
{
// Is this a packet from the video stream?
if( packet.stream_index == videoStream )
{
// Decode video frame
avcodec_decode_video2(codecContext, playerFrame, &frameFinished, &packet);
// Did we get a video frame?
if( frameFinished != 0 )
{
// Convert the content over to RGB565 (16-bit RGB) to playback with SDL
uint8_t *dst[3];
int dstStride[3];
// Set the destination stride
for (int plane = 0; plane < 3; plane++)
{
dstStride[plane] = codecContext->width*2;
dst[plane]= (uint8_t*) malloc(dstStride[plane]*destHeight);
}
sws_scale(swsContext, playerFrame->data,
playerFrame->linesize, 0,
destHeight,
dst, dstStride);
// Create the SDL surface frame that we are going to use to draw our video
// 16-bit RGB so 2 bytes per pixel (pitch = width*(bytes per pixel))
int pitch = destWidth*2;
SDL_Surface *frameSurface = SDL_CreateRGBSurfaceFrom(dst[0], destWidth, destHeight, 16, pitch, 0, 0, 0, 0);
// Clear the old frame first
SDL_RenderClear(renderer);
// Move the frame over to a texture and render it on screen
SDL_Texture *texture = SDL_CreateTextureFromSurface(renderer, frameSurface);
SDL_SetTextureBlendMode(texture, SDL_BLENDMODE_BLEND);
// Draw the new frame on the screen
SDL_RenderPresent(renderer);
SDL_DestroyTexture(texture);
SDL_FreeSurface(frameSurface);
} -
lavfi/fade : fix black level for non studio-level pixel formats
29 mars 2012, par Stefano Sabatinilavfi/fade : fix black level for non studio-level pixel formats
-
ffmpeg return black and white image
16 décembre 2011, par njaiVideo capture device is kworld dvd maker 2 (em2861)
It returns the image in "black and white" with green line at the bottom of screen ; but when I tested it with webcam, it worked properly.Here is 30s video. http://www.youtube.com/watch?v=7q2wFGVwGGI
How can I convert it to color ?
[root@localhost ~]# ffmpeg -f video4linux2 -i /dev/video0 -vcodec mpeg4 -y output.mp4
ffmpeg version git-2011-11-05-5fd1a69, Copyright (c) 2000-2011 the FFmpeg developers
built on Nov 5 2011 21:10:52 with gcc 4.4.4 20100726 (Red Hat 4.4.4-13)
configuration: --enable-libx264 --enable-gpl --enable-libvpx
libavutil 51. 23. 0 / 51. 23. 0
libavcodec 53. 27. 0 / 53. 27. 0
libavformat 53. 18. 0 / 53. 18. 0
libavdevice 53. 4. 0 / 53. 4. 0
libavfilter 2. 47. 0 / 2. 47. 0
libswscale 2. 1. 0 / 2. 1. 0
libpostproc 51. 2. 0 / 51. 2. 0
[video4linux2,v4l2 @ 0x2a64780] Estimating duration from bitrate, this may be inaccurate
Input #0, video4linux2,v4l2, from '/dev/video0':
Duration: N/A, start: 1320637788.113946, bitrate: 165888 kb/s
Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 720x576, 165888 kb/s, 25 tbr, 1000k tbn, 25 tbc
Incompatible pixel format 'yuyv422' for codec 'mpeg4', auto-selecting format 'yuv420p'
[buffer @ 0x2a63920] w:720 h:576 pixfmt:yuyv422 tb:1/1000000 sar:0/1 sws_param:
[buffersink @ 0x2a63cc0] auto-inserting filter 'auto-inserted scale 0' between the filter 'src' and the filter 'out'
[scale @ 0x2a64560] w:720 h:576 fmt:yuyv422 -> w:720 h:576 fmt:yuv420p flags:0x4
Output #0, mp4, to 'output.mp4':
Metadata:
encoder : Lavf53.18.0
Stream #0:0: Video: mpeg4 ( [0][0][0] / 0x0020), yuv420p, 720x576, q=2-31, 200 kb/s, 25 tbn, 25 tbc
Stream mapping:
Stream #0.0 -> #0.0 (rawvideo -> mpeg4)
Press [q] to stop, [?] for help
frame= 969 fps= 25 q=31.0 Lsize= 1547kB time=00:00:38.76 bitrate= 327.0kbits/s
video:1538kB audio:0kB global headers:0kB muxing overhead 0.561411%