Recherche avancée
Médias (1)
-
Video d’abeille en portrait
14 mai 2011, par
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (70)
-
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
Mediabox : ouvrir les images dans l’espace maximal pour l’utilisateur
8 février 2011, parLa visualisation des images est restreinte par la largeur accordée par le design du site (dépendant du thème utilisé). Elles sont donc visibles sous un format réduit. Afin de profiter de l’ensemble de la place disponible sur l’écran de l’utilisateur, il est possible d’ajouter une fonctionnalité d’affichage de l’image dans une boite multimedia apparaissant au dessus du reste du contenu.
Pour ce faire il est nécessaire d’installer le plugin "Mediabox".
Configuration de la boite multimédia
Dès (...) -
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.
Sur d’autres sites (9967)
-
swscale/arm : re-enable neon rgbx to nv12 routines
22 février 2016, par Xiaolei Yuswscale/arm : re-enable neon rgbx to nv12 routines
Commit ’842b8f4ba2e79b9c004a67f6fdb3d5c5d05805d3’ fixed clang/iphone
build but failed on some versions of cygwin. It has now been verified
to work on both platforms.Signed-off-by : Michael Niedermayer <michael@niedermayer.cc>
-
FFmpeg does not decode h264 stream
5 juillet 2012, par HAPPY_TIGERI am trying to decode h264 stream from rtsp server and render it on iPhone.
I found some libraries and read some articles about it.
Libraries are from dropCam for iPhone called RTSPClient and DecoderWrapper.
But I can not decode frame data with DecodeWrapper that using on ffmpeg.
Here are my code.
VideoViewer.m
- (void)didReceiveFrame:(NSData*)frameData presentationTime:(NSDate*)presentationTime
{
[VideoDecoder staticInitialize];
mConverter = [[VideoDecoder alloc] initWithCodec:kVCT_H264 colorSpace:kVCS_RGBA32 width:320 height:240 privateData:nil];
[mConverter decodeFrame:frameData];
if ([mConverter isFrameReady]) {
UIImage *imageData =[mConverter getDecodedFrame];
if (imageData) {
[mVideoView setImage:imageData];
NSLog(@"decoded!");
}
}
}
---VideoDecoder.m---
- (id)initWithCodec:(enum VideoCodecType)codecType
colorSpace:(enum VideoColorSpace)colorSpace
width:(int)width
height:(int)height
privateData:(NSData*)privateData {
if(self = [super init]) {
codec = avcodec_find_decoder(CODEC_ID_H264);
codecCtx = avcodec_alloc_context();
// Note: for H.264 RTSP streams, the width and height are usually not specified (width and height are 0).
// These fields will become filled in once the first frame is decoded and the SPS is processed.
codecCtx->width = width;
codecCtx->height = height;
codecCtx->extradata = av_malloc([privateData length]);
codecCtx->extradata_size = [privateData length];
[privateData getBytes:codecCtx->extradata length:codecCtx->extradata_size];
codecCtx->pix_fmt = PIX_FMT_RGBA;
#ifdef SHOW_DEBUG_MV
codecCtx->debug_mv = 0xFF;
#endif
srcFrame = avcodec_alloc_frame();
dstFrame = avcodec_alloc_frame();
int res = avcodec_open(codecCtx, codec);
if (res < 0)
{
NSLog(@"Failed to initialize decoder");
}
}
return self;
}
- (void)decodeFrame:(NSData*)frameData {
AVPacket packet = {0};
packet.data = (uint8_t*)[frameData bytes];
packet.size = [frameData length];
int frameFinished=0;
NSLog(@"Packet size===>%d",packet.size);
// Is this a packet from the video stream?
if(packet.stream_index==0)
{
int res = avcodec_decode_video2(codecCtx, srcFrame, &frameFinished, &packet);
NSLog(@"Res value===>%d",res);
NSLog(@"frame data===>%d",(int)srcFrame->data);
if (res < 0)
{
NSLog(@"Failed to decode frame");
}
}
else
{
NSLog(@"No video stream found");
}
// Need to delay initializing the output buffers because we don't know the dimensions until we decode the first frame.
if (!outputInit) {
if (codecCtx->width > 0 && codecCtx->height > 0) {
#ifdef _DEBUG
NSLog(@"Initializing decoder with frame size of: %dx%d", codecCtx->width, codecCtx->height);
#endif
outputBufLen = avpicture_get_size(PIX_FMT_RGBA, codecCtx->width, codecCtx->height);
outputBuf = av_malloc(outputBufLen);
avpicture_fill((AVPicture*)dstFrame, outputBuf, PIX_FMT_RGBA, codecCtx->width, codecCtx->height);
convertCtx = sws_getContext(codecCtx->width, codecCtx->height, codecCtx->pix_fmt, codecCtx->width,
codecCtx->height, PIX_FMT_RGBA, SWS_FAST_BILINEAR, NULL, NULL, NULL);
outputInit = YES;
frameFinished=1;
}
else {
NSLog(@"Could not get video output dimensions");
}
}
if (frameFinished)
frameReady = YES;
}The console shows me as follows.
2011-05-16 20:16:04.223 RTSPTest1[41226:207] Packet size===>359
[h264 @ 0x5815c00] no frame!
2011-05-16 20:16:04.223 RTSPTest1[41226:207] Res value===>-1
2011-05-16 20:16:04.224 RTSPTest1[41226:207] frame data===>101791200
2011-05-16 20:16:04.224 RTSPTest1[41226:207] Failed to decode frame
2011-05-16 20:16:04.225 RTSPTest1[41226:207] decoded!
2011-05-16 20:16:04.226 RTSPTest1[41226:207] Packet size===>424
[h264 @ 0x5017c00] no frame!
2011-05-16 20:16:04.226 RTSPTest1[41226:207] Res value===>-1
2011-05-16 20:16:04.227 RTSPTest1[41226:207] frame data===>81002704
2011-05-16 20:16:04.227 RTSPTest1[41226:207] Failed to decode frame
2011-05-16 20:16:04.228 RTSPTest1[41226:207] decoded!
2011-05-16 20:16:04.229 RTSPTest1[41226:207] Packet size===>424
[h264 @ 0x581d000] no frame!
2011-05-16 20:16:04.229 RTSPTest1[41226:207] Res value===>-1
2011-05-16 20:16:04.230 RTSPTest1[41226:207] frame data===>101791616
2011-05-16 20:16:04.230 RTSPTest1[41226:207] Failed to decode frame
2011-05-16 20:16:04.231 RTSPTest1[41226:207] decoded!
. . . . .But the simulator shows nothing.
What's wrong with my code.
Help me solve this problem.
Thanks for your answers.
-
Building FFMPEG library for iOS6.0 ARMv7 Processor
15 août 2013, par JimmyWARNING :
I was just informed by another user that there are some legal issues revolving around using FFMPEG for iOS, leaving the link here http://multinc.com/2009/08/24/compatibility-between-the-iphone-app-store-and-the-lgpl/
I cleaned up my question a little bit, when I wrote it the first time I was flustered. Now I can be more clear after taking a small break.
Edit : learned that you have to build for ARMv7, ARMv7s and iOS6.0
I'm trying to use the FFMPEG library in an XCode 4.5.1 project. And I'm trying to build it for ARMv7. What I'm looking for is the exact process, and some explanation. I understand that this is not a well documented problem. But I know that other pople have had the same problem as me.
What I have been able to do.
I have been able to build the library for testing.
1) I have been able to clone ffmpeg. For beginners this will get you started by creating a directory with the ffmpeg source. (Kudos to the guys who wrote it)
git clone git ://source.ffmpeg.org/ffmpeg.git ffmpeg
2) I have been able to write a config file that doesn't have any errors. We will go back to this part later. This is the command I attach to ./configure
./configure
—disable-doc
—disable-ffmpeg
—disable-ffplay
—disable-ffserver
—enable-cross-compile
—arch=arm
—target-os=darwin
—cc=/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/llvm-gcc-4.2/bin/arm-apple-darwin10-llvm-gcc-4.2—as='gas-preprocessor/gas-preprocessor.pl /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/llvm-gcc-4.2/bin/arm-apple-darwin10-llvm-gcc-4.2'
—sysroot=/applications/xcode.app/contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS6.0.sdk
—cpu=cortex-a8
—extra-ldflags='-arch=armv7 -isysroot /applications/xcode.app/contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS6.0.sdk'
—enable-pic —disable-bzlib —disable-gpl —disable-shared —enable-static —disable-mmx —disable-debug —disable-neon —extra-cflags='-pipe -Os -gdwarf-2 -isysroot /applications/xcode.app/contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS5.1.sdk
-m$thumb_opt :-no-thumb -mthumb-interwork'These are some things to note.
- I had to download ( https://github.com/yuvi/gas-preprocessor ) copy the file gas-preprocessor.pl at /usr/local/bin. Set permissions to read write (777)
- Make sure I'm using the right GCC compiler : /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/llvm-gcc-4.2/bin/arm-apple-darwin10-llvm-gcc-4.2
- Make sure I'm using the right SDK : /applications/xcode.app/contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS6.0.sdk
- —extra-cflags="-arch armv7" causes : error : unrecognized command line option “-arch”
Here in lies the problem.
I can include the library like so
libavcodec/avcodec.h
But when I started to write the encoder. I received this warning, and countless errors.
ignoring file /Users/Jimmy/Development/source.ffmpeg/Library/libavutil.a, file was built for archive which is not the architecture being linked (armv7s) : /Users/Jimmy/Development/source.ffmpeg/Library/libavutil.a
That means that I didn't build the right binary.
What I'm looking for is someone whose done it before, to walk all of us through the process of building FFMPEG for iOS6.0 and ARMv7 and the majority of things to look out for. Thanks a ton.