
Recherche avancée
Médias (1)
-
Sintel MP4 Surround 5.1 Full
13 mai 2011, par
Mis à jour : Février 2012
Langue : English
Type : Video
Autres articles (51)
-
XMP PHP
13 mai 2011, parDixit Wikipedia, XMP signifie :
Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...) -
Use, discuss, criticize
13 avril 2011, parTalk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
A discussion list is available for all exchanges between users. -
Installation en mode ferme
4 février 2011, parLe mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
C’est la méthode que nous utilisons sur cette même plateforme.
L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...)
Sur d’autres sites (8073)
-
FFMPEG in android issue
9 janvier 2012, par AntuI have successfully compiled the FFMPEG library to android using NDK.
(using Rock Player FFMPEG implementation)http://www.rockplayer.com/download/rockplayer_ffmpeg_git_20100418.zipI know that FFMPEG supports .avi, divX, mov ect. But I created a mediaplayer and tried to run them but I was not able to play them. I is this the right way to use the FFMPEG library. Can Any one Help.I am able to play defaut video, mp4, 3gp etc. Here is the code for mediaplayer
public native String stringFromJNI();
static {
System.loadLibrary("ffmpeg");
System.loadLibrary("ffmpeg-test-jni");
}
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
TextView tv =(TextView) findViewById(R.id.textView1);
tv.setText( stringFromJNI() );
System.gc();
Log.d("Video FFmpeg ", "**");
getWindow().setFormat(PixelFormat.TRANSLUCENT);
String filepath = Environment.getExternalStorageDirectory()+"/simple.avi";
Log.d("File path", filepath);
MediaController mc = new MediaController(this);
VideoView video=(VideoView) findViewById(R.id.video);
mc.setMediaPlayer(video);
video.setVideoPath(filepath);
video.setMediaController(mc);
mc.show();
//video.setVideoPath("/mnt/sdcard/Movies/Ishq-Hothon-Se.3gp");
video.start();
View nextButton = findViewById(R.id.button1);
nextButton.setOnClickListener(this);
}
@Override
public void onClick(View v) {
// TODO Auto-generated method stub
Intent i=new Intent(this,NextVideo.class);
startActivity(i);
}
} -
Workaround for loading overlay issue with stock Android browser.
23 août 2012, par Jack Moorem colorbox/jquery.colorbox.js Workaround for loading overlay issue with stock Android browser.
-
iPhone camera shooting video using the AVCaptureSession and using ffmpeg CMSampleBufferRef a change in h.264 format is the issue. please advice
4 janvier 2012, par isaiahMy goal is h.264/AAC , mpeg2-ts streaming to server from iphone device.
Current my source is FFmpeg+libx264 compile success. I Know gnu License. I want the demo program.
I'm want to know that
1.CMSampleBufferRef to AVPicture data is success ?
avpicture_fill((AVPicture*)pFrame, rawPixelBase, PIX_FMT_RGB32, width, height);
pFrame linesize and data is not null but pst -9233123123 . outpic also .
Because of this I have to guess 'non-strictly-monotonic PTS' message2.This log is repeat.
encoding frame (size= 0)
encoding frame = "" , 'avcodec_encode_video' return 0 is success but always 0 .I don't know what to do...
2011-06-01 15:15:14.199 AVCam[1993:7303] pFrame = avcodec_alloc_frame();
2011-06-01 15:15:14.207 AVCam[1993:7303] avpicture_fill = 1228800
Video encoding
2011-0601 15:5:14.215 AVCam[1993:7303] codec = 5841844
[libx264 @ 0x1441e00] using cpu capabilities: ARMv6 NEON
[libx264 @ 0x1441e00] profile Constrained Baseline, level 2.0[libx264 @ 0x1441e00] non-strictly-monotonic PTS
encoding frame (size= 0)
encoding frame
[libx264 @ 0x1441e00] final ratefactor: 26.743.I have to guess 'non-strictly-monotonic PTS' message is the cause of all problems.
what is this 'non-strictly-monotonic PTS' .this is source
(void) captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
if( !CMSampleBufferDataIsReady(sampleBuffer) )
{
NSLog( @"sample buffer is not ready. Skipping sample" );
return;
}
if( [isRecordingNow isEqualToString:@"YES"] )
{
lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
if( videoWriter.status != AVAssetWriterStatusWriting )
{
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:lastSampleTime];
}
if( captureOutput == videooutput )
{
[self newVideoSample:sampleBuffer];
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
// access the data
int width = CVPixelBufferGetWidth(pixelBuffer);
int height = CVPixelBufferGetHeight(pixelBuffer);
unsigned char *rawPixelBase = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);
AVFrame *pFrame;
pFrame = avcodec_alloc_frame();
pFrame->quality = 0;
NSLog(@"pFrame = avcodec_alloc_frame(); ");
// int bytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 0);
// int bytesSize = height * bytesPerRow ;
// unsigned char *pixel = (unsigned char*)malloc(bytesSize);
// unsigned char *rowBase = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
// memcpy (pixel, rowBase, bytesSize);
int avpicture_fillNum = avpicture_fill((AVPicture*)pFrame, rawPixelBase, PIX_FMT_RGB32, width, height);//PIX_FMT_RGB32//PIX_FMT_RGB8
//NSLog(@"rawPixelBase = %i , rawPixelBase -s = %s",rawPixelBase, rawPixelBase);
NSLog(@"avpicture_fill = %i",avpicture_fillNum);
//NSLog(@"width = %i,height = %i",width, height);
// Do something with the raw pixels here
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
//avcodec_init();
//avdevice_register_all();
av_register_all();
AVCodec *codec;
AVCodecContext *c= NULL;
int out_size, size, outbuf_size;
//FILE *f;
uint8_t *outbuf;
printf("Video encoding\n");
/* find the mpeg video encoder */
codec =avcodec_find_encoder(CODEC_ID_H264);//avcodec_find_encoder_by_name("libx264"); //avcodec_find_encoder(CODEC_ID_H264);//CODEC_ID_H264);
NSLog(@"codec = %i",codec);
if (!codec) {
fprintf(stderr, "codec not found\n");
exit(1);
}
c= avcodec_alloc_context();
/* put sample parameters */
c->bit_rate = 400000;
c->bit_rate_tolerance = 10;
c->me_method = 2;
/* resolution must be a multiple of two */
c->width = 352;//width;//352;
c->height = 288;//height;//288;
/* frames per second */
c->time_base= (AVRational){1,25};
c->gop_size = 10;//25; /* emit one intra frame every ten frames */
//c->max_b_frames=1;
c->pix_fmt = PIX_FMT_YUV420P;
c ->me_range = 16;
c ->max_qdiff = 4;
c ->qmin = 10;
c ->qmax = 51;
c ->qcompress = 0.6f;
/* open it */
if (avcodec_open(c, codec) < 0) {
fprintf(stderr, "could not open codec\n");
exit(1);
}
/* alloc image and output buffer */
outbuf_size = 100000;
outbuf = malloc(outbuf_size);
size = c->width * c->height;
AVFrame* outpic = avcodec_alloc_frame();
int nbytes = avpicture_get_size(PIX_FMT_YUV420P, c->width, c->height);
//create buffer for the output image
uint8_t* outbuffer = (uint8_t*)av_malloc(nbytes);
#pragma mark -
fflush(stdout);
<pre>// int numBytes = avpicture_get_size(PIX_FMT_YUV420P, c->width, c->height);
// uint8_t *buffer = (uint8_t *)av_malloc(numBytes*sizeof(uint8_t));
//
// //UIImage *image = [UIImage imageNamed:[NSString stringWithFormat:@"10%d", i]];
// CGImageRef newCgImage = [self imageFromSampleBuffer:sampleBuffer];//[image CGImage];
//
// CGDataProviderRef dataProvider = CGImageGetDataProvider(newCgImage);
// CFDataRef bitmapData = CGDataProviderCopyData(dataProvider);
// buffer = (uint8_t *)CFDataGetBytePtr(bitmapData);
//
// avpicture_fill((AVPicture*)pFrame, buffer, PIX_FMT_RGB8, c->width, c->height);
avpicture_fill((AVPicture*)outpic, outbuffer, PIX_FMT_YUV420P, c->width, c->height);
struct SwsContext* fooContext = sws_getContext(c->width, c->height,
PIX_FMT_RGB8,
c->width, c->height,
PIX_FMT_YUV420P,
SWS_FAST_BILINEAR, NULL, NULL, NULL);
//perform the conversion
sws_scale(fooContext, pFrame->data, pFrame->linesize, 0, c->height, outpic->data, outpic->linesize);
// Here is where I try to convert to YUV
/* encode the image */
out_size = avcodec_encode_video(c, outbuf, outbuf_size, outpic);
printf("encoding frame (size=%5d)\n", out_size);
printf("encoding frame %s\n", outbuf);
//fwrite(outbuf, 1, out_size, f);
// free(buffer);
// buffer = NULL;
/* add sequence end code to have a real mpeg file */
// outbuf[0] = 0x00;
// outbuf[1] = 0x00;
// outbuf[2] = 0x01;
// outbuf[3] = 0xb7;
//fwrite(outbuf, 1, 4, f);
//fclose(f);
free(outbuf);
avcodec_close(c);
av_free(c);
av_free(pFrame);
printf("\n");
</pre>