
Recherche avancée
Autres articles (81)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)
Sur d’autres sites (9896)
-
FFMPEG on Fedora but PHP Compilation confliction
11 octobre 2013, par UMII have running Xampp with PHP5.5 on Fedora from Apache Friend, with default settings what the installer package do on Linux.
When I install FFMPEG successfully and try to load from php.ini it always says :
*
11-Oct-2013 14:05:51 Europe/Berlin] PHP Warning: PHP Startup: ffmpeg: Unable to initialize module
Module compiled with module API=20060613
PHP compiled with module API=20121212
These options need to match
in Unknown on line 0*
The only thing which is confusing me is that when I did phpize even that I already installed xampp server which means I have PHP running. FFMPEG phpize did not work and I had to install php-devel. Does that mean that FFMPEG is configured with PHP, other than the installed with XAMPP ? I am not sure what is happening. Whatever I do I always receive this error message in php_error_log file.
It is amazing :) that I just run below command to see what version of PHP is, I have and below are the results which are shocking one because I am in a feeling that I have PHP 5.5 installed and running from XAMMP.
[root@localhost ~]# php -v
PHP 5.2.6 (cli) (built: May 8 2008 08:53:44)
Copyright (c) 1997-2008 The PHP Group
Zend Engine v2.2.0, Copyright (c) 1998-2008 Zend TechnologiesHow can I get this covered that if I simple install XAMPP server on linux and want to install and configure FFMPEG along with ?
-
"error C2400 : inline assembler syntax error in ‘opcode’" pxor compiling ffmpeg with mmx flag enabled
4 octobre 2013, par KristoferI'm trying to compile (visual studio 2005) ffmpeg with mmx flag enabled (HAVE_MMX) but get the following error :
"error C2400 : inline assembler syntax error in ‘opcode’"
And it's complaining about xpor_r2rIdeas ?
[Update]
Jester pointed out that it's probably a problem with the macro :#define mmx_r2r(op,regs,regd) \
__asm__ volatile (#op " %" #regs ", %" #regd)Directly using :
__asm__ pxor mm7 mm7
works
Adding volatile (as in the macro mentioned) gives the same error, syntax error as before in 'opcode' found 'data_type'.Just removing volatile from the macro does not work, instead gives error in 'opcode' found '('
Removing the paranthesis instead gives error in 'opcode' found 'bad_token'
-
FFmpeg avcodec_decode_video2 decode RTSP H264 HD-video packet to video picture with error
29 mai 2018, par Nguyen Ba ThiI used
FFmpeg
libraryversion 4.0
to have simple C++ program, in witch is a thread to receiveRTSP H264
video data from IP-camera and display it in program window.Code of this thread is follow :
DWORD WINAPI GrabbProcess(LPVOID lpParam)
// Grabbing thread
{
DWORD i;
int ret = 0, nPacket=0;
FILE *pktFile;
// Open video file
pFormatCtx = avformat_alloc_context();
if(avformat_open_input(&pFormatCtx, nameVideoStream, NULL, NULL)!=0)
fGrabb=-1; // Couldn't open file
else
// Retrieve stream information
if(avformat_find_stream_info(pFormatCtx, NULL)<0)
fGrabb=-2; // Couldn't find stream information
else
{
// Find the first video stream
videoStream=-1;
for(i=0; inb_streams; i++)
if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO)
{
videoStream=i;
break;
}
if(videoStream==-1)
fGrabb=-3; // Didn't find a video stream
else
{
// Get a pointer to the codec context for the video stream
pCodecCtxOrig=pFormatCtx->streams[videoStream]->codec;
// Find the decoder for the video stream
pCodec=avcodec_find_decoder(pCodecCtxOrig->codec_id);
if(pCodec==NULL)
fGrabb=-4; // Codec not found
else
{
// Copy context
pCodecCtx = avcodec_alloc_context3(pCodec);
if(avcodec_copy_context(pCodecCtx, pCodecCtxOrig) != 0)
fGrabb=-5; // Error copying codec context
else
{
// Open codec
if(avcodec_open2(pCodecCtx, pCodec, NULL)<0)
fGrabb=-6; // Could not open codec
else
// Allocate video frame for input
pFrame=av_frame_alloc();
// Determine required buffer size and allocate buffer
numBytes=avpicture_get_size(pCodecCtx->pix_fmt, pCodecCtx->width,
pCodecCtx->height);
buffer=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));
// Assign appropriate parts of buffer to image planes in pFrame
// Note that pFrame is an AVFrame, but AVFrame is a superset
// of AVPicture
avpicture_fill((AVPicture *)pFrame, buffer, pCodecCtx->pix_fmt,
pCodecCtx->width, pCodecCtx->height);
// Allocate video frame for display
pFrameRGB=av_frame_alloc();
// Determine required buffer size and allocate buffer
numBytes=avpicture_get_size(AV_PIX_FMT_RGB24, pCodecCtx->width,
pCodecCtx->height);
bufferRGB=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));
// Assign appropriate parts of buffer to image planes in pFrameRGB
// Note that pFrameRGB is an AVFrame, but AVFrame is a superset
// of AVPicture
avpicture_fill((AVPicture *)pFrameRGB, bufferRGB, AV_PIX_FMT_RGB24,
pCodecCtx->width, pCodecCtx->height);
// initialize SWS context for software scaling to FMT_RGB24
sws_ctx_to_RGB = sws_getContext(pCodecCtx->width,
pCodecCtx->height,
pCodecCtx->pix_fmt,
pCodecCtx->width,
pCodecCtx->height,
AV_PIX_FMT_RGB24,
SWS_BILINEAR,
NULL,
NULL,
NULL);
// Allocate video frame (grayscale YUV420P) for processing
pFrameYUV=av_frame_alloc();
// Determine required buffer size and allocate buffer
numBytes=avpicture_get_size(AV_PIX_FMT_YUV420P, pCodecCtx->width,
pCodecCtx->height);
bufferYUV=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));
// Assign appropriate parts of buffer to image planes in pFrameYUV
// Note that pFrameYUV is an AVFrame, but AVFrame is a superset
// of AVPicture
avpicture_fill((AVPicture *)pFrameYUV, bufferYUV, AV_PIX_FMT_YUV420P,
pCodecCtx->width, pCodecCtx->height);
// initialize SWS context for software scaling to FMT_YUV420P
sws_ctx_to_YUV = sws_getContext(pCodecCtx->width,
pCodecCtx->height,
pCodecCtx->pix_fmt,
pCodecCtx->width,
pCodecCtx->height,
AV_PIX_FMT_YUV420P,
SWS_BILINEAR,
NULL,
NULL,
NULL);
RealBsqHdr.biWidth = pCodecCtx->width;
RealBsqHdr.biHeight = -pCodecCtx->height;
}
}
}
}
while ((fGrabb==1)||(fGrabb==100))
{
// Grabb a frame
if (av_read_frame(pFormatCtx, &packet) >= 0)
{
// Is this a packet from the video stream?
if(packet.stream_index==videoStream)
{
// Decode video frame
int len = avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);
nPacket++;
// Did we get a video frame?
if(frameFinished)
{
// Convert the image from its native format to YUV
sws_scale(sws_ctx_to_YUV, (uint8_t const * const *)pFrame->data,
pFrame->linesize, 0, pCodecCtx->height,
pFrameYUV->data, pFrameYUV->linesize);
// Convert the image from its native format to RGB
sws_scale(sws_ctx_to_RGB, (uint8_t const * const *)pFrame->data,
pFrame->linesize, 0, pCodecCtx->height,
pFrameRGB->data, pFrameRGB->linesize);
HDC hdc=GetDC(hWndM);
SetDIBitsToDevice(hdc, 0, 0, pCodecCtx->width, pCodecCtx->height,
0, 0, 0, pCodecCtx->height,pFrameRGB->data[0], (LPBITMAPINFO)&RealBsqHdr, DIB_RGB_COLORS);
ReleaseDC(hWndM,hdc);
av_frame_unref(pFrame);
}
}
// Free the packet that was allocated by av_read_frame
av_free_packet(&packet);
}
}
// Free the org frame
av_frame_free(&pFrame);
// Free the RGB frame
av_frame_free(&pFrameRGB);
// Free the YUV frame
av_frame_free(&pFrameYUV);
// Close the codec
avcodec_close(pCodecCtx);
avcodec_close(pCodecCtxOrig);
// Close the video file
avformat_close_input(&pFormatCtx);
avformat_free_context(pFormatCtx);
if (fGrabb==1)
sprintf(tmpstr,"Grabbing Completed %d frames", nCntTotal);
else if (fGrabb==2)
sprintf(tmpstr,"User break on %d frames", nCntTotal);
else if (fGrabb==3)
sprintf(tmpstr,"Can't Grabb at frame %d", nCntTotal);
else if (fGrabb==-1)
sprintf(tmpstr,"Couldn't open file");
else if (fGrabb==-2)
sprintf(tmpstr,"Couldn't find stream information");
else if (fGrabb==-3)
sprintf(tmpstr,"Didn't find a video stream");
else if (fGrabb==-4)
sprintf(tmpstr,"Codec not found");
else if (fGrabb==-5)
sprintf(tmpstr,"Error copying codec context");
else if (fGrabb==-6)
sprintf(tmpstr,"Could not open codec");
i=(UINT) fGrabb;
fGrabb=0;
SetWindowText(hWndM,tmpstr);
ExitThread(i);
return 0;
}
// End Grabbing threadWhen program receive
RTSP H264
video data with resolution704x576
then decoded video pictures are OK. When receiveRTSP H264
HD-video data with resolution1280x720
it look like that first video picture is decoded OK and then video pictures are decoded but always with some error.Please help me to fix this problem !
Here is problems brief :
I have an IP camera modelHI3518E_50H10L_S39
(product of China).
Camera can provide H264 video stream both at resolution 704x576 (with RTSP URI "rtsp ://192.168.1.18:554/user=admin_password=tlJwpbo6_channel=1_stream=1.sdp ?real_stream") or 1280x720 (with RTSP URI "rtsp ://192.168.1.18:554/user=admin_password=tlJwpbo6_channel=1_stream=0.sdp ?real_stream").
UsingFFplay
utility I can access and display them with good picture quality.
For testing of grabbing from this camera, I have a simple (above mentioned) program in VC-2005. In "Grabbing thread" program useFFmpeg
library version 4.0 for opening camera RTSP stream, retrieve stream information, find the first video stream... and prepare some variables.
Center of this thread is loop : Grab a frame (functionav_read_frame
) - Decode it if it’s video (functionavcodec_decode_video2
) - Convert to RGB format (functionsws_scale
) - Display to program window (GDI functionSetDIBitsToDevice
).
When proram run with camera RTSP stream at resolution 704x576, I have good video picture. Here is a sample :
704x576 sample
When program run with camera RTSP stream at resolution 1280x720, first video picture is good :
First good at res.1280x720
but then not good :
not good at res.1280x720
Its seem to be my FFmpeg function call toavcodec_decode_video2
can’t fully decode certain packet for some reasons.