
Recherche avancée
Autres articles (77)
-
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)
Sur d’autres sites (7200)
-
Is there any control property to fix video playback speed problem when using ffmpeg to decode in Qt platform ?
16 février 2019, par SoloWangI want to play local video file in Qt platform using ffmpeg to decode.Everything is OK except that play speed is as twice as normal.
The first thing I think about is that there must be a sampling frequency involved.But to be a new to ffmpeg,I don’t know how to fix this problem.
Above is my code to read frame,is anyone can tell me what’s wrong with the code ?void VideoThread::run()
{
m_pInFmtCtx = avformat_alloc_context(); //ini struct
char path[] = "d:/test.mp4";
// open specific file
if(avformat_open_input(&m_pInFmtCtx, *path, NULL, NULL)){
{
qDebug()<<"get rtsp failed";
return;
}
else
{
qDebug()<<"get rtsp success";
}
if(avformat_find_stream_info(m_pInFmtCtx, NULL) < 0)
{
qDebug()<<"could not find stream information";
return;
}
int nVideoIndex = -1;
for(int i = 0; i < m_pInFmtCtx->nb_streams; i++)
{
if(m_pInFmtCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO)
{
nVideoIndex = i;
break;
}
}
if(nVideoIndex == -1)
{
qDebug()<<"could not find video stream";
return;
}
qDebug("---------------- File Information ---------------");
m_pCodecCtx = m_pInFmtCtx->streams[nVideoIndex]->codec;
m_pCodec = avcodec_find_decoder(m_pCodecCtx->codec_id);
if(!m_pCodec)
{
qDebug()<<"could not find codec";
return;
}
//start Decoder
if (avcodec_open2(m_pCodecCtx, m_pCodec, NULL) < 0) {
qDebug("Could not open codec.\n");
return;
}
//malloc space for stroring frame
m_pFrame = av_frame_alloc();
m_pFrameRGB = av_frame_alloc();
m_pOutBuf = (uint8_t*)av_malloc(avpicture_get_size(AV_PIX_FMT_RGB32, m_pCodecCtx->width, m_pCodecCtx->height));
avpicture_fill((AVPicture*)m_pFrameRGB, m_pOutBuf, AV_PIX_FMT_RGB32, m_pCodecCtx->width, m_pCodecCtx->height);
//for color switch,from YUV to RGB
struct SwsContext *pImgCtx = sws_getContext(m_pCodecCtx->width, m_pCodecCtx->height, m_pCodecCtx->pix_fmt,
m_pCodecCtx->width, m_pCodecCtx->height, AV_PIX_FMT_RGB32, SWS_BICUBIC, NULL, NULL, NULL);
int nSize = m_pCodecCtx->width * m_pCodecCtx->height;
m_pPacket = (AVPacket *)av_malloc(sizeof(AVPacket));
if(av_new_packet(m_pPacket, nSize) != 0)
{
qDebug()<<"new packet failed";
}
//isInterruptionRequested is a flag,determine whether the thread is over
// read each frame from specific video file
while (!isInterruptionRequested())
{
int nGotPic = 0;
if(av_read_frame(m_pInFmtCtx, m_pPacket) >= 0)
{
if(m_pPacket->stream_index == nVideoIndex)
{
//avcodec_decode_video2()transform from packet to frame
if(avcodec_decode_video2(m_pCodecCtx, m_pFrame, &nGotPic, m_pPacket) < 0)
{
qDebug()<<"decode failed";
return;
}
if(nGotPic)
{ // transform to RGB color
sws_scale(pImgCtx, (const uint8_t* const*)m_pFrame->data,
m_pFrame->linesize, 0, m_pCodecCtx->height, m_pFrameRGB->data,
m_pFrameRGB->linesize);
// save to QImage,for later use
QImage *pImage = new QImage((uchar*)m_pOutBuf, m_pCodecCtx->width, m_pCodecCtx->height, QImage::Format_RGB32);
}
}
}
av_free_packet(m_pPacket);
msleep(5);
}
exec();
} -
FFMPEG libav : Control when video frame is captured
24 mai 2019, par Michael MurrayI am using a the libav library to record video from a CSI camera on a raspberry pi. I can successfully record the camera data to a file, however I now want to synchronise the frames between two cameras. I’ve setup a mechanism to synchronise when frames are captured, however I assumed when I called
av_read_frame(input_format_context, &packet)
, that the next frame would be captured. However, it turns out that the video is streamed into some buffer, and when the buffer is full, it waits until theav_read_frame
method is called which removes data from the buffer, thus allowing another frame to be captured.This is not the behaviour I want, as I need to control exactly when the frame is captured. Is there a mechanism in libav that I could use to produce this kind of behaviour ? Or am I going to have to use a different library for capturing video from the CSI device ?
-
How to fix : GUI application who control a simple command of ffmpeg with python
9 octobre 2019, par maocaI want to make a graphic application with only 2 buttons (start / stop) that allows to launch a subprocess (start) and stop it (stop).
(I’m using Python3, PyQt5 and ffmpeg)
The process captures the screen in a video and save it to an mp4 using the ffmpeg command to launch the POpen command.
To make a clean output of the command, ffmpeg uses ’q’ that I write by stdin.In a simple script it works for me but I can’t get it to work within the buttons.
My knowledge is very basic and as much as I look for information I do not understand what I am doing wrong, I appreciate any comments that let me move on.
This is my code :
import sys
import subprocess
from PyQt5.QtWidgets import QApplication, QWidget, QPushButton
class Ventana(QWidget):
def __init__(self):
super().__init__()
# Button 1
pybutton = QPushButton('REC', self)
pybutton.clicked.connect(self.clickMethodB1)
pybutton.resize(50, 32)
pybutton.move(50, 50)
# BOTON 2
pybutton = QPushButton('STOP', self)
pybutton.clicked.connect(self.clickMethodB2)
pybutton.resize(100, 32)
pybutton.move(150, 50)
self.initUI()
def initUI(self):
self.setGeometry(300, 300, 300, 220)
self.setWindowTitle('FFMPEG')
self.move(800, 400)
self.show()
def clickMethodB1(self):
global ffmpeg
filename_mp4 = 'c://tmp//output.mp4'
print('REC')
command = 'ffmpeg -f dshow -i video="screen-capture-recorder" '+ filename_mp4
ffmpeg = subprocess.Popen(command, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, encoding='utf-8', shell=True)
def clickMethodB2(self):
print('STOP')
ffmpeg.stdin.write(str('q'))
if __name__ == '__main__':
app = QApplication(sys.argv)
ex = Ventana()
sys.exit(app.exec_())