
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (39)
-
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...)
Sur d’autres sites (7971)
-
using Qt's QProcess as popen (with ffmpeg rawvideo)
9 octobre 2019, par AlexI inserted some code in a video application to export using ffmpeg
with stdin (rawideo rgba format), to quickly test that it worked I
usedpopen()
, the tests went well and since the application is
written using Qt I thought of modify the patch usingQProcess
and
->write()
.The application shows no errors and works properly but the generated
video files are not playable neither with vlc nor with mplayer while
those generated withpopen()
work perfectly with both. I have the
feeling that->close()
or->terminate()
does not properly close
ffmpeg and consequently the file, but I don’t know how to verify it
nor I found alternative ways to end the executed command, beside
->waitForBytesWritten()
should wait for the data to be written,
suggestions ? Am I doing something wrong ?(Obviously I can’t prepare a testable example it would take me more
time than the patch took)Below is the code I entered, in the case
#else
the Qt codeInitialization
#if defined(EXPORT_POPEN) && EXPORT_POPEN == 1
pipe_frame.file = popen("/tmp/ffmpeg-rawpipe.sh", "w");
if (pipe_frame.file == NULL) {
return false;
}
#else
pipe_frame.qproc = new QProcess;
pipe_frame.qproc->start("/tmp/ffmpeg-rawpipe.sh", QIODevice::WriteOnly);
if(!pipe_frame.qproc->waitForStarted()) {
return false;
}
#endifWriting a frame
#if defined(EXPORT_POPEN) && EXPORT_POPEN == 1
fwrite(pipe_frame.data, pipe_frame.width*4*pipe_frame.height , 1, pipe_frame.file);
#else
qint64 towrite = pipe_frame.width*4*pipe_frame.height,
written = 0, partial;
while(written < towrite) {
partial = pipe_frame.qproc->write(&pipe_frame.data[written], towrite-written);
pipe_frame.qproc->waitForBytesWritten(-1);
written += partial;
}
#endifTermination
#if defined(EXPORT_POPEN) && EXPORT_POPEN == 1
pclose(pipe_frame.file);
#else
pipe_frame.qproc->terminate();
//pipe_frame.qproc->close();
#endifedit
ffmpeg-rawpipe.sh
#!/bin/sh
exec ffmpeg-cuda -y -f rawvideo -s 1920x1080 -pix_fmt rgba -r 25 -i - -an -c:v h264_nvenc \
-cq:v 19 \
-profile:v high /tmp/test.mp4I made some changes, I added the unbuffered flag to the open
pipe_frame.qproc->start("/tmp/ffmpeg-rawpipe.sh", QIODevice::WriteOnly|QIODevice::Unbuffered);
And therefore simplified the write
qint64 towrite = pipe_frame.width*4*pipe_frame.height;
pipe_frame.qproc->write(pipe_frame.data, towrite);
pipe_frame.qproc->waitForBytesWritten(-1);I added a closeWriteChannel before closing the application (hoping that stopping the stdin ffmpeg pipe ends properly, just in case, I’m not sure it doesn’t)
pipe_frame.qproc->waitForBytesWritten(-1);
pipe_frame.qproc->closeWriteChannel();
//pipe_frame.qproc->terminate();
pipe_frame.qproc->close();But nothing changes, the mp4 file is created and contains data but from the mplayer log I see that it is misinterpreted, the video format is not recognized and it looks for an audio that is not there.
-
avformat/matroskaenc : Remove unnecessary avio_tell(), avio_seek()
22 janvier 2020, par Andreas Rheinhardtavformat/matroskaenc : Remove unnecessary avio_tell(), avio_seek()
avio_close_dyn_buf() has a bug : When the write pointer does not point to
the end of the written data when calling it (i.e. when one has performed
a seek back to update already written data), it would not add padding to
the end of the buffer, but to the current position, overwriting other
data ; furthermore the reported size would be wrong (off by the amount of
data it has overwritten with padding).In order not to run into this when updating already written elements or
elements for which size has only been reserved, the Matroska muxer would
first record the current position of the dynamic buffer, then seek to
the desired position, perform the update and seek back to the earlier
position.But now that end_ebml_master_crc32() does not make use of
avio_close_dyn_buf() any more, this is no longer necessary.Signed-off-by : Andreas Rheinhardt <andreas.rheinhardt@gmail.com>
-
what is a steps to apply filter to video using ffmpeg c program android ndk
13 juillet 2016, par RajkumarI am newbie to ffmpeg with android. i have an input video birds.mp4 that have following settings
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'birds.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
encoder : Lavf54.59.106
Duration: 00:00:21.60, start: 0.000000, bitrate: 574 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 320x176 [
SAR 44:45 DAR 16:9], 440 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc (default)
Metadata:
handler_name : VideoHandler
Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, flt
p, 127 kb/s (default)
Metadata:
handler_name : SoundHandleri have compiled ffmpeg as *.so file for android. And i write a Jni wrapper. Now i want to apply a filter to that input video and save it to output.mp4 with same setting as shown. i know basic terminologies encoding, decoding, muxing, transcoding.
I want to know How to do this task ? and What are the techniques involved to do this task ?I use ffmpeg-3.1 c-source code grabbed from github. i refer this sample it doesn’t guide me.
If anybody experienced this problem please guide me right direction.