
Recherche avancée
Autres articles (63)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...) -
Le plugin : Podcasts.
14 juillet 2010, parLe problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
Types de fichiers supportés dans les flux
Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...)
Sur d’autres sites (8551)
-
using Qt's QProcess as popen (with ffmpeg rawvideo)
9 octobre 2019, par AlexI inserted some code in a video application to export using ffmpeg
with stdin (rawideo rgba format), to quickly test that it worked I
usedpopen()
, the tests went well and since the application is
written using Qt I thought of modify the patch usingQProcess
and
->write()
.The application shows no errors and works properly but the generated
video files are not playable neither with vlc nor with mplayer while
those generated withpopen()
work perfectly with both. I have the
feeling that->close()
or->terminate()
does not properly close
ffmpeg and consequently the file, but I don’t know how to verify it
nor I found alternative ways to end the executed command, beside
->waitForBytesWritten()
should wait for the data to be written,
suggestions ? Am I doing something wrong ?(Obviously I can’t prepare a testable example it would take me more
time than the patch took)Below is the code I entered, in the case
#else
the Qt codeInitialization
#if defined(EXPORT_POPEN) && EXPORT_POPEN == 1
pipe_frame.file = popen("/tmp/ffmpeg-rawpipe.sh", "w");
if (pipe_frame.file == NULL) {
return false;
}
#else
pipe_frame.qproc = new QProcess;
pipe_frame.qproc->start("/tmp/ffmpeg-rawpipe.sh", QIODevice::WriteOnly);
if(!pipe_frame.qproc->waitForStarted()) {
return false;
}
#endifWriting a frame
#if defined(EXPORT_POPEN) && EXPORT_POPEN == 1
fwrite(pipe_frame.data, pipe_frame.width*4*pipe_frame.height , 1, pipe_frame.file);
#else
qint64 towrite = pipe_frame.width*4*pipe_frame.height,
written = 0, partial;
while(written < towrite) {
partial = pipe_frame.qproc->write(&pipe_frame.data[written], towrite-written);
pipe_frame.qproc->waitForBytesWritten(-1);
written += partial;
}
#endifTermination
#if defined(EXPORT_POPEN) && EXPORT_POPEN == 1
pclose(pipe_frame.file);
#else
pipe_frame.qproc->terminate();
//pipe_frame.qproc->close();
#endifedit
ffmpeg-rawpipe.sh
#!/bin/sh
exec ffmpeg-cuda -y -f rawvideo -s 1920x1080 -pix_fmt rgba -r 25 -i - -an -c:v h264_nvenc \
-cq:v 19 \
-profile:v high /tmp/test.mp4I made some changes, I added the unbuffered flag to the open
pipe_frame.qproc->start("/tmp/ffmpeg-rawpipe.sh", QIODevice::WriteOnly|QIODevice::Unbuffered);
And therefore simplified the write
qint64 towrite = pipe_frame.width*4*pipe_frame.height;
pipe_frame.qproc->write(pipe_frame.data, towrite);
pipe_frame.qproc->waitForBytesWritten(-1);I added a closeWriteChannel before closing the application (hoping that stopping the stdin ffmpeg pipe ends properly, just in case, I’m not sure it doesn’t)
pipe_frame.qproc->waitForBytesWritten(-1);
pipe_frame.qproc->closeWriteChannel();
//pipe_frame.qproc->terminate();
pipe_frame.qproc->close();But nothing changes, the mp4 file is created and contains data but from the mplayer log I see that it is misinterpreted, the video format is not recognized and it looks for an audio that is not there.
-
avformat/matroskaenc : Remove unnecessary avio_tell(), avio_seek()
22 janvier 2020, par Andreas Rheinhardtavformat/matroskaenc : Remove unnecessary avio_tell(), avio_seek()
avio_close_dyn_buf() has a bug : When the write pointer does not point to
the end of the written data when calling it (i.e. when one has performed
a seek back to update already written data), it would not add padding to
the end of the buffer, but to the current position, overwriting other
data ; furthermore the reported size would be wrong (off by the amount of
data it has overwritten with padding).In order not to run into this when updating already written elements or
elements for which size has only been reserved, the Matroska muxer would
first record the current position of the dynamic buffer, then seek to
the desired position, perform the update and seek back to the earlier
position.But now that end_ebml_master_crc32() does not make use of
avio_close_dyn_buf() any more, this is no longer necessary.Signed-off-by : Andreas Rheinhardt <andreas.rheinhardt@gmail.com>
-
what is a steps to apply filter to video using ffmpeg c program android ndk
13 juillet 2016, par RajkumarI am newbie to ffmpeg with android. i have an input video birds.mp4 that have following settings
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'birds.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
encoder : Lavf54.59.106
Duration: 00:00:21.60, start: 0.000000, bitrate: 574 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 320x176 [
SAR 44:45 DAR 16:9], 440 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc (default)
Metadata:
handler_name : VideoHandler
Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, flt
p, 127 kb/s (default)
Metadata:
handler_name : SoundHandleri have compiled ffmpeg as *.so file for android. And i write a Jni wrapper. Now i want to apply a filter to that input video and save it to output.mp4 with same setting as shown. i know basic terminologies encoding, decoding, muxing, transcoding.
I want to know How to do this task ? and What are the techniques involved to do this task ?I use ffmpeg-3.1 c-source code grabbed from github. i refer this sample it doesn’t guide me.
If anybody experienced this problem please guide me right direction.