
Recherche avancée
Médias (1)
-
La conservation du net art au musée. Les stratégies à l’œuvre
26 mai 2011
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (93)
-
Dépôt de média et thèmes par FTP
31 mai 2013, parL’outil MédiaSPIP traite aussi les média transférés par la voie FTP. Si vous préférez déposer par cette voie, récupérez les identifiants d’accès vers votre site MédiaSPIP et utilisez votre client FTP favori.
Vous trouverez dès le départ les dossiers suivants dans votre espace FTP : config/ : dossier de configuration du site IMG/ : dossier des média déjà traités et en ligne sur le site local/ : répertoire cache du site web themes/ : les thèmes ou les feuilles de style personnalisées tmp/ : dossier de travail (...) -
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...) -
Gestion générale des documents
13 mai 2011, parMédiaSPIP ne modifie jamais le document original mis en ligne.
Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)
Sur d’autres sites (9845)
-
Pause/resume ffmpeg transcoding process
27 janvier 2014, par XXXIn my Android application I use Ffmpeg library written in C++. I use ffmpeg command
ffmpeg -y -i in.mp4 -s 320x240 -c:a copy out.mp4
to reduce the size of media files on Android. Everything works well. But if the file is too large, then the smartphone's battery heats up. And I have a question : is it possible to stop a ffmpeg process, and then continue it later on ? When I am converting a large file, I would like to be able to start from where I left off. Or is there any way to avoid heating the battery. May be this way : cut the file into pieces, reduce each separately and paste together ?UPDATE
I found this Pause any process you want in Linux. So I decided to try to apply it on Android.
public static void pause()
{
try
{
Process sh = Runtime.getRuntime().exec("su", null,null);
OutputStream os = sh.getOutputStream();
os.write(("sudo -kill -STOP (pidof Thread-10)").getBytes("ASCII"));
Log.e("!!", "process stopped!");
os.flush();
os.close();
sh.waitFor();
}
catch(Throwable t){t.printStackTrace();}
}But the process doesn't stop.
Thread-10
it's the name of ffmpeg transcoding process. Whats wrong ? -
What's wrong with my use of timestamps/timebases for frame seeking/reading using libav (ffmpeg) ?
17 septembre 2013, par mtreeSo I want to grab a frame from a video at a specific time using libav for the use as a thumbnail.
What I'm using is the following code. It compiles and works fine (in regards to retrieving a picture at all), yet I'm having a hard time getting it to retrieve the right picture.
I simply can't get my head around the all but clear logic behind libav's apparent use of multiple time-bases per video. Specifically figuring out which functions expect/return which type of time-base.
The docs were of basically no help whatsoever, unfortunately. SO to the rescue ?
#define ABORT(x) do {fprintf(stderr, x); exit(1);} while(0)
av_register_all();
AVFormatContext *format_context = ...;
AVCodec *codec = ...;
AVStream *stream = ...;
AVCodecContext *codec_context = ...;
int stream_index = ...;
// open codec_context, etc.
AVRational stream_time_base = stream->time_base;
AVRational codec_time_base = codec_context->time_base;
printf("stream_time_base: %d / %d = %.5f\n", stream_time_base.num, stream_time_base.den, av_q2d(stream_time_base));
printf("codec_time_base: %d / %d = %.5f\n\n", codec_time_base.num, codec_time_base.den, av_q2d(codec_time_base));
AVFrame *frame = avcodec_alloc_frame();
printf("duration: %lld @ %d/sec (%.2f sec)\n", format_context->duration, AV_TIME_BASE, (double)format_context->duration / AV_TIME_BASE);
printf("duration: %lld @ %d/sec (stream time base)\n\n", format_context->duration / AV_TIME_BASE * stream_time_base.den, stream_time_base.den);
printf("duration: %lld @ %d/sec (codec time base)\n", format_context->duration / AV_TIME_BASE * codec_time_base.den, codec_time_base.den);
double request_time = 10.0; // 10 seconds. Video's total duration is ~20sec
int64_t request_timestamp = request_time / av_q2d(stream_time_base);
printf("requested: %.2f (sec)\t-> %2lld (pts)\n", request_time, request_timestamp);
av_seek_frame(format_context, stream_index, request_timestamp, 0);
AVPacket packet;
int frame_finished;
do {
if (av_read_frame(format_context, &packet) < 0) {
break;
} else if (packet.stream_index != stream_index) {
av_free_packet(&packet);
continue;
}
avcodec_decode_video2(codec_context, frame, &frame_finished, &packet);
} while (!frame_finished);
// do something with frame
int64_t received_timestamp = frame->pkt_pts;
double received_time = received_timestamp * av_q2d(stream_time_base);
printf("received: %.2f (sec)\t-> %2lld (pts)\n\n", received_time, received_timestamp);Running this with a test movie file I get this output :
stream_time_base: 1 / 30000 = 0.00003
codec_time_base: 50 / 2997 = 0.01668
duration: 20062041 @ 1000000/sec (20.06 sec)
duration: 600000 @ 30000/sec (stream time base)
duration: 59940 @ 2997/sec (codec time base)
requested: 10.00 (sec) -> 300000 (pts)
received: 0.07 (sec) -> 2002 (pts)The times don't match. What's going on here ? What am I doing wrong ?
While searching for clues I stumbled upon this this statement from the libav-users mailing list…
[...] packet PTS/DTS are in units of the format context's time_base,
where the AVFrame->pts value is in units of the codec context's time_base.In other words, the container can have (and usually does) a different
time_base than the codec. Most libav players don't bother using the
codec's time_base or pts since not all codecs have one, but most
containers do. (This is why the dranger tutorial says to ignore AVFrame->pts)…which confused me even more, given that I couldn't find any such mention in the official docs.
Anyway, I replaced…
double received_time = received_timestamp * av_q2d(stream_time_base);
…with…
double received_time = received_timestamp * av_q2d(codec_time_base);
…and the output changed to this…
...
requested: 10.00 (sec) -> 300000 (pts)
received: 33.40 (sec) -> 2002 (pts)Still no match. What's wrong ?
-
Révision 22344 : Report de r22343 : Fix #3494 : quand la date de redaction anterieure est utilisee...
26 juillet 2015, par cedric -On robustifie traiter() en verifiant le droit a edition qui a ete calcule dans charger() avant de prendre en compte les request, ce qui evite tout risque de manipulation du form et d’injection d’input en html
+ Report de r22338 au passage