
Recherche avancée
Médias (2)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
-
Carte de Schillerkiez
13 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (67)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)
Sur d’autres sites (12445)
-
If I pass this code in Windows console it works, but when I emulate windows console in node.js code doesn't work, and returns unclear error
22 décembre 2016, par Maxim CherevatovI have code :
cmd.get(
'trimp3 ant.mp3 ant2.mp3 00:00 00:20',
function(data){
console.log('the node-cmd cloned dir contains these files :\n\n',data)
}
);If pass this code in Windows console it works well !
But, when i emulate windows console in node.js this code not work, and returns unclear mistake :[!!] ERROR: "ffmpeg" �� ����� ����७��� ��� ���譥�
��������, �ᯮ��塞�� �ணࠬ��� ��� �������� 䠩���.To emulate the use node-cmd.
-
android ffmpeg halfninja av_open_input_file returns -2 (no such file or directory)
20 janvier 2012, par cdavidyoungI have built ffmpeg for Android using the code and method described at
https://github.com/halfninja/android-ffmpeg-x264
using Ubuntu running in VirtualBox on windows. I then copied libvideokit.so into the Project\libs\armeabi folder of a Windows copy of the provided projects. From there I was able to run the ProjectTest from Eclipse on my Android device. I can see the ffmpeg code being executed but when it gets to the point of opening the input file it gives me the indicated error. I have noticed some discussion of this problem at
but the solutions have not helped since the file protocol is enabled in this build and I also tried putting "file :" in front of the filepath to no avail. For completeness I tried setting minimal_featureset=0 to enable all the defaults but this gives me the same error. Below is a snapshot of the logcat from Eclipse showing the output from Videokit with an extra call to LOGE to display the result from av_open_input_file. Any suggestions of things to try would be greatly appreciated.
10-23 11:57:33.888: DEBUG/Videokit(4830): run() called
10-23 11:57:33.888: DEBUG/Videokit(4830): run passing off to main()
10-23 11:57:33.904: DEBUG/Videokit(4830): main(): registering all modules
10-23 11:57:33.927: DEBUG/Videokit(4830): main(): registered everything
10-23 11:57:33.927: DEBUG/Videokit(4830): main(): initting opts
10-23 11:57:33.943: DEBUG/Videokit(4830): main(): initted opts.
10-23 11:57:33.943: ERROR/Videokit(4830): ffmpeg version N-30996-gf925b24, Copyright (c) 2000-2011 the FFmpeg developers
10-23 11:57:33.943: ERROR/Videokit(4830): built on Oct 21 2011 13:54:03 with gcc 4.4.3
10-23 11:57:33.943: ERROR/Videokit(4830): configuration: --enable-cross-compile --arch=arm5te --enable-armv5te --target-os=linux --disable-stripping --prefix=../output --disable-neon --enable-version3 --disable-shared --enable-static --enable-gpl --enable-memalign-hack --cc=arm-linux-androideabi-gcc --ld=arm-linux-androideabi-ld --extra-cflags='-fPIC -DANDROID -D__thumb__ -mthumb -Wfatal-errors -Wno-deprecated' --disable-everything --enable-decoder=mjpeg --enable-demuxer=mjpeg --enable-parser=mjpeg --enable-demuxer=image2 --enable-muxer=mp4 --enable-encoder=libx264 --enable-libx264 --enable-decoder=rawvideo --enable-protocol=file --enable-hwaccels --disable-ffmpeg --disable-ffplay --disable-ffprobe --disable-ffserver --disable-network --enable-filter=buffer --enable-filter=buffersink --disable-demuxer=v4l --disable-demuxer=v4l2 --disable-indev=v4l --disable-indev=v4l2 --extra-cflags='-I../x264 -Ivideokit' --extra-ldflags=-L../x264
10-23 11:57:33.943: DEBUG/Videokit(4830): main(): parsing options
10-23 11:57:33.943: DEBUG/Videokit(4830): parse_options has 4 options to parse
10-23 11:57:33.951: ERROR/Videokit(4830): opt_input_file av_open_input_file /mnt/sdcard/fun/snap0000.jpg -2
10-23 11:57:33.951: ERROR/Videokit(4830): /mnt/sdcard/fun/snap0000.jpg: No such file or directory
10-23 11:57:33.951: ERROR/Videokit(4830): ffmpeg_exit(1) called! -
Overlay filter in LibAV/FFMpeg returns strange (tripled) frame in C
28 juillet 2014, par gkuczeraI tried to make a program, which merges two frames. I use LibAV (libav-win32-20140428) under Windows 7 64 and Visual Studio 2013.
But the result is quite odd.The filter which was used is Overlay. When I change the graph, to the one, that uses only one stream and add FADE effect, everything works like a charm. But OVERLAY and eg. DRAWBOX give me strange distortion (three frames on one and black and white effect). Here is the code :
static int init_filter_graph(AVFilterGraph **pGraph, AVFilterContext **pSrc1, AVFilterContext **pSink)
{
AVFilterGraph* tFilterGraph;
AVFilterContext* tBufferContext1;
AVFilter* tBuffer1;
AVFilterContext* tColorContext;
AVFilter* tColor;
AVFilterContext* tOverlayContext;
AVFilter* tOverlay;
AVFilterContext* tBufferSinkContext;
AVFilter* tBufferSink;
int tError;
/* Create a new filtergraph, which will contain all the filters. */
tFilterGraph = avfilter_graph_alloc();
if (!tFilterGraph) {
return -1;
}
{ // BUFFER FILTER 1
tBuffer1 = avfilter_get_by_name("buffer");
if (!tBuffer1) {
return -1;
}
tBufferContext1 = avfilter_graph_alloc_filter(tFilterGraph, tBuffer1, "src1");
if (!tBufferContext1) {
return -1;
}
av_dict_set(&tOptionsDict, "width", "320", 0);
av_dict_set(&tOptionsDict, "height", "240", 0);
av_dict_set(&tOptionsDict, "pix_fmt", "bgr24", 0);
av_dict_set(&tOptionsDict, "time_base", "1/25", 0);
av_dict_set(&tOptionsDict, "sar", "1", 0);
tError = avfilter_init_dict(tBufferContext1, &tOptionsDict);
av_dict_free(&tOptionsDict);
if (tError < 0) {
return tError;
}
}
{ // COLOR FILTER
tColor = avfilter_get_by_name("color");
if (!tColor) {
return -1;
}
tColorContext = avfilter_graph_alloc_filter(tFilterGraph, tColor, "color");
if (!tColorContext) {
return -1;
}
av_dict_set(&tOptionsDict, "color", "white", 0);
av_dict_set(&tOptionsDict, "size", "20x120", 0);
av_dict_set(&tOptionsDict, "framerate", "1/25", 0);
tError = avfilter_init_dict(tColorContext, &tOptionsDict);
av_dict_free(&tOptionsDict);
if (tError < 0) {
return tError;
}
}
{ // OVERLAY FILTER
tOverlay = avfilter_get_by_name("overlay");
if (!tOverlay) {
return -1;
}
tOverlayContext = avfilter_graph_alloc_filter(tFilterGraph, tOverlay, "overlay");
if (!tOverlayContext) {
return -1;
}
av_dict_set(&tOptionsDict, "x", "0", 0);
av_dict_set(&tOptionsDict, "y", "0", 0);
av_dict_set(&tOptionsDict, "main_w", "120", 0);
av_dict_set(&tOptionsDict, "main_h", "140", 0);
av_dict_set(&tOptionsDict, "overlay_w", "320", 0);
av_dict_set(&tOptionsDict, "overlay_h", "240", 0);
tError = avfilter_init_dict(tOverlayContext, &tOptionsDict);
av_dict_free(&tOptionsDict);
if (tError < 0) {
return tError;
}
}
{ // BUFFERSINK FILTER
tBufferSink = avfilter_get_by_name("buffersink");
if (!tBufferSink) {
return -1;
}
tBufferSinkContext = avfilter_graph_alloc_filter(tFilterGraph, tBufferSink, "sink");
if (!tBufferSinkContext) {
return -1;
}
tError = avfilter_init_str(tBufferSinkContext, NULL);
if (tError < 0) {
return tError;
}
}
// Linking graph
tError = avfilter_link(tBufferContext1, 0, tOverlayContext, 0);
if (tError >= 0) {
tError = avfilter_link(tColorContext, 0, tOverlayContext, 1);
}
if (tError >= 0) {
tError = avfilter_link(tOverlayContext, 0, tBufferSinkContext, 0);
}
if (tError < 0) {
return tError;
}
tError = avfilter_graph_config(tFilterGraph, NULL);
if (tError < 0) {
return tError;
}
*pGraph = tFilterGraph;
*pSrc1 = tBufferContext1;
*pSink = tBufferSinkContext;
return 0;
}What do you think is the reason ?