
Recherche avancée
Médias (3)
-
MediaSPIP Simple : futur thème graphique par défaut ?
26 septembre 2013, par
Mis à jour : Octobre 2013
Langue : français
Type : Video
-
GetID3 - Bloc informations de fichiers
9 avril 2013, par
Mis à jour : Mai 2013
Langue : français
Type : Image
-
GetID3 - Boutons supplémentaires
9 avril 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Image
Autres articles (62)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Changer son thème graphique
22 février 2011, parLe thème graphique ne touche pas à la disposition à proprement dite des éléments dans la page. Il ne fait que modifier l’apparence des éléments.
Le placement peut être modifié effectivement, mais cette modification n’est que visuelle et non pas au niveau de la représentation sémantique de la page.
Modifier le thème graphique utilisé
Pour modifier le thème graphique utilisé, il est nécessaire que le plugin zen-garden soit activé sur le site.
Il suffit ensuite de se rendre dans l’espace de configuration du (...)
Sur d’autres sites (11707)
-
libavcodec on media entirely in memory
11 janvier 2019, par NickI’m dealing with small micro videos that exist entirely in memory (as a string). So far, I haven’t been able to get avcodec to properly decode h264 in this way.
I tried a custom AVIOContext that operates on containerized media :
struct Stream { char* str; size_t pos; size_t size; };
static int ReadStream(void* opaque, uint8* buf, int buf_size) {
Stream* strm = reinterpret_cast(opaque);
int read = strm->size-strm->pos;
read = read < buf_size ? read : buf_size;
memcpy(buf, strm->str+sto->pos, read);
memset(buf+read, 0, buf_size-read);
strm->pos += read;
return read;
}
static int64_t SeekStream(void *opaque, int64_t offset, int whence) {
Stream* strm = reinterpret_cast(opaque);
if (whence == AVSEEK_SIZE) {
return strm->size;
} else if (whence == SEEK_END) {
strm->pos = strm->size;
} else if (whence == SEEK_SET) {
strm->pos = 0;
}
strm->pos += offset;
return strm->pos;
}
int main(int argc, char *argv[]) {
string content;
GetContents("test.mp4", &content);
avcodec_register_all();
uint8* buff = (uint8*)malloc(4096 + AV_INPUT_BUFFER_PADDING_SIZE);
Stream strm = { const_cast(content.data()), 0, content.size() };
void* opaque = reinterpret_cast(&strm);
AVFormatContext* fmtctx = avformat_alloc_context();
AVIOContext* avctx = avio_alloc_context(buff, 4096, 0, opaque, &ReadStream, nullptr, &SeekStream);
AVInputFormat* ifmt = av_find_input_format("mp4");
AVDictionary* opts = nullptr;
fmtctx->pb = avctx;
avformat_open_input(&fmtctx, "", ifmt, &opts);
avformat_find_stream_info(fmtctx, &opts);
}But that always segfaults at find_stream_info.
I also tried pre-demuxing the video stream into raw h264 and just sending the stream packets (e.g.) :
int main(int argc, char *argv[]) {
string content;
GetContents("test.h264", &content);
uint8* data = reinterpret_cast(const_cast(content.c_str()));
avcodec_register_all();
AVCodec* codec = avcodec_find_decoder(AV_CODEC_ID_H264);
AVCodecContext* ctx = avcodec_alloc_context3(codec);
ctx->width = 1080;
ctx->height = 1920;
avcodec_open2(ctx, codec, nullptr);
AVPacket* pkt = av_packet_alloc();
AVFrame* frame = av_frame_alloc();
pkt->data = data;
pkt->size = 4096;
avcodec_send_packet(ctx, pkt);
data += 4096;
}But that just gives a nondescript "error while decoding MB # #, bytestream #". Note that I’ve removed the error checking from the allocs, etc. to simplify the code, but I am checking to make sure everything is allocated and instantiated correctly.
Any suggestions for where my misunderstanding or misuse of avcodec is ?
-
Is there an efficient way to retrieve frames from a video in Android ?
28 mars 2015, par NaveedI have an app which requires me to retrieve frames from a video and do some processing with them. However it seems like that the frame retrieval is very slow to the point where it is unacceptable. Sometimes it is taking upto 2.5 second to retrieve a single frame. I am using the MediaMetadataRetriever as most stackoverflow questions suggested. However the performance is very bad. Here is what I have :
private List<bitmap> retrieveFrames() {
MediaMetadataRetriever fmmr = new MediaMetadataRetriever();
fmmr.setDataSource("/path/to/some/video.mp4");
String strLength = fmmr.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION);
long milliSecs = Long.parseLong(strLength);
long microSecLength = milliSecs * 1000;
Log.d("TAG", "length: " + microSecLength);
long one_sec = 1000000; // one sec in micro seconds
ArrayList<bitmap> frames = new ArrayList<>();
int j = 0;
for (int i = 0; i < microSecLength; i += (one_sec / 5)) {
long time = System.currentTimeMillis();
Bitmap frame = fmmr.getFrameAtTime(i, MediaMetadataRetriever.OPTION_CLOSEST);
j++;
Log.d("TAG", "Frame number: " + j + " Time taken: " + (System.currentTimeMillis() - time));
// commented out because each frame would be written to disk instead of holding them in memory
// frames.add(frame);
}
fmmr.release();
return frames;
}
</bitmap></bitmap>The above will logs :
03-26 21:49:29.781 13213-13239/com.example.naveed.myapplication D/TAG﹕ length: 4949000
03-26 21:49:30.187 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 1 Time taken: 406
03-26 21:49:30.779 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 2 Time taken: 592
03-26 21:49:31.578 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 3 Time taken: 799
03-26 21:49:32.632 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 4 Time taken: 1054
03-26 21:49:33.895 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 5 Time taken: 1262
03-26 21:49:35.382 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 6 Time taken: 1486
03-26 21:49:37.128 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 7 Time taken: 1746
03-26 21:49:39.077 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 8 Time taken: 1948
03-26 21:49:41.287 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 9 Time taken: 2210
03-26 21:49:43.717 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 10 Time taken: 2429
03-26 21:49:44.093 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 11 Time taken: 376
03-26 21:49:44.707 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 12 Time taken: 614
03-26 21:49:45.539 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 13 Time taken: 831
03-26 21:49:46.597 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 14 Time taken: 1057
03-26 21:49:47.875 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 15 Time taken: 1278
03-26 21:49:49.384 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 16 Time taken: 1508
03-26 21:49:51.112 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 17 Time taken: 1728
03-26 21:49:53.096 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 18 Time taken: 1983
03-26 21:49:55.315 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 19 Time taken: 2218
03-26 21:49:57.711 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 20 Time taken: 2396
03-26 21:49:58.065 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 21 Time taken: 354
03-26 21:49:58.640 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 22 Time taken: 574
03-26 21:49:59.369 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 23 Time taken: 728
03-26 21:50:00.112 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 24 Time taken: 742
03-26 21:50:00.834 13213-13239/com.example.naveed.myapplication D/TAG﹕ Frame number: 25 Time taken: 721As you can see from above, it is taking about 18 - 25 sec to retrieve 25 frames from a 4 sec long video.
I have also tried this which uses FFmpeg underneath to do the same. I am not sure how well this library is implemented but it only improves the over all performance by a couple of seconds meaning it takes about 15-20 sec to do the same.
So my question is : is there a way to do it quicker ? My friend has an iOS app where he does something similar but it only takes couple of seconds and he is grabbing even more frames however he is not sure how to do it on android.
Is there anything on android that would speed up the process. Am I approaching this wrong ?
The end goal is to stitch those frames together into a gif.
-
Compiling ffmpeg for iOS and gas-preprocessor.pl
16 mai 2017, par user500I want to compile ffmpeg for iOS. I did it a few times before. But now I’m on clean new Mavericks and on configure I’m always getting
Configured with: --prefix=/Applications/Xcode.app/Contents/Developer/usr --with-gxx-include-dir=/usr/include/c++/4.2.1
Configured with: --prefix=/Applications/Xcode.app/Contents/Developer/usr --with-gxx-include-dir=/usr/include/c++/4.2.1
GNU assembler not found, install gas-preprocessor
If you think configure made a mistake, make sure you are using the latest
version from Git. If the latest version fails, report the problem to the
ffmpeg-user@ffmpeg.org mailing list or IRC #ffmpeg on irc.freenode.net.
Include the log file "config.log" produced by configure as this will help
solving the problem.I have current Xcode installed. Also brews. And current
gas-preprocessor.pl
(https://github.com/yuvi/gas-preprocessor) inusr/bin
and also inusr/local/bin
.
On
perl /usr/bin/gas-preprocessor.pl gcc
I’m gettingUnrecognized input filetype at /usr/bin/gas-preprocessor.pl line 33.
This config works :
./configure \
--extra-cflags='-arch arm64 -mios-version-min=7.0 -mthumb' \
--extra-ldflags='-arch arm64 -mios-version-min=7.0' \
--enable-cross-compile \
--arch=arm64 \
--target-os=darwin \
--cc=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang \
--sysroot=/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS7.0.sdk \
--prefix=arm64 \
--disable-doc \
--disable-shared \
--disable-everything \
--enable-static \
--enable-pic \
--disable-muxers \
--enable-muxer=flv \
--disable-demuxers \
--enable-demuxer=h264 \
--enable-demuxer=pcm_s16le \
--disable-devices \
--disable-parsers \
--enable-parser=h264 \
--disable-encoders \
--enable-encoder=aac \
--disable-decoders \
--enable-decoder=h264 \
--enable-decoder=pcm_s16le \
--disable-protocols \
--enable-protocol=rtmp \
--disable-filters \
--disable-bsfs
This config throws error above (GNU assembler not found, install gas-preprocessor) :
./configure \
--cpu=cortex-a8 \
--extra-cflags='-arch armv7 -mios-version-min=7.0 -mthumb' \
--extra-ldflags='-arch armv7 -mios-version-min=7.0' \
--enable-cross-compile \
--arch=armv7 \
--target-os=darwin \
--cc=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang \
--sysroot=/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS7.0.sdk \
--prefix=armv7 \
--disable-doc \
--disable-shared \
--disable-everything \
--enable-static \
--enable-pic \
--disable-muxers \
--enable-muxer=flv \
--disable-demuxers \
--enable-demuxer=h264 \
--enable-demuxer=pcm_s16le \
--disable-devices \
--disable-parsers \
--enable-parser=h264 \
--disable-encoders \
--enable-encoder=aac \
--disable-decoders \
--enable-decoder=h264 \
--enable-decoder=pcm_s16le \
--disable-protocols \
--enable-protocol=rtmp \
--disable-filters \
--disable-bsfs