
Recherche avancée
Médias (91)
-
DJ Z-trip - Victory Lap : The Obama Mix Pt. 2
15 septembre 2011
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Matmos - Action at a Distance
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
DJ Dolores - Oslodum 2004 (includes (cc) sample of “Oslodum” by Gilberto Gil)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Danger Mouse & Jemini - What U Sittin’ On ? (starring Cee Lo and Tha Alkaholiks)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Cornelius - Wataridori 2
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Rapture - Sister Saviour (Blackstrobe Remix)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (58)
-
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
D’autres logiciels intéressants
12 avril 2011, parOn ne revendique pas d’être les seuls à faire ce que l’on fait ... et on ne revendique surtout pas d’être les meilleurs non plus ... Ce que l’on fait, on essaie juste de le faire bien, et de mieux en mieux...
La liste suivante correspond à des logiciels qui tendent peu ou prou à faire comme MediaSPIP ou que MediaSPIP tente peu ou prou à faire pareil, peu importe ...
On ne les connais pas, on ne les a pas essayé, mais vous pouvez peut être y jeter un coup d’oeil.
Videopress
Site Internet : (...) -
Changer son thème graphique
22 février 2011, parLe thème graphique ne touche pas à la disposition à proprement dite des éléments dans la page. Il ne fait que modifier l’apparence des éléments.
Le placement peut être modifié effectivement, mais cette modification n’est que visuelle et non pas au niveau de la représentation sémantique de la page.
Modifier le thème graphique utilisé
Pour modifier le thème graphique utilisé, il est nécessaire que le plugin zen-garden soit activé sur le site.
Il suffit ensuite de se rendre dans l’espace de configuration du (...)
Sur d’autres sites (7303)
-
Transcoding videos with LibAvFormat for playback on iOS devices
18 mai 2016, par user361526I’m trying to transcode a video on my iOS app using FFMpeg/LibAv.
What I’m trying to accomplish is to transcode a video in order to resize each frame and possibly lower the bitrate in order to save valuable MB in the device.The resulting video must be playable on all iPhone5+ devices.
After reading the documentation I found out that :
- I do not need to encode/decode the audio stream -> I’ll copy as-is to the output file
- I need to encode the video using the h264 codec (LibX264) with a profile supported by iOS (baseline profile with level 3.0 - https://trac.ffmpeg.org/wiki/Encode/H.264#Compatibility)
- I’m also setting the picture format to YUV planar since it’s the only one supported by iOS
- For the sake of testing I’m not using any filter (I’m using a dummy/passthrough) at all or even trying to lower the bitrate, I’m just trying to decode the video stream and encode it again
- Most of the code is based on the transcoding.c and filtering.c available on the FFMpeg examples directory
FFMpeg-wise what I’m trying to achieve with LibAv is :
ffmpeg -i INPUT.MOV -c:v libx264 -preset ultrafast -profile:v baseline -level 3.0 -c:a copy output.MOV
(the resulting file - which can be found below - is playable on QuickTime if it’s generated by FFMpeg through the command line)
The original video was generated with a regular iPhone using iOS 8.2 but the problem is not device specific or iOS specific, it occurs on all videos generated with LibAv.
Although both resulting files are playable by VideoLan (VLC) the one I generated through LibAv is not playable by QuickTime even though I can’t find anything wrong with it.
As you can see below, I create the video stream with the proper video codec on the call to avformat_new_stream :
AVStream *out_stream; // output stream
AVStream *in_stream; // input stream
AVCodecContext *dec_ctx, *enc_ctx; // codec context for the stream
AVCodec *encoder; // codec used
int ret;
unsigned int i;
ofmt_ctx = NULL;
// Allocate an AVFormatContext for an output format. This will be the file header (similar to avformat_open_input but with an zero'ed memory)
avformat_alloc_output_context2(&ofmt_ctx, NULL, NULL, filename);
if (!ofmt_ctx) {
av_log(NULL, AV_LOG_ERROR, "Could not create output context\n");
[self errorWith:kErrorCreatingOutputContext and:@"Could not create output context"];
return AVERROR_UNKNOWN;
}
// we must not use the AVCodecContext from the video stream directly! So we have to use avcodec_copy_context() to copy the context to a new location (after allocating memory for it, of course).
// iterate over all input streams
for (i = 0; i < ifmt_ctx->nb_streams; i++) {
in_stream = ifmt_ctx->streams[i]; // input stream
dec_ctx = in_stream->codec; // get the codec context for the decoder
if (dec_ctx->codec_type == AVMEDIA_TYPE_VIDEO) {
// lets use h264
encoder = avcodec_find_encoder(AV_CODEC_ID_H264);
if (!encoder) {
[self errorWith:kErrorCodecNotFound and:@"H264 Codec Not Found"];
return AVERROR_UNKNOWN;
}
out_stream = avformat_new_stream(ofmt_ctx, encoder); // create a new stream with h264 codec
if (!out_stream) {
av_log(NULL, AV_LOG_ERROR, "Failed allocating output stream\n");
[self errorWith:kErrorAllocateOutputStream and:@"Failed allocating output stream"];
return AVERROR_UNKNOWN;
}
enc_ctx = out_stream->codec; // pointer to the stream codec context
/* we transcode to same properties (picture size,
* sample rate etc.). These properties can be changed for output
* streams easily using filters */
if (dec_ctx->codec_type == AVMEDIA_TYPE_VIDEO) {
enc_ctx->width = dec_ctx->width;
enc_ctx->height = dec_ctx->height;
enc_ctx->sample_aspect_ratio = dec_ctx->sample_aspect_ratio;
enc_ctx->pix_fmt = AV_PIX_FMT_YUV420P;
enc_ctx->time_base = dec_ctx->time_base;
av_opt_set(enc_ctx->priv_data, "preset", "ultrafast", 0);
av_opt_set(enc_ctx->priv_data, "profile", "baseline", 0);
av_opt_set(enc_ctx->priv_data, "level", "3.0", 0);
}
out_stream->time_base = in_stream->time_base;
AVDictionaryEntry *tag = NULL;
while ((tag = av_dict_get(in_stream->metadata, "", tag, AV_DICT_IGNORE_SUFFIX))) {
printf("%s=%s\n", tag->key, tag->value);
char *k = av_strdup(tag->key); // if your strings are already allocated,
char *v = av_strdup(tag->value); // you can avoid copying them like this
av_dict_set(&out_stream->metadata, k, v, 0);
}
ret = avcodec_open2(enc_ctx, encoder, NULL);
if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Cannot open video encoder for stream #%u\n", i);
[self errorWith:kErrorCantOpenOutputFile and:[NSString stringWithFormat:@"Cannot open video encoder for stream #%u",i]];
return ret;
}
}
else if(dec_ctx->codec_type == AVMEDIA_TYPE_UNKNOWN) {
// if we cant figure out the stream type, fail
av_log(NULL, AV_LOG_FATAL, "Elementary stream #%d is of unknown type, cannot proceed\n", i);
[self errorWith:kErrorUnknownStream and:[NSString stringWithFormat:@"Elementary stream #%d is of unknown type, cannot proceed",i]];
return AVERROR_INVALIDDATA;
}
else {
out_stream = avformat_new_stream(ofmt_ctx, NULL);
if (!out_stream) {
av_log(NULL, AV_LOG_ERROR, "Failed allocating output stream\n");
[self errorWith:kErrorAllocateOutputStream and:@"Failed allocating output stream"];
return AVERROR_UNKNOWN;
}
enc_ctx = out_stream->codec;
/* this stream must be remuxed */
// copies ifmt_ctx->streams[i]->codec into ofmt_ctx->streams[i]->codec - Copy the settings of the source AVCodecContext into the destination AVCodecContext.
ret = avcodec_copy_context(ofmt_ctx->streams[i]->codec,
ifmt_ctx->streams[i]->codec);
if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Copying stream context failed\n");
[self errorWith:kErrorCopyStreamFailed and:@"Copying stream context failed"];
return ret;
}
}
// dunno what this is for
if (ofmt_ctx->oformat->flags & AVFMT_GLOBALHEADER)
enc_ctx->flags |= AV_CODEC_FLAG_GLOBAL_HEADER;
}
if (!(ofmt_ctx->oformat->flags & AVFMT_NOFILE)) {
// Create and initialize a AVIOContext for accessing the
// resource indicated by url.
ret = avio_open(&ofmt_ctx->pb, filename, AVIO_FLAG_WRITE);
if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Could not open output file '%s'", filename);
[self errorWith:kErrorCantOpenOutputFile and:[NSString stringWithFormat:@"Could not open output file '%s'", filename]];
return ret;
}
}
/* init muxer, write output file header */
// Allocate the stream private data and write the stream header to an output media file.
ret = avformat_write_header(ofmt_ctx, NULL);
if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Error occurred when opening output file\n");
[self errorWith:kErrorOutFileCantWriteHeader and:@"Error occurred when opening output file"];
return ret;
}
return 0;You can find the files here :
- Original final : https://www.dropbox.com/s/2jjs1uy2pu2veyy/IMG_5705.MOV?dl=0
- File generated with FFMpeg - https://www.dropbox.com/s/9hfmq3fcifgpfqc/local-ffmpeg.MOV?dl=0
- File generated by code - https://www.dropbox.com/s/rttvny39rj7ejpf/generated-by-Ze.MOV?dl=0
Thank you so much,
Ze -
How do I use -analyzeduration from ffmpeg with FFmpegDecoder.framework from mooncatventures ?
29 décembre 2013, par user3143010I have been playing with the RtspPlay1 from mooncatventures to try and stream a live stream from an ffmpeg streaming source with as little delay as possible. The problem is even when I modify the code to indicate the -analyzedelay 0 flags in RtspPlay1 it does not seem to do anything. I came to this conclusion because the delay is the same on my computer without the -analyzeduration 0 flag as the iOS device. Any thoughts would be helpful.
Here is the command I am trying to emulate on the iPhone :
ffplay rtp :///224.1.1.1:11326 -analyzeduration 0Here is the modified code I tried with RtspPlay1 :
forward_argc=1;
forward_argv[1] = "-analyzeduration";
forward_argv[2] = "0";
//forward_argv[3] = "30";
//forward_argv[4] = "-fast";
//forward_argv[5] = "-sync";
//forward_argv[6] = "video";
//forward_argv[7] = "-drp";
//forward_argv[8] = "-skipidct";
//forward_argv[9] = "10";
//forward_argv[10] = "-skiploop";
//forward_argv[11] = "50";
//forward_argv[12] = "-threads";
//forward_argv[13] = "5";
//argv[14] = "-an";
forward_argv[3] = cString;
NSLog(@"glflag %@\n ",[parms objectForKey:@"glflag"] );
if ([parms objectForKey:@"glflag"]!=@"1") {
forward_argv[4]="0";
}else {
forward_argv[4]="1";
}
forward_argc += 4; -
Mobile Analytics SDK : beta release of Piwik iOS SDK
30 octobre 2013, par Piwik teamMattias Levin, a Mobile developer enthusiast from Sweden, has released the first public beta version of the official Piwik SDK for iOS !
If you are building apps for iOS or OSX, you will be able to track your App usage with Piwik. Learn more in this blog post.
Apps & Mobile apps Analytics
Using Piwik to track your app usage would give interesting statistics usage such as :
- number of active users (per day, week, month, …) of my mobile or desktop app,
- how long users spend in the app,
- track which icons, buttons are clicked (or any other custom event),
- record device info, operating system,
- reports on any Custom Variables you that are relevant to your app (see examples below),
-
how often is the app opened ? When and how long is the app opened ?
- number of new users, active users, total users,
- record errors or exception thrown
Piwik SDK for iOS
The PiwikTracker is an Objective-C framework (for iOS and OSX) designed to send app usage data to a Piwik analytics server. It is realeased under MIT license. Piwik server is a downloadable, Free/Libre (GPLv3 licensed) real time analytics platform.
Getting started
- Create a new website in the Piwik web interface called “My App”. Copy the Website ID and the token_auth.
- Download the PiwikTracker SDK.
- Add the PiwikTracker files to your project.
- Create and configure the PiwikTracker.
- Add code in your app to track screen views, events, exceptions, goals and more
- Let the dispatch timer dispatch pending events to the Piwik server, or dispatch events manually.
For more info, check out the Readme.
Requirements
The latest PiwikTracker version uses ARC and support iOS6+ and OSX 10.7+.
- iOS tracker depends on : Core Data, Core Location, Core Graphics, UIKit and AFNetworking.
- OSX tracker depends on : Core Data, Core Graphics, Cocoa and AFNetworking.
Demo project
The workspace contains an iPhone demo app that uses and demonstrates the features available in the SDK.
Feedback needed
If you use the iOS SDK to track your app, we would like to hear your suggestions, bug reports or general feedback.
We hope to work with you to improve the SDK and move it out of beta !
Please report suggestions, bugs, feature requests in the Github Issues at Piwik iOS SDK.
Happy App Analytics !