
Recherche avancée
Médias (2)
-
Exemple de boutons d’action pour une collection collaborative
27 février 2013, par
Mis à jour : Mars 2013
Langue : français
Type : Image
-
Exemple de boutons d’action pour une collection personnelle
27 février 2013, par
Mis à jour : Février 2013
Langue : English
Type : Image
Autres articles (29)
-
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Organiser par catégorie
17 mai 2013, parDans MédiaSPIP, une rubrique a 2 noms : catégorie et rubrique.
Les différents documents stockés dans MédiaSPIP peuvent être rangés dans différentes catégories. On peut créer une catégorie en cliquant sur "publier une catégorie" dans le menu publier en haut à droite ( après authentification ). Une catégorie peut être rangée dans une autre catégorie aussi ce qui fait qu’on peut construire une arborescence de catégories.
Lors de la publication prochaine d’un document, la nouvelle catégorie créée sera proposée (...) -
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
Sur d’autres sites (5611)
-
Why videos generated using FFMPEG in android doesn't play on all video players ?
9 juillet 2014, par Divya MotiwalaI am trimming a video using FFMPEG command mentioned below in android. It only plays with MX Player and not the default or any other player.
String trimVideoUrl = mFfmpegInstallPath + " -y -i " +
videoPath + " -ss " +
hms + " -t "+ duration +
" -async 1 -preset ultrafast -strict -2 "+ OUTPUT_FILENAME;Please let me know what other paramaters will help to make it run on other players. I did try
-c:v libx264
but didn’t help. -
ffmpeg API h264 encoded video does not play on all platforms
19 juillet 2012, par TheSHEEEPEdit : In the previous version I used a very old ffmpeg API. I now use the newest libraries. The problem has only changed slightly, from "Main" to "High".
I am using the ffmpeg C API to create a mp4 video in C++.
I want the resulting video to be of the profile "Constrained Baseline", so that the resulting video can be played on as much platforms as possible, especially mobile, but I get "High" profile every time, even though I hard coded the codec profile to be FF_PROFILE_H264_CONSTRAINED_BASELINE. As a result, the video does not play on all our testing platforms.
This is what "ffprobe video.mp4 -show_streams" tells about my video streams :
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
creation_time : 1970-01-01 00:00:00
encoder : Lavf53.5.0
Duration: 00:00:13.20, start: 0.000000, bitrate: 553 kb/s
Stream #0:0(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p, 320x180,
424 kb/s, 15 fps, 15 tbr, 15 tbn, 30 tbc
Metadata:
creation_time : 1970-01-01 00:00:00
handler_name : VideoHandler
Stream #0:1(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, stereo, s16, 12
kb/s
Metadata:
creation_time : 1970-01-01 00:00:00
handler_name : SoundHandler
-------VIDEO STREAM--------
[STREAM]
index=0
codec_name=h264
codec_long_name=H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10
profile=High <-- This should be "Constrained Baseline"
codec_type=video
codec_time_base=1/30
codec_tag_string=avc1
codec_tag=0x31637661
width=320
height=180
has_b_frames=0
sample_aspect_ratio=N/A
display_aspect_ratio=N/A
pix_fmt=yuv420p
level=30
timecode=N/A
is_avc=1
nal_length_size=4
id=N/A
r_frame_rate=15/1
avg_frame_rate=15/1
time_base=1/15
start_time=0.000000
duration=13.200000
bit_rate=424252
nb_frames=198
nb_read_frames=N/A
nb_read_packets=N/A
TAG:creation_time=1970-01-01 00:00:00
TAG:language=und
TAG:handler_name=VideoHandler
[/STREAM]
-------AUDIO STREAM--------
[STREAM]
index=1
codec_name=aac
codec_long_name=Advanced Audio Coding
profile=unknown
codec_type=audio
codec_time_base=1/44100
codec_tag_string=mp4a
codec_tag=0x6134706d
sample_fmt=s16
sample_rate=44100
channels=2
bits_per_sample=0
id=N/A
r_frame_rate=0/0
avg_frame_rate=0/0
time_base=1/44100
start_time=0.000000
duration=13.165714
bit_rate=125301
nb_frames=567
nb_read_frames=N/A
nb_read_packets=N/A
TAG:creation_time=1970-01-01 00:00:00
TAG:language=und
TAG:handler_name=SoundHandler
[/STREAM]This is the function I use to add a video stream. All the values that come from ptr-> are defined from outside, do those values have to be specific values to get the correct profile ? :
static AVStream *add_video_stream( Cffmpeg_dll * ptr, AVFormatContext *oc, enum CodecID codec_id )
{
AVCodecContext *c;
AVStream *st;
AVCodec* codec;
// Get correct codec
codec = avcodec_find_encoder(codec_id);
if (!codec) {
av_log(NULL, AV_LOG_ERROR, "%s","Video codec not found\n");
exit(1);
}
// Create stream
st = avformat_new_stream(oc, codec);
if (!st) {
av_log(NULL, AV_LOG_ERROR, "%s","Could not alloc stream\n");
exit(1);
}
c = st->codec;
/* Get default values */
codec = avcodec_find_encoder(codec_id);
if (!codec) {
av_log(NULL, AV_LOG_ERROR, "%s","Video codec not found (default values)\n");
exit(1);
}
avcodec_get_context_defaults3(c, codec);
c->codec_id = codec_id;
c->codec_type = AVMEDIA_TYPE_VIDEO;
c->bit_rate = ptr->video_bit_rate;
av_log(NULL, AV_LOG_ERROR, " Bit rate: %i", c->bit_rate);
c->qmin = ptr->qmin;
c->qmax = ptr->qmax;
c->me_method = ptr->me_method;
c->me_subpel_quality = ptr->me_subpel_quality;
c->i_quant_factor = ptr->i_quant_factor;
c->qcompress = ptr->qcompress;
c->max_qdiff = ptr->max_qdiff;
// We need to set the level and profile to get videos that play (hopefully) on all platforms
c->level = 30;
c->profile = FF_PROFILE_H264_CONSTRAINED_BASELINE;
c->width = ptr->dstWidth;
c->height = ptr->dstHeight;
c->time_base.den = ptr->fps;
c->time_base.num = 1;
c->gop_size = ptr->fps;
c->pix_fmt = STREAM_PIX_FMT;
c->max_b_frames = 0;
// some formats want stream headers to be separate
if(oc->oformat->flags & AVFMT_GLOBALHEADER)
c->flags |= CODEC_FLAG_GLOBAL_HEADER;
return st;
}Additional info :
As a reference video, I use the gizmo.mp4 that Mozilla serves as an example that plays on every platform/browser. It definitely has the "Constrained Baseline" profile, and definitely works on all our testing smartphones. You can download it here. Our self-created video doesn't work on all platforms and I'm convinced this is because of the profile.
I am also using qt-faststart.exe to move the headers to the start of the file after creating the mp4, as this cannot be done in a good way in C++ directly. Could that be the problem ?
Obviously, I am doing something wrong, but I don't know what it could be. I'd be thankful for every hint ;)
-
Youtube DASH stream through to iOS format
29 juillet 2014, par viperfxI have a stream link of a DASH codec format audio file using youtube-dl. It plays fine in the browser but on iOS the DASH codec is not supported. I am wondering how I can use ffmpeg to transcode the HTTPS stream live and then resend it out an AAC encoded m4a file or something that the iOS SDK supports through streaming - mp3 also works. I think using something such as nginx as the middleware inbetween would be a good idea. I can use python app to receive the request /videoID, it would grab the stream link and transcode it live feed it to nginx which would serve the correctly transcoded format back to the iOS app.
Does anyone have any suggestions/ideas on how to achieve this ? I am stuck on the details on the ffmpeg bit.