Recherche avancée
Autres articles (69)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (9916)
-
Cannot use FFmpeg in Xcode iOS Project (file .h not found)
19 mars 2016, par BlackBoxI followed almost step by step this guide.
Almost because I downloaded, as a user suggested, ffmpeg ios library already built from here
I followed from "Linking static libraries in Xcode" but I cannot import anyway the header files of ffmpeg. (So I got the .a files)
For example
#include "avformat.h"
// or
#import "libavformat/avformat.h"
// or
#import <libavformat></libavformat>avformat.h>Everything that I use does not work.
I specify that those .a files are currently in my project directory, indeed, if I import the .a file, it doesn’t complain that it isn’t found, but when compiling, it complains about UTF-8 stuff because .a files are object libraries and cannot be imported that way.
I put also the Header Search Paths for the project as it was suggested and the config.log file but nothing.
Also I see libraries are missing from every project example of FFmpeg I was able to find on GitHub.
Any ideas ?
-
ffmpeg produces mp4 I cannot load on latest Safari on iOS
28 mars 2016, par Michael HeubergerWhen I encode a video with these ffmpeg parameters (based on images), I cannot play that mp4 it produces on latest Safari (inside a tag) inside my iPhone 6s using the latest iOS too :
ffmpeg
-r 15.279071668502123
-f image2 -thread_queue_size 64
-i /home/michael-heuberger/abcd/frames/%d.webp
-y
-an
-vcodec libx264
-vf scale=trunc(iw/2)*2:trunc(ih/2)*2
-crf 16
-preset fast
-profile:v baseline
-pix_fmt yuv420p
-loglevel warning
-movflags faststart
/home/michael-heuberger/abcd/videomail_good.mp4I think the above parameters should be solid. I am adding baseline and yuv420p, yet no luck on Safari 9. Why ?
This makes it difficult for me to play videomails recorded on www.videomail.io on iOS devices.
Am I missing something here ? Already did lots of research and tried various combinations, no luck.
If you want to reproduce that, easy : just record a video on www.videomail.io and after that, copy the link of the recorded video page to your iPhone or just download it for local investigation.
Any clues very welcome !
-
How should set the video & audio's timestamp when using ffmpeg to publish rtmp stream to nginx-rtmp server
8 juillet 2016, par user1190248Now I use ffmpeg to publish my av stream to nginx-rtmp server.
My h264 stream is 30fps, AAC is 8khz stereo channels.
In the packet sending function, I use following code to read frame,set audio & video’s timestamp.void RtmpLiveEncoder::Run()
{
AVBitStreamFilterContext* aacbsfc = av_bitstream_filter_init("aac_adtstoasc");
start_time = av_gettime();
while(1)
{
do
{
int ret = 0;
AVPacket pkt;
av_init_packet(&pkt);
ret = av_read_frame(ifmt_ctx,&pkt);
if(ret<0 )
{
printf("read video frame failed\n");
break;
}
if(pkt.pts==AV_NOPTS_VALUE)
{
if(_frameduration==0)
{
pkt.dts = pkt.pts=(av_gettime()-start_time)/1000;
}
else
{
pkt.dts = pkt.pts = _lastvideopts;
pkt.duration = _frameduration;
pkt.pos = -1;
_lastvideopts += _frameduration;
}
}
if(av_write_frame(ofmt_ctx,&pkt)<0)
{
printf("write video frame failed\n");
}
av_packet_unref(&pkt);
}while(0);
do
{
if(!_hasaudio)
{
break;
}
if((_lastaudiopts-_lastvideopts)>0)
{
printf("the audio is faster than video, the audio pts is %d, the video pts is %d\n",_lastaudiopts,_lastvideopts);
break;
}
int ret = 0;
AVPacket audiopacket;
av_init_packet(&audiopacket);
ret = av_read_frame(aifmt_ctx,&audiopacket);
if(ret<0)
{
break;
}
AVStream* out_stream = ofmt_ctx->streams[1];
if(av_bitstream_filter_filter(aacbsfc, out_stream->codec, NULL, &audiopacket.data, &audiopacket.size, audiopacket.data, audiopacket.size, 0)<0)
{
printf("remove adts header failed\n");
}
if(av_bitstream_filter_filter(aacbsfc, out_stream->codec, NULL, &audiopacket.buf->data, &audiopacket.buf->size, audiopacket.buf->data, audiopacket.buf->size, 0)<0)
{
printf("remove adts header failed\n");
}
audiopacket.stream_index=1;
audiopacket.dts = audiopacket.pts=_lastaudiopts;
audiopacket.duration = (double)1024/out_stream->codecpar->sample_rate*1000;
_lastaudiopts += audiopacket.duration;
audiopacket.pos = -1;
if(av_write_frame(ofmt_ctx,&audiopacket)<0)
{
printf("write audio failed.\n");
}
av_packet_unref(&audiopacket);
}while(0);
}
av_write_trailer(ofmt_ctx);
}as you can see, I set the audio&video’s pts by frameduration.
Then I use vlc or iphone’s chrome to access the hls stream. At the first, the audio&video is synchronized.But as time goes on, the AV isn’t synchronized,the audio would be faster than video’s.
So what’s wrong with it ?