
Recherche avancée
Autres articles (62)
-
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 est la première version de MediaSPIP stable.
Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Gestion des droits de création et d’édition des objets
8 février 2011, parPar défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;
-
Mise à disposition des fichiers
14 avril 2011, parPar défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)
Sur d’autres sites (6321)
-
Texture rendering on iOS using OpenGL ES in Unity project
30 mars 2016, par Time1essI’m working on a project, part of which is to stream video to my iPhone. I use my laptop to create the video stream to my iPhone with ffmpeg.
The stream code in shell is below :
ffmpeg \
-f avfoundation -i "1" -s 1280*720 -r 29.97 \
-c:v mpeg2video -q:v 20 -pix_fmt yuv420p -g 1 -threads 4\
-f mpegts udp://192.168.1.102:6666with this, I successfully create my video stream.
In Unity, I want to decode the video stream to create a texture. After I have gone through some ffmpeg tutorial and Unity tutorial, I created my link library. Some of these codes are below (ask me if more is needed) :
In my library :
buffer alloc :
uint8_t *buffer;
int buffer_size;
buffer_size = avpicture_get_size(AV_PIX_FMT_RGBA, VIEW_WIDTH, VIEW_HEIGHT);
buffer = (uint8_t *) av_malloc(buffer_size*sizeof(uint8_t));
avpicture_fill((AVPicture *) pFrameRGB, buffer, AV_PIX_FMT_RGBA,
VIEW_WIDTH, VIEW_HEIGHT);getContext :
is->sws_ctx = sws_getContext
(
is->video_st->codec->width,
is->video_st->codec->height,
is->video_st->codec->pix_fmt,
VIEW_WIDTH,
VIEW_HEIGHT,
AV_PIX_FMT_RGBA,
SWS_BILINEAR,
NULL,
NULL,
NULL
);sws_scale :
sws_scale(
is->sws_ctx,
(uint8_t const * const *)pFrame->data,
pFrame->linesize,
0,
is->video_st->codec->height,
pFrameRGB->data,
pFrameRGB->linesize
);texture render :
static void UNITY_INTERFACE_API OnRenderEvent(int texID)
{
GLuint gltex = (GLuint)(size_t)(texID);
glBindTexture(GL_TEXTURE_2D, gltex);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, VIEW_WIDTH, VIEW_HEIGHT,
GL_RGBA, GL_UNSIGNED_BYTE, pFrameRGB->data[0]);
glGetError();
return;
}
extern "C" UnityRenderingEvent UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API GetRenderEventFunc()
{
return OnRenderEvent;
}In Unity :
texture created :
private Texture2D texture;
private int texID;
texture = new Texture2D (width, height, TextureFormat.RGBA32, false);
texture.filterMode = FilterMode.Point;
texture.Apply ();
GetComponent<renderer> ().material.mainTexture = texture;
texID = texture.GetNativeTexturePtr ().ToInt32();
</renderer>update func :
void Update ()
{
GL.IssuePluginEvent(GetRenderEventFunc(), texID);
}Video stream info :
Input #0, mpegts, from 'udp://0.0.0.0:6666':
Duration: N/A, start: 2.534467, bitrate: N/A
Program 1
Metadata:
service_name : Service01
service_provider: FFmpeg
Stream #0:0[0x100]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), yuv420p(tv), 1280x720 [SAR 1:1 DAR 16:9], max. 104857 kb/s, 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbcLeave other details, my library works fine on the Unity simulator, but when I compiled all my libraries for arm64 and used the xcode project that Unity created to build my app and ran it, I couldn’t get any texture rendered in my iPhone, I checked my network and I’m sure that data had been sent to my iPhone and the Debug log showed me that frame has been successfully decoded also the
OnRenderEvent
function had been called.FYI :
Unity 5.3.2f1 Personal
Xcode 7.2.1
iOS 9.2.1
ffmpeg 3.0
-
ffmpeg produces video I cannot load on latest Safari on iOS
3 avril 2016, par Michael HeubergerWhen I encode two videos (mp4 and webm) with these ffmpeg parameters (based on images), I cannot play the mp4 it produces on latest Safari (inside a
<video></video>
tag) inside my iPhone 6s using the latest iOS too :ffmpeg
-framerate 15.2
-f image2
-thread_queue_size 64
-i /home/michael-heuberger/abcd/frames/%d.webp
-y
-an
-vcodec libx264
-vf scale=trunc(iw/2)*2:trunc(ih/2)*2
-crf 16
-preset fast
-profile:v baseline
-pix_fmt yuv420p
-movflags +faststart
/home/michael-heuberger/abcd/videomail_good.mp4
-vcodec libvpx
-vf scale=trunc(iw/2)*2:trunc(ih/2)*2
-crf 8
-deadline good
-cpu-used 1
-pix_fmt yuv420p
-movflags +faststart /home/michael-heuberger/abcd/videomail_good.webmI think the above parameters should be solid. I am adding baseline and yuv420p, yet no luck on Safari 9. Why ?
And here is the whole ffmpeg output (just ignore the paths to the images/videos) :
ffmpeg stdout: ffmpeg version 2.7.6-0ubuntu0.15.10.1 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 5.2.1 (Ubuntu 5.2.1-22ubuntu2) 20151010
configuration: --prefix=/usr --extra-version=0ubuntu0.15.10.1 --build-suffix=-ffmpeg --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --enable-shared --disable-stripping --enable-avresample --enable-avisynth --enable-frei0r --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-openal --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libxvid --enable-libzvbi --enable-opengl --enable-x11grab --enable-libdc1394 --enable-libiec61883 --enable-libzmq --enable-libssh --enable-libsoxr --enable-libx264 --enable-libopencv --enable-libx265
WARNING: library configuration mismatch
avcodec configuration: --prefix=/usr --extra-version=0ubuntu0.15.10.1 --build-suffix=-ffmpeg --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --enable-shared --disable-stripping --enable-avresample --enable-avisynth --enable-frei0r --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-openal --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libxvid --enable-libzvbi --enable-opengl --enable-x11grab --enable-libdc1394 --enable-libiec61883 --enable-libzmq --enable-libssh --enable-libsoxr --enable-libx264 --enable-libopencv --enable-libx265 --enable-version3 --disable-doc --disable-programs --disable-avdevice --disable-avfilter --disable-avformat --disable-avresample --disable-postproc --disable-swscale --enable-libopencore_amrnb --enable-libopencore_amrwb --enable-libvo_aacenc --enable-libvo_amrwbenc
libavutil 54. 27.100 / 54. 27.100
libavcodec 56. 41.100 / 56. 41.100
libavformat 56. 36.100 / 56. 36.100
libavdevice 56. 4.100 / 56. 4.100
libavfilter 5. 16.101 / 5. 16.101
libavresample 2. 1. 0 / 2. 1. 0
libswscale 3. 1.101 / 3. 1.101
libswresample 1. 2.100 / 1. 2.100
libpostproc 53. 3.100 / 53. 3.100
Input #0, image2, from '/home/michael-heuberger/binarykitchen/code/videomail.io/var/local/tmp/clients/videomail.io/11e5-f576-676eaa40-b0ac-89a86db14ea0/frames/%d.webp':
Duration: 00:00:00.59, start: 0.000000, bitrate: N/A
Stream #0:0: Video: webp, yuv420p(tv, bt470bg/unknown/unknown), 400x300, 15.17 fps, 15.20 tbr, 15.20 tbn, 15.20 tbc
[libx264 @ 0xfc2760] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2
[libx264 @ 0xfc2760] profile Constrained Baseline, level 2.1
[libx264 @ 0xfc2760] 264 - core 146 r2555 0c21480 - H.264/MPEG-4 AVC codec - Copyleft 2003-2015 - http://www.videolan.org/x264.html - options: cabac=0 ref=2 deblock=1:0:0 analyse=0x1:0x111 me=hex subme=6 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=12 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=15 scenecut=40 intra_refresh=0 rc_lookahead=30 rc=crf mbtree=1 crf=16.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
[libvpx @ 0xfc5da0] v1.4.0
Output #0, mp4, to '/home/michael-heuberger/binarykitchen/code/videomail.io/var/local/uploads/videomail.io/videomail/11e5/f5/76/11e5-f576-676eaa40-b0ac-89a86db14ea0/videomail_good.mp4':
Metadata:
encoder : Lavf56.36.100
Stream #0:0: Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv420p, 400x300, q=-1--1, 15.20 fps, 19456 tbn, 15.20 tbc
Metadata:
encoder : Lavc56.41.100 libx264
Output #1, webm, to '/home/michael-heuberger/binarykitchen/code/videomail.io/var/local/uploads/videomail.io/videomail/11e5/f5/76/11e5-f576-676eaa40-b0ac-89a86db14ea0/videomail_good.webm':
Metadata:
encoder : Lavf56.36.100
Stream #1:0: Video: vp8 (libvpx), yuv420p, 400x300, q=-1--1, 200 kb/s, 15.20 fps, 1k tbn, 15.20 tbc
Metadata:
encoder : Lavc56.41.100 libvpx
Stream mapping:
Stream #0:0 -> #0:0 (webp (native) -> h264 (libx264))
Stream #0:0 -> #1:0 (webp (native) -> vp8 (libvpx))
Press [q] to stop, [?] for help
[mp4 @ 0xfc16c0] Starting second pass: moving the moov atom to the beginning of the file
frame= 9 fps=0.0 q=-1.0 Lq=0.0 size= 62kB time=00:00:00.59 bitrate= 856.1kbits/s
video:82kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
[libx264 @ 0xfc2760] frame I:1 Avg QP:15.45 size: 10341
[libx264 @ 0xfc2760] frame P:8 Avg QP:15.95 size: 6442
[libx264 @ 0xfc2760] mb I I16..4: 17.1% 0.0% 82.9%
[libx264 @ 0xfc2760] mb P I16..4: 6.8% 0.0% 11.3% P16..4: 36.7% 31.6% 10.8% 0.0% 0.0% skip: 2.8%
[libx264 @ 0xfc2760] coded y,uvDC,uvAC intra: 67.5% 60.4% 43.0% inter: 60.5% 68.4% 5.4%
[libx264 @ 0xfc2760] i16 v,h,dc,p: 32% 29% 21% 18%
[libx264 @ 0xfc2760] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 32% 23% 20% 3% 4% 6% 5% 4% 3%
[libx264 @ 0xfc2760] i8c dc,h,v,p: 60% 21% 14% 5%
[libx264 @ 0xfc2760] ref P L0: 59.3% 40.7%
[libx264 @ 0xfc2760] kb/s:836.04This makes it difficult for me to play videomails recorded on www.videomail.io on iOS devices.
Am I missing something here ? Already did lots of research and tried various combinations, no luck.
If you want to reproduce that, easy : just record a video on www.videomail.io and after that, copy the link of the recorded video page to your iPhone or just download it for local investigation.
Any clues very welcome !
-
How should set the video & audio's timestamp when using ffmpeg to publish rtmp stream to nginx-rtmp server
8 juillet 2016, par user1190248Now I use ffmpeg to publish my av stream to nginx-rtmp server.
My h264 stream is 30fps, AAC is 8khz stereo channels.
In the packet sending function, I use following code to read frame,set audio & video’s timestamp.void RtmpLiveEncoder::Run()
{
AVBitStreamFilterContext* aacbsfc = av_bitstream_filter_init("aac_adtstoasc");
start_time = av_gettime();
while(1)
{
do
{
int ret = 0;
AVPacket pkt;
av_init_packet(&pkt);
ret = av_read_frame(ifmt_ctx,&pkt);
if(ret<0 )
{
printf("read video frame failed\n");
break;
}
if(pkt.pts==AV_NOPTS_VALUE)
{
if(_frameduration==0)
{
pkt.dts = pkt.pts=(av_gettime()-start_time)/1000;
}
else
{
pkt.dts = pkt.pts = _lastvideopts;
pkt.duration = _frameduration;
pkt.pos = -1;
_lastvideopts += _frameduration;
}
}
if(av_write_frame(ofmt_ctx,&pkt)<0)
{
printf("write video frame failed\n");
}
av_packet_unref(&pkt);
}while(0);
do
{
if(!_hasaudio)
{
break;
}
if((_lastaudiopts-_lastvideopts)>0)
{
printf("the audio is faster than video, the audio pts is %d, the video pts is %d\n",_lastaudiopts,_lastvideopts);
break;
}
int ret = 0;
AVPacket audiopacket;
av_init_packet(&audiopacket);
ret = av_read_frame(aifmt_ctx,&audiopacket);
if(ret<0)
{
break;
}
AVStream* out_stream = ofmt_ctx->streams[1];
if(av_bitstream_filter_filter(aacbsfc, out_stream->codec, NULL, &audiopacket.data, &audiopacket.size, audiopacket.data, audiopacket.size, 0)<0)
{
printf("remove adts header failed\n");
}
if(av_bitstream_filter_filter(aacbsfc, out_stream->codec, NULL, &audiopacket.buf->data, &audiopacket.buf->size, audiopacket.buf->data, audiopacket.buf->size, 0)<0)
{
printf("remove adts header failed\n");
}
audiopacket.stream_index=1;
audiopacket.dts = audiopacket.pts=_lastaudiopts;
audiopacket.duration = (double)1024/out_stream->codecpar->sample_rate*1000;
_lastaudiopts += audiopacket.duration;
audiopacket.pos = -1;
if(av_write_frame(ofmt_ctx,&audiopacket)<0)
{
printf("write audio failed.\n");
}
av_packet_unref(&audiopacket);
}while(0);
}
av_write_trailer(ofmt_ctx);
}as you can see, I set the audio&video’s pts by frameduration.
Then I use vlc or iphone’s chrome to access the hls stream. At the first, the audio&video is synchronized.But as time goes on, the AV isn’t synchronized,the audio would be faster than video’s.
So what’s wrong with it ?