
Recherche avancée
Médias (1)
-
Carte de Schillerkiez
13 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (98)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Les vidéos
21 avril 2011, parComme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...) -
Participer à sa traduction
10 avril 2011Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
Actuellement MediaSPIP n’est disponible qu’en français et (...)
Sur d’autres sites (10065)
-
swscale/arm : re-enable neon rgbx to nv12 routines
22 février 2016, par Xiaolei Yuswscale/arm : re-enable neon rgbx to nv12 routines
Commit ’842b8f4ba2e79b9c004a67f6fdb3d5c5d05805d3’ fixed clang/iphone
build but failed on some versions of cygwin. It has now been verified
to work on both platforms.Signed-off-by : Michael Niedermayer <michael@niedermayer.cc>
-
texture rendering issue on iOS using OpenGL ES in Unity project
28 mars 2016, par Time1essI’m working on a project, part of it is to streaming video to my iPhone, currently I use my laptop to create the video stream to my iPhone with ffmpeg.The stream code in shell is below :
ffmpeg \
-f avfoundation -i "1" -s 1280*720 -r 29.97 \
-c:v mpeg2video -q:v 20 -pix_fmt yuv420p -g 1 -threads 4\
-f mpegts udp://192.168.1.102:6666with this, I successfully create my video stream.
In Unity, I want to decode the video stream to create a texture. After I have gone through with some ffmpeg tutorial and Unity tutorial(Since I’m new to both of them), I followed tutorials and created my link library. Some of these codes are below(ask me if more is needed) :
In my library :
buffer alloc :
uint8_t *buffer;
int buffer_size;
buffer_size = avpicture_get_size(AV_PIX_FMT_RGBA, VIEW_WIDTH, VIEW_HEIGHT);
buffer = (uint8_t *) av_malloc(buffer_size*sizeof(uint8_t));
avpicture_fill((AVPicture *) pFrameRGB, buffer, AV_PIX_FMT_RGBA,
VIEW_WIDTH, VIEW_HEIGHT);getContext :
is->sws_ctx = sws_getContext
(
is->video_st->codec->width,
is->video_st->codec->height,
is->video_st->codec->pix_fmt,
VIEW_WIDTH,
VIEW_HEIGHT,
AV_PIX_FMT_RGBA,
SWS_BILINEAR,
NULL,
NULL,
NULL
);sws_scale :
sws_scale(
is->sws_ctx,
(uint8_t const * const *)pFrame->data,
pFrame->linesize,
0,
is->video_st->codec->height,
pFrameRGB->data,
pFrameRGB->linesize
);texture render :
static void UNITY_INTERFACE_API OnRenderEvent(int texID)
{
GLuint gltex = (GLuint)(size_t)(texID);
glBindTexture(GL_TEXTURE_2D, gltex);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, VIEW_WIDTH, VIEW_HEIGHT,
GL_RGBA, GL_UNSIGNED_BYTE, pFrameRGB->data[0]);
glGetError();
return;
}
extern "C" UnityRenderingEvent UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API GetRenderEventFunc()
{
return OnRenderEvent;
}In Unity :
texture created :
private Texture2D texture;
private int texID;
texture = new Texture2D (width, height, TextureFormat.RGBA32, false);
texture.filterMode = FilterMode.Point;
texture.Apply ();
GetComponent<renderer> ().material.mainTexture = texture;
texID = texture.GetNativeTexturePtr ().ToInt32();
</renderer>update func :
void Update ()
{
GL.IssuePluginEvent(GetRenderEventFunc(), texID);
}Video stream info :
Input #0, mpegts, from 'udp://0.0.0.0:6666':
Duration: N/A, start: 2.534467, bitrate: N/A
Program 1
Metadata:
service_name : Service01
service_provider: FFmpeg
Stream #0:0[0x100]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), yuv420p(tv), 1280x720 [SAR 1:1 DAR 16:9], max. 104857 kb/s, 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbcLeave other details, my library works fine on the Unity simulator, but when I compiled all my libraries for arm64 and used the xcode project that Unity created to build my app and ran it, I couldn’t get any texture rendered in my iPhone, I checkd my network and I’m sure that data had been sent to my iPhone and the Debug log showed me that frame has been successfully decoded also the
OnRenderEvent
func had been called.I’m confused and try to find answer on stackoverflow, maybe I’m a beginer cause I can’t find answers, so I ask you guys to help me plz.
FYI :
Unity 5.3.2f1 Personal
Xcode 7.2.1
iOS 9.2.1
ffmpeg 3.0
-
Using ffmpeg to display a static image if an RTMP input source is missing
19 mars 2016, par iameliHere is what I would like ffmpeg to output :
- If I am streaming from my iPhone to my RTMP server, ffmpeg should output the live video from my iPhone.
- If not, ffmpeg should output a blank red screen.
Here’s what I have so far. It sort of works.
ffmpeg \
-f lavfi \
-re \
-i 'color=s=320x240:r=30:c=red' \
-thread_queue_size 512 \
-i 'rtmp://localhost/stream/iphone' \
-c:v libx264 \
-f flv \
-filter_complex "\
[1:v]scale=320:240[stream]; \
[0:v][stream]overlay=0:0:eof_action=pass[output] \
"\
-map '[output]' \
-tune zerolatency \
'rtmp://localhost/stream/output'What happens : it boots up and starts streaming my iPhone’s output no problem. When I disconnect, it hangs for a long time, perhaps 20 seconds. Then it starts outputting red, okay. But then if I reconnect my phone, it doesn’t resume. It’s still red. Two questions :
- Is there a way to configure the buffering so that it starts outputting red as soon as it stops getting data from the RTMP stream ?
- Is there a way to have it auto-retry, so after the RTMP stream returns, it will switch back ?
Full verbose output, if that’s helpful. I’m using the latest git version of ffmpeg as of 2016-03-18 on Ubuntu Wily. The RTMP server is nginx-rtmp.