
Recherche avancée
Autres articles (108)
-
Contribute to translation
13 avril 2011You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
MediaSPIP is currently available in French and English (...) -
Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs
12 avril 2011, parLa manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras. -
Personnaliser les catégories
21 juin 2013, parFormulaire de création d’une catégorie
Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
On peut modifier ce formulaire dans la partie :
Administration > Configuration des masques de formulaire.
Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...)
Sur d’autres sites (10593)
-
Cannot use FFmpeg in Xcode iOS Project (file .h not found)
19 mars 2016, par BlackBoxI followed almost step by step this guide.
Almost because I downloaded, as a user suggested, ffmpeg ios library already built from here
I followed from "Linking static libraries in Xcode" but I cannot import anyway the header files of ffmpeg. (So I got the .a files)
For example
#include "avformat.h"
// or
#import "libavformat/avformat.h"
// or
#import <libavformat></libavformat>avformat.h>Everything that I use does not work.
I specify that those .a files are currently in my project directory, indeed, if I import the .a file, it doesn’t complain that it isn’t found, but when compiling, it complains about UTF-8 stuff because .a files are object libraries and cannot be imported that way.
I put also the Header Search Paths for the project as it was suggested and the config.log file but nothing.
Also I see libraries are missing from every project example of FFmpeg I was able to find on GitHub.
Any ideas ?
-
Using ffmpeg to display a static image if an RTMP input source is missing
19 mars 2016, par iameliHere is what I would like ffmpeg to output :
- If I am streaming from my iPhone to my RTMP server, ffmpeg should output the live video from my iPhone.
- If not, ffmpeg should output a blank red screen.
Here’s what I have so far. It sort of works.
ffmpeg \
-f lavfi \
-re \
-i 'color=s=320x240:r=30:c=red' \
-thread_queue_size 512 \
-i 'rtmp://localhost/stream/iphone' \
-c:v libx264 \
-f flv \
-filter_complex "\
[1:v]scale=320:240[stream]; \
[0:v][stream]overlay=0:0:eof_action=pass[output] \
"\
-map '[output]' \
-tune zerolatency \
'rtmp://localhost/stream/output'What happens : it boots up and starts streaming my iPhone’s output no problem. When I disconnect, it hangs for a long time, perhaps 20 seconds. Then it starts outputting red, okay. But then if I reconnect my phone, it doesn’t resume. It’s still red. Two questions :
- Is there a way to configure the buffering so that it starts outputting red as soon as it stops getting data from the RTMP stream ?
- Is there a way to have it auto-retry, so after the RTMP stream returns, it will switch back ?
Full verbose output, if that’s helpful. I’m using the latest git version of ffmpeg as of 2016-03-18 on Ubuntu Wily. The RTMP server is nginx-rtmp.
-
texture rendering issue on iOS using OpenGL ES in Unity project
28 mars 2016, par Time1essI’m working on a project, part of it is to streaming video to my iPhone, currently I use my laptop to create the video stream to my iPhone with ffmpeg.The stream code in shell is below :
ffmpeg \
-f avfoundation -i "1" -s 1280*720 -r 29.97 \
-c:v mpeg2video -q:v 20 -pix_fmt yuv420p -g 1 -threads 4\
-f mpegts udp://192.168.1.102:6666with this, I successfully create my video stream.
In Unity, I want to decode the video stream to create a texture. After I have gone through with some ffmpeg tutorial and Unity tutorial(Since I’m new to both of them), I followed tutorials and created my link library. Some of these codes are below(ask me if more is needed) :
In my library :
buffer alloc :
uint8_t *buffer;
int buffer_size;
buffer_size = avpicture_get_size(AV_PIX_FMT_RGBA, VIEW_WIDTH, VIEW_HEIGHT);
buffer = (uint8_t *) av_malloc(buffer_size*sizeof(uint8_t));
avpicture_fill((AVPicture *) pFrameRGB, buffer, AV_PIX_FMT_RGBA,
VIEW_WIDTH, VIEW_HEIGHT);getContext :
is->sws_ctx = sws_getContext
(
is->video_st->codec->width,
is->video_st->codec->height,
is->video_st->codec->pix_fmt,
VIEW_WIDTH,
VIEW_HEIGHT,
AV_PIX_FMT_RGBA,
SWS_BILINEAR,
NULL,
NULL,
NULL
);sws_scale :
sws_scale(
is->sws_ctx,
(uint8_t const * const *)pFrame->data,
pFrame->linesize,
0,
is->video_st->codec->height,
pFrameRGB->data,
pFrameRGB->linesize
);texture render :
static void UNITY_INTERFACE_API OnRenderEvent(int texID)
{
GLuint gltex = (GLuint)(size_t)(texID);
glBindTexture(GL_TEXTURE_2D, gltex);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, VIEW_WIDTH, VIEW_HEIGHT,
GL_RGBA, GL_UNSIGNED_BYTE, pFrameRGB->data[0]);
glGetError();
return;
}
extern "C" UnityRenderingEvent UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API GetRenderEventFunc()
{
return OnRenderEvent;
}In Unity :
texture created :
private Texture2D texture;
private int texID;
texture = new Texture2D (width, height, TextureFormat.RGBA32, false);
texture.filterMode = FilterMode.Point;
texture.Apply ();
GetComponent<renderer> ().material.mainTexture = texture;
texID = texture.GetNativeTexturePtr ().ToInt32();
</renderer>update func :
void Update ()
{
GL.IssuePluginEvent(GetRenderEventFunc(), texID);
}Video stream info :
Input #0, mpegts, from 'udp://0.0.0.0:6666':
Duration: N/A, start: 2.534467, bitrate: N/A
Program 1
Metadata:
service_name : Service01
service_provider: FFmpeg
Stream #0:0[0x100]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), yuv420p(tv), 1280x720 [SAR 1:1 DAR 16:9], max. 104857 kb/s, 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbcLeave other details, my library works fine on the Unity simulator, but when I compiled all my libraries for arm64 and used the xcode project that Unity created to build my app and ran it, I couldn’t get any texture rendered in my iPhone, I checkd my network and I’m sure that data had been sent to my iPhone and the Debug log showed me that frame has been successfully decoded also the
OnRenderEvent
func had been called.I’m confused and try to find answer on stackoverflow, maybe I’m a beginer cause I can’t find answers, so I ask you guys to help me plz.
FYI :
Unity 5.3.2f1 Personal
Xcode 7.2.1
iOS 9.2.1
ffmpeg 3.0