
Recherche avancée
Médias (2)
-
Exemple de boutons d’action pour une collection collaborative
27 février 2013, par
Mis à jour : Mars 2013
Langue : français
Type : Image
-
Exemple de boutons d’action pour une collection personnelle
27 février 2013, par
Mis à jour : Février 2013
Langue : English
Type : Image
Autres articles (109)
-
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...) -
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs
12 avril 2011, parLa manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.
Sur d’autres sites (15575)
-
ffmpeg can't read png data on iphone
10 octobre 2014, par user2789801I’m using ffmpeg to decode a png picture and use the AVFrame as a opengl texture.
But the strangest thing is that I can get the png converted to opengl texture nicely on a iphone simulator, but I got a blank texture on a real iphone.on both simulator and iphone, I got a null pointer for AVFrame’s data
avcodec_decode_video2(codecContext/* AVCodecContext* */,frame /* AVFrame */,&finished,&tempPacket);
Then I covert the color space to AV_PIX_FMT_RGBA
void convertToRGBColor()
{
int numBytes = avpicture_get_size(
AV_PIX_FMT_RGBA,
codecContext->width,
codecContext->height);
uint8_t *buffer = (uint8_t *)av_malloc(numBytes);
avpicture_fill(rgbFrame/* AVFrame* */, buffer, AV_PIX_FMT_RGBA, codecContext->width, codecContext->height);
struct SwsContext *img_convert_ctx = NULL;
img_convert_ctx = sws_getCachedContext(
img_convert_ctx,
codecContext->width,
codecContext->height,
codecContext->pix_fmt,
codecContext->width,
codecContext->height,
COLOR_SPACE,
SWS_BILINEAR,
NULL,
NULL,
NULL);
if( !img_convert_ctx )
{
fprintf(stderr, "Cannot initialize sws conversion context\n");
}
sws_scale(img_convert_ctx,
frame->data,
frame->linesize,
0,
codecContext->height,
rgbFrame->data,
rgbFrame->linesize);
sws_freeContext(img_convert_ctx);
}On a simulator, rgbFrame’s data[0] will be a valid pointer, but on a iphone, it’s null.
So, does anyone had the same problem before ?
-
I use ffmpeg dinamically to edit a local hosted video to be streamed on iPhone over 3G [on hold]
3 octobre 2014, par Luchian ChivoiuI use ffmpeg dinamically to edit a local hosted video to be streamed on iPhone over 3G .
I use this line :
"ffmpeg -i C :\file1.mp4 -s 640x480 -r 30000/1001 -b:a 56k -bt 40k -b:v 50k -maxrate 10M -bufsize 10M -coder 0 -bf 0 -refs 1 -ar 48000 -ab 52k C :\file2.mp4" and it doesn’t work to open on iPhone the transformed file called "file2.mp4".
I use Web Api and C# to call the ffmpeg locally and the resulted file opens on the browser but still doesn’t want to stream to an iPhone over 3G only if it’s over WiFi.Here Apple explains the process but I still didn’t get it work .
One example that can’t be played on IPhone is this one : http://www.starlety.com/Videos/8ada5277-424a-e411-b2e5-00ff91d7f3dd.mp4
-
Live Stream using .m3u8 and .ts files with iPhone as server
26 février 2015, par BhumitI am trying to accomplish a task to live stream from iPhone camera. I have done some research and found that i can use
.m3u8
files for streaming live video with should contain .ts(Mpeg-2)
files .Now the file which i have on my iPhone is
.mp4
file and it does not work with .m3u8, so i figured out i will have to convert.mp4
to.ts
for that , but i have not succeeded in doing so.I found that it is possible to convert video ffmpeg lib as mentioned in this article here. I have successfully imported
ffmpeg
library but not able figure out how can i use it to convert a video as i am using this for first time.One another thing apple documentation says
There are a number of hardware and software encoders that can create
MPEG-2 transport streams carrying MPEG-4 video and AAC audio in real
time.What is being said here ? is there any other way i can use .mp4 files for live streaming without converting them from iOS ?
Let me know if i am not clear, i can provide more information .Any suggestion is appreciated. I would like to know am i on a right path here ?
EDIT
I am adding more info to my question, so basically what i am asking is , we can convert .mp4 video to .ts using following command
ffmpeg -i file.mp4 -acodec libfaac -vcodec libx264 -an -map 0 -f segment -segment_time 10 -segment_list test.m3u8 -segment_format mpegts -vbsf h264_mp4toannexb -flags -global_header stream%05d.ts
How can i use ffmpeg library to do what this command does in iOS.