
Recherche avancée
Médias (17)
-
Matmos - Action at a Distance
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
DJ Dolores - Oslodum 2004 (includes (cc) sample of “Oslodum” by Gilberto Gil)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Danger Mouse & Jemini - What U Sittin’ On ? (starring Cee Lo and Tha Alkaholiks)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Cornelius - Wataridori 2
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Rapture - Sister Saviour (Blackstrobe Remix)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Chuck D with Fine Arts Militia - No Meaning No
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (81)
-
MediaSPIP Player : problèmes potentiels
22 février 2011, parLe lecteur ne fonctionne pas sur Internet Explorer
Sur Internet Explorer (8 et 7 au moins), le plugin utilise le lecteur Flash flowplayer pour lire vidéos et son. Si le lecteur ne semble pas fonctionner, cela peut venir de la configuration du mod_deflate d’Apache.
Si dans la configuration de ce module Apache vous avez une ligne qui ressemble à la suivante, essayez de la supprimer ou de la commenter pour voir si le lecteur fonctionne correctement : /** * GeSHi (C) 2004 - 2007 Nigel McNie, (...) -
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...) -
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
Sur d’autres sites (10674)
-
Using ffmpeg to display a static image if an RTMP input source is missing
19 mars 2016, par iameliHere is what I would like ffmpeg to output :
- If I am streaming from my iPhone to my RTMP server, ffmpeg should output the live video from my iPhone.
- If not, ffmpeg should output a blank red screen.
Here’s what I have so far. It sort of works.
ffmpeg \
-f lavfi \
-re \
-i 'color=s=320x240:r=30:c=red' \
-thread_queue_size 512 \
-i 'rtmp://localhost/stream/iphone' \
-c:v libx264 \
-f flv \
-filter_complex "\
[1:v]scale=320:240[stream]; \
[0:v][stream]overlay=0:0:eof_action=pass[output] \
"\
-map '[output]' \
-tune zerolatency \
'rtmp://localhost/stream/output'What happens : it boots up and starts streaming my iPhone’s output no problem. When I disconnect, it hangs for a long time, perhaps 20 seconds. Then it starts outputting red, okay. But then if I reconnect my phone, it doesn’t resume. It’s still red. Two questions :
- Is there a way to configure the buffering so that it starts outputting red as soon as it stops getting data from the RTMP stream ?
- Is there a way to have it auto-retry, so after the RTMP stream returns, it will switch back ?
Full verbose output, if that’s helpful. I’m using the latest git version of ffmpeg as of 2016-03-18 on Ubuntu Wily. The RTMP server is nginx-rtmp.
-
OpenGLES glReadPixels exc_bad_access
29 novembre 2011, par YannyI'm trying to create video from images using OpenGLES and ffmpeg, but on iPad(4.3) I have a crash on glReadPixels
-(NSData *) glToUIImage {
int numberOfComponents = NUMBER_OF_COMPONENTS; //4
int width = PICTURE_WIDTH;
int height = PICTURE_HEIGHT;
NSInteger myDataLength = width * height * numberOfComponents;
NSMutableData * buffer= [NSMutableData dataWithLength :myDataLength];
[self checkForGLError];
GLenum type = NUMBER_OF_COMPONENTS == 3 ? GL_RGB : GL_RGBA; //RGBA
glReadPixels(0, 0, width, height, type, GL_UNSIGNED_BYTE, [buffer mutableBytes]); //EXC_BAD_ACCESS here
return buffer;
}It is working on iPhone 4 (4.3) and iPod Touch, but have problems on iPhone 3G(3.0) and iPad(4.3). Can you help me with this issue ?
Also on iPhone 3G(3.0) and iPad(4.3) I have problems with Video - first 5-20 video frames have trash. Maybe issue with optimization ? Or architecture ?
-
Treating a video stream as playback with pausing
21 janvier 2020, par kealistI am working on an application that streams multiple h264 video streams to a video wall. I am using libav/ffmpeg libs to stream multiple video files at once from inside the application. The application will control playback speed, seeking, pausing, resuming, stopping, and the video wall will only be receiving udp streams.
I want to implement streaming such that if the videos are paused, the same frame is sent continually so that it looks as if it is a video window in a paused state.
How can i insert copies of the same h264 frame into the stream so that it does not mess up sending of later frames ?
My code is almost an exact port of
transcoding.c
from ffmpeg.exe. Planning on retaining a copy of the last frame/packet, and when paused to send this. Is this likely to function properly, or should I approach this a different way.while (true)
{
if (paused) {
// USE LAST PACKET
}
else
{
if ((ret = ffmpeg.av_read_frame(ifmt_ctx, &packet)) < 0)
break;
}
stream_index = packet.stream_index;
type = ifmt_ctx->streams[packet.stream_index]->codec->codec_type;
Console.WriteLine("Demuxer gave frame of stream_index " + stream_index);
if (filter_ctx[stream_index].filter_graph != null)
{
Console.WriteLine("Going to reencode&filter the frame\n");
frame = ffmpeg.av_frame_alloc();
if (frame == null)
{
ret = ffmpeg.AVERROR(ffmpeg.ENOMEM);
break;
}
packet.dts = ffmpeg.av_rescale_q_rnd(packet.dts,
ifmt_ctx->streams[stream_index]->time_base,
ifmt_ctx->streams[stream_index]->codec->time_base,
AVRounding.AV_ROUND_NEAR_INF | AVRounding.AV_ROUND_PASS_MINMAX);
packet.pts = ffmpeg.av_rescale_q_rnd(packet.pts,
ifmt_ctx->streams[stream_index]->time_base,
ifmt_ctx->streams[stream_index]->codec->time_base,
AVRounding.AV_ROUND_NEAR_INF | AVRounding.AV_ROUND_PASS_MINMAX);
if (type == AVMediaType.AVMEDIA_TYPE_VIDEO)
{
ret = ffmpeg.avcodec_decode_video2(stream_ctx[packet.stream_index].dec_ctx, frame,
&got_frame, &packet);
}
else
{
ret = ffmpeg.avcodec_decode_audio4(stream_ctx[packet.stream_index].dec_ctx, frame,
&got_frame, &packet);
}
if (ret < 0)
{
ffmpeg.av_frame_free(&frame);
Console.WriteLine("Decoding failed\n");
break;
}
if (got_frame != 0)
{
frame->pts = ffmpeg.av_frame_get_best_effort_timestamp(frame);
ret = filter_encode_write_frame(frame, (uint)stream_index);
// SAVE LAST FRAME/PACKET HERE
ffmpeg.av_frame_free(&frame);
if (ret < 0)
goto end;
}
else
{
ffmpeg.av_frame_free(&frame);
}
}
else
{
/* remux this frame without reencoding */
packet.dts = ffmpeg.av_rescale_q_rnd(packet.dts,
ifmt_ctx->streams[stream_index]->time_base,
ofmt_ctx->streams[stream_index]->time_base,
AVRounding.AV_ROUND_NEAR_INF | AVRounding.AV_ROUND_PASS_MINMAX);
packet.pts = ffmpeg.av_rescale_q_rnd(packet.pts,
ifmt_ctx->streams[stream_index]->time_base,
ofmt_ctx->streams[stream_index]->time_base,
AVRounding.AV_ROUND_NEAR_INF | AVRounding.AV_ROUND_PASS_MINMAX);
ret = ffmpeg.av_interleaved_write_frame(ofmt_ctx, &packet);
if (ret < 0)
goto end;
}
ffmpeg.av_free_packet(&packet);
}