
Recherche avancée
Médias (1)
-
Video d’abeille en portrait
14 mai 2011, par
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (88)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
MediaSPIP Player : problèmes potentiels
22 février 2011, parLe lecteur ne fonctionne pas sur Internet Explorer
Sur Internet Explorer (8 et 7 au moins), le plugin utilise le lecteur Flash flowplayer pour lire vidéos et son. Si le lecteur ne semble pas fonctionner, cela peut venir de la configuration du mod_deflate d’Apache.
Si dans la configuration de ce module Apache vous avez une ligne qui ressemble à la suivante, essayez de la supprimer ou de la commenter pour voir si le lecteur fonctionne correctement : /** * GeSHi (C) 2004 - 2007 Nigel McNie, (...) -
List of compatible distributions
26 avril 2011, parThe table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)
Sur d’autres sites (11577)
-
Live video streaming with Node js ,HTML5, MPEG-DASH, FFMPEG and IP camera/Raspberry Pi
12 mai 2016, par sparksI am a programmer but i am very new in live video streaming concepts.I need help
What i want to accomplish
I want to develop an online live video streaming system.The scenario is i would have a device or number of devices(raspberry pi & camera only OR ip camera only ...not sure yet) to capture the video and stream the video live in real time remotely to my web app.Multiple clients can connect to the web app and watch the video live.The key things to note here, is that these devices should be wireless(able to connect to internet and live stream the content) and also i want to eliminate the idea of manually configuring the ip adrress to local WIFI router.So simply i turn on the device and it start streaming right away to the web app.
Infrastructure, Platforms,Browsers, Streaming methods and formats
In the beginning i just want to stream though chrome web browser(that’s all i care about).But in the future i would build android and IOS mobile apps.So long term i would expect to be Chrome and mobile(Android & IOS platforms)
So based on my research i learned that the client should be HTML5, streaming method MPEG-DASH(In the future add HLS), the web app will be in Node Js.I also came across Dash.js for Html5.My understanding of streaming based on my research
I also came across things like FFMPEG,Dash encoder and wowza which i am not clear about.Now correct me if i am wrong, my understanding is that FFMPEG get hold of the device/camera and the content(i am not sure the format of the content at this point)and format it(i am not sure what this means in simple english) and then Dash encoder picks up and re-format the content to MPEG-DASH format, which produces MPD and then Dash.js client uses MPD to display the video to the browser.
QUESTIONS
-
First correct me if i am wrong based on my understanding above or
clarify for me.Also I am not sure of where the wowza streaming
engine come into play. Do i even need it ? -
I am not sure of the devices to use between Raspberry pi with camera
module/ Or IP Wifi camera by itself.I know with raspberry pi
connected to internet you can set up all the necessary programs and
stream the video to web app directly(not sure about quality and
performance) but I am not sure about Wifi camera.Is it possible to
connect to the wifi camera remotely from the web app programatically
without opening the wifi router portal manually or i should stick
with Raspbery Pi ? -
For raspberry Pi would i be able to connect it with high quality
picture IP camera/web cam ? (The point here to get the best picture
through raspbery Pi)My expectations
Better performance and quality would be great.But i know live streaming is not easy so i am willing to compromise performance to a point but not quality.
Thank you in advance, Anything will be appreciated.I know this is a lot so take your time :)
-
-
How to embed subtitles into an mp4 file using gstreamer
27 août 2021, par StephenMy Goal


I'm trying to embed subtitles into an mp4 file using the
mp4mux
gstreamer element.

What I've tried


The pipeline I would like to use is :


GST_DEBUG=3 gst-launch-1.0 filesrc location=sample-nosub-avc.mp4 ! qtdemux ! queue ! video/x-h264 ! mp4mux name=mux reserved-moov-update-period=1000 ! filesink location=output.mp4 filesrc location=english.srt ! subparse ! queue ! text/x-raw,format=utf8 ! mux.subtitle_0



It just demuxes a sample mp4 file for the h.264 stream and then muxes it together with an srt subtitle file.


The error I get is :


Setting pipeline to PAUSED ...
0:00:00.009958915 1324869 0x5624a8c7a0a0 WARN basesrc gstbasesrc.c:3600:gst_base_src_start_complete:<filesrc0> pad not activated yet
Pipeline is PREROLLING ...
0:00:00.010128080 1324869 0x5624a8c53de0 WARN basesrc gstbasesrc.c:3072:gst_base_src_loop:<filesrc1> error: Internal data stream error.
0:00:00.010129102 1324869 0x5624a8c53e40 WARN qtdemux qtdemux_types.c:239:qtdemux_type_get: unknown QuickTime node type pasp
0:00:00.010140810 1324869 0x5624a8c53de0 WARN basesrc gstbasesrc.c:3072:gst_base_src_loop:<filesrc1> error: streaming stopped, reason not-negotiated (-4)
0:00:00.010172990 1324869 0x5624a8c53e40 WARN qtdemux qtdemux.c:3237:qtdemux_parse_trex:<qtdemux0> failed to find fragment defaults for stream 1
ERROR: from element /GstPipeline:pipeline0/GstFileSrc:filesrc1: Internal data stream error.
Additional debug info:
gstbasesrc.c(3072): gst_base_src_loop (): /GstPipeline:pipeline0/GstFileSrc:filesrc1:
streaming stopped, reason not-negotiated (-4)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
</qtdemux0></filesrc1></filesrc1></filesrc0>


My Thoughts


I believe the issue is not related to the above warning but rather
mp4mux
's incompatibility with srt subtitles.

The reason I belive this is because, other debug logs hint at it, but also stealing the subititles from another mp4 file and muxing it back together does work.


gst-launch-1.0 filesrc location=sample-nosub-avc.mp4 ! qtdemux ! mp4mux name=mux ! filesink location=output.mp4 filesrc location=sample-with-subs.mp4 ! qtdemux name=demux demux.subtitle_1 ! text/x-raw,format=utf8 ! queue ! mux.subtitle_0



A major catch 22 I am having is that mp4 files don't typically support srt subtitles, but gstreamer's
subparse
element doesn't support parsing mp4 subtitle formats (tx3g, ttxt, etc.) so I'm not sure how I'm meant to put it all together.

I'm very sorry for the lengthy question but I've tried many things so it was difficult to condense it. Any hints or help is appreciated. Thank you.


-
ios video after trimming then play on non ios device audio/video out of sync
31 août 2015, par gavinHetrimming video,then I send the video trimmed to android device and play,I find audio/video out of sync, the audio is several seconds behind the video. but the video can play normal on iOS device.
1.I trim video with codes like this :- (IBAction)showTrimmedVideo:(UIButton *)sender
{
[self deleteTmpFile];
NSURL *videoFileUrl = [NSURL fileURLWithPath:self.originalVideoPath];
AVAsset *anAsset = [[AVURLAsset alloc] initWithURL:videoFileUrl options:nil];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:anAsset];
if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality]) {
self.exportSession = [[AVAssetExportSession alloc]
initWithAsset:anAsset presetName:AVAssetExportPresetHighestQuality];
// Implementation continues.
NSURL *furl = [NSURL fileURLWithPath:self.tmpVideoPath];
self.exportSession.outputURL = furl;
self.exportSession.outputFileType = AVFileTypeMPEG4;
CMTime start = CMTimeMakeWithSeconds(self.startTime, anAsset.duration.timescale);
CMTime duration = CMTimeMakeWithSeconds(self.stopTime-self.startTime, anAsset.duration.timescale);
CMTimeRange range = CMTimeRangeMake(start, duration);
self.exportSession.timeRange = range;
self.trimBtn.hidden = YES;
self.myActivityIndicator.hidden = NO;
[self.myActivityIndicator startAnimating];
[self.exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([self.exportSession status]) {
case AVAssetExportSessionStatusFailed:
NSLog(@"Export failed: %@", [[self.exportSession error] localizedDescription]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"Export canceled");
break;
default:
NSLog(@"NONE");
dispatch_async(dispatch_get_main_queue(), ^{
[self.myActivityIndicator stopAnimating];
self.myActivityIndicator.hidden = YES;
self.trimBtn.hidden = NO;
[self playMovie:self.tmpVideoPath];
});
break;
}
}];
}
}2.I send the video trimmed to server,then android device get video from server,but they find audio/video out of sync,at first I consider of server do something wrong,so I just send video to android device with USB,the error still exist.
3.so I analyze the trimmed video by ffmpeg tools :
ffmpeg -i trimVideo.mp4
then I find trimVideo.mp4 start is a negative number.
here is what ffmpeg print :Metadata :
major_brand : qt
minor_version : 0
compatible_brands : qt
creation_time : 2015-08-29 12:22:13
encoder : Lavf56.15.102
Duration : 00:02:21.77, start : -4.692568, bitrate : 359 kb/s
Stream #0:0(und) : Audio : aac (LC) (mp4a / 0x6134706D), 24000 Hz, stereo, fltp, 69 kb/s (default)
Metadata :
creation_time : 2015-08-29 12:22:13
handler_name : Core Media Data Handler
Stream #0:1(und) : Video : h264 (High) (avc1 / 0x31637661), yuv420p, 512x288 [SAR 1:1 DAR 16:9], 277 kb/s, 15.16 fps, 15.17 tbr, 12136 tbn, 30.34 tbc (default)
Metadata :
creation_time : 2015-08-29 12:22:13
handler_name : Core Media Data Handler
encoder : ’avc1’I have been puzzled by this bug for several days, I am sorry of my bad english and I really need your help,thanks.