
Recherche avancée
Médias (1)
-
Video d’abeille en portrait
14 mai 2011, par
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (63)
-
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
List of compatible distributions
26 avril 2011, parThe table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)
Sur d’autres sites (9144)
-
Using ffmpeg with Flash Media Server and HDS
20 avril 2012, par JonathanI want to use ffmpeg to encode and publish a live stream to Flash Media Server. In order to support iOS devices, I need to implement HTTP Live Streaming as well. The video needs to be in H.264 format and the audio should be AAC. I don't have much experience working with ffmpeg, and I'm having a hard time getting this to work. This is the command that I've tried (and some variations as well) :
ffmpeg.exe -threads 15 -f dshow -i video="USB2.0 UVC WebCam":audio="Microphone (Realtek High Defini" \
-map_channel 0.1.1 -r 24 -acodec libvo_aacenc -ar 22050 -ab 128k -vcodec libx264 \
-s vga -vb 100k -f flv "rtmp:///livepkgr/livestream1?adbe-live-event=liveevent" \
-r 24 -acodec libvo_aacenc -ar 22050 -ab 128k -vcodec libx264 -s qvga -vb 200k \
-f flv "rtmp:///livepkgr/livestream2?adbe-live-event=liveevent" \
-r 24 -acodec libvo_aacenc -ar 22050 -ab 128k -vcodec libx264 -s vga -vb 350k
-f flv "rtmp:///livepkgr/livestream3?adbe-live-event=liveevent"When I run this, it appears to connect to FMS, but then I get a lot of error messages about dropped frames - I'm not sure if ANY frames get encoded successfully. My CPU usage is very high as well. I get a 404 error from FMS when I enter the URL of the *.m3u8 file for one of the individual streams (the main livestream.m3u8 file is accessible though). I have also tried outputting to a file instead of FMS, with no success. All I get is some very garbled sound and no video.
Any suggestions for what options/commands I should use to get this working ? Is anyone using ffmpeg with FMS to do HTTP Dynamic Streaming / HLS with MP4 video ? I've been struggling to get HDS/HLS working for some time now, and any help would be much appreciated ! It shouldn't make a difference, but I'm using FMS on Amazon EC2 with their AMI image.
Thanks !
-
Using ffmpeg rtmp stream a static image and audio input [on hold]
30 octobre 2017, par ChadUsing a Raspberry Pi, stream audio in and use a static image as the video input thru ffmpeg over RTMP to a Cloud video provider (DaCast in this instance)
So far, I’ve gone through many blog posts, Stack Overflow questions, and package documentation. I’ve found that most of the posts are no longer valid with the newer versions of ffmpeg. Or don’t quite line up with what I am trying to achieve.
However, I have figured out the right settings to stream the Raspberry Pi Camera v2 with the audio in.
ffmpeg \ -f alsa -ac 1 -i plughw:1,0 \ -f v4l2 -s 1920x1080 -r 30 -input_format h264 -i /dev/video0 \ -vcodec copy -preset veryfast -r 15 -g 30 -b:v 64k -ar 44100 -threads 6 -b:a 96k -bufsize 3000k \ -f flv rtmp ://streaming_server_url
But can’t seem to get it right to replace the video input with a static image.
I have tried removing the 3rd line and adding
-loop 1 -i '/path/to/image.jpg'
The logs look like :
[alsa @ 0x55e4b980] Thread message queue blocking ; consider raising the thread_queue_size option (current value : 1024) [alsa @ 0x55e4b980] ALSA buffer xrun. 0kB time=00:00:00.00 bitrate=N/A speed= 0x [alsa @ 0x55e4b980] ALSA buffer xrun.130kB time=00:00:00.27 bitrate=3822.5kbits/s speed=0.0403x ...
I have also tried looping a 4 second video, with similar outcomes.
My Setup for context :
- Raspberry Pi 3 Model B
- USB Audio Device (Sabrent USB External Stereo Sound Adapter)
- Ubuntu MATE 16.04.2 (Xenial)
- ffmpeg version 3.2-2+rpi1 xenial1.7 (I can post what is configured with the build, if needed)
-
Trying to grab video stream from a 802W device
1er juin 2015, par brentilA group of us in the RC hobby forums had started trying to use a device called the 802W, it takes RCA in and then broadcasts it back out over a WiFi you connect to via an Android or iOS device. They’re typically used for backup camera addon systems for vehicles. We want to use it to do FPV (First Person Video/View) with using smartphones instead of buying more expensive FPV goggles.
802W device example (plenty of clones online)
http://www.amazon.com/Wireless-Backup-Camera-Transmitter-Android/dp/B00LJPTJSY
The problem is you can only use their application WIFI_AVIN or WIFI_AVIN2 from the app stores to connect to it because they don’t publish the information about how to grab the stream data. We want to write our own apps that can use the stream to better show the information. We’ve tried using VLC to grab the stream from an Android phone or a Windows PC but we’ve had no success so far. I was hoping someone could look at the Wireshark outputs and might understand what they’re looking at better than I am. I "think" it’s a UDP multicast being broadcasted but I just don’t know enough to be sure. We’ve tried using VLC to connect to network streams directly on the device or from udp ://@ type addresses but I think part of the issue too might be we’re missing the file path of the stream file.
Attempting to reverse engineer their code for learning purposes showed that ffmpeg is inside a compiled .so library which also seems to be where the actual connection code happens which we were unable to dig into.
In the images 192.168.72.33 is my phone and 192.168.72.173 is the 802W device.
Image of what I believe is a UDP broadcast of the video information.
This is what the stream turns into when the device connects using the WIFI_AVIN application.