
Recherche avancée
Autres articles (32)
-
Submit bugs and patches
13 avril 2011Unfortunately a software is never perfect.
If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
You may also (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
MediaSPIP Player : les contrôles
26 mai 2010, parLes contrôles à la souris du lecteur
En plus des actions au click sur les boutons visibles de l’interface du lecteur, il est également possible d’effectuer d’autres actions grâce à la souris : Click : en cliquant sur la vidéo ou sur le logo du son, celui ci se mettra en lecture ou en pause en fonction de son état actuel ; Molette (roulement) : en plaçant la souris sur l’espace utilisé par le média (hover), la molette de la souris n’exerce plus l’effet habituel de scroll de la page, mais diminue ou (...)
Sur d’autres sites (4768)
-
Trying to grab video stream from a 802W device
1er juin 2015, par brentilA group of us in the RC hobby forums had started trying to use a device called the 802W, it takes RCA in and then broadcasts it back out over a WiFi you connect to via an Android or iOS device. They’re typically used for backup camera addon systems for vehicles. We want to use it to do FPV (First Person Video/View) with using smartphones instead of buying more expensive FPV goggles.
802W device example (plenty of clones online)
http://www.amazon.com/Wireless-Backup-Camera-Transmitter-Android/dp/B00LJPTJSY
The problem is you can only use their application WIFI_AVIN or WIFI_AVIN2 from the app stores to connect to it because they don’t publish the information about how to grab the stream data. We want to write our own apps that can use the stream to better show the information. We’ve tried using VLC to grab the stream from an Android phone or a Windows PC but we’ve had no success so far. I was hoping someone could look at the Wireshark outputs and might understand what they’re looking at better than I am. I "think" it’s a UDP multicast being broadcasted but I just don’t know enough to be sure. We’ve tried using VLC to connect to network streams directly on the device or from udp ://@ type addresses but I think part of the issue too might be we’re missing the file path of the stream file.
Attempting to reverse engineer their code for learning purposes showed that ffmpeg is inside a compiled .so library which also seems to be where the actual connection code happens which we were unable to dig into.
In the images 192.168.72.33 is my phone and 192.168.72.173 is the 802W device.
Image of what I believe is a UDP broadcast of the video information.
This is what the stream turns into when the device connects using the WIFI_AVIN application.
-
Using ffmpeg rtmp stream a static image and audio input [on hold]
30 octobre 2017, par ChadUsing a Raspberry Pi, stream audio in and use a static image as the video input thru ffmpeg over RTMP to a Cloud video provider (DaCast in this instance)
So far, I’ve gone through many blog posts, Stack Overflow questions, and package documentation. I’ve found that most of the posts are no longer valid with the newer versions of ffmpeg. Or don’t quite line up with what I am trying to achieve.
However, I have figured out the right settings to stream the Raspberry Pi Camera v2 with the audio in.
ffmpeg \ -f alsa -ac 1 -i plughw:1,0 \ -f v4l2 -s 1920x1080 -r 30 -input_format h264 -i /dev/video0 \ -vcodec copy -preset veryfast -r 15 -g 30 -b:v 64k -ar 44100 -threads 6 -b:a 96k -bufsize 3000k \ -f flv rtmp ://streaming_server_url
But can’t seem to get it right to replace the video input with a static image.
I have tried removing the 3rd line and adding
-loop 1 -i '/path/to/image.jpg'
The logs look like :
[alsa @ 0x55e4b980] Thread message queue blocking ; consider raising the thread_queue_size option (current value : 1024) [alsa @ 0x55e4b980] ALSA buffer xrun. 0kB time=00:00:00.00 bitrate=N/A speed= 0x [alsa @ 0x55e4b980] ALSA buffer xrun.130kB time=00:00:00.27 bitrate=3822.5kbits/s speed=0.0403x ...
I have also tried looping a 4 second video, with similar outcomes.
My Setup for context :
- Raspberry Pi 3 Model B
- USB Audio Device (Sabrent USB External Stereo Sound Adapter)
- Ubuntu MATE 16.04.2 (Xenial)
- ffmpeg version 3.2-2+rpi1 xenial1.7 (I can post what is configured with the build, if needed)
-
Using ffmpeg with Flash Media Server and HDS
20 avril 2012, par JonathanI want to use ffmpeg to encode and publish a live stream to Flash Media Server. In order to support iOS devices, I need to implement HTTP Live Streaming as well. The video needs to be in H.264 format and the audio should be AAC. I don't have much experience working with ffmpeg, and I'm having a hard time getting this to work. This is the command that I've tried (and some variations as well) :
ffmpeg.exe -threads 15 -f dshow -i video="USB2.0 UVC WebCam":audio="Microphone (Realtek High Defini" \
-map_channel 0.1.1 -r 24 -acodec libvo_aacenc -ar 22050 -ab 128k -vcodec libx264 \
-s vga -vb 100k -f flv "rtmp:///livepkgr/livestream1?adbe-live-event=liveevent" \
-r 24 -acodec libvo_aacenc -ar 22050 -ab 128k -vcodec libx264 -s qvga -vb 200k \
-f flv "rtmp:///livepkgr/livestream2?adbe-live-event=liveevent" \
-r 24 -acodec libvo_aacenc -ar 22050 -ab 128k -vcodec libx264 -s vga -vb 350k
-f flv "rtmp:///livepkgr/livestream3?adbe-live-event=liveevent"When I run this, it appears to connect to FMS, but then I get a lot of error messages about dropped frames - I'm not sure if ANY frames get encoded successfully. My CPU usage is very high as well. I get a 404 error from FMS when I enter the URL of the *.m3u8 file for one of the individual streams (the main livestream.m3u8 file is accessible though). I have also tried outputting to a file instead of FMS, with no success. All I get is some very garbled sound and no video.
Any suggestions for what options/commands I should use to get this working ? Is anyone using ffmpeg with FMS to do HTTP Dynamic Streaming / HLS with MP4 video ? I've been struggling to get HDS/HLS working for some time now, and any help would be much appreciated ! It shouldn't make a difference, but I'm using FMS on Amazon EC2 with their AMI image.
Thanks !