
Recherche avancée
Médias (91)
-
Chuck D with Fine Arts Militia - No Meaning No
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Paul Westerberg - Looking Up in Heaven
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Le Tigre - Fake French
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Thievery Corporation - DC 3000
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Dan the Automator - Relaxation Spa Treatment
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Gilberto Gil - Oslodum
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (75)
-
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...) -
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...) -
XMP PHP
13 mai 2011, parDixit Wikipedia, XMP signifie :
Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)
Sur d’autres sites (6966)
-
Trying to grab video stream from a 802W device
1er juin 2015, par brentilA group of us in the RC hobby forums had started trying to use a device called the 802W, it takes RCA in and then broadcasts it back out over a WiFi you connect to via an Android or iOS device. They’re typically used for backup camera addon systems for vehicles. We want to use it to do FPV (First Person Video/View) with using smartphones instead of buying more expensive FPV goggles.
802W device example (plenty of clones online)
http://www.amazon.com/Wireless-Backup-Camera-Transmitter-Android/dp/B00LJPTJSY
The problem is you can only use their application WIFI_AVIN or WIFI_AVIN2 from the app stores to connect to it because they don’t publish the information about how to grab the stream data. We want to write our own apps that can use the stream to better show the information. We’ve tried using VLC to grab the stream from an Android phone or a Windows PC but we’ve had no success so far. I was hoping someone could look at the Wireshark outputs and might understand what they’re looking at better than I am. I "think" it’s a UDP multicast being broadcasted but I just don’t know enough to be sure. We’ve tried using VLC to connect to network streams directly on the device or from udp ://@ type addresses but I think part of the issue too might be we’re missing the file path of the stream file.
Attempting to reverse engineer their code for learning purposes showed that ffmpeg is inside a compiled .so library which also seems to be where the actual connection code happens which we were unable to dig into.
In the images 192.168.72.33 is my phone and 192.168.72.173 is the 802W device.
Image of what I believe is a UDP broadcast of the video information.
This is what the stream turns into when the device connects using the WIFI_AVIN application.
-
PipedInputStream / PipedOutputStream, ImageIO and ffmpeg
19 avril 2015, par jdevelopI have the following code in Scala :
val pos = new PipedOutputStream()
val pis = new PipedInputStream(pos)
Future {
LOG.trace("Start rendering")
generateFrames(videoRenderParams.length) {
img ⇒ ImageIO.write(img, "PNG", pos)
}
pos.flush()
IOUtils.closeQuietly(pos)
LOG.trace("Finished rendering")
} onComplete {
case Success(_) ⇒
LOG.trace("Complete successfully")
case Failure(err) ⇒
LOG.error("Can't render stuff", err)
IOUtils.closeQuietly(pis)
IOUtils.closeQuietly(pos)
}
val prc = (ffmpegCli #< pis).!(logger)the Future simply writes the generated images one by one to the OutputStream. Now the ffmpeg process reads the input images from stdin and converts them to MP4 file.
That works pretty well, but for some reason sometimes I’m getting the following stacktraces :
I/O error Pipe closed for process: <input stream="stream" />
java.io.IOException: Pipe closed
at java.io.PipedInputStream.checkStateForReceive(PipedInputStream.java:260)
at java.io.PipedInputStream.receive(PipedInputStream.java:226)
at java.io.PipedOutputStream.write(PipedOutputStream.java:149)
at scala.sys.process.BasicIO$.loop$1(BasicIO.scala:236)
at scala.sys.process.BasicIO$.transferFullyImpl(BasicIO.scala:242)
at scala.sys.process.BasicIO$.transferFully(BasicIO.scala:223)
at scala.sys.process.ProcessImpl$PipeThread.runloop(ProcessImpl.scala:159)
at scala.sys.process.ProcessImpl$PipeSource.run(ProcessImpl.scala:179)At the same time I’m getting the following error from another stream :
javax.imageio.IIOException: I/O error writing PNG file!
at com.sun.imageio.plugins.png.PNGImageWriter.write(PNGImageWriter.java:1168)
at javax.imageio.ImageWriter.write(ImageWriter.java:615)
at javax.imageio.ImageIO.doWrite(ImageIO.java:1612)
at javax.imageio.ImageIO.write(ImageIO.java:1578)
atSo it seems that the streams were broken somewhere in between, so ffmpeg can not read the data, and ImageIO can not write the data.
What is even more interesting - the problem is reproducible only on certain Linux server (Amazon). It works flawlessly on other Linux boxes. So I wonder if somebody could point me out to the possible causes of this error.
What I’ve tried so far :
- use Oracle JDK 8 and OpenJDK
- use different versions of FFMPEG
Nothing worked by the moment.
-
Muxing in audio to gstreamer RTMP stream kills both video and Audio
1er avril 2015, par AdamI need some genius help here - I’m trying to set up a live stream for my upcoming wedding... and I have it ALMOST working - audio seems to be the problem.
This is my setup
- Raspberry Pi Model B+
- Logitech C920 (with onboard h264 encoding that I am utilising)
- on-camera (C920) microphone
- USB wifi to iPhone 4G connection
- gstreamer1.0
- Amazon EC2 Wowza RTMP server
I have it all set up, but as soon as I mux in the audio, the streams wont play by any player.
What Works :
- my gstreamer pipeline WITHOUT the audio muxed in
- Wowza receives a consistent stream, no failures
- The various Flash players / iOS / Android and VLC all play back the videoWhat doesnt :
- enabling audio in the mux (using the pipeline below)
- BUT gstreamer doesnt complain
- BUT Wowza receives a consistent stream, no failures
- The various flash players fail to play both Audio and Video. some just display the first video frame
- VLC plays 1 video frame, and about 100ms of audio, then stopsIdeally I’d like the muxed audio/video FLV stored on the SD card too in case the network goes down - but if the ’tee’ needs to be sacrificed to make it work, so be it.
This is my current FAILING pipeline - I assume there’s something really stupid in it because I know practically nothing about gstreamer.... The first frame loads in all the players (except iOS.. which never shows anything)
# set camera resolution to 720p, and the data format to H264 (alternatives are YUV and JPG)
v4l2-ctl --device=/dev/video0 --set-fmt-video=width=1280,height=720,pixelformat=1
# set the frame rate
v4l2-ctl --device=/dev/video0 --set-parm=10
gst-launch-1.0 -v -e uvch264src initial-bitrate=300000 average-bitrate=300000 device=/dev/video0 name=src auto-start=true src.vidsrc \
! queue \
! video/x-h264,width=1280,height=720,framerate=10/1 \
! h264parse \
! flvmux streamable=true name=mux \
! queue \
! tee name=t \
! queue \
! filesink location=/home/pi/wedding.flv t. \
! queue \
! rtmpsink location='rtmp://wowzaserver/live/wedding live=1' >>/home/pi/wedding.log 2>&1Some of the things I can’t really afford to change at this late stage are the encapsulation (FLV) and wowza RTMP because I’ve built everything around that...
Please Help !! Thanks !
UPDATE
Given that I am also saving the FLV file, I have found that if I use ffmpeg to send that FLV file (using audio copy, video copy) to the RTMP server, everything works (but obviously its not live) ! So I am now starting to believe this is a problem with the way Gstreamer encapsulates RTMP - and by putting ffmpeg in the middle it fixes it... but it’s not live of course.
Is it possible to pipe my output to ffmpeg and using ffmpeg’s RTMP ?