
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (86)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 est la première version de MediaSPIP stable.
Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Mise à disposition des fichiers
14 avril 2011, parPar défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)
Sur d’autres sites (11909)
-
(IOS) Streaming raw h.264 video from and iphone to a RTSP
10 octobre 2015, par Eric JovenittiOk so i’m new to video encoding and decoding and RTSP, but I need to take a third party component video stream i.e. (A quad copter drone) and output that stream to a RTSP. The stream is in raw h.264. The stream comes from the drones sdk i.e.,
-(void) camera:(DJICamera*)camera didReceivedVideoData:(uint8_t*)videoBuffer length:(int)length{
uint8_t* pBuffer = (uint8_t*)malloc(length);
memcpy(pBuffer, videoBuffer, length);
[[[VideoPreviewer instance] dataQueue] push:pBuffer length:length];}
As you can see the raw video comes in from camera in the uint8_t.
To my understanding this a decoded raw h.264 stream with Nalu’s an a IDR frame.So I want to stream this video to a RTSP so it can be watch on VLC .
Does anyone know of tutorials or demo code to stream this video via a RTSP ?
Additionally I found this demo : http://www.gdcl.co.uk/2013/02/20/iOS-Video-Encoding.html
This sends the iphone camera’s video out to a RTSP and it work well.
But i can’t figure out how to inject the drones streaming video into this demo RTSP app.This demo uses
AVCaptureDevice to capture the cameras streaming data.So my questions from this demo are :
Is there anyway to make my drones stream an AVCAptureDevice ?
If not, can i just inject the drones raw data into the where the iphones camera data would be and just use the same properties that the iphome camera already configured ?if i can’t do those options can I just use the demos RTSP part ?
If that’s possible, i can that the iphone camera data is being encoded. Does this mean i would need to encode the drones stream ? if so, how would i go about doing this ? The drones SKD uses FFmpeg to output the video so it can be seen in real time on the iphones screen.Thanks so much for the help !
-
Using FFMPEG library with iPhone SDK for video encoding
21 juin 2015, par user203349I need to encode several pictures shot by the iphone camera into a mp4 video file and I know FFMPEG can do this (the application TimeLapser and ReelMoments do it already). I plan to use this in my app iMotion (available in the appstore).
I successfully install and compile the ffmpeg for the iphone SDK with this link :
http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-October/076618.htmlBut now I’m stuck here in my XCode project. What should I do next to use the FFMPEG library for video encoding ? The Apple documentation about external library using is very light and I just can find any tutorial on the web wich explain how to do this.
If anybody can help me, I would be very grateful..`
Thanks
Aurelien Potier
-
FFMPEG pushed RTMP stream not working on Android & iPhone
1er décembre 2015, par BlackDivineI have to make a semi-live-stream. I used Nginx-rtmp module and then pushed content to it via ffmpeg using :
ffmpeg -re -i content.mp4 -r 25 -f fvl "rtmp://rtmp.server.here"
The stream runs fine when I open it in VLC from "rtmp ://rtmp.server.here"
But I also have to make iPhone and Android apps that play these streams. And that’s the problem, the stream doesn’t work on Android and iPhone.
If I use Wowza streaming cloud and stream to Wowza cloud instead of my own nginx-rtmp server then the same app written for Android & iPhone can playback the stream just fine.
Now either nginx-rtmp is not working right, or what else ? I’ve also tried crtmpserver and the same thing happens.
What I want to acheive :
I have to develop a system where we can upstream a TV-Channel (have rights for it) to a server and then make a website, android app & iPhone app so consumers can watch the live channel.The uploading part I have a clue of, probably a TV tuner card and Open Broadcast Software to stream it to server. But the Live playback is new to me.
UPDATE : I also used ffprobe and here’s the output. (See the last line)
munir@munir-HP-ProBook-450-G2:~$ ffprobe rtmp://rtmp.server.here
ffprobe version 2.6.2 Copyright (c) 2007-2015 the FFmpeg developers
built with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1)
configuration: --extra-libs=-ldl --prefix=/opt/ffmpeg --enable-avresample --disable-debug --enable-nonfree --enable-gpl --enable-version3 --enable-libopencore-amrnb --enable-libopencore-amrwb --disable-decoder=amrnb --disable-decoder=amrwb --enable-libpulse --enable-libx264 --enable-libx265 --enable-libfdk-aac --enable-libvorbis --enable-libmp3lame --enable-libopus --enable-libvpx --enable-libspeex --enable-libass --enable-avisynth --enable-libsoxr --enable-libxvid --enable-libvo-aacenc --enable-libvidstab
libavutil 54. 20.100 / 54. 20.100
libavcodec 56. 26.100 / 56. 26.100
libavformat 56. 25.101 / 56. 25.101
libavdevice 56. 4.100 / 56. 4.100
libavfilter 5. 11.102 / 5. 11.102
libavresample 2. 1. 0 / 2. 1. 0
libswscale 3. 1.101 / 3. 1.101
libswresample 1. 1.100 / 1. 1.100
libpostproc 53. 3.100 / 53. 3.100
[flv @ 0x267cc60] Stream discovered after head already parsed
Last message repeated 1 times
Input #0, flv, from 'rtmp://stage.funworldpk.com/live':
Metadata:
Server : NGINX RTMP (github.com/arut/nginx-rtmp-module)
displayWidth : 320
displayHeight : 240
fps : 20
profile :
level :
Duration: 00:00:00.00, start: 288.763000, bitrate: N/A
Stream #0:0: Video: h264 (High), yuv420p, 320x240 [SAR 1:1 DAR 4:3], 20 fps, 20 tbr, 1k tbn, 40 tbc
Stream #0:1: Data: none
Stream #0:2: Audio: aac (LC), 22050 Hz, stereo, fltp
Unsupported codec with id 0 for input stream 1Update 2 :
I got my stream working by using Licensed copy of Wowza streaming server. Everything works now. But obviously this will not be an option for everyone that’s why I am not posting it as an answer.