
Recherche avancée
Autres articles (56)
-
Gestion des droits de création et d’édition des objets
8 février 2011, parPar défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;
-
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
-
Dépôt de média et thèmes par FTP
31 mai 2013, parL’outil MédiaSPIP traite aussi les média transférés par la voie FTP. Si vous préférez déposer par cette voie, récupérez les identifiants d’accès vers votre site MédiaSPIP et utilisez votre client FTP favori.
Vous trouverez dès le départ les dossiers suivants dans votre espace FTP : config/ : dossier de configuration du site IMG/ : dossier des média déjà traités et en ligne sur le site local/ : répertoire cache du site web themes/ : les thèmes ou les feuilles de style personnalisées tmp/ : dossier de travail (...)
Sur d’autres sites (9430)
-
IPCamera on my LAN streaming video using RTSP with ffmpeg in android
16 juillet 2012, par LMDSI have an IPCamera on my LAN streaming video using RTSP. I have been able to capture and display it successfully using MediaPlayer or Videoview, but now I need to do it for more then one streamming.
After doing some research I found out that I have to use ffmpeg to decode/encode video stream. I can compile the ffmpeg libraries using and I know how to use the NDK, (this link was very helpfull http://marakana.com/s/introduction_to_ndk,1153/index.html)
So I would like to achieve is using programming in C/C++ using ffmpeg library to do that.
So let me phrase two simple questions :
How do I receive the stream in a C/C++ program using FFMPEG library (just provide some URL/tutorial, as google was not helpful)
How do I display the received video in ? (same here, some good URL to direct me).
-
FFmpeg streaming : can't see image from h263 stream in flash
5 avril 2013, par JustLoginI'm using FFmpeg for live rtmp-streaming of my desktop to Wowza. If I publish stream in h264 or flashsv2 codec, everything works fine, but I need h263 support too. The problem is, that I see no images from stream in Flash, if my stream's codec is H263 ! I catch NetStream.Play.Start event, and even get video's metadata, but no image is displayed in my Video object. There was the same problem with h264, when I tried to stream it with this command :
ffmpeg -f dshow -i video=UScreenCapture -vcodec h264 -f flv rtmp://[adr]/desc
Choosing yuv420p pixel format fixed this bug. So, streaming with the command below, is displayed in proper way in Flash :
ffmpeg -f dshow -i video=UScreenCapture -vcodec h264 -pix_fmt yuv420p -f flv rtmp://[adr]/desc
But when I try to solve the problem with h263 this way, nothing happens : still no image in flash.
What I have to do to fix this bug ?Here is my h263 command :
ffmpeg -f dshow -i video=UScreenCapture -vcodec h263 -pix_fmt yuv420p -s 352x288 -f flv rtmp://[adr]/desc
-
How to implement HTTP Live Streaming server on Unix ?
5 septembre 2012, par alexI just realized that Apple required HTTP Live Streaming in order to view videos in iPhone apps. I was not aware of this before... I am now trying to understand what this involves so I can decide whether I want to do the work and make the videos available in 3G or limit video playing to users who are connected to wi-fi.
I read the overview provided by Apple, and now understand that my server needs to segment and index my media files. I also understand that I don't have to host the content to be able to stream it (I can point to a video hosted somewhere else, right ?).
What's not clear to me at this point is what to implement on my server (Ubuntu Hardy) to do the actual segmenting and indexing on the fly (once again, I do not host the videos I want to serve).
I found a link explaining how to install FFmpeg and X264, but I don't know if this is the best solution (since I have an Ubuntu server, I can't use the Apple Live Streaming tools, is it correct ?). Also, I do not understand at which point my server knows that a video needs to be converted and starts the job...
Any feedback that could help me understand exactly what to do on the server side to be able to stream videos on my iPhone app in 3G would be greatly appreciated ! (Oh, and just it makes any difference, my app back-end is in Rails)