
Recherche avancée
Médias (1)
-
La conservation du net art au musée. Les stratégies à l’œuvre
26 mai 2011
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (58)
-
Gestion des droits de création et d’édition des objets
8 février 2011, parPar défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;
-
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...) -
Dépôt de média et thèmes par FTP
31 mai 2013, parL’outil MédiaSPIP traite aussi les média transférés par la voie FTP. Si vous préférez déposer par cette voie, récupérez les identifiants d’accès vers votre site MédiaSPIP et utilisez votre client FTP favori.
Vous trouverez dès le départ les dossiers suivants dans votre espace FTP : config/ : dossier de configuration du site IMG/ : dossier des média déjà traités et en ligne sur le site local/ : répertoire cache du site web themes/ : les thèmes ou les feuilles de style personnalisées tmp/ : dossier de travail (...)
Sur d’autres sites (9826)
-
Truly live streaming to Android/iPhone
4 juillet 2012, par TsaukpaetraI have spent quite a while (past week) trying this to little avail. However, what I want seems completely unheard of. So far, I have reviewed recommendations available through google, which include encoding a static file into multiple static files in different formats, creating a playlist that hosts static files in an m3u8 file (files which get added to the playlist as streaming continues).
I have also seen ideas involving rtmp, rtsp etc which are completely out of the question because of their incompatibility.
Ideally, I would have one webpage that would link to the stream (http://server/video.mp4) and/or show it in a webpage (via the video tag). With that in mind, the most likely format would be h264+aac in mp4 container.Unfortunately, (and probably because the file has no duration metadata) it does not work. I can use a desktop player (such as VLC) to open the stream and play it, but my iPhone and Android both give their respective "Can't be played" messages.
I don't think the problem is caused by the devices' ability to stream, for I have made a streaming shoutcast server work just fine (mp3 only).
Currently, the closest I have become is using the following setup on my win32 machine :
FFMPEG Command: : ffmpeg -f dshow -i video="Logitech Webcam 200":audio="Microphone (Webcam 200)" -b:v 180k -bt 240k -vcodec libx264 -tune zerolatency -profile:v baseline -preset ultrafast -r 10 -strict -2 -acodec aac -ac 2 -ar 48000 -ab 32k -f flv "udp ://127.0.0.1:1234"
VLC: : Stream from udp ://127.0.0.1:1234 to http:// :8080/video.mp4 (No Transcoding), basically just to convert the UDP stream into an http-accessible stream.
Any hints or suggestions would be warmly welcomed !
-
ffmpeg - Encoding Percentage Maths
12 juillet 2012, par JimboI've written a whole system in PHP and bash on the server to convert and stream videos in HTML5 on my VPS. The conversion is done by ffmpeg in the background and the contents is output to block.txt.
Having looked at the following posts :
Can ffmpeg show a progress bar ?
and
ffmpeg video encoding progress bar
amongst others, I can't find a working example.
I have always struggled with the regexes and maths, and I need to grab the currently encoded progress as a percentage.
The first post I linked above gives :
$log = @file_get_contents('block.txt');
preg_match("/Duration:([^,]+)/", $log, $matches);
list($hours,$minutes,$seconds,$mili) = split(":",$matches[1]);
$seconds = (($hours * 3600) + ($minutes * 60) + $seconds);
$seconds = round($seconds);
$page = join("",file("$txt"));
$kw = explode("time=", $page);
$last = array_pop($kw);
$values = explode(' ', $last);
$curTime = round($values[0]);
$percent_extracted = round((($curTime * 100)/($seconds)));
echo $percent_extracted;The $percent_extracted variable echoes zero, and as maths is not my strong point, I really don't know how to progress here.
Here's one line from the ffmpeg output from block.txt (if it's helpful)
time=00:19:25.16 bitrate= 823.0kbits/s frame=27963 fps= 7 q=0.0 size=
117085kB time=00:19:25.33 bitrate= 823.1kbits/s frame=27967 fps= 7
q=0.0 size= 117085kB time=00:19:25.49 bitrate= 823.0kbits/s
frame=27971 fps= 7 q=0.0 size= 117126kBPlease help me output this percentage, once done I can create my own progress bar. Thanks.
-
How to build and link FFMPEG to iOS ?
30 juin 2015, par Alexander Tkachenkoall !
I know, there are a lot of questions here about FFMPEG on iOS, but no one answer is appropriate for my case :(
Something strange happens each case when I am trying to link FFMPEG in my project, so please, help me !My task is to write video-chat application for iOS, that uses RTMP-protocol for publishing and reading video-stream to/from custom Flash Media Server.
I decided to use rtmplib, free open-source library for streaming FLV video over RTMP, as it is the only appropriate library.
Many problem appeared when I began research of it, but later I understood how it should work.
Now I can read live stream of FLV video(from url) and send it back to channel, with the help of my application.
My trouble now is in sending video FROM Camera.
Basic operations sequence, as I understood, should be the following :-
Using AVFoundation, with the help of sequence (Device-AVCaptureSession-AVVideoDataOutput-> AVAssetWriter) I write this to a file(If you need, I can describe this flow more detailed, but in the context of question it is not important). This flow is necessary to make hardware-accelerated conversion of live video from the camera into H.264 codec. But it is in MOV container format. (This is completed step)
-
I read this temporary file with each sample written, and obtain the stream of bytes of video-data, (H.264 encoded, in QuickTime container). (this is allready completed step)
-
I need to convert videodata from QuickTime container format to FLV. And it all in real-time.(packet - by - packet)
-
If i will have the packets of video-data, contained in FLV container format, I will be able to send packets over RTMP using rtmplib.
Now, the most complicated part for me, is step 3.
I think, I need to use ffmpeg lib to this conversion (libavformat). I even found the source code, showing how to decode h.264 data packets from MOV file (looking in libavformat, i found that it is possible to extract this packets even from byte stream, which is more appropriate for me). And having this completed, I will need to encode packets into FLV(using ffmpeg or manually, in a way of adding FLV-headers to h.264 packets, it is not problem and is easy, if I am correct).
FFMPEG has great documentation and is very powerfull library, and I think, there won’t be a problem to use it. BUT the problem here is that I can not got it working in iOS project.
I have spend 3 days reading documentation, stackoverflow and googling the answer on the question "How to build FFMPEG for iOS" and I think, my PM is gonna fire me if I will spend one more week on trying to compile this library :))
I tried to use many different build-scripts and configure files, but when I build FFMPEG, i Got libavformat, libavcodec, etc. for x86 architecture (even when I specify armv6 arch in build-script). (I use "lipo -info libavcodec.a" to show architectures)
So I cannot build this sources, and decided to find prebuilt FFMPEG, that is build for architecture armv7, armv6, i386.
I have downloaded iOS Comm Lib from MidnightCoders from github, and it contains example of usage FFMPEG, it contains prebuilt .a files of avcodec,avformat and another FFMPEG libraries.
I check their architecture :
iMac-2:MediaLibiOS root# lipo -info libavformat.a
Architectures in the fat file: libavformat.a are: armv6 armv7 i386And I found that it is appropriate for me !
When I tried to add this libraries and headers to xCode project, It compiles fine(and I even have no warnings like "Library is compiled for another architecture"), and I can use structures from headers, but when I am trying to call C-function from libavformat (av_register_all()), the compiler show me error message "Symbol(s) not found for architecture armv7 : av_register_all".I thought, that maybe there are no symbols in lib, and tried to show them :
root# nm -arch armv6 libavformat.a | grep av_register_all
00000000 T _av_register_allNow I am stuck here, I don’t understand, why xCode can not see this symbols, and can not move forward.
Please, correct me if I am wrong in the understanding of flow for publishing RTMP-stream from iOS, and help me in building and linking FFMPEG for iOS.
I have iPhone 5.1. SDK and xCode 4.2.
-