
Recherche avancée
Médias (1)
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (71)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)
Sur d’autres sites (11738)
-
FFMpeg/libav set SEI unregistered user message per video frame
17 décembre 2018, par maxhapI’m trying to set the SEI unregistered user message on a per video frame basis. I notice that libav has a H264MetadataContext structure which
contains a const char *sei_user_data field. This seems to be exactly what I’m looking for. After poking around in the FFMpeg code I notice
this structure is wrapped within the private data of the AVBSFContext. My knowledge is extremely limited so at this point I’m looking for an example
or explanation on how to correctly access the H264MetadataContext structure. I presume I need a bit stream filter (based on the naming), but can’t
find any examples of setting the H264MetadataContext.Can anyone help me with an explanation, code snippet or point me to an example I might have missed/overlooked ?
Any general advice would also be appreciated. I have checked the few similar questions here on stackoverflow, they don’t seem to have any solid answers or explanations though (maybe due to my lack of understanding).
Thanks in advanced !
-
How to implement HTTP Live Streaming server on Unix ?
5 septembre 2012, par alexI just realized that Apple required HTTP Live Streaming in order to view videos in iPhone apps. I was not aware of this before... I am now trying to understand what this involves so I can decide whether I want to do the work and make the videos available in 3G or limit video playing to users who are connected to wi-fi.
I read the overview provided by Apple, and now understand that my server needs to segment and index my media files. I also understand that I don't have to host the content to be able to stream it (I can point to a video hosted somewhere else, right ?).
What's not clear to me at this point is what to implement on my server (Ubuntu Hardy) to do the actual segmenting and indexing on the fly (once again, I do not host the videos I want to serve).
I found a link explaining how to install FFmpeg and X264, but I don't know if this is the best solution (since I have an Ubuntu server, I can't use the Apple Live Streaming tools, is it correct ?). Also, I do not understand at which point my server knows that a video needs to be converted and starts the job...
Any feedback that could help me understand exactly what to do on the server side to be able to stream videos on my iPhone app in 3G would be greatly appreciated ! (Oh, and just it makes any difference, my app back-end is in Rails)
-
Anomalie #2466 : droits de l’admin restreint
24 décembre 2011, par denisb -ne pas tenir compte du point 2 (évidemment...) le 1 demeure