
Recherche avancée
Autres articles (97)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
ANNEXE : Les plugins utilisés spécifiquement pour la ferme
5 mars 2010, parLe site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)
Sur d’autres sites (12938)
-
Player for Android SmartPhone and Android TV, live stream using m3u/m3u8 playlist with MPEG-2 TS channels
30 mai 2016, par ot0I want to have a player (like IPTV) that can play the .m3u playlist channels.
I have found ExoPlayer should support live-streaming. I want to be able to see the code because I want to add the player inside my own App and embed in my own GUI.I downloaded the open source github project of exoplayer github.com/google/ExoPlayer and appended the URL for our .m3u playlist (containing mpeg2-ts channels) in the demo but unfortunately it couldn’t play. Inside the
Samples.java
I added a Test channel :public static final Sample[] HLS = new Sample[] {
new Sample("Apple AAC media playlist",
"devimages.apple.com.edgekey.net/streaming/examples/bipbop_4x3/gear0/"
+ "prog_index.m3u8", Util.TYPE_HLS),
new Sample("Apple ID3 metadata", "devimages.apple.com/samplecode/adDemo/ad.m3u8",
Util.TYPE_HLS),
....
new Sample("Test", "ipaddress.com/channels.m3u", Util.TYPE_OTHER)
};I also tried to just paste the URL of one of the mpeg2-ts channels from the playlist but it didn’t work either
The .m3u looks like :
#EXTM3U
#EXTINF:-1 group-title="Random" tvg-logo="xx.com/pic/channel1.jpg",Channel1
ipaddress.com:7777/channel1
..
#EXT-X-ENDLISTDoes anyone have some recommendations ? It is not possible to play IPTV encapsulated in MPEG-2 TS using ExoPlayer ? If not, do we have alternatives ? I am mostly for an open source project where the player is already implemented so that I only need to add visual elements inside it. Otherwise API to other apps will also be useful
-
Android concatenating images to video using FFMPEG. Using vanevery/Android-MJPEG-Video-Capture-FFMPEG
4 mai 2014, par MarcinZiolekI’m using vanevery’s code from github to join images into movie (JPEG to mp4).
I’ve changed sample code and it is working. But now I want to move this feature to my app and I’m getting this error starting conversion :V/MJPEG_FFMPEG(27166): ***soinfo_link_image(linker.cpp:1607): missing essential tablesCANNOT LINK EXECUTABLE***
I’ve copied all the files from assets :
-assets
--ffmpeg
--libavcodec.so
--libavcodec.so.52
--libavcodec.so.52.99.1
--libavcore.so
--libavcore.so.0
--libavcore.so.0.16.0
--AND SO ON...Do I have to compile these files by my self, specific for my project ? If yes, I know that there are lots of tutorials to make ffmpeg get working on Android but it is really complicated.
Is there a simple method to get it work ?
I’m working in Eclipse.
-
Android-How to pass back frames from FFmpeg back to Android
23 octobre 2013, par yarinIt is an architecture question-i am really interesting about the answer
I building an app with following goals :
1.record video with effect in real time(using FFmpeg)
2.display the customized video in real time for the user while he recording
So,after 1 month of working...i decide to remember that goal number 2 is worth to thinking about :)
I have a ready skeleton app that record video with effect in real time.
but i have to preview this customized frame back to the user.My options (and this is my question) :
1.Each frame that pass from
onPreviewFrame(byte[] video_frame_data, Camera camera)
to ffmpeg with JNI to encode-will sending back to android through the same JNI after i apply the effects(i mean : onPreviewFrame->JNI to FFMPEG->immediately apply effect->send the costumed frame back to android side for display->encode the costumed frame).Advantages : it is look like is the most easy to use.
Disadvantages : use the JNI twice or the passing back the frame could consume time(i really don't now if it really big price to pay,cuz it is only byte array or int array per frame to send to android side)
2.I heard about openGL on ndk,but i think that the surface it self created on the android side-so is it really going to be better ?
i prefer to use other surface that i using now in java3.Create an video player on FFmpeg to preview each customized frame in real time.
Thank for your helping,i hope that the first solution is available and not consume to much expensive time in terms of real time processing