
Recherche avancée
Autres articles (94)
-
Mediabox : ouvrir les images dans l’espace maximal pour l’utilisateur
8 février 2011, parLa visualisation des images est restreinte par la largeur accordée par le design du site (dépendant du thème utilisé). Elles sont donc visibles sous un format réduit. Afin de profiter de l’ensemble de la place disponible sur l’écran de l’utilisateur, il est possible d’ajouter une fonctionnalité d’affichage de l’image dans une boite multimedia apparaissant au dessus du reste du contenu.
Pour ce faire il est nécessaire d’installer le plugin "Mediabox".
Configuration de la boite multimédia
Dès (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Automated installation script of MediaSPIP
25 avril 2011, parTo overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
The documentation of the use of this installation script is available here.
The code of this (...)
Sur d’autres sites (4377)
-
Hot to access video frames from external process ?
14 août 2019, par gaamaaWindows 32 bit application memory limit is around 2gb approximately.
I need to play 4 to 8 HD video files or live camera graphs. Which is not possible by a single 32 bit application.
So I would like to play each graph in separate 32 bit application and access those video frames from my main application. In this case I could manage my need.Could some experts give me some clue to do this if c++ ?
Is there any ready solution to achieve this ?
gaamaa
-
How to reduce mp4 size by using FFMPEG lib into android
19 septembre 2016, par Swap-IOS-AndroidI am new to NDK so i read tutorial and i successfully build the FFMPEG lib than i copied it into my jni folder create Android.mk and Application.mk file and execute ndk-build command so now i got libavcodec.so into my lib folder..( i didnt copy ffmpeg header files into my jni folder .. is it necessary to add header file or should i add complete ffmpeg lib into jni... stack-overflow comments say that you just have to add header files)
I know that if i want to convert my camera video into small size than i have to compress it by using avcodac.so so i compile it but important this is How can i use that.
There is confusion in my mind to use that so file..
1) Should i need to use System.load("libavcodec.so") method to load So file IF yes after loading so files how can i access the native methods of C/C++ ?2) Or should i need to create my java class and my c class which both communicate with each other and that c class communicate with avcodec class of ffmpeg ??
Or should need to implement both and one more important thing IF i have to create my c class than in Android.mk i have to add it into source file line ???
And please can anybody tell me what are the method and steps available to compress video file size in FFMPEG ?
Any help is appreciated and this my question will also helpful to other fresher.
Thank you -
sync dumped rtp streams
7 mars 2017, par Pawel KI need a bit of a guidance. I am writing a bit of an application to parse two streams of RTP data to get the actual video (h264) and audio (PCM alaw), it’s not a RTSP client but a rather dump parser (Interleved RTP/RTCP).
Question is now how to synchronise these both streams ?I know I need to do some calculations between timestamps from packets and NTP from RTCP (I research it currently) but what then how to prepare two tracks to mux them into some container like matroska (using for instance ffmpeg) ? Do I have to create some empty frames for audio to lengthen it or stretch somehow I don’t have a clue how to approach it.