
Recherche avancée
Autres articles (55)
-
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)
Sur d’autres sites (3654)
-
unsynchronized graph when h264 stream pause
24 septembre 2016, par jorpSteps to reproduce :
1.Connect android device to linux PC via USB.
2.Run command on pc :
adb shell screenrecord --output-format=h264 - | ffplay -analyzeduration 1 -
3.Screenrecord(an Android system tool) will start to capture Android screen, encode by MediaCodec, generate h264 stream, send stream data to ffplay via pipe.
4.PC will pop up ffplay window, and start display Android screen graphs
If adb command runs when Android screen graph keeps static (no pixel color changes), ffplay window won’t pop up, until I touch somewhere to make Android graph changes.
Question 1 : how to make ffplay to immidiately display the first frame of video when Android screen keeps static and adb commmand runs ?
I’m not talking about "latency", if Android graph keeps changing, the graphic latency is within 2 second. It’s fast enough.
when Android graph is static, run command without ffplay :
adb shell screenrecord --output-format=h264 -
screenrecord will output H264 stream data immidiately then pause.(See picture below).
stream data comes firstDoes the data contain all information of first frame graph ?
If so, how to make ffplay to display it immidiately ?
Question 2 : how to make ffplay to immidiately display the latest frame of video when Android screen suddenly become static ?
When Android screen start casting and Android graph suddenly become static, H264 data stream transfer will pause, while PC graph is different with Android graph.
For example, I make a popup menu on Android disappear, there has been a fadeoff animation, PC graph pauses and display a translucent popup menu.(See picture below)
latest graph on pc
The first and latest frames of stream doesn’t display in ffplay immidiately, they are "missing".
when Android screen go on changing, the "missing" frames displays.
So it seems like the latest frames are always in buffer.I’ve tried adding these params to ffplay :
-noinfbuf
-fflags nobuffer
-max_delay 0
-sync ext
-preset ultrafast
-tune zerolatency
They cannot solve the problemsModifing Android ROM or screenrecord source, making a pixel color changes every frame, may solve the problems(I guess), but it would enlarge data stream, it’s the last choise.
Is there any solution by adding params of ffplay, or by modifing screenrecord or ffplay source code ?PS : mplayer also has the same problem :
mplayer -demuxer h264es -
-
exporting ogg videos from video slideshow (mp4) and audio (wav) using ffmpeg in python
17 septembre 2018, par jtzI have tried to find a solution already, but just cannot make it work. I want to export an .ogg-video by combining a video slideshow (mp4) with an audio file (wav) by using ffmpeg in python. The audio is shorter than the video and should have an offset of 2000 milliseconds, so that it starts approximately in the middle of the video. The code that I tried is essentially this :
subprocess.call(’ffmpeg.exe -itsoffset 0 -i slideshows/slideshow.mp4 -itsoffset 2000 -i sounds/audio.wav -codec:v libtheora -codec:a libvorbis output.ogg’, shell= True)
FYI : I need to create a couple of different video formats from the video and audio input, so substituting the ogg with a different format is no option. I have already successfully created .mp4-video-files and .webm-video-files from the material, but need the .ogg in addition. Also, Power Point does not like any of the exported videos, maybe this might be relevant for finding the problem here (I also unsuccessfully tried to fix this issue).
-
Immidiately display first and latest frame of h264 stream which may sometimes pause
22 septembre 2016, par jorpSteps to reproduce :
1.Connect android device to linux PC via USB.
2.Run command on pc :
adb shell screenrecord --output-format=h264 - | ffplay -analyzeduration 1 -
3.Screenrecord(an Android system tool) will start to capture Android screen, encode by MediaCodec, generate h264 stream, send stream data to ffplay via pipe.
4.PC will pop up ffplay window, and start display Android screen graphs
If adb command runs when Android screen graph keeps static (no pixel color changes), ffplay window won’t pop up, until I touch somewhere to make Android graph changes.
Question 1 : how to make ffplay to immidiately display the first frame of video when Android screen keeps static and adb commmand runs ?
I’m not talking about "latency", if Android graph keeps changing, the graphic latency is within 2 second. It’s fast enough.
when Android graph is static, run command without ffplay :
adb shell screenrecord --output-format=h264 -
screenrecord will output H264 stream data immidiately then pause.(See picture below).
stream data comes firstDoes the data contain all information of first frame graph ?
If so, how to make ffplay to display it immidiately ?
Question 2 : how to make ffplay to immidiately display the latest frame of video when Android screen suddenly become static ?
When Android screen start casting and Android graph suddenly become static, H264 data stream transfer will pause, while PC graph is different with Android graph.
For example, I make a popup menu on Android disappear, there has been a fadeoff animation, PC graph pauses and display a translucent popup menu.(See picture below)
latest graph on pc
The first and latest frames of stream doesn’t display in ffplay immidiately, they are "missing".
when Android screen go on changing, the "missing" frames displays.
So it seems like the latest frames are always in buffer.I’ve tried adding these params to ffplay :
-noinfbuf
-fflags nobuffer
-max_delay 0
-sync ext
-preset ultrafast
-tune zerolatency
They cannot solve the problemsModifing Android ROM or screenrecord source, making a pixel color changes every frame, may solve the problems(I guess), but it would enlarge data stream, it’s the last choise.
Is there any solution by adding params of ffplay, or by modifing screenrecord or ffplay source code ?PS : mplayer also has the same problem :
mplayer -demuxer h264es -