
Advanced search
Other articles (70)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 September 2013, byCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo; l’ajout d’une bannière l’ajout d’une image de fond;
-
Ecrire une actualité
21 June 2013, byPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Publier sur MédiaSpip
13 June 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
On other websites (11427)
-
how can I use the openCV FFMPEG video I/O rather than the DirectShow one in Windows?
2 May 2012, by octiSo I'm trying to write a video using the openCV videoWriter as such:
writer=cv.CreateVideoWriter(path+"test_output.avi",-1,fps,(W,H),1)
So instead of supplying the FOURCC I supplied -1 in order to see what codecs I have available.
Result was Microsoft RLE, Microsoft Video 1, Intel YUV, and Uncompressed.The reason is that when configuring openCV using CMAKE for Visual Studio 10 x64, this is what I have in the video i/o:
Video I/O: DirectShowIs there a way to switch this to FFMPEG? I know the ffmpeg dll is present in \3dparty\ffmpeg.
I looked for Cmake FFMPEG flags but found none whatsoever. The weird thing is in the CmakeLists.txt in the opencv root under the video section:if(UNIX AND NOT APPLE)
<ffmpeg stuff="stuff">
elseif(WIN32)
status(" Video I/O:" HAVE_VIDEOINPUT THEN DirectShow ELSE NO)
endif()
</ffmpeg>So it seems to me that opencv automatically switched to DirectShow and gives no choice of using FFMpeg.
Or rather can one upgrade Driectshow to support other formats such as Divx or h264?
Any ideas? -
is that possible to serv hls and dash mpeg both at a time
12 February 2020, by Rahul Mactually i’m a kind a newbie to the Nginx RTMP server. I had setup my nginx.conf file to accept both hls and dash-mpeg. but the now problem is at a once I can able to serve either of the hls or dash-mpeg. so now my question is that possible to serve both hls and dash-mpeg at the same time for two different videos? I’m streaming video from OBS Studio.
here are my MPEG and hls code in nginx.conf filertmp {
server {
listen 1935; # Listen on standard RTMP port
chunk_size 4000;
application show {
live on;
# Turn on HLS
hls on;
hls_path /nginx/hls/;
hls_fragment 3;
hls_playlist_length 60;
deny play all;
}
application dash {
live on;
dash on;
dash_path /nginx/dash;
}
}}
thank you advance.
-
Android java.lang.UnsatisfiedLinkError - couldn't find "libffmpeg.so"
9 August 2017, by Achini have build this project with eclipse https://github.com/youtube/yt-watchme and it is running fine , but when i try to build this project in android studio i am error in my Ffmpeg class ,i have copy all the file from my running demo which i made in eclipse to my android studio project directory , i will post my directory structure and build.gradle , please anyone guide me? please see the below
Process: com.google.android.apps.watchme, PID: 6330
java.lang.UnsatisfiedLinkError: dalvik.system.PathClassLoader[DexPathList[[zip file "/data/app/com.google.android.apps.watchme-2/base.apk"],nativeLibraryDirectories=[/vendor/lib, /system/lib]]] couldn't find "libffmpeg.so"
at java.lang.Runtime.loadLibrary(Runtime.java:366)
at java.lang.System.loadLibrary(System.java:988)
at com.google.android.apps.watchme.Ffmpeg.<clinit>(Ffmpeg.java:22)
at com.google.android.apps.watchme.VideoStreamingConnection.open(VideoStreamingConnection.java:71)
at com.google.android.apps.watchme.StreamerService.startStreaming(StreamerService.java:73)
at com.google.android.apps.watchme.StreamerActivity.startStreaming(StreamerActivity.java:161)
at com.google.android.apps.watchme.StreamerActivity.access$200(StreamerActivity.java:39)
at com.google.android.apps.watchme.StreamerActivity$1.onServiceConnected(StreamerActivity.java:55)
at android.app.LoadedApk$ServiceDispatcher.doConnected(LoadedApk.java:1208)
at android.app.LoadedApk$ServiceDispatcher$RunConnection.run(LoadedApk.java:1225)
at android.os.Handler.handleCallback(Handler.java:739)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:135)
at android.app.ActivityThread.main(ActivityThread.java:5343)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:905)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:700)
</clinit>and in JNI function