
Recherche avancée
Médias (1)
-
1 000 000 (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
Autres articles (58)
-
Other interesting software
13 avril 2011, parWe don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
We don’t know them, we didn’t try them, but you can take a peek.
Videopress
Website : http://videopress.com/
License : GNU/GPL v2
Source code : (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (9718)
-
How to do precise frame cut with ffmpeg ?
25 juillet 2020, par Menglun LiI tried to cut multiple big videos into many small pieces. I know I can cut and export the small pieces one by one with "Premiere Elements", however that takes a lot time to export them manually. I couldn't find a way to do batch export in "Premiere Elements".


I wrote down the starting and ending points of the small videos from Premiere Elements' timeline viewer, then I created a batch file to use ffmpeg for batch cutting. Below is one of the cutting command in the batch file.


ffmpeg -i input.mp4 -ss 00:20:45.09 -to 00:23:01.00 -c:v libx264 -c:a aac output5.mp4 



I compared the video that was cut by ffmpeg and the video was exported by Premiere Elements. I found sometime they are not starting and ending at the same frame, and it is not consistent either. Both the starting and ending frame that was cut by ffmpeg can drift forward or backward 6-9 frames from the frame number showed in Premiere Elements.


How do I make sure ffmpeg has the precise frame cut ? Thanks in advance.


-
How to Forward MJPG Webcam to Virtual Video Device using FFmpeg ?
24 août 2020, par physiiiI have a webcam that looks like this :


$ ffmpeg -f v4l2 -list_formats all -i /dev/video0
[video4linux2,v4l2 @ 0x55fe6d48a240] Compressed: mjpeg : Motion-JPEG : 1920x1080 1280x720 640x480 320x240
[video4linux2,v4l2 @ 0x55fe6d48a240] Raw : yuyv422 : YUYV 4:2:2 : 640x480 320x240

$ v4l2-ctl --device /dev/video0 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT Type: Video Capture

 [0]: 'MJPG' (Motion-JPEG, compressed)
 Size: Discrete 1920x1080
 Interval: Discrete 0.033s (30.000 fps)
 Size: Discrete 1280x720
 Interval: Discrete 0.033s (30.000 fps)
 Size: Discrete 640x480
 Interval: Discrete 0.033s (30.000 fps)
 Size: Discrete 320x240
 Interval: Discrete 0.033s (30.000 fps)
 [1]: 'YUYV' (YUYV 4:2:2)
 Size: Discrete 640x480
 Interval: Discrete 0.033s (30.000 fps)
 Size: Discrete 320x240 Interval: Discrete 0.033s (30.000 fps)



I created a virtual video device like this :


modprobe v4l2loopback video_nr=20



I tried to forward the stream with ffmpeg using :


$ ffmpeg -pix_fmt yuv420p -f v4l2 -input_format mjpeg -s 1280x720 -i /dev/video0 -f v4l2 /dev/video20
ffmpeg version 4.2.4-1ubuntu0.1 Copyright (c) 2000-2020 the FFmpeg developers
 built with gcc 9 (Ubuntu 9.3.0-10ubuntu2)
 configuration: --prefix=/usr --extra-version=1ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-nvenc --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
 libavutil 56. 31.100 / 56. 31.100
 libavcodec 58. 54.100 / 58. 54.100
 libavformat 58. 29.100 / 58. 29.100
 libavdevice 58. 8.100 / 58. 8.100
 libavfilter 7. 57.100 / 7. 57.100
 libavresample 4. 0. 0 / 4. 0. 0
 libswscale 5. 5.100 / 5. 5.100
 libswresample 3. 5.100 / 3. 5.100
 libpostproc 55. 5.100 / 55. 5.100
[video4linux2,v4l2 @ 0x560fef7a8340] The V4L2 driver changed the video from 1280x720 to 640x480
Input #0, video4linux2,v4l2, from '/dev/video0':
 Duration: N/A, start: 169134.649958, bitrate: 147456 kb/s
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 640x480, 147456 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc
Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> rawvideo (native))
Press [q] to stop, [?] for help
Output #0, video4linux2,v4l2, to '/dev/video20':
 Metadata:
 encoder : Lavf58.29.100
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 640x480, q=2-31, 147456 kb/s, 30 fps, 30 tbn, 30 tbc
 Metadata:
 encoder : Lavc58.54.100 rawvideo
frame= 109 fps= 32 q=-0.0 Lsize=N/A time=00:00:03.63 bitrate=N/A dup=4 drop=0 speed=1.06x



However I keep ending up with the virtual device looking like this :


$ v4l2-ctl --device /dev/video20 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
 Type: Video Capture

 [0]: 'YUYV' (YUYV 4:2:2)
 Size: Discrete 640x480
 Interval: Discrete 0.033s (30.000 fps)



It appears to only forward YUYV format no matter what I specify in the ffmpeg command. How do I forward my webcam using mjpg so I can use the virtual device at higher resolutions ?


-
FFMPEG Command in Android Failing to Execute
12 septembre 2020, par ZoeI'm trying to execute ffmpeg commands through an android app I'm developing.



I found this post which has been somewhat useful :
Problems with ffmpeg command line on android



and I downloaded a static build of ffmpeg from here : http://ffmpeg.gusari.org/static/



The problem is, when this code runs



public void merge_video(){


 String[] ffmpegCommand = new String[5];
 ffmpegCommand[0] = "/data/data/com.example.zovideo/ffmpeg";
 ffmpegCommand[1] = "-i";
 ffmpegCommand[2] = "concat:storage/emulated/0/DCIM/Camera/VID30141106_211509.mp4|storage/emulated/0/DCIM/Camera/VID30141106_211509.mp4";
 ffmpegCommand[3] = "copy";
 ffmpegCommand[4] = "storage/emulated/0/DCIM/ZoVideo/Output.mp4"; 

 try {
 Process ffmpegProcess = new ProcessBuilder(ffmpegCommand).redirectErrorStream(true).start();

 String line;
 BufferedReader reader = new BufferedReader(new InputStreamReader(ffmpegProcess.getInputStream()));
 Log.d(null, "*******Starting FFMPEG");

 while((line = reader.readLine())!=null){

 Log.d(null, "***"+line+"***"); 
 }
 Log.d(null,"****ending FFMPEG****");

 } catch (IOException e) {
 e.printStackTrace();
 }
 }




It fails when trying to start the process with



Java.io.IOException: Error running exec(). Command: [/data/data/com.example.zovideo/ffmpeg, -i, concat:storage/emulated/0/DCIM/Camera/VID30141106_211509.mp4|storage/emulated/0/DCIM/Camera/VID30141106_211509.mp4, copy, storage/emulated/0/DCIM/ZoVideo/Output.mp4] Working Directory: null Environment: [ANDROID_ROOT=/system, EMULATED_STORAGE_SOURCE=/mnt/shell/emulated, LOOP_MOUNTPOINT=/mnt/obb, LD_PRELOAD=libsigchain.so, ANDROID_BOOTLOGO=1, EMULATED_STORAGE_TARGET=/storage/emulated, EXTERNAL_STORAGE=/storage/emulated/legacy, SYSTEMSERVERCLASSPATH=/system/framework/services.jar:/system/framework/ethernet-service.jar:/system/framework/wifi-service.jar, ANDROID_SOCKET_zygote=10, PATH=/sbin:/vendor/bin:/system/sbin:/system/bin:/system/xbin, ANDROID_DATA=/data, ANDROID_ASSETS=/system/app, ASEC_MOUNTPOINT=/mnt/asec, BOOTCLASSPATH=/system/framework/core-libart.jar:/system/framework/conscrypt.jar:/system/framework/okhttp.jar:/system/framework/core-junit.jar:/system/framework/bouncycastle.jar:/system/framework/ext.jar:/system/framework/framework.jar:/system/framework/telephony-common.jar:/system/framework/voip-common.jar:/system/framework/ims-common.jar:/system/framework/mms-common.jar:/system/framework/android.policy.jar:/system/framework/apache-xml.jar, ANDROID_PROPERTY_WORKSPACE=9,0, ANDROID_STORAGE=/storage]




I understand from the stackoverflow post I mentioned above that the ffmpeg static build needs to be on my device otherwise my app cannot use it.



However I'm unsure how to get it in the /data/data/com.example.zovideo folder as needed.



I have done is download the latest static ffmpeg build from http://ffmpeg.gusari.org/static/ and copied it into my libs/armeabi and libs/armeabi-v7a folders but this obviously hasn't succeeded in getting into the data/data folder when my app is installed onto my device. 
(I feel like I'm being an idiot by copy/pasting the files but I don't know what else to do. I don't know how to compile it myself - I have compiled ffmpeg using a Roman10 tutorial but this produces .so files which from which I understand is not what I need)



So I'm a little stuck.
Any advice is greatly appreciated. Thanks