
Recherche avancée
Autres articles (65)
-
Contribute to documentation
13 avril 2011Documentation is vital to the development of improved technical capabilities.
MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
To contribute, register to the project users’ mailing (...) -
Selection of projects using MediaSPIP
2 mai 2011, parThe examples below are representative elements of MediaSPIP specific uses for specific projects.
MediaSPIP farm @ Infini
The non profit organizationInfini develops hospitality activities, internet access point, training, realizing innovative projects in the field of information and communication technologies and Communication, and hosting of websites. It plays a unique and prominent role in the Brest (France) area, at the national level, among the half-dozen such association. Its members (...) -
Use, discuss, criticize
13 avril 2011, parTalk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
A discussion list is available for all exchanges between users.
Sur d’autres sites (4035)
-
ffmpeg lib/API to cut videos in java
23 janvier 2017, par João Paulo RaddI’m trying to use some FFmpeg’s lib/API to java to cut a video in several parts by time. I know that I can use RunTime with a command to this, but in my case I need use this in code using some API/lib.
Example command in terminal / runtime :ffmpeg -i /tmp/test.mp4 -ss 30 -c copy -to 40 /tmp/outTest.mp4
String url_str = "ffmpeg -i /tmp/"+fileName+".mp4 -ss "+secStart+" -c copy -to "+secEnd+" /tmp/"+outfile+".mp4 -y";
System.out.println("url_str :"+url_str);
try {
Runtime rt = Runtime.getRuntime();
Process p = rt.exec(url_str);
BufferedReader r = new BufferedReader(new InputStreamReader(p.getErrorStream()));
String s;
while ((s = r.readLine()) != null) {
//System.out.println(s); terminal response
}
r.close();Thus, I’ve a original video and a inicial time in seconds and a final time (some cases, optional) and create a new video with this part.
I’m trying to use the FFmpeg Java by Andrew Brampton (net.bramp.ffmpeg) to do the same thing as the example. But, if some one know by other API/lib, like FFMPEG-Java (net.sf.ffmpeg_java) for example, will be good too. -
Bash, Relative Paths, and Mac Automator Fails
30 septembre 2015, par grahamaThis script works great in the terminal but fails when I run the shell script in automator (mac). For some reason, automator doesn’t remember the gif’s name and writes the file as *.gif. When run directly in Terminal, the script correctly converts to movie file to Gif and moves it to the correct Dropbox location.
I’m trying to run this automator Mov2Gif script when Apple Motion 5 finishes rendering a movie.Any help is appreciated. Automator is a little touchy.
#!/bin/sh
fps=8
scale=400
palette="/tmp/palette.png"
filters="fps=$fps,scale=$scale:-1:flags=lanczos"
destDIR="/PATH/TO/DROPBOX/DIR"
curDIR=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )
incomingDIR=$curDIR/_incoming
ffmpeg=$curDIR/ffmpeg/ffmpeg
for f in *.mov
do
fbname=$(basename "$f" .mov)
fbAbsPath=$incomingDIR/$fbname
$ffmpeg -v warning -i $fbAbsPath.mov -vf "$filters,palettegen" -y $palette
$ffmpeg -v warning -i $fbAbsPath.mov -i $palette -lavfi "$filters [x]; [x][1:v] paletteuse" -y $fbAbsPath.gif
rm $fbAbsPath.mov
mv -f $fbAbsPath.gif $destDIR/$fbname.gif
done -
Overlying video frames using FFmpeg in c++
14 mai 2020, par BruceI am trying to overlaying two videos using FFmpeg in c++. So I followed the FFmpeg page and followed this command in terminal.



$ ffmpeg -i Right.mov -i Left.mov -filter_complex "[0:v][1:v] overlay=0:0" -c:a copy output.mov




I can implement this functionality through the terminal, but I am trying to achieve this functionality through codding.



typedef struct {
 AVFormatContext *fmt_ctx;
 int stream_idx;
 AVRational time_base;
 AVStream *video_stream;
 AVCodecContext *codec_ctx;
 AVCodecContext *pCodecCtxOrig;
 AVCodec *decoder;
 AVPacket *packet;
 AVFrame *av_frame;
 AVFrame *gl_frame;
 AVFrame *out_frame;
 AVStream *pStream;
 struct SwsContext *conv_ctx;




also, I show some example code, but I am not sure about it



https://ffmpeg.org/doxygen/2.1/doc_2examples_2filtering_video_8c-example.html



and 
https://code5.cn/so/c%2B%2B/2601062



AVFilterContext *buffersink_ctx;
AVFilterContext *buffersrc_ctx;
AVFilterGraph *filter_graph;




how can I implement this functionality in my code ?