Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
creating thumbnail from a video using ffmpeg in php exec() returns 1 but returns 0 if command is 'ffmpeg -h'
23 septembre 2013, par user2807517creating thumbnail from a video using ffmpeg in php exec() returns 1 but returns 0 if command is 'ffmpeg -h'. i am using ubuntu. A portion of my code is as shown bellow. When i tried $cmd it returned 1 and when i tried $cmd1 it worked.
$ffmpeg = 'ffmpeg'; $getFromSecond = 5; $videoFile = $_FILES['file']['tmp_name']; $size = '105x73'; $imageFile = 'Newimage.jpg'; $cmd = "$ffmpeg -an -ss $getFromSecond -i $videoFile -vframes 1 -s $size $imageFile"; $cmd1 = "$ffmpeg -h";
works for $cmd1 but not for $cmd, i am thinking it is permission issues
-
Passing FFmpeg encoder output to live 555 media server
23 septembre 2013, par AshutoshI need help on transmitting encoded output from encoder video output to live 555 media server.
ret=avcodec_encode_video2(c,&pkt,(AVFrame*)&sc_dst_data,&gotim); source=ByteStreamMemoryBufferSource::createNew(*env,pkt.data,ret); if (source == NULL){} videoES = source; video_source=H264VideoStreamFramer::createNew(*env,videoES); video_sink->startPlaying(*video_source,afterPlaying, video_sink);
The above codes doesn't transmit the streams properly.it breaks.any insights on how to do for a proper streaming in ffmpeg? Please help.
-
ffmpeg - set metatag to .ts file
23 septembre 2013, par Febini have a .mp4 video, that is recorded in iphone4s.This video file contains 'Rotate - 180' metadata.
When i am converting the .mp4 file to .ts using ffmpeg. I lost the 'Rotate' meta tag.
The ffmpeg command that i have used is given below.
ffmpeg -i input_file.mp4 -vcodec copy -acodec copy -vbsf h264_mp4toannexb output_file.ts
is there any one know how to set 'Rotate' meta data to a .ts file ?
or
any other way to copy all meta datas in the input .mp4 file to output .ts file
Thank you
-
Android FFMPEG : Could not execute the ffmpeg from Java code
23 septembre 2013, par RajI am working on an Android app where I want to create a video from a list of static images. After doing some search on internet, it made me realized that using "FFMPEG" is the way to go in getting this thing done. So I got hold of this site: https://github.com/guardianproject/android-ffmpeg-java from where I downloaded the C library and the Java wrapper. I was able to compile the C library - of course not the way the instruction was laid out - but still I was able to get "ffmpeg" executable under /external/android-ffmpeg/ffmpeg directory. I copied that executable in my current directory and then copied it to a directory under Android where my app can access it. Then I called the provided Java wrapper but I am seeing some errors in the log file like follows:
08-13 11:55:37.848: D/FFMPEG(29598): /data/data/com.sample/app_bin/ffmpeg -y -loop 1 -i /storage/emulated/0/usersnapshot/ffmpeg/image%03d.jpg -r 25 -t 2 -qscale 5 /storage/emulated/0/video/snapshot-video.mp4 08-13 11:55:37.898: I/ShellCallback : shellOut()(29598): /data/data/com.sample/app_bin/ffmpeg[1]: syntax error: '(' unexpected 08-13 11:55:37.938: I/ShellCallback : processComplete()(29598): 1
And following is the code snippet (where targetDirectoryForFFMPEG = directory where the images are stored):
FfmpegController ffmpegController = new FfmpegController(this, targetDirectoryForFFMPEG); String out = videoOutPutFile.getPath(); MediaDesc mediaIn = new MediaDesc(); mediaIn.path = targetDirectoryForFFMPEG+"/image%03d.jpg"; mediaIn.videoFps = "25"; ffmpegController.convertImageToMP4(mediaIn, 2, out,new ShellCallback() { @Override public void shellOut(String shellLine) { Log.i("ShellCallback : shellOut()", shellLine); } @Override public void processComplete(int exitValue) { Log.i("ShellCallback : processComplete()", exitValue+""); } });
Has anybody implemented this before? If yes, can you point me to what am I doing incorrect? I will provide more information if needed.
-
How can I mux H.264 RTP output into a container using FFMPEG ?
23 septembre 2013, par GradI am working on the effects of network losses in video transmission. In order to simulate the network losses I use a simple program which drops random RTP packets from the output of H.264 RTP encoding.
I use Joint Model (JM) 14.2 in order to encode the video. However, I don't use AnnexB format as my output, instead I choose the output as RTP packets. The JM output is generated as RTP packets with RTP headers and payload as a sequence. After that, some of RTP packets are dropped by using a simple program. Then, I decode the output by using also JM and it's error concealment methods. That gives me a YUV file as output. The format of the output is as follows:
---------------------------------------------------------------------- | RTP Header #1 | RTP Payload #1 | RTP Header #2 | RTP Payload #2 |... ----------------------------------------------------------------------
I want to make a subjective test with these bitstreams and it's very inconvenient to crowdsource this subjective test with GBs of video data. So, I want to mux these bitstreams into a container (i.e. AVI) by using FFMPEG. I have tried to decode these bitstreams with FFMPEG and FFPLAY; however, both of them didn't work. I also tried the following command and it didn't work, either.
ffmpeg - f h264 -i -vcodec copy -r 25 out.avi
Which format or muxer should I use? Do I need to convert these files to any other format?