Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
FFMpeg giving me execution error
28 mars 2017, par Richard Mcfriend OluwamuyiwaI am trying to process a video file using php-ffmpeg, but I keep getting this error:
'/usr/local/bin/ffmpeg' '-y' '-i' '/home/kwilistc/public_html/contents/videos/1490719990_MP4_360p_Short_video_clip_nature_mp4.mp4' '-async' '1' '-metadata:s:v:0' 'start_time=0' '-vcodec' 'libx264' '-acodec' 'libmp3lame' '-b:v' '128k' '-refs' '6' '-coder' '1' '-sc_threshold' '40' '-flags' '+loop' '-me_range' '16' '-subq' '7' '-i_qfactor' '0.71' '-qcomp' '0.6' '-qdiff' '4' '-trellis' '1' '-b:a' '8k' '-ac' '1' '-pass' '1' '-passlogfile' '/tmp/ffmpeg-passes58daa6f9c79c7owdke/pass-58daa6f9c810a' '/home/kwilistc/public_html/contents/videos/1490719990_MP4_360p_Short_video_clip_nature_mp4_22995.mp4''
This is my code:
$ffmpeg = $ffmpeg = FFMpeg\FFMpeg::create(['timeout'=>3600, 'ffmpeg.thread'=>12, 'ffmpeg.binaries' => '/usr/local/bin/ffmpeg', 'ffprobe.binaries' => '/usr/local/bin/ffprobe']); $ffprobe_prep = FFMpeg\FFProbe::create(['ffmpeg.binaries' => '/usr/local/bin/ffmpeg', 'ffprobe.binaries' => '/usr/local/bin/ffprobe']); $ffprobe = $ffprobe_prep->format($video_file); $video = $ffmpeg->open($video_file); // Get video duration to ensure our videos are never longer than our video limit. $duration = $ffprobe->get('duration'); // Use mp4 format and set the audio bitrate to 56Kbit and Mono channel. // TODO: Try stereo later... $format = new FFMpeg\Format\Video\X264('libmp3lame', 'libx264'); $format -> setKiloBitrate(128) -> setAudioChannels(1) -> setAudioKiloBitrate(8); $first = $ffprobe_prep ->streams($video_file) ->videos() ->first(); $width = $first->get('width'); if($width > VIDEO_WIDTH){ // Resize to 558 x 314 and resize to fit width. $video ->filters() ->resize(new FFMpeg\Coordinate\Dimension(VIDEO_WIDTH, ceil(VIDEO_WIDTH / 16 * 9))); } // Trim to videos longer than three minutes to 3 minutes. if($duration > MAX_VIDEO_PLAYTIME){ $video ->filters() ->clip(FFMpeg\Coordinate\TimeCode::fromSeconds(0), FFMpeg\Coordinate\TimeCode::fromSeconds(MAX_VIDEO_PLAYTIME)); } // Change the framerate to 16fps and GOP as 9. $video ->filters() ->framerate(new FFMpeg\Coordinate\FrameRate(16), 9); // Synchronize audio and video $video->filters()->synchronize(); $video->save($format, $video_file_new_2);
Please why am I getting this error and how do I solve it.
Note: I have already contacted my host and they don't seem much helpful.
-
Grab an exact image at exact time in FFMpeg video ?
28 mars 2017, par hermt2I am trying to build a project in java that uses this library: https://github.com/bytedeco/javacv
I want to use the FFmpegFrameGrabber to get an exact frame of a video, but the only methods I see are
FFmpegFrameGrabber.grab()
. How does this know what frame to extract when it does not accept an input? I feel like there should be a time in milliseconds or a frame number to specify, not sure though.If you have any ideas I'd love your input.
-
Error while running ffmpeg
28 mars 2017, par user4543816I'm trying to convert a video file to images: ffmpeg -i XXXX.mp4 -r 1/1 $XX%03d.bmp
and I get the following error
video:121501kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
Any help will be appreciated
-
iOS multiple video display
28 mars 2017, par CuevasI'm currently doing an iOS project that uses
IJKPlayer
which is based onFFmpeg
andSDL
to display RTSP feed from a certain source.I have no problem in displaying a single video feed but my project requires me to display multiple stream on the screen simultaneously, the problem I'm facing right now is to separate each of the streams and display it on
n
number of instances of player.RTSP -> stream 0, stream 1, stream 2, stream 4 -> display
Here is a sample output I want to achieve. Each color represents a single stream. Thanks!
Edit: If this is not possible on IJKPlayer, can someone recommend a different approach on how to implement this?
-
NV12 textures not working in DirectX 11.1
28 mars 2017, par André VitorI’m trying to render NV12 textures from frames decoded with ffmpeg 2.8.11 using DirectX 11.1 but when I do render them the texture is broken and the color is always off.
Result is: http://imgur.com/a/YIVQk
Code below is how I get the frame decoded by ffmpeg that is in YUV420P format and then I convert(not sure) to NV12 format by interleaving the U and V planes.
static uint8_t *pixelsPtr_ = nullptr; UINT rowPitch = ((width + 1) >> 1) * 2; UINT imageSize = (rowPitch * height) + ((rowPitch * height + 1) >> 1); if (!pixelsPtr_) { pixelsPtr_ = new uint8_t[imageSize]; } int j, position = 0; uint32_t pitchY = avFrame.linesize[0]; uint32_t pitchU = avFrame.linesize[1]; uint32_t pitchV = avFrame.linesize[2]; uint8_t *avY = avFrame.data[0]; uint8_t *avU = avFrame.data[1]; uint8_t *avV = avFrame.data[2]; ::SecureZeroMemory(pixelsPtr_, imageSize); for (j = 0; j < height; j++) { ::CopyMemory(pixelsPtr_ + position, avY, (width)); position += (width); avY += pitchY; } for (j = 0; j < height >> 1; j++) { ::CopyMemory(pixelsPtr_ + position, avU, (width >> 1)); position += (width >> 1); avU += pitchU; ::CopyMemory(pixelsPtr_ + position, avV, (width >> 1)); position += (width >> 1); avV += pitchV; }
This is how I’m creating the Texture2D with the data I just got.
// Create texture D3D11_TEXTURE2D_DESC desc; desc.Width = width; desc.Height = height; desc.MipLevels = 1; desc.ArraySize = 1; desc.Format = DXGI_FORMAT_NV12; desc.SampleDesc.Count = 1; desc.SampleDesc.Quality = 0; desc.Usage = D3D11_USAGE_DEFAULT; desc.BindFlags = D3D11_BIND_SHADER_RESOURCE; desc.CPUAccessFlags = 0; desc.MiscFlags = 0; D3D11_SUBRESOURCE_DATA initData; initData.pSysMem = pixelsPtr_; initData.SysMemPitch = rowPitch; ID3D11Texture2D* tex = nullptr; hr = d3dDevice->CreateTexture2D(&desc, &initData, &tex); if (SUCCEEDED(hr) && tex != 0) { D3D11_SHADER_RESOURCE_VIEW_DESC SRVDesc; memset(&SRVDesc, 0, sizeof(SRVDesc)); SRVDesc.Format = DXGI_FORMAT_R8_UNORM; SRVDesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D; SRVDesc.Texture2D.MipLevels = 1; hr = d3dDevice->CreateShaderResourceView(tex, &SRVDesc, &textureViewYUV[0]); if (FAILED(hr)) { tex->Release(); return hr; } SRVDesc.Format = DXGI_FORMAT_R8G8_UNORM; hr = d3dDevice->CreateShaderResourceView(tex, &SRVDesc, &textureViewYUV[1]); if (FAILED(hr)) { tex->Release(); return hr; } tex->Release(); }
Then I pass both Shader Resource View to Pixel Shader
graphics->Context()->PSSetShaderResources(0, 2, textureViewYUV);
This is the pixel shader:
struct PixelShaderInput { float4 pos : SV_POSITION; float4 Color : COLOR; float2 texCoord : TEXCOORD; }; static const float3x3 YUVtoRGBCoeffMatrix = { 1.164383f, 1.164383f, 1.164383f, 0.000000f, -0.391762f, 2.017232f, 1.596027f, -0.812968f, 0.000000f }; Texture2D
luminanceChannel; Texture2D chrominanceChannel; SamplerState linearfilter { Filter = MIN_MAG_MIP_LINEAR; }; float3 ConvertYUVtoRGB(float3 yuv) { // Derived from https://msdn.microsoft.com/en-us/library/windows/desktop/dd206750(v=vs.85).aspx // Section: Converting 8-bit YUV to RGB888 // These values are calculated from (16 / 255) and (128 / 255) yuv -= float3(0.062745f, 0.501960f, 0.501960f); yuv = mul(yuv, YUVtoRGBCoeffMatrix); return saturate(yuv); } float4 main(PixelShaderInput input) : SV_TARGET { float y = luminanceChannel.Sample(linearfilter, input.texCoord); float2 uv = chrominanceChannel.Sample(linearfilter, input.texCoord); float3 YUV = float3(y, uv.x, uv.y); float4 YUV4 = float4(YUV.x, YUV.y, YUV.z, 1); float3 RGB = ConvertYUVtoRGB(YUV); float4 RGB4 = float4(RGB.x, RGB.y, RGB.z, 1); return RGB4; } Can someone help me? What I’m doing wrong?
EDIT #1
int skipLineArea = 0; int uvCount = (height >> 1) * (width >> 1); for (j = 0, k = 0; j < uvCount; j++, k++) { if (skipLineArea == (width >> 1)) { k += pitchU - (width >> 1); skipLineArea = 0; } pixelsPtr_[position++] = avU[k]; pixelsPtr_[position++] = avV[k]; skipLineArea++; }
EDIT #2
Updating the texture instead of creating new ones
D3D11_MAPPED_SUBRESOURCE mappedResource; d3dContext->Map(tex, 0, D3D11_MAP_WRITE_DISCARD, 0, &mappedResource); uint8_t* mappedData = reinterpret_cast(mappedResource.pData); for (UINT i = 0; i < height * 1.5; ++i) { memcpy(mappedData, frameData, rowPitch); mappedData += mappedResource.RowPitch; frameData += rowPitch; } d3dContext->Unmap(tex, 0);