
Recherche avancée
Médias (2)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
-
Carte de Schillerkiez
13 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (97)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Multilang : améliorer l’interface pour les blocs multilingues
18 février 2011, parMultilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela. -
ANNEXE : Les plugins utilisés spécifiquement pour la ferme
5 mars 2010, parLe site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)
Sur d’autres sites (11406)
-
Encoiding Video Frame Error with VAAPI encoding [on hold]
26 octobre 2018, par Tcor5I’m attempting to convert one of the FFMPEG Muxing examples (https://ffmpeg.org/doxygen/2.8/muxing_8c.html) to use VAAPI hardware encoding, using https://www.ffmpeg.org/doxygen/trunk/vaapi_transcode_8c-example.html as an example.
The problem is when I call avcodec_open2, it returns :
Error encoding video frame : Input/output errorI believe I have set up all the HW contexts and frames appropriately but it seems the encoding is failing due to an invalid VASurface ID.
Does anyone have any tips on how to resolve/troubleshoot this ?Thanks
This is the entire shell printout :
[AVHWDeviceContext @ 0x55acd607dd40] Opened VA display via X11 display :0.
[AVHWDeviceContext @ 0x55acd607dd40] libva: VA-API version 1.1.0
[AVHWDeviceContext @ 0x55acd607dd40] libva: va_getDriverName() returns 0
[AVHWDeviceContext @ 0x55acd607dd40] libva: Trying to open /usr/lib/x86_64-linux-gnu/dri/i965_drv_video.so
[AVHWDeviceContext @ 0x55acd607dd40] libva: Found init function __vaDriverInit_1_1
[AVHWDeviceContext @ 0x55acd607dd40] libva: va_openDriver() returns 0
[AVHWDeviceContext @ 0x55acd607dd40] Initialised VAAPI connection: version 1.1
[AVHWDeviceContext @ 0x55acd607dd40] Format 0x32315659 -> yuv420p.
[AVHWDeviceContext @ 0x55acd607dd40] Format 0x30323449 -> yuv420p.
[AVHWDeviceContext @ 0x55acd607dd40] Format 0x3231564e -> nv12.
[AVHWDeviceContext @ 0x55acd607dd40] Format 0x32595559 -> yuyv422.
[AVHWDeviceContext @ 0x55acd607dd40] Format 0x59565955 -> uyvy422.
[AVHWDeviceContext @ 0x55acd607dd40] Format 0x48323234 -> yuv422p.
[AVHWDeviceContext @ 0x55acd607dd40] Format 0x58424752 -> rgb0.
[AVHWDeviceContext @ 0x55acd607dd40] Format 0x58524742 -> bgr0.
[AVHWDeviceContext @ 0x55acd607dd40] Format 0x30313050 -> p010le.
[AVHWDeviceContext @ 0x55acd607dd40] Matched "Intel i965 driver for Intel(R) Skylake - 2.1.0" as known driver "Intel i965 (Quick Sync)".
[AVHWDeviceContext @ 0x55acd609dc40] Opened VA display via X11 display :0.
[AVHWDeviceContext @ 0x55acd609dc40] libva: VA-API version 1.1.0
[AVHWDeviceContext @ 0x55acd609dc40] libva: va_getDriverName() returns 0
[AVHWDeviceContext @ 0x55acd609dc40] libva: Trying to open /usr/lib/x86_64-linux-gnu/dri/i965_drv_video.so
[AVHWDeviceContext @ 0x55acd609dc40] libva: Found init function __vaDriverInit_1_1
[AVHWDeviceContext @ 0x55acd609dc40] libva: va_openDriver() returns 0
[AVHWDeviceContext @ 0x55acd609dc40] Initialised VAAPI connection: version 1.1
[AVHWDeviceContext @ 0x55acd609dc40] Format 0x32315659 -> yuv420p.
[AVHWDeviceContext @ 0x55acd609dc40] Format 0x30323449 -> yuv420p.
[AVHWDeviceContext @ 0x55acd609dc40] Format 0x3231564e -> nv12.
[AVHWDeviceContext @ 0x55acd609dc40] Format 0x32595559 -> yuyv422.
[AVHWDeviceContext @ 0x55acd609dc40] Format 0x59565955 -> uyvy422.
[AVHWDeviceContext @ 0x55acd609dc40] Format 0x48323234 -> yuv422p.
[AVHWDeviceContext @ 0x55acd609dc40] Format 0x58424752 -> rgb0.
[AVHWDeviceContext @ 0x55acd609dc40] Format 0x58524742 -> bgr0.
[AVHWDeviceContext @ 0x55acd609dc40] Format 0x30313050 -> p010le.
[AVHWDeviceContext @ 0x55acd609dc40] Matched "Intel i965 driver for Intel(R) Skylake - 2.1.0" as known driver "Intel i965 (Quick Sync)".
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x4000000.
[AVHWFramesContext @ 0x55acd60bbb00] Direct mapping possible.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x4000001.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x4000002.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x4000003.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x4000004.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x4000005.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x4000006.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x4000007.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x4000008.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x4000009.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x400000a.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x400000b.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x400000c.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x400000d.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x400000e.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x400000f.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x4000010.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x4000011.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x4000012.
[AVHWFramesContext @ 0x55acd60bbb00] Created surface 0x4000013.
[h264_vaapi @ 0x55acd60ba3c0] Using nv12 as format of reconstructed frames.
[AVHWFramesContext @ 0x55acd60bcec0] Created surface 0x4000014.
[AVHWFramesContext @ 0x55acd60bcec0] Direct mapping possible.
[AVHWFramesContext @ 0x55acd60bcec0] Created surface 0x4000015.
[AVHWFramesContext @ 0x55acd60bcec0] Created surface 0x4000016.
[AVHWFramesContext @ 0x55acd60bcec0] Created surface 0x4000017.
[AVHWFramesContext @ 0x55acd60bcec0] Created surface 0x4000018.
[AVHWFramesContext @ 0x55acd60bcec0] Created surface 0x4000019.
[AVHWFramesContext @ 0x55acd60bcec0] Created surface 0x400001a.
[h264_vaapi @ 0x55acd60ba3c0] Using variable-bitrate = 400000 bps.
[h264_vaapi @ 0x55acd60ba3c0] Encode frame: 352x288 (0).
[h264_vaapi @ 0x55acd60ba3c0] Pictures: IDR (0/0)
[h264_vaapi @ 0x55acd60ba3c0] Issuing encode for pic 0/0 as type IDR.
[h264_vaapi @ 0x55acd60ba3c0] No reference pictures.
[h264_vaapi @ 0x55acd60ba3c0] Input surface is 0.
[h264_vaapi @ 0x55acd60ba3c0] Recon surface is 0x400001a.
[h264_vaapi @ 0x55acd60ba3c0] Allocated output buffer 0x8000000
[h264_vaapi @ 0x55acd60ba3c0] Output buffer is 0x8000000.
[h264_vaapi @ 0x55acd60ba3c0] Param buffer (27) is 0x8000001.
[h264_vaapi @ 0x55acd60ba3c0] Param buffer (27) is 0x8000002.
[h264_vaapi @ 0x55acd60ba3c0] Param buffer (27) is 0x8000003.
[h264_vaapi @ 0x55acd60ba3c0] Param buffer (27) is 0x8000004.
[h264_vaapi @ 0x55acd60ba3c0] Param buffer (22) is 0x8000005.
[h264_vaapi @ 0x55acd60ba3c0] Param buffer (23) is 0x8000006.
[h264_vaapi @ 0x55acd60ba3c0] Packed header buffer (1) is 0x8000007/0x8000008 (376 bits).
[h264_vaapi @ 0x55acd60ba3c0] Packed header buffer (4) is 0x8000009/0x800000a (944 bits).
[h264_vaapi @ 0x55acd60ba3c0] Packed header buffer (3) is 0x800000b/0x800000c (72 bits).
[h264_vaapi @ 0x55acd60ba3c0] Param buffer (24) is 0x800000d.
[h264_vaapi @ 0x55acd60ba3c0] Failed to begin picture encode issue: 6 (invalid VASurfaceID).
[h264_vaapi @ 0x55acd60ba3c0] Encode failed: -5.
Error encoding video frame: Input/output errorThis is the code that i’m using :
#include
#include
#include
#include
#include <libavutil></libavutil>avassert.h>
#include <libavutil></libavutil>channel_layout.h>
#include <libavutil></libavutil>opt.h>
#include <libavutil></libavutil>mathematics.h>
#include <libavutil></libavutil>timestamp.h>
#include <libavformat></libavformat>avformat.h>
#include <libswscale></libswscale>swscale.h>
#include <libswresample></libswresample>swresample.h>
//HW Encoding
#include <libavcodec></libavcodec>avcodec.h>
#include <libavutil></libavutil>pixdesc.h>
#include <libavutil></libavutil>hwcontext.h>
#define STREAM_DURATION 10.0
#define STREAM_FRAME_RATE 25 /* 25 images/s */
/* default pix_fmt */
#define STREAM_PIX_FMT AV_PIX_FMT_VAAPI
#define SCALE_FLAGS SWS_BICUBIC
#define LINUX_LIVE_STREAM // enables linux live stream
#define MAX_NAME_LEN 256
static AVBufferRef *hw_device_ctx = NULL; //HW Encoding
// a wrapper around a single output AVStream
typedef struct OutputStream {
AVStream *st;
AVCodecContext *enc;
/* pts of the next frame that will be generated */
int64_t next_pts;
int samples_count;
AVFrame *frame;
AVFrame *tmp_frame;
float t, tincr, tincr2;
struct SwsContext *sws_ctx;
struct SwrContext *swr_ctx;
} OutputStream;
static int set_hwframe_ctx(AVCodecContext *c, AVBufferRef *hw_device_ctx)
{
AVBufferRef *hw_frames_ref;
AVHWFramesContext *frames_ctx = NULL;
int err = 0;
if (!(hw_frames_ref = av_hwframe_ctx_alloc(hw_device_ctx))) {
fprintf(stderr, "Failed to create VAAPI frame context.\n");
return -1;
}
frames_ctx = (AVHWFramesContext *)(hw_frames_ref->data);
frames_ctx->format = AV_PIX_FMT_VAAPI;
frames_ctx->sw_format = AV_PIX_FMT_NV12;
frames_ctx->width = 352;
frames_ctx->height = 288;
frames_ctx->initial_pool_size = 20;
if ((err = av_hwframe_ctx_init(hw_frames_ref)) < 0) {
fprintf(stderr, "Failed to initialize VAAPI frame context."
"Error code: %s\n",av_err2str(err));
av_buffer_unref(&hw_frames_ref);
return err;
}
c->hw_frames_ctx = av_buffer_ref(hw_frames_ref);
if (!c->hw_frames_ctx)
err = AVERROR(ENOMEM);
return err;
}
/* Add an output stream. */
static void add_stream(OutputStream *ost, AVFormatContext *oc,
AVCodec **codec,
enum AVCodecID codec_id)
{
AVCodecContext *c;
int i;
int ret; //HW Encoding
const char *enc_name = "h264_vaapi";//HW encoding
//HW Encoding
ret = av_hwdevice_ctx_create(&hw_device_ctx, AV_HWDEVICE_TYPE_VAAPI, NULL, NULL, 0);
if (ret < 0) {
fprintf(stderr, "Failed to create a VAAPI device. Error code: %s\n", av_err2str(ret));
return -1;
}
*codec = avcodec_find_encoder_by_name(enc_name);//HW encoding codec
if (!(*codec)) {
fprintf(stderr, "Could not find encoder.\n");
exit(1);
}
ost->st = avformat_new_stream(oc, NULL);
if (!ost->st) {
fprintf(stderr, "Could not allocate stream\n");
exit(1);
}
ost->st->id = oc->nb_streams-1;
c = avcodec_alloc_context3(*codec);
if (!c) {
fprintf(stderr, "Could not alloc an encoding context\n");
exit(1);
}
ost->enc = c;
switch ((*codec)->type) {
case AVMEDIA_TYPE_VIDEO:
c->codec_id = codec_id;
c->bit_rate = 400000;
/* Resolution must be a multiple of two. */
c->width = 352;
c->height = 288;
ost->st->time_base = (AVRational){ 1, STREAM_FRAME_RATE };
c->time_base = ost->st->time_base;
c->gop_size = 12;
c->pix_fmt = STREAM_PIX_FMT;
if (c->codec_id == AV_CODEC_ID_MPEG2VIDEO) {
c->max_b_frames = 2;
}
if (c->codec_id == AV_CODEC_ID_MPEG1VIDEO) {
c->mb_decision = 2;
}
break;
default:
break;
}
if ((ret = set_hwframe_ctx(c, hw_device_ctx)) < 0) {
fprintf(stderr, "Failed to set hwframe context.\n");
exit(1);
}
if (oc->oformat->flags & AVFMT_GLOBALHEADER)
c->flags |= AV_CODEC_FLAG_GLOBAL_HEADER;
}
/* video output */
static AVFrame *alloc_picture(enum AVPixelFormat pix_fmt, int width, int height)
{
AVFrame *picture;
int ret;
picture = av_frame_alloc();
if (!picture)
return NULL;
picture->format = AV_PIX_FMT_YUV420P;
picture->width = width;
picture->height = height;
printf ("pix_fmt = %d \n", pix_fmt);
printf ("picture ->format (pix_fmt) = %d \n", picture->format);
/* allocate the buffers for the frame data */
ret = av_frame_get_buffer(picture, 32);
if (ret < 0) {
fprintf(stderr, "Could not allocate frame data.\n");
printf ("error : %d \n", ret);
exit(1);
}
return picture;
}
static void open_video(AVFormatContext *oc, AVCodec *codec, OutputStream *ost, AVDictionary *opt_arg)
{
int ret;
AVCodecContext *c = ost->enc;
AVDictionary *opt = NULL;
av_dict_copy(&opt, opt_arg, 0);
/* open the codec */
ret = avcodec_open2(c, codec, &opt);
av_dict_free(&opt);
if (ret < 0) {
fprintf(stderr, "Could not open video codec: %s\n", av_err2str(ret));
exit(1);
}
/* allocate and init a re-usable frame */
printf ("line #%d\n", __LINE__);
ost->frame = alloc_picture(c->pix_fmt, c->width, c->height);
if (!ost->frame) {
fprintf(stderr, "Could not allocate video frame\n");
exit(1);
}
printf ("line #%d\n", __LINE__);
/* If the output format is not YUV420P, then a temporary YUV420P
* picture is needed too. It is then converted to the required
* output format. */
ost->tmp_frame = NULL;
if (c->pix_fmt != AV_PIX_FMT_YUV420P) {
ost->tmp_frame = alloc_picture(AV_PIX_FMT_YUV420P, c->width, c->height);
if (!ost->tmp_frame) {
fprintf(stderr, "Could not allocate temporary picture\n");
exit(1);
}
}
/* copy the stream parameters to the muxer */
ret = avcodec_parameters_from_context(ost->st->codecpar, c);
if (ret < 0) {
fprintf(stderr, "Could not copy the stream parameters\n");
exit(1);
}
}
/* Prepare a dummy image. */
static void fill_yuv_image(AVFrame *pict, int frame_index,
int width, int height)
{
int x, y, i;
i = frame_index;
/* Y */
for (y = 0; y < height; y++)
for (x = 0; x < width; x++)
pict->data[0][y * pict->linesize[0] + x] = x + y + i * 3;
/* Cb and Cr */
for (y = 0; y < height / 2; y++) {
for (x = 0; x < width / 2; x++) {
pict->data[1][y * pict->linesize[1] + x] = 128 + y + i * 2;
}
}
}
static AVFrame *get_video_frame(OutputStream *ost)
{
AVCodecContext *c = ost->enc;
if (av_compare_ts(ost->next_pts, c->time_base,
STREAM_DURATION, (AVRational){ 1, 1 }) >= 0)
return NULL;
if (av_frame_make_writable(ost->frame) < 0)
exit(1);
if (c->pix_fmt != AV_PIX_FMT_YUV420P) {
/* as we only generate a YUV420P picture, we must convert it
* to the codec pixel format if needed */
if (!ost->sws_ctx) {
ost->sws_ctx = sws_getContext(c->width, c->height,
AV_PIX_FMT_YUV420P,
c->width, c->height,
AV_PIX_FMT_NV12,
SCALE_FLAGS, NULL, NULL, NULL);
if (!ost->sws_ctx) {
fprintf(stderr,
"Could not initialize the conversion context\n");
exit(1);
}
}
printf ("line #%d\n", __LINE__);
fill_yuv_image(ost->tmp_frame, ost->next_pts, c->width, c->height);
printf ("line #%d\n", __LINE__);
sws_scale(ost->sws_ctx, (const uint8_t * const *) ost->tmp_frame->data,
ost->tmp_frame->linesize, 0, c->height, ost->frame->data,
ost->frame->linesize);
printf ("line #%d\n", __LINE__);
} else {
fill_yuv_image(ost->frame, ost->next_pts, c->width, c->height);
}
printf ("line #%d\n", __LINE__);
ost->frame->pts = ost->next_pts++;
return ost->frame;
}
/*
* encode one video frame and send it to the muxer
* return 1 when encoding is finished, 0 otherwise
*/
static int write_video_frame(AVFormatContext *oc, OutputStream *ost)
{
int ret;
AVCodecContext *c;
AVFrame *frame;
int got_packet = 0;
AVPacket pkt = { 0 };
AVStream *st;
printf ("line #%d\n", __LINE__);
c = ost->enc;
printf ("line #%d\n", __LINE__);
frame = get_video_frame(ost);
printf ("line #%d\n", __LINE__);
av_init_packet(&pkt);
/* encode the image */
ret = avcodec_encode_video2(c, &pkt, frame, &got_packet);
if (ret < 0) {
fprintf(stderr, "Error encoding video frame: %s\n", av_err2str(ret));
exit(1);
}
if (got_packet) {
av_packet_rescale_ts(&pkt, c->time_base, ost->st->time_base);
pkt.stream_index = ost->st->index;
ret = av_interleaved_write_frame(oc, &pkt);
} else {
ret = 0;
}
if (ret < 0) {
fprintf(stderr, "Error while writing video frame: %s\n", av_err2str(ret));
exit(1);
}
return (frame || got_packet) ? 0 : 1;
}
static void close_stream(AVFormatContext *oc, OutputStream *ost)
{
avcodec_free_context(&ost->enc);
av_frame_free(&ost->frame);
av_frame_free(&ost->tmp_frame);
sws_freeContext(ost->sws_ctx);
swr_free(&ost->swr_ctx);
}
/**************************************************************/
/* media file output */
int main(int argc, char **argv)
{
OutputStream video_st = { 0 }, audio_st = { 0 };
const char *filename;
AVOutputFormat *fmt;
AVFormatContext *oc;
AVCodec *video_codec, *enc_codec;
int ret;
int have_video = 0;
int encode_video = 0;
AVDictionary *opt = NULL;
int i;
av_log_set_level(AV_LOG_DEBUG); //Debug flag
if (argc < 2) {
printf("usage: %s output_file\n"
"API example program to output a media file with libavformat.\n"
"This program generates a synthetic audio and video stream, encodes and\n"
"muxes them into a file named output_file.\n"
"The output format is automatically guessed according to the file extension.\n"
"Raw images can also be output by using '%%d' in the filename.\n"
"\n", argv[0]);
return 1;
}
filename = argv[1];
for (i = 2; i+1 < argc; i+=2) {
if (!strcmp(argv[i], "-flags") || !strcmp(argv[i], "-fflags"))
av_dict_set(&opt, argv[i]+1, argv[i+1], 0);
}
//HW Encoding
ret = av_hwdevice_ctx_create(&hw_device_ctx, AV_HWDEVICE_TYPE_VAAPI, NULL, NULL, 0);
if (ret < 0) {
fprintf(stderr, "Failed to create a VAAPI device. Error code: %s\n", av_err2str(ret));
return -1;
}
if (!(video_codec = avcodec_find_encoder_by_name("h264_vaapi"))) {
printf(stderr, "Could not find encoder h264_vaapi");
ret = -1;
}
/* allocate the output media context */
avformat_alloc_output_context2(&oc, NULL, NULL, filename);
if (!oc) {
printf("Could not deduce output format from file extension: using MPEG.\n");
avformat_alloc_output_context2(&oc, NULL, "mpeg", filename);
}
if (!oc)
return 1;
fmt = oc->oformat;
/* Add the audio and video streams using the default format codecs
* and initialize the codecs. */
if (fmt->video_codec != AV_CODEC_ID_NONE) {
add_stream(&video_st, oc, &video_codec, fmt->video_codec);
have_video = 1;
encode_video = 1;
}
/* Now that all the parameters are set, we can open the audio and
* video codecs and allocate the necessary encode buffers. */
if (have_video)
open_video(oc, video_codec, &video_st, opt);
av_dump_format(oc, 0, filename, 1);
/* open the output file, if needed */
if (!(fmt->flags & AVFMT_NOFILE)) {
ret = avio_open(&oc->pb, filename, AVIO_FLAG_WRITE);
if (ret < 0) {
fprintf(stderr, "Could not open '%s': %s\n", filename,
av_err2str(ret));
return 1;
}
}
/* Write the stream header, if any. */
ret = avformat_write_header(oc, &opt);
if (ret < 0) {
fprintf(stderr, "Error occurred when opening output file: %s\n",
av_err2str(ret));
return 1;
}
printf ("line #%d\n", __LINE__);
while (encode_video) {
/* select the stream to encode */
if (encode_video &&
(av_compare_ts(video_st.next_pts, video_st.enc->time_base,
video_st.next_pts, video_st.enc->time_base) <= 0)) {
printf ("line #%d\n", __LINE__);
encode_video = !write_video_frame(oc, &video_st);
printf ("line #%d\n", __LINE__);
}
}
av_write_trailer(oc);
/* Close each codec. */
if (have_video)
close_stream(oc, &video_st);
if (!(fmt->flags & AVFMT_NOFILE))
/* Close the output file. */
avio_closep(&oc->pb);
/* free the stream */
avformat_free_context(oc);
av_buffer_unref(&hw_device_ctx);
return 0;
} -
stout to textarea from ffmpeg command
21 septembre 2014, par BrettHi im trying to get the output of a ffmpeg command into a text level here is my code i am posting the lot as im brand new to java and am not sure where ive gone wrong i want to run the command and have the progress bar update and the output to show on a text area.
any help would appreciatedpackage MyPackage;
import java.util.*;
import java.io.BufferedReader;
import java.io.File;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.logging.Level;
import java.util.logging.Logger;
/**
*
* @author brett
*/
public class NumberAdditionUI extends javax.swing.JFrame {
private static final long serialVersionUID = 1L;
/**
* Creates new form NumberAdditionUI
*/
public NumberAdditionUI() {
initComponents();
}
/**
* This method is called from within the constructor to initialize the form.
* WARNING: Do NOT modify this code. The content of this method is always
* regenerated by the Form Editor.
*/
@SuppressWarnings("unchecked")
//
private void initComponents() {
db = new javax.swing.JFileChooser();
jButton1 = new javax.swing.JButton();
jButton2 = new javax.swing.JButton();
jTextField1 = new javax.swing.JTextField();
jButton3 = new javax.swing.JButton();
jLabel1 = new javax.swing.JLabel();
jProgressBar1 = new javax.swing.JProgressBar();
jScrollPane1 = new javax.swing.JScrollPane();
jTextArea1 = new javax.swing.JTextArea();
db.setBackground(java.awt.Color.white);
db.setCurrentDirectory(new java.io.File("C:\\Users\\brett\\Documents\\convert"));
db.setDialogTitle("grabAFile");
setDefaultCloseOperation(javax.swing.WindowConstants.EXIT_ON_CLOSE);
setTitle("Smoke Goobies");
setMaximumSize(getPreferredSize());
jButton1.setText("Exit");
jButton1.setHorizontalAlignment(javax.swing.SwingConstants.RIGHT);
jButton1.addActionListener(new java.awt.event.ActionListener() {
public void actionPerformed(java.awt.event.ActionEvent evt) {
jButton1ActionPerformed(evt);
}
});
jButton2.setText("Browse");
jButton2.addActionListener(new java.awt.event.ActionListener() {
public void actionPerformed(java.awt.event.ActionEvent evt) {
jButton2ActionPerformed(evt);
}
});
jTextField1.setText("Select A File To Covert");
jButton3.setText("Run This Puppy");
jButton3.addActionListener(new java.awt.event.ActionListener() {
public void actionPerformed(java.awt.event.ActionEvent evt) {
jButton3ActionPerformed(evt);
}
});
jLabel1.setFont(new java.awt.Font("Goudy Old Style", 1, 56)); // NOI18N
jLabel1.setIcon(jLabel1.getIcon());
jLabel1.setText(" MASHiTuP");
jProgressBar1.setValue(50);
jProgressBar1.setBorder(new javax.swing.border.MatteBorder(null));
jTextArea1.setColumns(20);
jTextArea1.setRows(5);
jTextArea1.setAutoscrolls(false);
jScrollPane1.setViewportView(jTextArea1);
javax.swing.GroupLayout layout = new javax.swing.GroupLayout(getContentPane());
getContentPane().setLayout(layout);
layout.setHorizontalGroup(
layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
.addGroup(layout.createSequentialGroup()
.addContainerGap()
.addGroup(layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
.addComponent(jProgressBar1, javax.swing.GroupLayout.DEFAULT_SIZE, javax.swing.GroupLayout.DEFAULT_SIZE, Short.MAX_VALUE)
.addGroup(layout.createSequentialGroup()
.addGroup(layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING, false)
.addComponent(jLabel1, javax.swing.GroupLayout.PREFERRED_SIZE, 355, javax.swing.GroupLayout.PREFERRED_SIZE)
.addGroup(layout.createSequentialGroup()
.addComponent(jTextField1, javax.swing.GroupLayout.PREFERRED_SIZE, 282, javax.swing.GroupLayout.PREFERRED_SIZE)
.addPreferredGap(javax.swing.LayoutStyle.ComponentPlacement.RELATED)
.addComponent(jButton2, javax.swing.GroupLayout.PREFERRED_SIZE, 77, javax.swing.GroupLayout.PREFERRED_SIZE))
.addGroup(layout.createSequentialGroup()
.addComponent(jButton3)
.addPreferredGap(javax.swing.LayoutStyle.ComponentPlacement.RELATED, javax.swing.GroupLayout.DEFAULT_SIZE, Short.MAX_VALUE)
.addComponent(jButton1)))
.addGap(0, 0, Short.MAX_VALUE))
.addComponent(jScrollPane1))
.addContainerGap())
);
layout.setVerticalGroup(
layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
.addGroup(javax.swing.GroupLayout.Alignment.TRAILING, layout.createSequentialGroup()
.addContainerGap()
.addComponent(jLabel1, javax.swing.GroupLayout.PREFERRED_SIZE, 73, javax.swing.GroupLayout.PREFERRED_SIZE)
.addPreferredGap(javax.swing.LayoutStyle.ComponentPlacement.RELATED)
.addGroup(layout.createParallelGroup(javax.swing.GroupLayout.Alignment.BASELINE)
.addComponent(jButton2)
.addComponent(jTextField1, javax.swing.GroupLayout.PREFERRED_SIZE, javax.swing.GroupLayout.DEFAULT_SIZE, javax.swing.GroupLayout.PREFERRED_SIZE))
.addGap(34, 34, 34)
.addComponent(jProgressBar1, javax.swing.GroupLayout.PREFERRED_SIZE, 24, javax.swing.GroupLayout.PREFERRED_SIZE)
.addGap(18, 18, 18)
.addComponent(jScrollPane1, javax.swing.GroupLayout.PREFERRED_SIZE, 84, javax.swing.GroupLayout.PREFERRED_SIZE)
.addPreferredGap(javax.swing.LayoutStyle.ComponentPlacement.RELATED, 48, Short.MAX_VALUE)
.addGroup(layout.createParallelGroup(javax.swing.GroupLayout.Alignment.BASELINE)
.addComponent(jButton1)
.addComponent(jButton3))
.addContainerGap())
);
jProgressBar1.getAccessibleContext().setAccessibleName("MYsTATUS");
pack();
}//
private void jButton1ActionPerformed(java.awt.event.ActionEvent evt) {
System.exit(0);
}
private void jButton2ActionPerformed(java.awt.event.ActionEvent evt) {
int returnVal = db.showOpenDialog( this );
File f = db.getSelectedFile();
String filename = f.getAbsolutePath();
jTextField1.setText(filename);
}
private void jButton3ActionPerformed(java.awt.event.ActionEvent evt) {
java.lang.Runtime rt = java.lang.Runtime.getRuntime();
java.lang.Process proc = null;
try {
proc = rt.exec("ipconfig");
//proc = rt.exec("ffmpeg -i C:\\Users\\brett\\Documents\\MASH_02.avi C:\\Users\\brett\\Documents\\mash09.avi");
} catch (IOException ex) {
Logger.getLogger(NumberAdditionUI.class.getName()).log(Level.SEVERE, null, ex);
}
BufferedReader stdInput = new BufferedReader(new InputStreamReader(proc.getInputStream()));
BufferedReader stdError = new BufferedReader(new InputStreamReader(proc.getErrorStream()));
// read the output from the command
System.out.println("Here is the standard output of the command:\n");
String s;
try {
while ((s = stdInput.readLine()) != null) {
System.out.println(s);
jTextArea1.append(s+"\n\n");
}
} catch (IOException ex) {
Logger.getLogger(NumberAdditionUI.class.getName()).log(Level.SEVERE, null, ex);
}
// read any errors from the attempted command
System.out.println("Here is the standard error of the command (if any):\n");
try {
while ((s = stdError.readLine()) != null) {
System.out.println(s);
}
} catch (IOException ex) {
Logger.getLogger(NumberAdditionUI.class.getName()).log(Level.SEVERE, null, ex);
}
}
/**
*
* @param args the command line arguments
*/
public static void main(String args[])
{
/*
* Set the Nimbus look and feel
*/
//
/*
* If Nimbus (introduced in Java SE 6) is not available, stay with the
* default look and feel. For details see
* http://download.oracle.com/javase/tutorial/uiswing/lookandfeel/plaf.html
*/
try {
for (javax.swing.UIManager.LookAndFeelInfo info : javax.swing.UIManager.getInstalledLookAndFeels()) {
if ("Nimbus".equals(info.getName())) {
javax.swing.UIManager.setLookAndFeel(info.getClassName());
break;
}
}
} catch (ClassNotFoundException | InstantiationException | IllegalAccessException | javax.swing.UnsupportedLookAndFeelException ex) {
java.util.logging.Logger.getLogger(NumberAdditionUI.class.getName()).log(java.util.logging.Level.SEVERE, null, ex);
}
//
/*
* Create and display the form
*/
java.awt.EventQueue.invokeLater(new Runnable() {
@Override
public void run() {
new NumberAdditionUI().setVisible(true);
}
});
}
// Variables declaration - do not modify
private javax.swing.JFileChooser db;
private javax.swing.JButton jButton1;
private javax.swing.JButton jButton2;
private javax.swing.JButton jButton3;
private javax.swing.JLabel jLabel1;
private javax.swing.JProgressBar jProgressBar1;
private javax.swing.JScrollPane jScrollPane1;
private javax.swing.JTextArea jTextArea1;
private javax.swing.JTextField jTextField1;
// End of variables declaration
} -
ffmpeg rotate and resize
23 août 2014, par user1321964I’m trying to resize a movie and to rotate it counter-clockwise using ffmpeg.
I can achieve one or the other ok individuallyffmpeg -i MVI_8189.MOV -vf "transpose=2" -ss 00:00:12 -t 00:00:05 rotate.MOV
OR
ffmpeg -i MVI_8189.MOV -s 720x406 -ss 00:00:12 -t 00:00:05 resize.MOV
but if I try to do both, either-
ffmpeg -i MVI_8189.MOV -s 720x406 -ss 00:00:12 -t 00:00:05 resize.MOV
ffmpeg -i resize.MOV -vf "transpose=2" toutp.MOV
OR
ffmpeg -i MVI_8189.MOV -vf "transpose=2" -ss 00:00:12 -t 00:00:05 rotate.MOV
ffmpeg -i rotate.MOV -s 406x720 toutp.MOV
the final movie does not play, only audio, no video ; neither previewing in Finder, nor in Quicktime. More seriously once I have attempted to play it, I can’t then play any other movie whatsoever (only audio) - quicktime won’t quit, constantly need to force-quit it. The only way to then restore seems to be reboot OS-X
This though does work -
ffmpeg -i MVI_8189.MOV -vf "transpose=2" -ss 00:00:12 -t 00:00:05 rotate.MOV
ffmpeg -i rotate.MOV -s 720x406 toutp.MOV
but doesn’t do what I want - it stretches the movie in the wrong direction
Any ideas ?
ffmpeg version 1.0.git Copyright (c) 2000-2012 the FFmpeg developers
built on Dec 24 2012 18:43:25 with llvm-gcc 4.2.1 (LLVM build 2336.1.00)MAC OSX 10.7.5
==========
output fromffmpeg -i MVI_8189.MOV -vf "transpose=2" -ss 00:00:12 -t 00:00:05 rotate.MOV
ffmpeg -i rotate.MOV -s 406x720 toutp.MOV
ffmpeg version 1.0.git Copyright (c) 2000-2012 the FFmpeg developers
built on Dec 24 2012 18:43:25 with llvm-gcc 4.2.1 (LLVM build 2336.1.00)
configuration: --prefix=/usr/local --enable-shared --disable-mmx --enable-libmp3lame --enable-gpl --enable-zlib --enable-libvorbis --enable-libfaac --enable-nonfree --enable-libx264 --enable-pthreads
libavutil 52. 12.100 / 52. 12.100
libavcodec 54. 81.100 / 54. 81.100
libavformat 54. 49.102 / 54. 49.102
libavdevice 54. 3.102 / 54. 3.102
libavfilter 3. 29.101 / 3. 29.101
libswscale 2. 1.103 / 2. 1.103
libswresample 0. 17.102 / 0. 17.102
libpostproc 52. 2.100 / 52. 2.100
Guessed Channel Layout for Input Stream #0.1 : stereo
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'MVI_8189.MOV':
Metadata:
major_brand : qt
minor_version : 537331968
compatible_brands: qt CAEP
creation_time : 2014-08-17 23:13:43
Duration: 00:01:32.40, start: 0.000000, bitrate: 44677 kb/s
Stream #0:0(eng): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuvj420p, 1920x1080, 43132 kb/s, 25 fps, 25 tbr, 25k tbn, 50k tbc
Metadata:
creation_time : 2014-08-17 23:13:43
Stream #0:1(eng): Audio: pcm_s16le (sowt / 0x74776F73), 48000 Hz, stereo, s16, 1536 kb/s
Metadata:
creation_time : 2014-08-17 23:13:43
[libx264 @ 0x7fc67c043600] using cpu capabilities: MMX2 SSE2Fast SSSE3 FastShuffle SSE4.2 AVX
[libx264 @ 0x7fc67c043600] profile High, level 4.0
[libx264 @ 0x7fc67c043600] 264 - core 128 - H.264/MPEG-4 AVC codec - Copyleft 2003-2012 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mov, to 'rotate.MOV':
Metadata:
major_brand : qt
minor_version : 537331968
compatible_brands: qt CAEP
encoder : Lavf54.49.102
Stream #0:0(eng): Video: h264 (avc1 / 0x31637661), yuvj420p, 1080x1920, q=-1--1, 12800 tbn, 25 tbc
Metadata:
creation_time : 2014-08-17 23:13:43
Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, stereo, s16, 128 kb/s
Metadata:
creation_time : 2014-08-17 23:13:43
Stream mapping:
Stream #0:0 -> #0:0 (h264 -> libx264)
Stream #0:1 -> #0:1 (pcm_s16le -> libfaac)
Press [q] to stop, [?] for help
frame= 125 fps=5.7 q=-2.0 Lsize= 6394kB time=00:00:05.00 bitrate=10470.5kbits/s /s
video:6322kB audio:67kB subtitle:0 global headers:0kB muxing overhead 0.079207%
[libx264 @ 0x7fc67c043600] frame I:1 Avg QP:25.87 size:113032
[libx264 @ 0x7fc67c043600] frame P:65 Avg QP:25.78 size: 64251
[libx264 @ 0x7fc67c043600] frame B:59 Avg QP:26.73 size: 37017
[libx264 @ 0x7fc67c043600] consecutive B-frames: 5.6% 94.4% 0.0% 0.0%
[libx264 @ 0x7fc67c043600] mb I I16..4: 10.7% 79.9% 9.3%
[libx264 @ 0x7fc67c043600] mb P I16..4: 3.6% 12.0% 0.3% P16..4: 52.0% 19.1% 11.3% 0.0% 0.0% skip: 1.7%
[libx264 @ 0x7fc67c043600] mb B I16..4: 0.9% 1.7% 0.0% B16..8: 45.4% 5.1% 0.8% direct:24.2% skip:21.9% L0:50.1% L1:44.5% BI: 5.3%
[libx264 @ 0x7fc67c043600] 8x8 transform intra:74.5% inter:79.5%
[libx264 @ 0x7fc67c043600] coded y,uvDC,uvAC intra: 59.4% 94.3% 79.5% inter: 38.2% 80.5% 30.1%
[libx264 @ 0x7fc67c043600] i16 v,h,dc,p: 30% 21% 22% 28%
[libx264 @ 0x7fc67c043600] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 18% 14% 24% 6% 8% 9% 7% 8% 7%
[libx264 @ 0x7fc67c043600] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 28% 18% 17% 6% 7% 8% 6% 6% 4%
[libx264 @ 0x7fc67c043600] i8c dc,h,v,p: 60% 14% 18% 8%
[libx264 @ 0x7fc67c043600] Weighted P-Frames: Y:0.0% UV:0.0%
[libx264 @ 0x7fc67c043600] ref P L0: 43.9% 12.6% 26.8% 16.7%
[libx264 @ 0x7fc67c043600] ref B L0: 68.6% 31.4%
[libx264 @ 0x7fc67c043600] kb/s:10357.28
ffmpeg version 1.0.git Copyright (c) 2000-2012 the FFmpeg developers
built on Dec 24 2012 18:43:25 with llvm-gcc 4.2.1 (LLVM build 2336.1.00)
configuration: --prefix=/usr/local --enable-shared --disable-mmx --enable-libmp3lame --enable-gpl --enable-zlib --enable-libvorbis --enable-libfaac --enable-nonfree --enable-libx264 --enable-pthreads
libavutil 52. 12.100 / 52. 12.100
libavcodec 54. 81.100 / 54. 81.100
libavformat 54. 49.102 / 54. 49.102
libavdevice 54. 3.102 / 54. 3.102
libavfilter 3. 29.101 / 3. 29.101
libswscale 2. 1.103 / 2. 1.103
libswresample 0. 17.102 / 0. 17.102
libpostproc 52. 2.100 / 52. 2.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'rotate.MOV':
Metadata:
major_brand : qt
minor_version : 512
compatible_brands: qt
encoder : Lavf54.49.102
Duration: 00:00:05.01, start: 0.010667, bitrate: 10446 kb/s
Stream #0:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuvj420p, 1080x1920, 10358 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc
Metadata:
handler_name : DataHandler
Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 109 kb/s
Metadata:
handler_name : DataHandler
[libx264 @ 0x7fcf1c05ba00] using cpu capabilities: MMX2 SSE2Fast SSSE3 FastShuffle SSE4.2 AVX
[libx264 @ 0x7fcf1c05ba00] profile High, level 3.0
[libx264 @ 0x7fcf1c05ba00] 264 - core 128 - H.264/MPEG-4 AVC codec - Copyleft 2003-2012 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mov, to 'toutp.MOV':
Metadata:
major_brand : qt
minor_version : 512
compatible_brands: qt
encoder : Lavf54.49.102
Stream #0:0(eng): Video: h264 (avc1 / 0x31637661), yuvj420p, 406x720, q=-1--1, 12800 tbn, 25 tbc
Metadata:
handler_name : DataHandler
Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, stereo, s16, 128 kb/s
Metadata:
handler_name : DataHandler
Stream mapping:
Stream #0:0 -> #0:0 (h264 -> libx264)
Stream #0:1 -> #0:1 (aac -> libfaac)
Press [q] to stop, [?] for help
frame= 125 fps= 32 q=-2.0 Lsize= 469kB time=00:00:05.01 bitrate= 766.2kbits/s
video:396kB audio:68kB subtitle:0 global headers:0kB muxing overhead 1.088481%
[libx264 @ 0x7fcf1c05ba00] frame I:1 Avg QP:24.20 size: 31395
[libx264 @ 0x7fcf1c05ba00] frame P:42 Avg QP:24.51 size: 7137
[libx264 @ 0x7fcf1c05ba00] frame B:82 Avg QP:27.71 size: 893
[libx264 @ 0x7fcf1c05ba00] consecutive B-frames: 0.8% 16.0% 57.6% 25.6%
[libx264 @ 0x7fcf1c05ba00] mb I I16..4: 7.0% 62.2% 30.8%
[libx264 @ 0x7fcf1c05ba00] mb P I16..4: 0.2% 0.8% 0.1% P16..4: 52.8% 18.7% 12.5% 0.0% 0.0% skip:14.9%
[libx264 @ 0x7fcf1c05ba00] mb B I16..4: 0.0% 0.0% 0.0% B16..8: 30.0% 0.8% 0.1% direct: 0.9% skip:68.2% L0:36.0% L1:62.1% BI: 2.0%
[libx264 @ 0x7fcf1c05ba00] 8x8 transform intra:66.4% inter:78.3%
[libx264 @ 0x7fcf1c05ba00] coded y,uvDC,uvAC intra: 79.3% 93.6% 84.9% inter: 13.6% 32.2% 10.4%
[libx264 @ 0x7fcf1c05ba00] i16 v,h,dc,p: 50% 19% 7% 24%
[libx264 @ 0x7fcf1c05ba00] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 20% 20% 11% 5% 8% 10% 9% 7% 8%
[libx264 @ 0x7fcf1c05ba00] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 27% 22% 8% 5% 8% 8% 10% 5% 5%
[libx264 @ 0x7fcf1c05ba00] i8c dc,h,v,p: 42% 22% 24% 13%
[libx264 @ 0x7fcf1c05ba00] Weighted P-Frames: Y:0.0% UV:0.0%
[libx264 @ 0x7fcf1c05ba00] ref P L0: 48.7% 16.3% 25.6% 9.4%
[libx264 @ 0x7fcf1c05ba00] ref B L0: 73.7% 21.3% 5.0%
[libx264 @ 0x7fcf1c05ba00] ref B L1: 90.7% 9.3%
[libx264 @ 0x7fcf1c05ba00] kb/s:647.04