Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • How to encode jpeg images to H264 very fast (transform images to video)

    26 février, par Paul

    I have 30 JPEG images (.jpg) at a resolution of 480 x 640. Each image takes aboout 20KB (all of them takes about 600KB).

    I am using FFmpeg command to encode these images into a video in H264 format.

    I need this to be done very fast - about 1 second.

    Using the classic command:

    ffmpeg -y  -f  image2   -r 1/5   -i image_%d.jpg   -c:v libx264   -r 30   video.mp4
    

    takes about 90 seconds.

    After adding -preset ultrafast:

    ffmpeg -y  -f  image2   -r 1/5   -i image_%d.jpg   -c:v libx264   -preset ultrafast    -r 30   video.mp4
    

    the encoding takes about 15 seconds which is much better, but still not enough

    I've tried others parameters also, like:

    -profile:v baseline
    
    -qscale:v
    
    -b:v 1000k
    
    -crf 24
    

    but the encoding time does not fall below 10 seconds.

    I'm not familiar with FFmpeg commands nor with the parameters I need to use, and this is the reason I post here this question.

    The video quality needs to be ok, doesn't need to be perfect.

    As a note: I am running these commands in an Android application where I have the ffmpeg executable, using an ProcessBuilder.

    Reply1 (to Robert Rowntree):

    ArrayList l2 = new ArrayList();
    
            //l2.add("ffmpeg");
            l2.add("/data/data/" + packageName + "/ffmpeg");
            l2.add("-y");
            l2.add("-loop");
            l2.add("1");
    
            l2.add("-i");
            l2.add("frame_%d.jpg");
    
    //            l2.add("-t");
    //            l2.add(strngs[3]);
    
            l2.add("-r");
            l2.add("1/2");
            l2.add("-preset");
            l2.add("superfast");
            l2.add("-tune");
            l2.add("zerolatency");
    
    //            l2.add("-pass");
    //            l2.add(Integer.valueOf(pass).toString());
    
            l2.add("-vcodec");
            l2.add("libx264");
            l2.add("-b:v");
            l2.add("200k");
            l2.add("-bt");
            l2.add("50k");
            l2.add("-threads");
            l2.add("0");
            l2.add("-b_strategy");
            l2.add("1");
    
    //            if(pass ==1){
    //                l2.add("-an");
    //            } else {
    //                l2.add("-acodec");
    //                l2.add("copy");
    //            }
    
            l2.add("-f");
            l2.add("mp4");
            l2.add("-strict");
            l2.add("-2");
    //            l2.add("-passlogfile");
    //            l2.add(strngs[4]);
    
    //            if(pass ==1){
    //                l2.add("/dev/null");
    //            } else {
    //                l2.add(strngs[5]);
    //            }
    
            l2.add("video.mp4");
            //return l2;
    
  • FFMpeg Coding in C : Encoder returns EOF at first interaction. Encoder not opened correctly ? [closed]

    26 février, par Davidhohey

    as I'm fairly new to FFMpeg Programming and C in general, the code looks like a mess.

    I have smashed my head against a wall trying to get this code to work for about a week.

    int decode_encode_pipeline(AVFormatContext *Input_Format_Context, AVFormatContext *Output_Format_Context, int *streams_list){
    
        const AVCodec *DECodec, *ENCodec;
        AVCodecContext *DECodecContext = NULL, *ENCodecContext = NULL;
        AVCodecParameters *CodecParameters = NULL;
        AVDictionary *opts = NULL;
        AVPacket *Packet;
        AVFrame *Frame;
        int check;
    
        Packet = av_packet_alloc();
        if(!Packet){
        
            printf("\nFehler bei Allocating Packet");
        
            return 0;
        
        }
    
        Frame = av_frame_alloc();
        if(!Frame){
        
            printf("\nFehler bei Allocating Frame");
        
            return 0;
        
        }
    
        CodecParameters = Input_Format_Context->streams[Packet->stream_index]->codecpar;
        if(!CodecParameters){
    
            printf("\nCodecParameters konnte nicht erstellt oder zugewiesen werden.");
    
        }
    
        DECodec = avcodec_find_decoder(CodecParameters->codec_id);
        if(!DECodec){
        
            printf("\nCodec nicht gefunden");
        
            return 0;
        
        }
    
        DECodecContext = avcodec_alloc_context3(DECodec);
        if (!DECodecContext){
        
            printf("\nFehler bei Allocating CodecContext");
        
            return 0;
        
        }
    
        ENCodec = avcodec_find_encoder(CodecParameters->codec_id);
        if(!DECodec){
        
            printf("\nCodec nicht gefunden");
        
            return 0;
        
        }
    
        ENCodecContext = avcodec_alloc_context3(ENCodec);
        if (!ENCodecContext){
        
            printf("\nFehler bei Allocating CodecContext");
        
            return 0;
        
        }
    
        check = avformat_write_header(Output_Format_Context, &opts);
        if(check < 0){
    
            printf("\nFehler beim Öffnen des Output Files.");
            
            return 1;
    
        }
    
        avcodec_parameters_to_context(DECodecContext, CodecParameters);
        avcodec_parameters_to_context(ENCodecContext, CodecParameters);
    
        ENCodecContext->width = DECodecContext->width;
        ENCodecContext->height = DECodecContext->height;
        ENCodecContext->bit_rate = DECodecContext->bit_rate;
        ENCodecContext->time_base = (AVRational){1, 30};
        ENCodecContext->framerate = DECodecContext->framerate;
        ENCodecContext->gop_size = DECodecContext->gop_size;
        ENCodecContext->max_b_frames = DECodecContext->max_b_frames;
        ENCodecContext->pix_fmt = DECodecContext->pix_fmt;
        if(ENCodec->id == AV_CODEC_ID_H264){
    
            av_opt_set(ENCodecContext->priv_data, "preset", "slow", 0);
    
        }
    
        check = avcodec_open2(DECodecContext, DECodec, NULL);
        if(check < 0){
        
            printf("\nFehler bei Öffnen von DECodec");
        
            return 1;
        
        }
    
        check = avcodec_open2(ENCodecContext, ENCodec, NULL);
        if(check < 0){
        
            printf("\nFehler bei Öffnen von ENCodec");
        
            return 1;
        
        }
    
        while(1){
        
            check = av_read_frame(Input_Format_Context, Packet);
            if(check < 0){
            
                break;
            
            }
    
            AVStream *in_stream, *out_stream;
    
            in_stream  = Input_Format_Context->streams[Packet->stream_index];
            out_stream = Output_Format_Context->streams[Packet->stream_index];
    
            if(in_stream->codecpar->codec_type == AVMEDIA_TYPE_VIDEO && Packet->stream_index == streams_list[Packet->stream_index]){
    
                check = avcodec_send_packet(DECodecContext, Packet);
                if(check < 0){
    
                    printf("\nFehler bei Encoding");
    
                    return 1;
    
                }
    
                AVPacket *EncodedPacket;
                EncodedPacket = av_packet_alloc();
                if(!EncodedPacket){
            
                    printf("\nFehler bei Allocating Packet");
            
                    return 1;
            
                }
    
                /*While Loop Decoding*/
                while(check >= 0){
        
                    check = avcodec_receive_frame(DECodecContext, Frame);
                    if(check == AVERROR(EAGAIN)){
            
                        continue;
            
                    }else if(check == AVERROR_EOF){
                        
                        break;
                        
                    }else if(check < 0){
            
                        printf("\nFehler bei Decoding");
            
                        return 1;
            
                    }
    
                    /*Convert Colorspace*/
                    struct SwsContext *SwsContexttoRGB = sws_getContext(Frame->width, Frame->height, Frame->format, Frame->width, Frame->height, AV_PIX_FMT_RGB24, SWS_BILINEAR, NULL, NULL, NULL);
                    struct SwsContext *SwsContexttoOriginal = sws_getContext(Frame->width, Frame->height, AV_PIX_FMT_RGB24, Frame->width, Frame->height, Frame->format, SWS_BILINEAR, NULL, NULL, NULL);
                    if(!SwsContexttoRGB || !SwsContexttoOriginal){
    
                        printf("\nSwsContext konnte nicht befüllt werden.");
    
                        return 1;
    
                    }   
    
                    if(Frame->linesize < 0){
    
                        printf("\nFehler: linesize ist negativ und nicht kompatibel\n");
    
                        return 1;
    
                    }
    
                    AVFrame *RGBFrame;
                    RGBFrame = av_frame_alloc();
                    if(!RGBFrame){
    
                        printf("\nFehler bei der Reservierung für den RGBFrame");
    
                        return 1;
    
                    }
                    /*
                    int number_bytes = av_image_get_buffer_size(AV_PIX_FMT_RGB24, Frame->width, Frame->height, 1);
                    if(number_bytes < 0){
    
                        printf("\nFehler bei der Berechnung der benoetigten Bytes fuer Konvertierung");
    
                        return 1;
    
                    }
                    
                    uint8_t *rgb_buffer = (uint8_t *)av_malloc(number_bytes*sizeof(uint8_t));
                    if(rgb_buffer == NULL){
    
                        printf("\nFehler bei der Reservierung für den RGBBuffer");
    
                        return 1;
    
                    }
    
                    check = av_image_fill_arrays(RGBFrame->data, RGBFrame->linesize, rgb_buffer, AV_PIX_FMT_RGB24, Frame->width, Frame->height, 1);
                    if(check < 0){
    
                        printf("\nFehler bei der Zuweisung der RGB Daten");
    
                        return 1;
    
                    }*/
    
                    //sws_scale(SwsContexttoRGB, (const uint8_t * const *)Frame->data, Frame->linesize, 0, Frame->height, RGBFrame->data, RGBFrame->linesize);
                    sws_scale_frame(SwsContexttoRGB, Frame, RGBFrame);
                    printf("\nIch habe die Daten zu RGB konvertiert.");
    
                    //sws_scale(SwsContexttoOriginal, (const uint8_t * const *)RGBFrame->data, RGBFrame->linesize, 0, Frame->height, Frame->data, Frame->linesize);
                    sws_scale_frame(SwsContexttoOriginal, RGBFrame, Frame);
                    printf("\nIch habe die Daten zurück ins Original konvertiert.");
    
                    Frame->format = ENCodecContext->pix_fmt;
                    Frame->width  = ENCodecContext->width;
                    Frame->height = ENCodecContext->height;
                    
                    check = av_frame_get_buffer(Frame, 0);
                    if(check < 0){
            
                        printf("\nFehler bei Allocating Frame Buffer");
            
                        return 1;
            
                    }
    
                    /* Encoding */
                    check = av_frame_make_writable(Frame);
                    if(check < 0){
    
                        printf("\nFehler bei Make Frame Writable");
    
                        return 1;
    
                    }
    
                    encode(ENCodecContext, Frame, EncodedPacket, Output_Format_Context);
    
                    sws_freeContext(SwsContexttoRGB);
                    sws_freeContext(SwsContexttoOriginal);
                    av_frame_free(&RGBFrame);
                    //av_free(rgb_buffer);
    
                }
    
                /* Flushing Encoder */
                encode(ENCodecContext, NULL, EncodedPacket, Output_Format_Context);
    
                //avcodec_flush_buffers(DECodecContext);
                //avcodec_flush_buffers(ENCodecContext);
    
                av_packet_free(&EncodedPacket);
    
            }else{
    
                av_interleaved_write_frame(Output_Format_Context, Packet);
    
            }
    
        }
    
        av_write_trailer(Output_Format_Context); 
    
        /* Memory Free */
        avcodec_free_context(&DECodecContext);
        avcodec_free_context(&ENCodecContext);
        avcodec_parameters_free(&CodecParameters);
        av_frame_free(&Frame);
        av_packet_free(&Packet);
    
        return 0;
    
    }
    
    

    The function encode looks as follows:

    static void encode(AVCodecContext *ENCodecContext, AVFrame *Frame, AVPacket *EncodedPacket, AVFormatContext *Output_Format_Context){
    
        int check;
    
    
    
        check = avcodec_send_frame(ENCodecContext, Frame);
        if(check == AVERROR(EAGAIN)){
            printf("\nEAGAIN");
        } 
        if(check == AVERROR_EOF){
            printf("\nEOF");
        }
        if(check == AVERROR(EINVAL)){
            printf("\nEINVAL");
        }
        if(check == AVERROR(ENOMEM)){
            printf("\nENOMEM");
        }
        if(check < 0){
    
            printf("\nFehler bei Encoding Send Frame. Check = %d", check);
    
            return;
    
        }
    
        while(check >= 0){
    
            check = avcodec_receive_packet(ENCodecContext, EncodedPacket);
            if(check == AVERROR(EAGAIN) || check == AVERROR_EOF){
    
                return;
    
            }else if(check < 0){
    
                printf("\nFehler bei Encoding");
    
                return;
    
            }
    
            if (av_interleaved_write_frame(Output_Format_Context, EncodedPacket) < 0) {
    
                printf("\nFehler beim Muxen des Paketes.");
                break;
    
            }
    
            av_packet_unref(EncodedPacket);
    
        }
    
        return;
    
    }
    

    The program should decode a video into the individual frames convert them to RGB24, so I can work with the raw data of the frame, then convert it back to the original format and encode the frames.

    The encoder doesn't play nice, as I get an EOF error at avcodec_send_frame(). But I couldn't figure it out why the encoder behaves like this. And yes I have read the docs and example files, but either I'm massivly missing a crucial detail or I'm just ****.

    Any and all help will be and is massivly appreciated.

    PS.: The used libraries are libavutil, libavformat, libavcodec, libswscale. All installed with the "-dev" suffix through linux commandline. Should all be the version 7.0 libraries.

    Thanks in advance. With best regards.

    • Read the docs
    • Shifting the encoding step out of the decoding while loop
  • FFmpeg WASM writeFile Stalls and Doesn't Complete in React App with Ant Design

    26 février, par raiyan khan

    I'm using FFmpeg WebAssembly (WASM) in a React app to process and convert a video file before uploading it. The goal is to resize the video to 720p using FFmpeg before sending it to the backend.

    Problem:

    Everything works up to fetching the file and confirming it's loaded into memory, but FFmpeg hangs at ffmpeg.writeFile() and does not proceed further. No errors are thrown.

    Code Snippet:

    • Loading FFmpeg

       const loadFFmpeg = async () => {
       if (loaded) return; // Avoid reloading if 
       already loaded
      
       const baseURL = 'https://unpkg.com/@ffmpeg/core@0.12.6/dist/umd';
       const ffmpeg = ffmpegRef.current;
       ffmpeg.on('log', ({ message }) => {
           messageRef.current.innerHTML = message;
           console.log(message);
       });
       await ffmpeg.load({
           coreURL: await toBlobURL(`${baseURL}/ffmpeg-core.js`, 'text/javascript'),
           wasmURL: await toBlobURL(`${baseURL}/ffmpeg-core.wasm`, 'application/wasm'),
       });
       setLoaded(true);
       };
      
       useEffect(() => {
       loadFFmpeg()
       }, [])
      
    • Fetching and Writing File

        const convertVideoTo720p = async (videoFile) => {
             console.log("Starting video 
           conversion...");
      
      
      
       const { height } = await getVideoMetadata(videoFile);
       console.log(`Video height: ${height}`);
      
       if (height <= 720) {
           console.log("No conversion needed.");
           return videoFile;
       }
      
       const ffmpeg = ffmpegRef.current;
       console.log("FFmpeg instance loaded. Writing file to memory...");
      
       const fetchedFile = await fetchFile(videoFile);
       console.log("File fetched successfully:", fetchedFile);
      
       console.log("Checking FFmpeg memory before writing...");
       console.log(`File size: ${fetchedFile.length} bytes (~${(fetchedFile.length / 1024 / 1024).toFixed(2)} MB)`);
      
       if (!ffmpeg.isLoaded()) {
           console.error("FFmpeg is not fully loaded yet!");
           return;
       }
      
       console.log("Memory seems okay. Writing file to FFmpeg...");
       await ffmpeg.writeFile('input.mp4', fetchedFile);  // ❌ This line hangs, nothing after runs
       console.log("File successfully written to FFmpeg memory.");
            };
      

    Debugging Steps I've Tried:

    • Ensured FFmpeg is fully loaded before calling writeFile()ffmpeg.isLoaded() returns true.
    • Checked file fetch process: ✅ fetchFile(videoFile) successfully returns a Uint8Array.
    • Tried renaming the file to prevent caching issues ✅ Used a unique file name like video_${Date.now()}.mp4, but no change
    • Checked browser console for errors: ❌ No errors are displayed.
    • Tried skipping FFmpeg and uploading the raw file instead: ✅ Upload works fine without FFmpeg, so the issue is specific to FFmpeg.

    Expected Behavior

    • ffmpeg.writeFile('input.mp4', fetchedFile); should complete and allow FFmpeg to process the video.

    Actual Behavior

    • Execution stops at writeFile, and no errors are thrown.

    Environment:

    • React: 18.x
    • FFmpeg WASM Version: @ffmpeg/ffmpeg@0.12.15
    • Browser: Chrome 121, Edge 120
    • Operating System: Windows 11

    Question: Why is FFmpeg's writeFile() stalling and never completing? How can I fix or further debug this issue?

    Here is my full code:

    import { useNavigate } from "react-router-dom";
    import { useEffect, useRef, useState } from 'react';
    import { Form, Input, Button, Select, Space } from 'antd';
    const { Option } = Select;
    import { FaAngleLeft } from "react-icons/fa6";
    import { message, Upload } from 'antd';
    import { CiCamera } from "react-icons/ci";
    import { IoVideocamOutline } from "react-icons/io5";
    import { useCreateWorkoutVideoMutation } from "../../../redux/features/workoutVideo/workoutVideoApi";
    import { convertVideoTo720p } from "../../../utils/ffmpegHelper";
    import { FFmpeg } from '@ffmpeg/ffmpeg';
    import { fetchFile, toBlobURL } from '@ffmpeg/util';
    
    
    const AddWorkoutVideo = () => {
        const [videoFile, setVideoFile] = useState(null);
        const [imageFile, setImageFile] = useState(null);
        const [loaded, setLoaded] = useState(false);
        const ffmpegRef = useRef(new FFmpeg());
        const videoRef = useRef(null);
        const messageRef = useRef(null);
        const [form] = Form.useForm();
        const [createWorkoutVideo, { isLoading }] = useCreateWorkoutVideoMutation()
        const navigate = useNavigate();
    
        const videoFileRef = useRef(null); // Use a ref instead of state
    
    
        // Handle Video Upload
        const handleVideoChange = ({ file }) => {
            setVideoFile(file.originFileObj);
        };
    
        // Handle Image Upload
        const handleImageChange = ({ file }) => {
            setImageFile(file.originFileObj);
        };
    
        // Load FFmpeg core if needed (optional if you want to preload)
        const loadFFmpeg = async () => {
            if (loaded) return; // Avoid reloading if already loaded
    
            const baseURL = 'https://unpkg.com/@ffmpeg/core@0.12.6/dist/umd';
            const ffmpeg = ffmpegRef.current;
            ffmpeg.on('log', ({ message }) => {
                messageRef.current.innerHTML = message;
                console.log(message);
            });
            await ffmpeg.load({
                coreURL: await toBlobURL(`${baseURL}/ffmpeg-core.js`, 'text/javascript'),
                wasmURL: await toBlobURL(`${baseURL}/ffmpeg-core.wasm`, 'application/wasm'),
            });
            setLoaded(true);
        };
    
        useEffect(() => {
            loadFFmpeg()
        }, [])
    
        // Helper: Get video metadata (width and height)
        const getVideoMetadata = (file) => {
            return new Promise((resolve, reject) => {
                const video = document.createElement('video');
                video.preload = 'metadata';
                video.onloadedmetadata = () => {
                    resolve({ width: video.videoWidth, height: video.videoHeight });
                };
                video.onerror = () => reject(new Error('Could not load video metadata'));
                video.src = URL.createObjectURL(file);
            });
        };
    
        // Inline conversion helper function
        // const convertVideoTo720p = async (videoFile) => {
        //     // Check the video resolution first
        //     const { height } = await getVideoMetadata(videoFile);
        //     if (height <= 720) {
        //         // No conversion needed
        //         return videoFile;
        //     }
        //     const ffmpeg = ffmpegRef.current;
        //     // Load ffmpeg if not already loaded
        //     // await ffmpeg.load({
        //     //     coreURL: await toBlobURL(`${baseURL}/ffmpeg-core.js`, 'text/javascript'),
        //     //     wasmURL: await toBlobURL(`${baseURL}/ffmpeg-core.wasm`, 'application/wasm'),
        //     // });
        //     // Write the input file to the ffmpeg virtual FS
        //     await ffmpeg.writeFile('input.mp4', await fetchFile(videoFile));
        //     // Convert video to 720p (scale filter maintains aspect ratio)
        //     await ffmpeg.exec(['-i', 'input.mp4', '-vf', 'scale=-1:720', 'output.mp4']);
        //     // Read the output file
        //     const data = await ffmpeg.readFile('output.mp4');
        //     console.log(data, 'data from convertVideoTo720p');
        //     const videoBlob = new Blob([data.buffer], { type: 'video/mp4' });
        //     return new File([videoBlob], 'output.mp4', { type: 'video/mp4' });
        // };
        const convertVideoTo720p = async (videoFile) => {
            console.log("Starting video conversion...");
    
            // Check the video resolution first
            const { height } = await getVideoMetadata(videoFile);
            console.log(`Video height: ${height}`);
    
            if (height <= 720) {
                console.log("No conversion needed. Returning original file.");
                return videoFile;
            }
    
            const ffmpeg = ffmpegRef.current;
            console.log("FFmpeg instance loaded. Writing file to memory...");
    
            // await ffmpeg.writeFile('input.mp4', await fetchFile(videoFile));
            // console.log("File written. Starting conversion...");
            console.log("Fetching file for FFmpeg:", videoFile);
            const fetchedFile = await fetchFile(videoFile);
            console.log("File fetched successfully:", fetchedFile);
            console.log("Checking FFmpeg memory before writing...");
            console.log(`File size: ${fetchedFile.length} bytes (~${(fetchedFile.length / 1024 / 1024).toFixed(2)} MB)`);
    
            if (fetchedFile.length > 50 * 1024 * 1024) { // 50MB limit
                console.error("File is too large for FFmpeg WebAssembly!");
                message.error("File too large. Try a smaller video.");
                return;
            }
    
            console.log("Memory seems okay. Writing file to FFmpeg...");
            const fileName = `video_${Date.now()}.mp4`; // Generate a unique name
            console.log(`Using filename: ${fileName}`);
    
            await ffmpeg.writeFile(fileName, fetchedFile);
            console.log(`File successfully written to FFmpeg memory as ${fileName}.`);
    
            await ffmpeg.exec(['-i', 'input.mp4', '-vf', 'scale=-1:720', 'output.mp4']);
            console.log("Conversion completed. Reading output file...");
    
            const data = await ffmpeg.readFile('output.mp4');
            console.log("File read successful. Creating new File object.");
    
            const videoBlob = new Blob([data.buffer], { type: 'video/mp4' });
            const convertedFile = new File([videoBlob], 'output.mp4', { type: 'video/mp4' });
    
            console.log(convertedFile, "converted video from convertVideoTo720p");
    
            return convertedFile;
        };
    
    
        const onFinish = async (values) => {
            // Ensure a video is selected
            if (!videoFileRef.current) {
                message.error("Please select a video file.");
                return;
            }
    
            // Create FormData
            const formData = new FormData();
            if (imageFile) {
                formData.append("image", imageFile);
            }
    
            try {
                message.info("Processing video. Please wait...");
    
                // Convert the video to 720p only if needed
                const convertedVideo = await convertVideoTo720p(videoFileRef.current);
                console.log(convertedVideo, 'convertedVideo from onFinish');
    
                formData.append("media", videoFileRef.current);
    
                formData.append("data", JSON.stringify(values));
    
                // Upload manually to the backend
                const response = await createWorkoutVideo(formData).unwrap();
                console.log(response, 'response from add video');
    
                message.success("Video added successfully!");
                form.resetFields(); // Reset form
                setVideoFile(null); // Clear file
    
            } catch (error) {
                message.error(error.data?.message || "Failed to add video.");
            }
    
            // if (videoFile) {
            //     message.info("Processing video. Please wait...");
            //     try {
            //         // Convert the video to 720p only if needed
            //         const convertedVideo = await convertVideoTo720p(videoFile);
            //         formData.append("media", convertedVideo);
            //     } catch (conversionError) {
            //         message.error("Video conversion failed.");
            //         return;
            //     }
            // }
            // formData.append("data", JSON.stringify(values)); // Convert text fields to JSON
    
            // try {
            //     const response = await createWorkoutVideo(formData).unwrap();
            //     console.log(response, 'response from add video');
    
            //     message.success("Video added successfully!");
            //     form.resetFields(); // Reset form
            //     setFile(null); // Clear file
            // } catch (error) {
            //     message.error(error.data?.message || "Failed to add video.");
            // }
        };
    
        const handleBackButtonClick = () => {
            navigate(-1); // This takes the user back to the previous page
        };
    
        const videoUploadProps = {
            name: 'video',
            // action: 'https://660d2bd96ddfa2943b33731c.mockapi.io/api/upload',
            // headers: {
            //     authorization: 'authorization-text',
            // },
            // beforeUpload: (file) => {
            //     const isVideo = file.type.startsWith('video/');
            //     if (!isVideo) {
            //         message.error('You can only upload video files!');
            //     }
            //     return isVideo;
            // },
            // onChange(info) {
            //     if (info.file.status === 'done') {
            //         message.success(`${info.file.name} video uploaded successfully`);
            //     } else if (info.file.status === 'error') {
            //         message.error(`${info.file.name} video upload failed.`);
            //     }
            // },
            beforeUpload: (file) => {
                const isVideo = file.type.startsWith('video/');
                if (!isVideo) {
                    message.error('You can only upload video files!');
                    return Upload.LIST_IGNORE; // Prevents the file from being added to the list
                }
                videoFileRef.current = file; // Store file in ref
                // setVideoFile(file); // Store the file in state instead of uploading it automatically
                return false; // Prevent auto-upload
            },
        };
    
        const imageUploadProps = {
            name: 'image',
            action: 'https://660d2bd96ddfa2943b33731c.mockapi.io/api/upload',
            headers: {
                authorization: 'authorization-text',
            },
            beforeUpload: (file) => {
                const isImage = file.type.startsWith('image/');
                if (!isImage) {
                    message.error('You can only upload image files!');
                }
                return isImage;
            },
            onChange(info) {
                if (info.file.status === 'done') {
                    message.success(`${info.file.name} image uploaded successfully`);
                } else if (info.file.status === 'error') {
                    message.error(`${info.file.name} image upload failed.`);
                }
            },
        };
        return (
            <>
                

    Add Video

    Adding Video

    / style={{ maxWidth: 600, margin: '0 auto' }} > {/* Section 1 */} {/* */} {/* */}
    {/* Video */} Upload Video} name="media" className="responsive-form-item" // rules={[{ required: true, message: 'Please enter the package amount!' }]} > {/* Thumbnail */} Upload Image} name="image" className="responsive-form-item" // rules={[{ required: true, message: 'Please enter the package amount!' }]} > {/* Title */} Video Title} name="name" className="responsive-form-item-section-2" >
    {/*
    */} {/*
    */} {/* Submit Button */}
    {isLoading ? 'Uploading...' : 'Upload'}
    > ) } export default AddWorkoutVideo

    Would appreciate any insights or suggestions. Thanks!

  • ffmpeg fails on conversion from .mov to .mp4 fails aT 98%

    26 février, par user2241249

    Our code is below. When converting a uploaded .mov to .mp4 the conversion halts at 98%.

    Almost all of our other preferred formats encode flawlessly. Anyone has any clue on where we went wrong? We are struggling on this for a while now so we want to ask the experts, thanks in advance for any help.

    <?php
    $Path = dirname(__FILE__) . "/";
    $url = "http://" . $_SERVER['SERVER_NAME'] . str_replace('\\', '/', dirname($_SERVER['SCRIPT_NAME'])) . "/";
    
    session_start();
    include_once 'inc/config.inc.php';
    include_once 'common.php';
    include_once 'inc/ffmpegprogressbar.class.php';
    
    ob_flush();
    ?>
    
    
        
        
        <?php
    
    // Specifie Inputfile for FFMPEG
        $count = count($_SESSION['Files']);
    // $file  = array_pop($_SESSION['Files']);
    
        $FileInstance = array_pop($_SESSION['Files']);
        //$FileInstance = $_SESSION['Files'][0];
        $file = $FileInstance['FileName'];
        $passNeeded = $FileInstance['PassNeeded'];
        $fmt = $FileInstance['fmt'];
    
        $FFMPEGInput = $Path . 'data/' . $_SESSION['OldSession'] . "/" . $file;
    
        $timeUnique = time();
        //echo "Start:Session holds: ".$_SESSION["cstatus"].", cstatus var =   $cstatus";
    
    
      /*  if ($FileInstance['PassNeeded'] == 5) {
            $FFMPEGParams = "-r 1000 -ar 11025 -ab 32 -f $fmt -sameq -g 15 -s 1920x1080";
        } else if ($FileInstance['PassNeeded'] == 4) {
            $FFMPEGParams ="-r 1000 -ar 11025 -ab 32 -f $fmt -sameq -g 15 -s 1280x720";
        } else if ($FileInstance['PassNeeded'] == 3) {
            $FFMPEGParams = "-r 1000 -ar 11025 -ab 32 -f $fmt -sameq -g 15 -s 854x480";
        } else if ($FileInstance['PassNeeded'] == 2) {
            $FFMPEGParams = "-r 1000 -ar 11025 -ab 32 -f $fmt -sameq -g 15 -s 640x360";
        }
        */
    if($fmt=="flv"){
    if ($FileInstance['PassNeeded'] == 5) {
            $FFMPEGParams = "-r 1000 -ar 11025 -ab 32 -f $fmt -sameq -g 15 -s 1920x1080";
        } else if ($FileInstance['PassNeeded'] == 4) {
            $FFMPEGParams ="-r 1000 -ar 11025 -ab 32 -f $fmt -sameq -g 15 -s 1280x720";
        } else if ($FileInstance['PassNeeded'] == 3) {
            $FFMPEGParams = "-r 1000 -ar 11025 -ab 32 -f $fmt -sameq -g 15 -s 854x480";
        } else if ($FileInstance['PassNeeded'] == 2) {
            $FFMPEGParams = "-r 1000 -ar 11025 -ab 32 -f $fmt -sameq -g 15 -s 640x360";
        }
    }elseif($fmt=="mp4"){
    $FFMPEGParams = " -pix_fmt yuv420p -s 640x360";
    
    }elseif($fmt=="ogv"){
    $FFMPEGParams = " -acodec libvorbis -ac 2 -ab 96k -ar 44100 -b 125k -s 640x360";
    
    }elseif($fmt=="webm"){
    $FFMPEGParams = " -acodec libvorbis -ac 2 -ab 96k -ar 44100 -b 125k -s 640x360";
    //}elseif($fmt!==""){
    //-i %1 -s qvga -acodec libfaac -ar 22050 -ab 128k -vcodec libx264 -threads 0 -f   ipod %2
    //-vcodec mpeg4 -acodec aac output.mp4
    //$FFMPEGParams = "-r 20 -ar 44100 -ab 196 -f $fmt -s 480x351";
    }
        if ($FileInstance['PassNeeded'] > 2) {
            $FileInstance['PassNeeded'] = $FileInstance['PassNeeded'] - 1;
            array_unshift($_SESSION['Files'], $FileInstance);
        }
    
        $flv_rpath = 'data/' . $_SESSION['OldSession'] . "/" . $file. $timeUnique . ".".$fmt;
    
         $FFMPEGOutput = $FFMPEGInput . $timeUnique . ".$fmt";
    
        $_SESSION['ConvertedFiles'][$file][] = array('Pass' => $passNeeded, 'OutFile' => $file . $timeUnique . ".$fmt","fmt"=>$fmt);
    
    
    
        if (!$_GET["pkey"]) {
            $pkey = rand();
        } elseif (file_exists('log/' . $_GET["pkey"] . '.ffmpeg')) {
            $pkey = $_GET["pkey"];
        } else {
            $pkey = rand();
        }
    
    // initializing and create ProgressBar
        flush();
        $FFMPEGProgressBar = &new FFMPEGProgressBar();
        flush();
    // Show Progressbar
        if ($count > 0) {
            if ($FileInstance['PassNeeded'] < 3) {
            define('FFMPEG_LIBRARY', '/usr/local/bin/ffmpeg ');
            $extension = "ffmpeg";
            $extension_soname = $extension . "." . PHP_SHLIB_SUFFIX;
            $extension_fullname = PHP_EXTENSION_DIR . "/" . $extension_soname;
                if (!extension_loaded($extension)) {
                    dl($extension_soname) or die("Can't load extension $extension_fullname\n");
                }
                exec(FFMPEG_LIBRARY . " -y -i '" . $FFMPEGInput . "' -vframes 1 -ss 00:00:03 -an -vcodec png -f rawvideo -s 160x90 '$FFMPEGInput.png'");
            }
            $FFMPEGProgressBar->Show($pkey, $count, $url, $passNeeded, $file, $timeUnique,$fmt);
            if (!$_GET["pkey"] || !file_exists('log/' . $_GET["pkey"] . '.ffmpeg')) {
                flush();
                $FFMPEGProgressBar = &new FFMPEGProgressBar();
                flush();
                @$FFMPEGProgressBar->execFFMPEG($FFMPEGInput, $FFMPEGOutput, $FFMPEGParams, $pkey);
    
                flush();
                $_SESSION['new_space']["video"] = 'data/' . $_SESSION['OldSession'] . "/" . $file;
                $_SESSION['new_space']["{$FileInstance['name']}"] = $flv_rpath;
                $_SESSION['new_space']["session"] =   $_SESSION['OldSession'];
            }
        //echo "End:Session holds: ".$_SESSION["cstatus"].", cstatus var = $cstatus";
        } else {
    //  header("Location: $url" . "index.php" );
            echo "<script type=\"text/javascript\">window.location.href=&#x27;" . $url . "&#x27;;</script>No Input";
        }
    // ShowProgressbar
        ?>
    
  • Create Android application which to convert RTSP url into Srt url and play streaming using exoplayer or ffplay [closed]

    25 février, par Anuj Vaish

    Create Android application which to convert RTSP url into Srt url and play streaming using exoplayer or ffplay:

    Step 1: (it's done) Docker convert mp4 file to RTSP URL (rtsp://10.10.15.25:8554/mystream)and verify in ffplay "ffplay -rtsp_transport tcp rtsp://10.10.15.24:8554/mystream"

    Step 2: RTSP URL run on Exoplayer (it's done)

    Pending: Step 3: RTSP Url conversion SRT URL (srt://live.cloudflare.com:778?passphrase=&streamid=) it's playback: srt://live.cloudflare.com:778?passphrase=&streamid=play

    Step 4: Play SRT URL in Exoplayer or ffplay in android application.

    I want to use ffmpeg-kit or ffmpeg-full or same dependency default not support srt protocol in documentation provide is support in android.