Recherche avancée

Médias (1)

Mot : - Tags -/ogg

Autres articles (92)

  • Automated installation script of MediaSPIP

    25 avril 2011, par

    To overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
    You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
    The documentation of the use of this installation script is available here.
    The code of this (...)

  • Participer à sa documentation

    10 avril 2011

    La documentation est un des travaux les plus importants et les plus contraignants lors de la réalisation d’un outil technique.
    Tout apport extérieur à ce sujet est primordial : la critique de l’existant ; la participation à la rédaction d’articles orientés : utilisateur (administrateur de MediaSPIP ou simplement producteur de contenu) ; développeur ; la création de screencasts d’explication ; la traduction de la documentation dans une nouvelle langue ;
    Pour ce faire, vous pouvez vous inscrire sur (...)

  • Gestion de la ferme

    2 mars 2010, par

    La ferme est gérée dans son ensemble par des "super admins".
    Certains réglages peuvent être fais afin de réguler les besoins des différents canaux.
    Dans un premier temps il utilise le plugin "Gestion de mutualisation"

Sur d’autres sites (7833)

  • How to convert multiple jpg images to MP4 format using FFmpeg ?

    30 janvier 2024, par PDLAOZEI
      

    • I first used the FFmpeg command line to implement this function.
ffmpeg -framerate 1 -pattern_type glob -i "./image/*.jpg" -c:v libx264 -crf 25 -vf format=yuv420p -movflags +faststart test.mp4

      


    • 


    • Then I use FFmpeg's api to convert multiple jpg into MP4 format, the process is probably.

      


    • 


    


      

    1. Find the AV_CODEC_ID_MPEG4 encoder.
    2. 


    3. Turn on the encoder.
    4. 


    5. Set video stream parameters.
    6. 


    7. Open the output file.
    8. 


    9. Write file header.
    10. 


    11. Allocate image frame rate and buffer.
    12. 


    13. Write each jpg image to the video (write in a loop).
7-1. read jpg image.
7-2. Find video stream information.
7-3. Decoded image frame.
7-4. Transcode to yuv420p.
7-5. Encode and write video frames.
    14. 


    15. Finally, free memory.
    16. 


    


    update the reproducible example

    


    #include <iostream>&#xA;#include <vector>&#xA;#include <string>&#xA;#include <chrono>&#xA;#include <thread>&#xA;&#xA;#include <filesystem>&#xA;#include <algorithm>&#xA;#include <iomanip>&#xA;#include <sys></sys>stat.h>&#xA;&#xA;extern "C"&#xA;{&#xA;#include <libavcodec></libavcodec>avcodec.h>&#xA;#include <libavformat></libavformat>avformat.h>&#xA;#include <libswscale></libswscale>swscale.h>&#xA;#include <libavutil></libavutil>imgutils.h>&#xA;}&#xA;&#xA;namespace fs = std::filesystem;&#xA;&#xA;bool compareFilenames(const std::string &amp;str1, const std::string &amp;str2)&#xA;{&#xA;    //ToDo Sort&#xA;}&#xA;&#xA;int main()&#xA;{&#xA;    &#xA;    fs::path folderPath = "" //replace your image foldePath&#xA;&#xA;    std::vector imageFiles;&#xA;    for (const auto &amp;entry : fs::directory_iterator(folderPath))&#xA;    {&#xA;        if (entry.is_regular_file() &amp;&amp; entry.path().extension() == ".jpg")&#xA;        {&#xA;            imageFiles.push_back(entry.path());&#xA;        }&#xA;    }&#xA;&#xA;    std::sort(imageFiles.begin(), imageFiles.end(), compareFilenames);&#xA;&#xA;    std::string outputVideoFile = "output.mp4";&#xA;&#xA;    AVFormatContext *formatContext = nullptr;&#xA;    AVCodecContext *codecContext = nullptr;&#xA;    AVStream *videoStream = nullptr;&#xA;&#xA;    av_register_all();&#xA;    avformat_alloc_output_context2(&amp;formatContext, nullptr, nullptr, outputVideoFile.c_str());&#xA;&#xA;&#xA;    AVCodec *codec = avcodec_find_encoder(AV_CODEC_ID_MPEG4);//AV_CODEC_ID_H264&#xA;    videoStream = avformat_new_stream(formatContext, codec);&#xA;    codecContext = avcodec_alloc_context3(codec);&#xA;&#xA;    codecContext->codec_id = codec->id;&#xA;    codecContext->codec_type = AVMEDIA_TYPE_VIDEO;&#xA;    codecContext->pix_fmt = AV_PIX_FMT_YUV420P;&#xA;    codecContext->width = 1920;&#xA;    codecContext->height = 1080;&#xA;    codecContext->time_base = {1, 2}; &#xA;    codecContext->bit_rate = 100000000;&#xA;    codecContext->gop_size = 1;&#xA;&#xA;    avcodec_open2(codecContext, codec, nullptr);&#xA;    avcodec_parameters_from_context(videoStream->codecpar, codecContext);&#xA;&#xA;    avio_open(&amp;formatContext->pb, outputVideoFile.c_str(), AVIO_FLAG_WRITE);&#xA;    avformat_write_header(formatContext, NULL);&#xA;&#xA;    AVFrame *frame = av_frame_alloc();&#xA;&#xA;    std::vector imageBuffer(codecContext->width * codecContext->height * 3);&#xA;    AVPacket packet;&#xA;&#xA;    for (const std::string &amp;imageFile : imageFiles)&#xA;    {&#xA;&#xA;        AVFormatContext *inputFormatContext = avformat_alloc_context();&#xA;        avformat_open_input(&amp;inputFormatContext, imageFile.c_str(), nullptr, nullptr);&#xA;&#xA;        avformat_find_stream_info(inputFormatContext, nullptr);&#xA;&#xA;        AVCodec *inputCodec = nullptr;&#xA;        AVCodecContext *inputCodecContext = nullptr;&#xA;        int videoStreamIndex = -1;&#xA;&#xA;        for (unsigned int i = 0; i &lt; inputFormatContext->nb_streams; i&#x2B;&#x2B;)&#xA;        {&#xA;            if (inputFormatContext->streams[i]->codecpar->codec_type == AVMEDIA_TYPE_VIDEO)&#xA;            {&#xA;                videoStreamIndex = i;&#xA;                inputCodec = avcodec_find_decoder(inputFormatContext->streams[i]->codecpar->codec_id);&#xA;                inputCodecContext = avcodec_alloc_context3(inputCodec);&#xA;                avcodec_parameters_to_context(inputCodecContext, inputFormatContext->streams[i]->codecpar);&#xA;                avcodec_open2(inputCodecContext, inputCodec, NULL);&#xA;                break;&#xA;            }&#xA;        }&#xA;&#xA;        AVFrame *inputFrame = av_frame_alloc();&#xA;        while (av_read_frame(inputFormatContext, &amp;packet) >= 0)&#xA;        {&#xA;            if (packet.stream_index == videoStreamIndex)&#xA;            {&#xA;                int response = avcodec_send_packet(inputCodecContext, &amp;packet);&#xA;&#xA;                while (response >= 0)&#xA;                {&#xA;                    response = avcodec_receive_frame(inputCodecContext, inputFrame);&#xA;                    if (response == AVERROR(EAGAIN) || response == AVERROR_EOF)&#xA;                    {&#xA;                        break;&#xA;                    }&#xA;                    else if (response &lt; 0)&#xA;                    {&#xA;                        std::cerr &lt;&lt; "Failed to receive frame from input decoder" &lt;&lt; std::endl;&#xA;                        break;&#xA;                    }&#xA;&#xA;                                        AVFrame *yuvFrame = av_frame_alloc();&#xA;                    av_image_alloc(yuvFrame->data, yuvFrame->linesize, codecContext->width, codecContext->height, codecContext->pix_fmt, 1);&#xA;&#xA;                    SwsContext *yuvSwsContext = sws_getContext(inputFrame->width, inputFrame->height, AV_PIX_FMT_RGB24,&#xA;                                                               codecContext->width, codecContext->height, codecContext->pix_fmt,&#xA;                                                               0, nullptr, nullptr, nullptr);&#xA;&#xA;                    sws_scale(yuvSwsContext, inputFrame->data, inputFrame->linesize, 0, inputFrame->height,&#xA;                              yuvFrame->data, yuvFrame->linesize);&#xA;&#xA;&#xA;                    memcpy(imageBuffer.data(), yuvFrame->data[0], codecContext->width * codecContext->height);&#xA;                    memcpy(imageBuffer.data() &#x2B; codecContext->width * codecContext->height, yuvFrame->data[1], codecContext->width * codecContext->height / 4);&#xA;                    memcpy(imageBuffer.data() &#x2B; codecContext->width * codecContext->height * 5 / 4, yuvFrame->data[2], codecContext->width * codecContext->height / 4);&#xA;&#xA;                    frame->data[0] = imageBuffer.data();&#xA;                    frame->data[1] = frame->data[0] &#x2B; codecContext->width * codecContext->height;&#xA;                    frame->data[2] = frame->data[1] &#x2B; codecContext->width * codecContext->height / 4;&#xA;                    frame->linesize[0] = codecContext->width;&#xA;                    frame->linesize[1] = codecContext->width / 2;&#xA;                    frame->linesize[2] = codecContext->width / 2;&#xA;                    frame->width = codecContext->width;&#xA;                    frame->height = codecContext->height;&#xA;                    frame->format = codecContext->pix_fmt;&#xA;                    frame->pts = av_rescale_q(videoStream->nb_frames, videoStream->time_base, videoStream->codec->time_base);&#xA;&#xA;                    av_init_packet(&amp;packet);&#xA;                    packet.data = nullptr;&#xA;                    packet.size = 0;&#xA;&#xA;                    avcodec_send_frame(codecContext, frame);&#xA;&#xA;                    while (avcodec_receive_packet(codecContext, &amp;packet) >= 0)&#xA;                    {&#xA;                        av_packet_rescale_ts(&amp;packet, codecContext->time_base, videoStream->time_base);&#xA;                        packet.stream_index = videoStream->index;&#xA;&#xA;                        av_write_frame(formatContext, &amp;packet);&#xA;                        av_packet_unref(&amp;packet);&#xA;                    }&#xA;&#xA;                    av_freep(&amp;yuvFrame->data[0]);&#xA;                    av_freep(&amp;yuvFrame);&#xA;                    sws_freeContext(yuvSwsContext);&#xA;                }&#xA;            }&#xA;&#xA;            av_packet_unref(&amp;packet);&#xA;        }&#xA;&#xA;        av_frame_free(&amp;inputFrame);&#xA;        avcodec_close(inputCodecContext);&#xA;        avformat_close_input(&amp;inputFormatContext);&#xA;    }&#xA;&#xA;    av_write_trailer(formatContext);&#xA;&#xA;    av_frame_free(&amp;frame);&#xA;    avcodec_close(codecContext);&#xA;    av_free(codecContext);&#xA;    avio_close(formatContext->pb);&#xA;    avformat_free_context(formatContext);&#xA;&#xA;    return 0;&#xA;}&#xA;&#xA;&#xA;</iomanip></algorithm></filesystem></thread></chrono></string></vector></iostream>

    &#xA;

    But the encoding format is mpeg, which cannot be opened in the player included with windows, and the prompt is an unsupported format.

    &#xA;

    Although the vlc player can be opened and played normally, I need to play it normally on the player that comes with windows, because the output video will be played on the browser.

    &#xA;

    So I changed AV_CODEC_ID_MPEG4 to AV_CODEC_ID_H264, and the result is that although the file size of the output image is very large, only one picture is displayed during playback. I don't know what the problem is.

    &#xA;

    I was thinking about writing a new one, but I had no idea how to write a new one.&#xA;I would appreciate it if someone could write me an example.

    &#xA;

  • Use ffmpeg to pan right over an image and then pan left to the original location

    4 février 2024, par user1517922

    I'm using this command to pan to the right for 5 seconds on an image (like a 1080x1080 window moving across a larger image, 500px from the top) :

    &#xA;

    ffmpeg -loop 1 -r 30 -i image.jpg -vf crop=w=1080:h=1080:x=n:y=500 -frames:v 150 -pix_fmt yuv420p -preset fast -crf 30 video.mp4

    &#xA;

    I'd like to pan back to the left for 5 seconds to end up at the same spot.

    &#xA;

    The desired affect is when the video is on loop the square window pans right for 5 seconds then back left for 5 seconds then repeats, smooth and without flicker.

    &#xA;

    I suspect I need to use zoompan, but I haven't had success trying that.

    &#xA;

  • Use FFmpeg to crop a video to square format in Android

    22 août 2016, par Cédric Portmann

    I am currently working on an app that is supposed to crop a 16:9 video into a 1:1 video. However I dont get the code working. If possible the software should convert the inputed video as fast as possible. The resolution can vary between 480x480 and 720x720.

    If I could choose the postion of the crop frame it would be perfect.

    The error I get :

    E/FFmpeg: Exception while trying to run: [Ljava.lang.String;@f0c91b8
    java.io.IOException: Error running exec(). Command: [/data/user/0/com.android.grafika/files/ffmpeg, -i /storage/emulated/0/Alphacrypt1.mp4 -vf crop=640:256:0:400 -threads 5 -preset ultrafast -strict -2 /storage/emulated/0/YourCroppedMovie.mp4] Working Directory: null Environment: null
    at java.lang.ProcessManager.exec(ProcessManager.java:215)
    at java.lang.Runtime.exec(Runtime.java:174)
    at java.lang.Runtime.exec(Runtime.java:129)
    at com.github.hiteshsondhi88.libffmpeg.ShellCommand.run(ShellCommand.java:10)
    at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:38)
    at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:10)
    at android.os.AsyncTask$2.call(AsyncTask.java:295)
    at java.util.concurrent.FutureTask.run(FutureTask.java:237)
    at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
    at java.lang.Thread.run(Thread.java:818)
    Caused by: java.io.IOException: No such file or directory
    at java.lang.ProcessManager.exec(Native Method)
    at java.lang.ProcessManager.exec(ProcessManager.java:213)
    at java.lang.Runtime.exec(Runtime.java:174) 
    at java.lang.Runtime.exec(Runtime.java:129) 
    at com.github.hiteshsondhi88.libffmpeg.ShellCommand.run(ShellCommand.java:10) 
    at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:38) 
    at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:10) 
    at android.os.AsyncTask$2.call(AsyncTask.java:295) 
    at java.util.concurrent.FutureTask.run(FutureTask.java:237) 
    at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588) 
    at java.lang.Thread.run(Thread.java:818) 

    The code I am using :

    final String[] cmd = new String[]{"-i /storage/emulated/0/Alphacrypt1.mp4 -vf crop=640:256:0:400 -threads 5 -preset ultrafast -strict -2 /storage/emulated/0/YourCroppedMovie.mp4"};

       try {
           final FFmpeg ffmpeg = FFmpeg.getInstance(this);
           ffmpeg.execute(cmd, new FFmpegExecuteResponseHandler() {
               @Override
               public void onSuccess(String message) {
                   Toast.makeText(getApplicationContext(), "Successfully converted!",
                           Toast.LENGTH_LONG).show();
               }

               @Override
               public void onProgress(String message) {

               }

               @Override
               public void onFailure(String message) {
                   Toast.makeText(getApplicationContext(), "Fail!"+ message,
                           Toast.LENGTH_LONG).show();
               }

               @Override
               public void onStart() {
                   Toast.makeText(getApplicationContext(), "Started!",
                           Toast.LENGTH_LONG).show();
               }

               @Override
               public void onFinish() {
                   Toast.makeText(getApplicationContext(), "Stopped!",
                           Toast.LENGTH_LONG).show();
               }
           });
       } catch (FFmpegCommandAlreadyRunningException e) {

       }

    Thank you for your help.

    SOLUTION :

    1. I did not add the "Load Binary" part, which is necessary to run the
      FFmpeg library (http://writingminds.github.io/ffmpeg-android-java/).
    2. The command needs to be splitted using .split(" "); as already told by printfmyname
    3. For now I use -vf crop=1080:1080:0:0 to crop the video to a square. (Without messing up the ratio)