
Recherche avancée
Médias (1)
-
MediaSPIP Simple : futur thème graphique par défaut ?
26 septembre 2013, par
Mis à jour : Octobre 2013
Langue : français
Type : Video
Autres articles (111)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)
Sur d’autres sites (11807)
-
Use FFmpeg to crop a video to square format in Android
22 août 2016, par Cédric PortmannI am currently working on an app that is supposed to crop a 16:9 video into a 1:1 video. However I dont get the code working. If possible the software should convert the inputed video as fast as possible. The resolution can vary between 480x480 and 720x720.
If I could choose the postion of the crop frame it would be perfect.
The error I get :
E/FFmpeg: Exception while trying to run: [Ljava.lang.String;@f0c91b8
java.io.IOException: Error running exec(). Command: [/data/user/0/com.android.grafika/files/ffmpeg, -i /storage/emulated/0/Alphacrypt1.mp4 -vf crop=640:256:0:400 -threads 5 -preset ultrafast -strict -2 /storage/emulated/0/YourCroppedMovie.mp4] Working Directory: null Environment: null
at java.lang.ProcessManager.exec(ProcessManager.java:215)
at java.lang.Runtime.exec(Runtime.java:174)
at java.lang.Runtime.exec(Runtime.java:129)
at com.github.hiteshsondhi88.libffmpeg.ShellCommand.run(ShellCommand.java:10)
at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:38)
at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:10)
at android.os.AsyncTask$2.call(AsyncTask.java:295)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
at java.lang.Thread.run(Thread.java:818)
Caused by: java.io.IOException: No such file or directory
at java.lang.ProcessManager.exec(Native Method)
at java.lang.ProcessManager.exec(ProcessManager.java:213)
at java.lang.Runtime.exec(Runtime.java:174)
at java.lang.Runtime.exec(Runtime.java:129)
at com.github.hiteshsondhi88.libffmpeg.ShellCommand.run(ShellCommand.java:10)
at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:38)
at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:10)
at android.os.AsyncTask$2.call(AsyncTask.java:295)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
at java.lang.Thread.run(Thread.java:818)The code I am using :
final String[] cmd = new String[]{"-i /storage/emulated/0/Alphacrypt1.mp4 -vf crop=640:256:0:400 -threads 5 -preset ultrafast -strict -2 /storage/emulated/0/YourCroppedMovie.mp4"};
try {
final FFmpeg ffmpeg = FFmpeg.getInstance(this);
ffmpeg.execute(cmd, new FFmpegExecuteResponseHandler() {
@Override
public void onSuccess(String message) {
Toast.makeText(getApplicationContext(), "Successfully converted!",
Toast.LENGTH_LONG).show();
}
@Override
public void onProgress(String message) {
}
@Override
public void onFailure(String message) {
Toast.makeText(getApplicationContext(), "Fail!"+ message,
Toast.LENGTH_LONG).show();
}
@Override
public void onStart() {
Toast.makeText(getApplicationContext(), "Started!",
Toast.LENGTH_LONG).show();
}
@Override
public void onFinish() {
Toast.makeText(getApplicationContext(), "Stopped!",
Toast.LENGTH_LONG).show();
}
});
} catch (FFmpegCommandAlreadyRunningException e) {
}Thank you for your help.
SOLUTION :
- I did not add the "Load Binary" part, which is necessary to run the
FFmpeg library (http://writingminds.github.io/ffmpeg-android-java/). - The command needs to be splitted using
.split(" ");
as already told by printfmyname - For now I use -vf crop=1080:1080:0:0 to crop the video to a square. (Without messing up the ratio)
- I did not add the "Load Binary" part, which is necessary to run the
-
Use ffmpeg to pan right over an image and then pan left to the original location
4 février 2024, par user1517922I'm using this command to pan to the right for 5 seconds on an image (like a 1080x1080 window moving across a larger image, 500px from the top) :


ffmpeg -loop 1 -r 30 -i image.jpg -vf crop=w=1080:h=1080:x=n:y=500 -frames:v 150 -pix_fmt yuv420p -preset fast -crf 30 video.mp4


I'd like to pan back to the left for 5 seconds to end up at the same spot.


The desired affect is when the video is on loop the square window pans right for 5 seconds then back left for 5 seconds then repeats, smooth and without flicker.


I suspect I need to use zoompan, but I haven't had success trying that.


-
How to convert multiple jpg images to MP4 format using FFmpeg ?
30 janvier 2024, par PDLAOZEI- 

-
I first used the FFmpeg command line to implement this function.

ffmpeg -framerate 1 -pattern_type glob -i "./image/*.jpg" -c:v libx264 -crf 25 -vf format=yuv420p -movflags +faststart test.mp4


-
Then I use FFmpeg's api to convert multiple jpg into MP4 format, the process is probably.








- 

- Find the AV_CODEC_ID_MPEG4 encoder.
- Turn on the encoder.
- Set video stream parameters.
- Open the output file.
- Write file header.
- Allocate image frame rate and buffer.
- Write each jpg image to the video (write in a loop).
7-1. read jpg image.
7-2. Find video stream information.
7-3. Decoded image frame.
7-4. Transcode to yuv420p.
7-5. Encode and write video frames.
- Finally, free memory.


















update the reproducible example


#include <iostream>
#include <vector>
#include <string>
#include <chrono>
#include <thread>

#include <filesystem>
#include <algorithm>
#include <iomanip>
#include <sys></sys>stat.h>

extern "C"
{
#include <libavcodec></libavcodec>avcodec.h>
#include <libavformat></libavformat>avformat.h>
#include <libswscale></libswscale>swscale.h>
#include <libavutil></libavutil>imgutils.h>
}

namespace fs = std::filesystem;

bool compareFilenames(const std::string &str1, const std::string &str2)
{
 //ToDo Sort
}

int main()
{
 
 fs::path folderPath = "" //replace your image foldePath

 std::vector imageFiles;
 for (const auto &entry : fs::directory_iterator(folderPath))
 {
 if (entry.is_regular_file() && entry.path().extension() == ".jpg")
 {
 imageFiles.push_back(entry.path());
 }
 }

 std::sort(imageFiles.begin(), imageFiles.end(), compareFilenames);

 std::string outputVideoFile = "output.mp4";

 AVFormatContext *formatContext = nullptr;
 AVCodecContext *codecContext = nullptr;
 AVStream *videoStream = nullptr;

 av_register_all();
 avformat_alloc_output_context2(&formatContext, nullptr, nullptr, outputVideoFile.c_str());


 AVCodec *codec = avcodec_find_encoder(AV_CODEC_ID_MPEG4);//AV_CODEC_ID_H264
 videoStream = avformat_new_stream(formatContext, codec);
 codecContext = avcodec_alloc_context3(codec);

 codecContext->codec_id = codec->id;
 codecContext->codec_type = AVMEDIA_TYPE_VIDEO;
 codecContext->pix_fmt = AV_PIX_FMT_YUV420P;
 codecContext->width = 1920;
 codecContext->height = 1080;
 codecContext->time_base = {1, 2}; 
 codecContext->bit_rate = 100000000;
 codecContext->gop_size = 1;

 avcodec_open2(codecContext, codec, nullptr);
 avcodec_parameters_from_context(videoStream->codecpar, codecContext);

 avio_open(&formatContext->pb, outputVideoFile.c_str(), AVIO_FLAG_WRITE);
 avformat_write_header(formatContext, NULL);

 AVFrame *frame = av_frame_alloc();

 std::vector imageBuffer(codecContext->width * codecContext->height * 3);
 AVPacket packet;

 for (const std::string &imageFile : imageFiles)
 {

 AVFormatContext *inputFormatContext = avformat_alloc_context();
 avformat_open_input(&inputFormatContext, imageFile.c_str(), nullptr, nullptr);

 avformat_find_stream_info(inputFormatContext, nullptr);

 AVCodec *inputCodec = nullptr;
 AVCodecContext *inputCodecContext = nullptr;
 int videoStreamIndex = -1;

 for (unsigned int i = 0; i < inputFormatContext->nb_streams; i++)
 {
 if (inputFormatContext->streams[i]->codecpar->codec_type == AVMEDIA_TYPE_VIDEO)
 {
 videoStreamIndex = i;
 inputCodec = avcodec_find_decoder(inputFormatContext->streams[i]->codecpar->codec_id);
 inputCodecContext = avcodec_alloc_context3(inputCodec);
 avcodec_parameters_to_context(inputCodecContext, inputFormatContext->streams[i]->codecpar);
 avcodec_open2(inputCodecContext, inputCodec, NULL);
 break;
 }
 }

 AVFrame *inputFrame = av_frame_alloc();
 while (av_read_frame(inputFormatContext, &packet) >= 0)
 {
 if (packet.stream_index == videoStreamIndex)
 {
 int response = avcodec_send_packet(inputCodecContext, &packet);

 while (response >= 0)
 {
 response = avcodec_receive_frame(inputCodecContext, inputFrame);
 if (response == AVERROR(EAGAIN) || response == AVERROR_EOF)
 {
 break;
 }
 else if (response < 0)
 {
 std::cerr << "Failed to receive frame from input decoder" << std::endl;
 break;
 }

 AVFrame *yuvFrame = av_frame_alloc();
 av_image_alloc(yuvFrame->data, yuvFrame->linesize, codecContext->width, codecContext->height, codecContext->pix_fmt, 1);

 SwsContext *yuvSwsContext = sws_getContext(inputFrame->width, inputFrame->height, AV_PIX_FMT_RGB24,
 codecContext->width, codecContext->height, codecContext->pix_fmt,
 0, nullptr, nullptr, nullptr);

 sws_scale(yuvSwsContext, inputFrame->data, inputFrame->linesize, 0, inputFrame->height,
 yuvFrame->data, yuvFrame->linesize);


 memcpy(imageBuffer.data(), yuvFrame->data[0], codecContext->width * codecContext->height);
 memcpy(imageBuffer.data() + codecContext->width * codecContext->height, yuvFrame->data[1], codecContext->width * codecContext->height / 4);
 memcpy(imageBuffer.data() + codecContext->width * codecContext->height * 5 / 4, yuvFrame->data[2], codecContext->width * codecContext->height / 4);

 frame->data[0] = imageBuffer.data();
 frame->data[1] = frame->data[0] + codecContext->width * codecContext->height;
 frame->data[2] = frame->data[1] + codecContext->width * codecContext->height / 4;
 frame->linesize[0] = codecContext->width;
 frame->linesize[1] = codecContext->width / 2;
 frame->linesize[2] = codecContext->width / 2;
 frame->width = codecContext->width;
 frame->height = codecContext->height;
 frame->format = codecContext->pix_fmt;
 frame->pts = av_rescale_q(videoStream->nb_frames, videoStream->time_base, videoStream->codec->time_base);

 av_init_packet(&packet);
 packet.data = nullptr;
 packet.size = 0;

 avcodec_send_frame(codecContext, frame);

 while (avcodec_receive_packet(codecContext, &packet) >= 0)
 {
 av_packet_rescale_ts(&packet, codecContext->time_base, videoStream->time_base);
 packet.stream_index = videoStream->index;

 av_write_frame(formatContext, &packet);
 av_packet_unref(&packet);
 }

 av_freep(&yuvFrame->data[0]);
 av_freep(&yuvFrame);
 sws_freeContext(yuvSwsContext);
 }
 }

 av_packet_unref(&packet);
 }

 av_frame_free(&inputFrame);
 avcodec_close(inputCodecContext);
 avformat_close_input(&inputFormatContext);
 }

 av_write_trailer(formatContext);

 av_frame_free(&frame);
 avcodec_close(codecContext);
 av_free(codecContext);
 avio_close(formatContext->pb);
 avformat_free_context(formatContext);

 return 0;
}


</iomanip></algorithm></filesystem></thread></chrono></string></vector></iostream>


But the encoding format is mpeg, which cannot be opened in the player included with windows, and the prompt is an unsupported format.


Although the vlc player can be opened and played normally, I need to play it normally on the player that comes with windows, because the output video will be played on the browser.


So I changed AV_CODEC_ID_MPEG4 to AV_CODEC_ID_H264, and the result is that although the file size of the output image is very large, only one picture is displayed during playback. I don't know what the problem is.


I was thinking about writing a new one, but I had no idea how to write a new one.
I would appreciate it if someone could write me an example.


-