
Recherche avancée
Médias (1)
-
La conservation du net art au musée. Les stratégies à l’œuvre
26 mai 2011
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (84)
-
Mise à jour de la version 0.1 vers 0.2
24 juin 2013, parExplications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)
Sur d’autres sites (12424)
-
Getting log line for each extracted frame from FFMPEG
3 février 2016, par wpfwannabeI am using FFMPEG.exe to extract frames from various videos. As this is a programmatic solution and getting the total frame count and/or duration can prove tricky (with ffprobe), I am thinking I could use the console output to detect individual frames’ timestamps but I am getting a single output line every N frames like this :
frame= 20 fps=0.0 q=0.0 size= 0kB time=00:00:01.72 bitrate= 0.0kbits/s
frame= 40 fps= 38 q=0.0 size= 0kB time=00:00:04.02 bitrate= 0.0kbits/s
frame= 60 fps= 39 q=0.0 size= 0kB time=00:00:06.14 bitrate= 0.0kbits/s
frame= 70 fps= 38 q=0.0 Lsize= 0kB time=00:00:07.86 bitrate= 0.0kbits/sIs there a command line option to force output for each and every frame ? If so, I could extract the
time=
portion. This is the command line currently used :ffmpeg.exe -i video.avi -y -threads 0 -vsync 2 %10d.jpeg
Ideally, replacing
%10d.jpeg
with some other format that writes frame’s timestamp but I don’t think this exists. -
C# get dominant color in an image
22 janvier 2017, par CK13I’m building a program that makes screenshots from a video.
It extracts frames from the video (with ffmpeg) and then combines them into one file.All works fine, except sometimes I get (almost) black images, mostly in the beginning and ending of the video.
A possible solution I can think of is to detect if the extracted frame is dark. If it is dark, extract another frame from a slightly different time.
How can I detect if the extracted frame is dark/black ? Or is there another way I can solve this ?
private void getScreenshots_Click(object sender, EventArgs e)
{
int index = 0;
foreach (string value in this.filesList.Items)
{
string file = selectedFiles[index] + "\\" + value;
// ------------------------------------------------
// MediaInfo
// ------------------------------------------------
// https://github.com/Nicholi/MediaInfoDotNet
//
// get file width, height, and frame count
//
// get aspect ratio of the video
// and calculate height of thumbnail
// using width and aspect ratio
//
MediaInfo MI = new MediaInfo();
MI.Open(file);
var width = MI.Get(StreamKind.Video, 0, "Width");
var height = MI.Get(StreamKind.Video, 0, "Height");
decimal d = Decimal.Parse(MI.Get(StreamKind.Video, 0, "Duration"));
decimal frameCount = Decimal.Parse(MI.Get(StreamKind.Video, 0, "FrameCount"));
MI.Close();
decimal ratio = Decimal.Divide(Decimal.Parse(width), Decimal.Parse(height));
int newHeight = Decimal.ToInt32(Decimal.Divide(newWidth, ratio));
decimal startTime = Decimal.Divide(d, totalImages);
//totalImages - number of thumbnails the final image will have
for (int x = 0; x < totalImages; x++)
{
// increase the time where the thumbnail is taken on each iteration
decimal newTime = Decimal.Multiply(startTime, x);
string time = TimeSpan.FromMilliseconds(double.Parse(newTime.ToString())).ToString(@"hh\:mm\:ss");
string outputFile = this.tmpPath + "img-" + index + x + ".jpg";
// create individual thumbnails with ffmpeg
proc = new Process();
proc.StartInfo.FileName = "ffmpeg.exe";
proc.StartInfo.Arguments = "-y -seek_timestamp 1 -ss " + time + " -i \"" + file + "\" -frames:v 1 -qscale:v 3 \"" + outputFile + "\"";
proc.Start();
proc.WaitForExit();
}
// set width and height of final image
int w = (this.cols * newWidth) + (this.spacing * this.cols + this.spacing);
int h = (this.rows * newHeight) + (this.spacing * this.rows + this.spacing);
int left, top, i = 0;
// combine individual thumbnails into one image
using (Bitmap bmp = new Bitmap(w, h))
{
using (Graphics g = Graphics.FromImage(bmp))
{
g.Clear(this.backgroundColor);
// this.rows - number of rows
for (int y = 0; y < this.rows; y++)
{
// put images on a column
// this.cols - number of columns
// when x = number of columns go to next row
for (int x = 0; x < this.cols; x++)
{
Image imgFromFile = Image.FromFile(this.tmpPath + "img-" + index + i + ".jpg");
MemoryStream imgFromStream = new MemoryStream();
imgFromFile.Save(imgFromStream, imgFromFile.RawFormat);
imgFromFile.Dispose();
left = (x * newWidth) + ((x + 1) * this.spacing);
top = (this.spacing * (y + 1)) + (newHeight * y);
g.DrawImage(Image.FromStream(imgFromStream), left, top, newWidth, newHeight);
i++;
}
}
}
// save the final image
bmp.Save(selectedFiles[index] + "\\" + value + ".jpg");
}
index++;
}
} -
Difference between 'display_aspect_ratio' and 'sample_aspect_ratio' in ffprobe [duplicate]
18 juin 2018, par John AllardThis question already has an answer here :
-
ffmpeg scaling not working for video
1 answer
I have an issue where a video is played in the correct 16:9 aspect ratio when played through VLC or quicktime player, but when I attempt to extract individual frames with ffmpeg the frames come out as 4:3 aspect ratio.
The ffprobe output on the video in question is as follows
$ ffprobe -v error -select_streams v:0 -show_entries stream -of default=noprint_wrappers=1 -print_format json movie.mp4
{
"programs": [
],
"streams": [
{
"index": 0,
"codec_name": "h264",
"codec_long_name": "H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10",
"profile": "Main",
"codec_type": "video",
"codec_time_base": "126669/6400000",
"codec_tag_string": "avc1",
"codec_tag": "0x31637661",
"width": 2592,
"height": 1944,
"coded_width": 2592,
"coded_height": 1944,
"has_b_frames": 0,
"sample_aspect_ratio": "4:3",
"display_aspect_ratio": "16:9",
"pix_fmt": "yuvj420p",
"level": 50,
"color_range": "pc",
"color_space": "bt709",
"color_transfer": "bt709",
"color_primaries": "bt709",
"chroma_location": "left",
"refs": 1,
"is_avc": "true",
"nal_length_size": "4",
"r_frame_rate": "25/1",
"avg_frame_rate": "3200000/126669",
"time_base": "1/12800",
"start_pts": 0,
"start_time": "0.000000",
"duration_ts": 126682,
"duration": "9.897031",
"bit_rate": "4638928",
"bits_per_raw_sample": "8",
"nb_frames": "250",
"disposition": {
"default": 1,
"dub": 0,
"original": 0,
"comment": 0,
"lyrics": 0,
"karaoke": 0,
"forced": 0,
"hearing_impaired": 0,
"visual_impaired": 0,
"clean_effects": 0,
"attached_pic": 0,
"timed_thumbnails": 0
},
"tags": {
"language": "und",
"handler_name": "VideoHandler"
}
}
]
}So it says
"width": 2592,
"height": 1944,
"coded_width": 2592,
"coded_height": 1944,
"has_b_frames": 0,
"sample_aspect_ratio": "4:3",
"display_aspect_ratio": "16:9",which seems odd to me. The width/height are in 4:3, the sample aspect ratio is 4:3, the display is 16:9 ?
Now, when I play this through VLC/Quicktime the video looks fine (screenshot below)
but now, if I run an ffmpeg command to extract individual frames from this video, they come out in 4:3
ffmpeg -y -hide_banner -nostats -loglevel error -i movie.mp4 -vf select='eq(n\,10)+eq(n\,20)+eq(n\,30)+eq(n\,40)',scale=-1:640 -vsync 0 /tmp/ffmpeg_image_%04d.jpg
So I guess my questions are as follows :
- what is the relation between display aspect ratio, sample aspect ratio, and the width/height ratio ?
- how to I get ffmpeg to output in the correct aspect ratio ?
-
ffmpeg scaling not working for video