
Recherche avancée
Autres articles (40)
-
ANNEXE : Les extensions, plugins SPIP des canaux
11 février 2010, parUn plugin est un ajout fonctionnel au noyau principal de SPIP. MediaSPIP consiste en un choix délibéré de plugins existant ou pas auparavant dans la communauté SPIP, qui ont pour certains nécessité soit leur création de A à Z, soit des ajouts de fonctionnalités.
Les extensions que MediaSPIP nécessite pour fonctionner
Depuis la version 2.1.0, SPIP permet d’ajouter des plugins dans le répertoire extensions/.
Les "extensions" ne sont ni plus ni moins que des plugins dont la particularité est qu’ils se (...) -
Changer son thème graphique
22 février 2011, parLe thème graphique ne touche pas à la disposition à proprement dite des éléments dans la page. Il ne fait que modifier l’apparence des éléments.
Le placement peut être modifié effectivement, mais cette modification n’est que visuelle et non pas au niveau de la représentation sémantique de la page.
Modifier le thème graphique utilisé
Pour modifier le thème graphique utilisé, il est nécessaire que le plugin zen-garden soit activé sur le site.
Il suffit ensuite de se rendre dans l’espace de configuration du (...) -
Contribute to documentation
13 avril 2011Documentation is vital to the development of improved technical capabilities.
MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
To contribute, register to the project users’ mailing (...)
Sur d’autres sites (5030)
-
Difference between 'display_aspect_ratio' and 'sample_aspect_ratio' in ffprobe [duplicate]
18 juin 2018, par John AllardThis question already has an answer here :
-
ffmpeg scaling not working for video
1 answer
I have an issue where a video is played in the correct 16:9 aspect ratio when played through VLC or quicktime player, but when I attempt to extract individual frames with ffmpeg the frames come out as 4:3 aspect ratio.
The ffprobe output on the video in question is as follows
$ ffprobe -v error -select_streams v:0 -show_entries stream -of default=noprint_wrappers=1 -print_format json movie.mp4
{
"programs": [
],
"streams": [
{
"index": 0,
"codec_name": "h264",
"codec_long_name": "H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10",
"profile": "Main",
"codec_type": "video",
"codec_time_base": "126669/6400000",
"codec_tag_string": "avc1",
"codec_tag": "0x31637661",
"width": 2592,
"height": 1944,
"coded_width": 2592,
"coded_height": 1944,
"has_b_frames": 0,
"sample_aspect_ratio": "4:3",
"display_aspect_ratio": "16:9",
"pix_fmt": "yuvj420p",
"level": 50,
"color_range": "pc",
"color_space": "bt709",
"color_transfer": "bt709",
"color_primaries": "bt709",
"chroma_location": "left",
"refs": 1,
"is_avc": "true",
"nal_length_size": "4",
"r_frame_rate": "25/1",
"avg_frame_rate": "3200000/126669",
"time_base": "1/12800",
"start_pts": 0,
"start_time": "0.000000",
"duration_ts": 126682,
"duration": "9.897031",
"bit_rate": "4638928",
"bits_per_raw_sample": "8",
"nb_frames": "250",
"disposition": {
"default": 1,
"dub": 0,
"original": 0,
"comment": 0,
"lyrics": 0,
"karaoke": 0,
"forced": 0,
"hearing_impaired": 0,
"visual_impaired": 0,
"clean_effects": 0,
"attached_pic": 0,
"timed_thumbnails": 0
},
"tags": {
"language": "und",
"handler_name": "VideoHandler"
}
}
]
}So it says
"width": 2592,
"height": 1944,
"coded_width": 2592,
"coded_height": 1944,
"has_b_frames": 0,
"sample_aspect_ratio": "4:3",
"display_aspect_ratio": "16:9",which seems odd to me. The width/height are in 4:3, the sample aspect ratio is 4:3, the display is 16:9 ?
Now, when I play this through VLC/Quicktime the video looks fine (screenshot below)
but now, if I run an ffmpeg command to extract individual frames from this video, they come out in 4:3
ffmpeg -y -hide_banner -nostats -loglevel error -i movie.mp4 -vf select='eq(n\,10)+eq(n\,20)+eq(n\,30)+eq(n\,40)',scale=-1:640 -vsync 0 /tmp/ffmpeg_image_%04d.jpg
So I guess my questions are as follows :
- what is the relation between display aspect ratio, sample aspect ratio, and the width/height ratio ?
- how to I get ffmpeg to output in the correct aspect ratio ?
-
ffmpeg scaling not working for video
-
C# get dominant color in an image
22 janvier 2017, par CK13I’m building a program that makes screenshots from a video.
It extracts frames from the video (with ffmpeg) and then combines them into one file.All works fine, except sometimes I get (almost) black images, mostly in the beginning and ending of the video.
A possible solution I can think of is to detect if the extracted frame is dark. If it is dark, extract another frame from a slightly different time.
How can I detect if the extracted frame is dark/black ? Or is there another way I can solve this ?
private void getScreenshots_Click(object sender, EventArgs e)
{
int index = 0;
foreach (string value in this.filesList.Items)
{
string file = selectedFiles[index] + "\\" + value;
// ------------------------------------------------
// MediaInfo
// ------------------------------------------------
// https://github.com/Nicholi/MediaInfoDotNet
//
// get file width, height, and frame count
//
// get aspect ratio of the video
// and calculate height of thumbnail
// using width and aspect ratio
//
MediaInfo MI = new MediaInfo();
MI.Open(file);
var width = MI.Get(StreamKind.Video, 0, "Width");
var height = MI.Get(StreamKind.Video, 0, "Height");
decimal d = Decimal.Parse(MI.Get(StreamKind.Video, 0, "Duration"));
decimal frameCount = Decimal.Parse(MI.Get(StreamKind.Video, 0, "FrameCount"));
MI.Close();
decimal ratio = Decimal.Divide(Decimal.Parse(width), Decimal.Parse(height));
int newHeight = Decimal.ToInt32(Decimal.Divide(newWidth, ratio));
decimal startTime = Decimal.Divide(d, totalImages);
//totalImages - number of thumbnails the final image will have
for (int x = 0; x < totalImages; x++)
{
// increase the time where the thumbnail is taken on each iteration
decimal newTime = Decimal.Multiply(startTime, x);
string time = TimeSpan.FromMilliseconds(double.Parse(newTime.ToString())).ToString(@"hh\:mm\:ss");
string outputFile = this.tmpPath + "img-" + index + x + ".jpg";
// create individual thumbnails with ffmpeg
proc = new Process();
proc.StartInfo.FileName = "ffmpeg.exe";
proc.StartInfo.Arguments = "-y -seek_timestamp 1 -ss " + time + " -i \"" + file + "\" -frames:v 1 -qscale:v 3 \"" + outputFile + "\"";
proc.Start();
proc.WaitForExit();
}
// set width and height of final image
int w = (this.cols * newWidth) + (this.spacing * this.cols + this.spacing);
int h = (this.rows * newHeight) + (this.spacing * this.rows + this.spacing);
int left, top, i = 0;
// combine individual thumbnails into one image
using (Bitmap bmp = new Bitmap(w, h))
{
using (Graphics g = Graphics.FromImage(bmp))
{
g.Clear(this.backgroundColor);
// this.rows - number of rows
for (int y = 0; y < this.rows; y++)
{
// put images on a column
// this.cols - number of columns
// when x = number of columns go to next row
for (int x = 0; x < this.cols; x++)
{
Image imgFromFile = Image.FromFile(this.tmpPath + "img-" + index + i + ".jpg");
MemoryStream imgFromStream = new MemoryStream();
imgFromFile.Save(imgFromStream, imgFromFile.RawFormat);
imgFromFile.Dispose();
left = (x * newWidth) + ((x + 1) * this.spacing);
top = (this.spacing * (y + 1)) + (newHeight * y);
g.DrawImage(Image.FromStream(imgFromStream), left, top, newWidth, newHeight);
i++;
}
}
}
// save the final image
bmp.Save(selectedFiles[index] + "\\" + value + ".jpg");
}
index++;
}
} -
Getting log line for each extracted frame from FFMPEG
3 février 2016, par wpfwannabeI am using FFMPEG.exe to extract frames from various videos. As this is a programmatic solution and getting the total frame count and/or duration can prove tricky (with ffprobe), I am thinking I could use the console output to detect individual frames’ timestamps but I am getting a single output line every N frames like this :
frame= 20 fps=0.0 q=0.0 size= 0kB time=00:00:01.72 bitrate= 0.0kbits/s
frame= 40 fps= 38 q=0.0 size= 0kB time=00:00:04.02 bitrate= 0.0kbits/s
frame= 60 fps= 39 q=0.0 size= 0kB time=00:00:06.14 bitrate= 0.0kbits/s
frame= 70 fps= 38 q=0.0 Lsize= 0kB time=00:00:07.86 bitrate= 0.0kbits/sIs there a command line option to force output for each and every frame ? If so, I could extract the
time=
portion. This is the command line currently used :ffmpeg.exe -i video.avi -y -threads 0 -vsync 2 %10d.jpeg
Ideally, replacing
%10d.jpeg
with some other format that writes frame’s timestamp but I don’t think this exists.