
Recherche avancée
Médias (1)
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (64)
-
Pas question de marché, de cloud etc...
10 avril 2011Le vocabulaire utilisé sur ce site essaie d’éviter toute référence à la mode qui fleurit allègrement
sur le web 2.0 et dans les entreprises qui en vivent.
Vous êtes donc invité à bannir l’utilisation des termes "Brand", "Cloud", "Marché" etc...
Notre motivation est avant tout de créer un outil simple, accessible à pour tout le monde, favorisant
le partage de créations sur Internet et permettant aux auteurs de garder une autonomie optimale.
Aucun "contrat Gold ou Premium" n’est donc prévu, aucun (...) -
Mediabox : ouvrir les images dans l’espace maximal pour l’utilisateur
8 février 2011, parLa visualisation des images est restreinte par la largeur accordée par le design du site (dépendant du thème utilisé). Elles sont donc visibles sous un format réduit. Afin de profiter de l’ensemble de la place disponible sur l’écran de l’utilisateur, il est possible d’ajouter une fonctionnalité d’affichage de l’image dans une boite multimedia apparaissant au dessus du reste du contenu.
Pour ce faire il est nécessaire d’installer le plugin "Mediabox".
Configuration de la boite multimédia
Dès (...) -
Activation de l’inscription des visiteurs
12 avril 2011, parIl est également possible d’activer l’inscription des visiteurs ce qui permettra à tout un chacun d’ouvrir soit même un compte sur le canal en question dans le cadre de projets ouverts par exemple.
Pour ce faire, il suffit d’aller dans l’espace de configuration du site en choisissant le sous menus "Gestion des utilisateurs". Le premier formulaire visible correspond à cette fonctionnalité.
Par défaut, MediaSPIP a créé lors de son initialisation un élément de menu dans le menu du haut de la page menant (...)
Sur d’autres sites (8877)
-
JSON Python check for data
3 mars 2017, par Georgе StoyanovSo I am trying to write a script which is looking into a JSON file and check if stream with index 1 exists. Right now my program doesn’t do that. Here is the program itself. I want to check if
(data["streams"][1])
exists and in case it exists to print the codec, sample rate and the bitrate of all available audio streams.#!/usr/bin/env python
import subprocess
import json
import os.path
# Saving the file path and the name path into input_file
input_file = raw_input("Please enter the input file path: ")
# Loop until the user enters a valid input file
while os.path.isfile(input_file) == False:
print "Please try again, the specified file / path don't exist!"
input_file = raw_input("Please enter the input file path again: ")
# Execution of the ffprobe command to list the general statistics of the file. I have separated both scripts because the script for analyzing the frames is taking longer time.
returned_data = subprocess.check_output(['ffprobe', '-v', 'quiet', '-print_format', 'json', '-show_format', '-show_streams', input_file])
# Loading of the json file
data = json.loads(returned_data.decode('utf-8'))
t = (data["streams"][0]["avg_frame_rate"])
fps = [float(x) for x in t.split('/')]
# Printing of the general information about the video file
print "========================== Video ============================="
print
print "Codec: %s" %(data["streams"][0]["codec_long_name"])
print "Profile: %s" %(data["streams"][0]["profile"])
print "Resolution: %d x %d" %((data["streams"][0]["width"]), (data["streams"][0]["height"]))
print "Pixel Format: %s" %(data["streams"][0]["pix_fmt"])
print "Bits per sample: %s" %(data["streams"][0]["bits_per_raw_sample"])
print
print "========================== Audio ============================="
print
print "Codec: %s" %(data["streams"][1]["codec_name"])
print "Sample Rate: %.3f KHz" %(int(data["streams"][1]["sample_rate"])/1000)
print "Bitrate: %d Kbps" %(int(data["streams"][1]["bit_rate"])/1000)And here is the output of the JSON file.
{
"streams": [
{
"index": 0,
"codec_name": "mpeg4",
"codec_long_name": "MPEG-4 part 2",
"profile": "Simple Profile",
"codec_type": "video",
"codec_time_base": "1/24",
"codec_tag_string": "FMP4",
"codec_tag": "0x34504d46",
"width": 854,
"height": 480,
"coded_width": 854,
"coded_height": 480,
"has_b_frames": 0,
"sample_aspect_ratio": "1:1",
"display_aspect_ratio": "427:240",
"pix_fmt": "yuv420p",
"level": 1,
"chroma_location": "left",
"refs": 1,
"quarter_sample": "false",
"divx_packed": "false",
"r_frame_rate": "24/1",
"avg_frame_rate": "24/1",
"time_base": "1/24",
"start_pts": 0,
"start_time": "0.000000",
"duration_ts": 14315,
"duration": "596.458333",
"bit_rate": "2500431",
"nb_frames": "14315",
"disposition": {
"default": 0,
"dub": 0,
"original": 0,
"comment": 0,
"lyrics": 0,
"karaoke": 0,
"forced": 0,
"hearing_impaired": 0,
"visual_impaired": 0,
"clean_effects": 0,
"attached_pic": 0,
"timed_thumbnails": 0
}
},
{
"index": 1,
"codec_name": "ac3",
"codec_long_name": "ATSC A/52A (AC-3)",
"codec_type": "audio",
"codec_time_base": "1/48000",
"codec_tag_string": "[0] [0][0]",
"codec_tag": "0x2000",
"sample_fmt": "fltp",
"sample_rate": "48000",
"channels": 6,
"channel_layout": "5.1(side)",
"bits_per_sample": 0,
"dmix_mode": "-1", -
How can this YUV420 video have uneven dimensions
22 décembre 2020, par XelpiJust trying to cement my understanding of video codecs and can't seem to find any information about this.


https://www.mediafire.com/file/cbx8sciq5mie94m/arrow.webm/file contains a tiny vp9 webm. This was generated by me via
ffmpeg
by doing a simple gif -> webm conversion.

The source gif had a
13x11
resolution. Somehow, the output video also has a13x11
resolution. I'm trying to understand how that is possible.

As far as I understand :


- 

-
The YUV420 pixel format would make this impossible due to the chroma subsampling factor of 2 forcing a divisibility by two requirement.


-
VP9 itself has a minimum block size of 16x16(?) so at least that much data must be encoded(?)








Consequently, it's my assumption we have either a 14x12 or 16x16 video stream encoded here that is being somehow scaled or cropped down to 13x11.


The problem is I can't find any explanation as to how this is working.


Here's the ffprobe output for the stream :


[STREAM]
index=0
codec_name=vp9
codec_long_name=Google VP9
profile=Profile 0
codec_type=video
codec_time_base=1/60
codec_tag_string=[0][0][0][0]
codec_tag=0x0000
width=13
height=11
coded_width=13
coded_height=11
closed_captions=0
has_b_frames=0
sample_aspect_ratio=1:1
display_aspect_ratio=13:11
pix_fmt=yuv420p
level=-99
color_range=tv
color_space=unknown
color_transfer=unknown
color_primaries=unknown
chroma_location=unspecified
field_order=unknown
timecode=N/A
refs=1
id=N/A
r_frame_rate=60/1
avg_frame_rate=60/1
time_base=1/1000
start_pts=0
start_time=0.000000
duration_ts=N/A
duration=N/A
bit_rate=N/A
max_bit_rate=N/A
bits_per_raw_sample=N/A
nb_frames=N/A
nb_read_frames=N/A
nb_read_packets=N/A
DISPOSITION:default=1
DISPOSITION:dub=0
DISPOSITION:original=0
DISPOSITION:comment=0
DISPOSITION:lyrics=0
DISPOSITION:karaoke=0
DISPOSITION:forced=0
DISPOSITION:hearing_impaired=0
DISPOSITION:visual_impaired=0
DISPOSITION:clean_effects=0
DISPOSITION:attached_pic=0
DISPOSITION:timed_thumbnails=0
TAG:alpha_mode=1
TAG:ENCODER=Lavc58.91.100 libvpx-vp9
TAG:DURATION=00:00:00.600000000
[/STREAM]



and for the last frame :


[FRAME]
media_type=video
stream_index=0
key_frame=0
pkt_pts=583
pkt_pts_time=0.583000
pkt_dts=583
pkt_dts_time=0.583000
best_effort_timestamp=583
best_effort_timestamp_time=0.583000
pkt_duration=16
pkt_duration_time=0.016000
pkt_pos=3639
pkt_size=15
width=13
height=11
pix_fmt=yuv420p
sample_aspect_ratio=1:1
pict_type=P
coded_picture_number=0
display_picture_number=0
interlaced_frame=0
top_field_first=0
repeat_pict=0
color_range=tv
color_space=unknown
color_primaries=unknown
color_transfer=unknown
chroma_location=unspecified
[/FRAME]



going off the
coded_width
andcoded_height
values (supposed to represent the "true" width/height before any scaling(?)) plussar
value of 1, as far as I can tell this is genuinely a 13x11 video stream, but that should be impossible no ?

My question is, why is this a valid video file ?


If I try to e.g.
zscale
something to an uneven resolution inYUV420
pixel format I hit the expected chroma subsampling errors.

-
-
Difference between 'display_aspect_ratio' and 'sample_aspect_ratio' in ffprobe [duplicate]
18 juin 2018, par John AllardThis question already has an answer here :
-
ffmpeg scaling not working for video
1 answer
I have an issue where a video is played in the correct 16:9 aspect ratio when played through VLC or quicktime player, but when I attempt to extract individual frames with ffmpeg the frames come out as 4:3 aspect ratio.
The ffprobe output on the video in question is as follows
$ ffprobe -v error -select_streams v:0 -show_entries stream -of default=noprint_wrappers=1 -print_format json movie.mp4
{
"programs": [
],
"streams": [
{
"index": 0,
"codec_name": "h264",
"codec_long_name": "H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10",
"profile": "Main",
"codec_type": "video",
"codec_time_base": "126669/6400000",
"codec_tag_string": "avc1",
"codec_tag": "0x31637661",
"width": 2592,
"height": 1944,
"coded_width": 2592,
"coded_height": 1944,
"has_b_frames": 0,
"sample_aspect_ratio": "4:3",
"display_aspect_ratio": "16:9",
"pix_fmt": "yuvj420p",
"level": 50,
"color_range": "pc",
"color_space": "bt709",
"color_transfer": "bt709",
"color_primaries": "bt709",
"chroma_location": "left",
"refs": 1,
"is_avc": "true",
"nal_length_size": "4",
"r_frame_rate": "25/1",
"avg_frame_rate": "3200000/126669",
"time_base": "1/12800",
"start_pts": 0,
"start_time": "0.000000",
"duration_ts": 126682,
"duration": "9.897031",
"bit_rate": "4638928",
"bits_per_raw_sample": "8",
"nb_frames": "250",
"disposition": {
"default": 1,
"dub": 0,
"original": 0,
"comment": 0,
"lyrics": 0,
"karaoke": 0,
"forced": 0,
"hearing_impaired": 0,
"visual_impaired": 0,
"clean_effects": 0,
"attached_pic": 0,
"timed_thumbnails": 0
},
"tags": {
"language": "und",
"handler_name": "VideoHandler"
}
}
]
}So it says
"width": 2592,
"height": 1944,
"coded_width": 2592,
"coded_height": 1944,
"has_b_frames": 0,
"sample_aspect_ratio": "4:3",
"display_aspect_ratio": "16:9",which seems odd to me. The width/height are in 4:3, the sample aspect ratio is 4:3, the display is 16:9 ?
Now, when I play this through VLC/Quicktime the video looks fine (screenshot below)
but now, if I run an ffmpeg command to extract individual frames from this video, they come out in 4:3
ffmpeg -y -hide_banner -nostats -loglevel error -i movie.mp4 -vf select='eq(n\,10)+eq(n\,20)+eq(n\,30)+eq(n\,40)',scale=-1:640 -vsync 0 /tmp/ffmpeg_image_%04d.jpg
So I guess my questions are as follows :
- what is the relation between display aspect ratio, sample aspect ratio, and the width/height ratio ?
- how to I get ffmpeg to output in the correct aspect ratio ?
-
ffmpeg scaling not working for video