Recherche avancée

Médias (0)

Mot : - Tags -/flash

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (27)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

Sur d’autres sites (6747)

  • Getting log line for each extracted frame from FFMPEG

    3 février 2016, par wpfwannabe

    I am using FFMPEG.exe to extract frames from various videos. As this is a programmatic solution and getting the total frame count and/or duration can prove tricky (with ffprobe), I am thinking I could use the console output to detect individual frames’ timestamps but I am getting a single output line every N frames like this :

    frame=   20 fps=0.0 q=0.0 size=       0kB time=00:00:01.72 bitrate=   0.0kbits/s
    frame=   40 fps= 38 q=0.0 size=       0kB time=00:00:04.02 bitrate=   0.0kbits/s
    frame=   60 fps= 39 q=0.0 size=       0kB time=00:00:06.14 bitrate=   0.0kbits/s
    frame=   70 fps= 38 q=0.0 Lsize=       0kB time=00:00:07.86 bitrate=   0.0kbits/s

    Is there a command line option to force output for each and every frame ? If so, I could extract the time= portion. This is the command line currently used :

    ffmpeg.exe -i video.avi -y -threads 0 -vsync 2 %10d.jpeg

    Ideally, replacing %10d.jpeg with some other format that writes frame’s timestamp but I don’t think this exists.

  • C# get dominant color in an image

    22 janvier 2017, par CK13

    I’m building a program that makes screenshots from a video.
    It extracts frames from the video (with ffmpeg) and then combines them into one file.

    All works fine, except sometimes I get (almost) black images, mostly in the beginning and ending of the video.

    A possible solution I can think of is to detect if the extracted frame is dark. If it is dark, extract another frame from a slightly different time.

    How can I detect if the extracted frame is dark/black ? Or is there another way I can solve this ?

    private void getScreenshots_Click(object sender, EventArgs e)
    {
       int index = 0;
       foreach (string value in this.filesList.Items)
       {
           string file = selectedFiles[index] + "\\" + value;

           // ------------------------------------------------
           //   MediaInfo
           // ------------------------------------------------
           // https://github.com/Nicholi/MediaInfoDotNet
           //
           // get file width, height, and frame count
           //
           // get aspect ratio of the video
           // and calculate height of thumbnail
           // using width and aspect ratio
           //
           MediaInfo MI = new MediaInfo();
           MI.Open(file);
           var width = MI.Get(StreamKind.Video, 0, "Width");
           var height = MI.Get(StreamKind.Video, 0, "Height");
           decimal d = Decimal.Parse(MI.Get(StreamKind.Video, 0, "Duration"));
           decimal frameCount = Decimal.Parse(MI.Get(StreamKind.Video, 0, "FrameCount"));
           MI.Close();
           decimal ratio = Decimal.Divide(Decimal.Parse(width), Decimal.Parse(height));
           int newHeight = Decimal.ToInt32(Decimal.Divide(newWidth, ratio));
           decimal startTime = Decimal.Divide(d, totalImages);
           //totalImages - number of thumbnails the final image will have
           for (int x = 0; x < totalImages; x++)
           {
               // increase the time where the thumbnail is taken on each iteration
               decimal newTime = Decimal.Multiply(startTime, x);
               string time = TimeSpan.FromMilliseconds(double.Parse(newTime.ToString())).ToString(@"hh\:mm\:ss");

               string outputFile = this.tmpPath + "img-" + index + x + ".jpg";

               // create individual thumbnails with ffmpeg
               proc = new Process();
               proc.StartInfo.FileName = "ffmpeg.exe";
               proc.StartInfo.Arguments = "-y -seek_timestamp 1 -ss " + time + " -i \"" + file + "\" -frames:v 1 -qscale:v 3 \"" + outputFile + "\"";
               proc.Start();
               proc.WaitForExit();
           }

           // set width and height of final image
           int w = (this.cols * newWidth) + (this.spacing * this.cols + this.spacing);
           int h = (this.rows * newHeight) + (this.spacing * this.rows + this.spacing);

           int left, top, i = 0;
           // combine individual thumbnails into one image
           using (Bitmap bmp = new Bitmap(w, h))
           {
               using (Graphics g = Graphics.FromImage(bmp))
               {
                   g.Clear(this.backgroundColor);
                   // this.rows - number of rows
                   for (int y = 0; y < this.rows; y++)
                   {
                       // put images on a column
                       // this.cols - number of columns
                       // when x = number of columns go to next row
                       for (int x = 0; x < this.cols; x++)
                       {
                           Image imgFromFile = Image.FromFile(this.tmpPath + "img-" + index + i + ".jpg");
                           MemoryStream imgFromStream = new MemoryStream();
                           imgFromFile.Save(imgFromStream, imgFromFile.RawFormat);
                           imgFromFile.Dispose();

                           left = (x * newWidth) + ((x + 1) * this.spacing);
                           top = (this.spacing * (y + 1)) + (newHeight * y);
                           g.DrawImage(Image.FromStream(imgFromStream), left, top, newWidth, newHeight);
                           i++;
                       }
                   }
               }

               // save the final image
               bmp.Save(selectedFiles[index] + "\\" + value + ".jpg");
           }
           index++;
       }
    }
  • Difference between 'display_aspect_ratio' and 'sample_aspect_ratio' in ffprobe [duplicate]

    18 juin 2018, par John Allard

    This question already has an answer here :

    I have an issue where a video is played in the correct 16:9 aspect ratio when played through VLC or quicktime player, but when I attempt to extract individual frames with ffmpeg the frames come out as 4:3 aspect ratio.

    The ffprobe output on the video in question is as follows

    $ ffprobe -v error -select_streams v:0 -show_entries stream -of default=noprint_wrappers=1 -print_format json movie.mp4

    {
    "programs": [

    ],
    "streams": [
       {
           "index": 0,
           "codec_name": "h264",
           "codec_long_name": "H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10",
           "profile": "Main",
           "codec_type": "video",
           "codec_time_base": "126669/6400000",
           "codec_tag_string": "avc1",
           "codec_tag": "0x31637661",
           "width": 2592,
           "height": 1944,
           "coded_width": 2592,
           "coded_height": 1944,
           "has_b_frames": 0,
           "sample_aspect_ratio": "4:3",
           "display_aspect_ratio": "16:9",
           "pix_fmt": "yuvj420p",
           "level": 50,
           "color_range": "pc",
           "color_space": "bt709",
           "color_transfer": "bt709",
           "color_primaries": "bt709",
           "chroma_location": "left",
           "refs": 1,
           "is_avc": "true",
           "nal_length_size": "4",
           "r_frame_rate": "25/1",
           "avg_frame_rate": "3200000/126669",
           "time_base": "1/12800",
           "start_pts": 0,
           "start_time": "0.000000",
           "duration_ts": 126682,
           "duration": "9.897031",
           "bit_rate": "4638928",
           "bits_per_raw_sample": "8",
           "nb_frames": "250",
           "disposition": {
               "default": 1,
               "dub": 0,
               "original": 0,
               "comment": 0,
               "lyrics": 0,
               "karaoke": 0,
               "forced": 0,
               "hearing_impaired": 0,
               "visual_impaired": 0,
               "clean_effects": 0,
               "attached_pic": 0,
               "timed_thumbnails": 0
           },
           "tags": {
               "language": "und",
               "handler_name": "VideoHandler"
           }
       }
    ]
    }

    So it says

       "width": 2592,
       "height": 1944,
       "coded_width": 2592,
       "coded_height": 1944,
       "has_b_frames": 0,
       "sample_aspect_ratio": "4:3",
       "display_aspect_ratio": "16:9",

    which seems odd to me. The width/height are in 4:3, the sample aspect ratio is 4:3, the display is 16:9 ?

    Now, when I play this through VLC/Quicktime the video looks fine (screenshot below)

    enter image description here

    but now, if I run an ffmpeg command to extract individual frames from this video, they come out in 4:3

    ffmpeg -y -hide_banner -nostats -loglevel error -i movie.mp4 -vf select='eq(n\,10)+eq(n\,20)+eq(n\,30)+eq(n\,40)',scale=-1:640 -vsync 0 /tmp/ffmpeg_image_%04d.jpg

    enter image description here

    So I guess my questions are as follows :

    1. what is the relation between display aspect ratio, sample aspect ratio, and the width/height ratio ?
    2. how to I get ffmpeg to output in the correct aspect ratio ?