Recherche avancée

Médias (91)

Autres articles (12)

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

  • Ajouter notes et légendes aux images

    7 février 2011, par

    Pour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
    Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
    Modification lors de l’ajout d’un média
    Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

Sur d’autres sites (3008)

  • Cutting a live stream into separate mp4 files

    9 juin 2017, par Fearhunter

    I am doing a research for cutting a live stream in piece and save it as mp4 files. I am using this source for the proof of concept :

    https://docs.microsoft.com/en-us/azure/media-services/media-services-dotnet-creating-live-encoder-enabled-channel#download-sample

    And this is the example code I use :

    using System;
    using System.Collections.Generic;
    using System.Configuration;
    using System.IO;
    using System.Linq;
    using System.Net;
    using System.Security.Cryptography;
    using System.Text;
    using System.Threading.Tasks;
    using Microsoft.WindowsAzure.MediaServices.Client;
    using Newtonsoft.Json.Linq;

    namespace AMSLiveTest
    {
       class Program
       {
           private const string StreamingEndpointName = "streamingendpoint001";
           private const string ChannelName = "channel001";
           private const string AssetlName = "asset001";
           private const string ProgramlName = "program001";

           // Read values from the App.config file.
           private static readonly string _mediaServicesAccountName =
           ConfigurationManager.AppSettings["MediaServicesAccountName"];
           private static readonly string _mediaServicesAccountKey =
           ConfigurationManager.AppSettings["MediaServicesAccountKey"];

           // Field for service context.
           private static CloudMediaContext _context = null;
           private static MediaServicesCredentials _cachedCredentials = null;

           static void Main(string[] args)
           {
               // Create and cache the Media Services credentials in a static class variable.
               _cachedCredentials = new MediaServicesCredentials(
               _mediaServicesAccountName,
               _mediaServicesAccountKey);
               // Used the cached credentials to create CloudMediaContext.
               _context = new CloudMediaContext(_cachedCredentials);

               IChannel channel = CreateAndStartChannel();

               // Set the Live Encoder to point to the channel's input endpoint:
               string ingestUrl = channel.Input.Endpoints.FirstOrDefault().Url.ToString();

               // Use the previewEndpoint to preview and verify
               // that the input from the encoder is actually reaching the Channel.
               string previewEndpoint = channel.Preview.Endpoints.FirstOrDefault().Url.ToString();

               IProgram program = CreateAndStartProgram(channel);
               ILocator locator = CreateLocatorForAsset(program.Asset, program.ArchiveWindowLength);
               IStreamingEndpoint streamingEndpoint = CreateAndStartStreamingEndpoint();
               GetLocatorsInAllStreamingEndpoints(program.Asset);

               // Once you are done streaming, clean up your resources.
               Cleanup(streamingEndpoint, channel);
           }

           public static IChannel CreateAndStartChannel()
           {
               //If you want to change the Smooth fragments to HLS segment ratio, you would set the ChannelCreationOptions’s Output property.

               IChannel channel = _context.Channels.Create(
               new ChannelCreationOptions
               {
               Name = ChannelName,
               Input = CreateChannelInput(),
               Preview = CreateChannelPreview()
               });

               //Starting and stopping Channels can take some time to execute. To determine the state of operations after calling Start or Stop, query the IChannel.State .

               channel.Start();

               return channel;
           }

           private static ChannelInput CreateChannelInput()
           {
               return new ChannelInput
               {
                   StreamingProtocol = StreamingProtocol.RTMP,
                   AccessControl = new ChannelAccessControl
                   {
                       IPAllowList = new List<iprange>
                               {
                               new IPRange
                           {
                               Name = "TestChannelInput001",
                               // Setting 0.0.0.0 for Address and 0 for SubnetPrefixLength
                               // will allow access to IP addresses.
                               Address = IPAddress.Parse("0.0.0.0"),
                               SubnetPrefixLength = 0
                           }
                       }
                   }
               };
           }

           private static ChannelPreview CreateChannelPreview()
           {
               return new ChannelPreview
               {
                   AccessControl = new ChannelAccessControl
                   {
                       IPAllowList = new List<iprange>
                       {
                           new IPRange
                           {
                               Name = "TestChannelPreview001",
                               // Setting 0.0.0.0 for Address and 0 for SubnetPrefixLength
                               // will allow access to IP addresses.
                               Address = IPAddress.Parse("0.0.0.0"),
                               SubnetPrefixLength = 0
                           }
                       }
                   }
               };
           }

           public static void UpdateCrossSiteAccessPoliciesForChannel(IChannel channel)
           {
               var clientPolicy =
                   @"&lt;?xml version=""1.0"" encoding=""utf-8""?>
               
                   
                       <policy>
                           
                               <domain uri=""></domain>
                           
                           
                              <resource path=""></resource>"" include-subpaths=""true""/>
                           
                       </policy>
                   
               ";

               var xdomainPolicy =
                   @"&lt;?xml version=""1.0"" ?>
               
                   
               ";

               channel.CrossSiteAccessPolicies.ClientAccessPolicy = clientPolicy;
               channel.CrossSiteAccessPolicies.CrossDomainPolicy = xdomainPolicy;

               channel.Update();
           }

           public static IProgram CreateAndStartProgram(IChannel channel)
           {
               IAsset asset = _context.Assets.Create(AssetlName, AssetCreationOptions.None);

               // Create a Program on the Channel. You can have multiple Programs that overlap or are sequential;
               // however each Program must have a unique name within your Media Services account.
               IProgram program = channel.Programs.Create(ProgramlName, TimeSpan.FromHours(3), asset.Id);
               program.Start();

               return program;
           }

           public static ILocator CreateLocatorForAsset(IAsset asset, TimeSpan ArchiveWindowLength)
           {
               // You cannot create a streaming locator using an AccessPolicy that includes write or delete permissions.            

               var locator = _context.Locators.CreateLocator
                   (
                       LocatorType.OnDemandOrigin,
                       asset,
                       _context.AccessPolicies.Create
                       (
                           "Live Stream Policy",
                           ArchiveWindowLength,
                           AccessPermissions.Read
                       )
                   );

               return locator;
           }

           public static IStreamingEndpoint CreateAndStartStreamingEndpoint()
           {
               var options = new StreamingEndpointCreationOptions
               {
                   Name = StreamingEndpointName,
                   ScaleUnits = 1,
                   AccessControl = GetAccessControl(),
                   CacheControl = GetCacheControl()
               };

               IStreamingEndpoint streamingEndpoint = _context.StreamingEndpoints.Create(options);
               streamingEndpoint.Start();

               return streamingEndpoint;
           }

           private static StreamingEndpointAccessControl GetAccessControl()
           {
               return new StreamingEndpointAccessControl
               {
                   IPAllowList = new List<iprange>
                   {
                       new IPRange
                       {
                           Name = "Allow all",
                           Address = IPAddress.Parse("0.0.0.0"),
                           SubnetPrefixLength = 0
                       }
                   },

                   AkamaiSignatureHeaderAuthenticationKeyList = new List<akamaisignatureheaderauthenticationkey>
                   {
                       new AkamaiSignatureHeaderAuthenticationKey
                       {
                           Identifier = "My key",
                           Expiration = DateTime.UtcNow + TimeSpan.FromDays(365),
                           Base64Key = Convert.ToBase64String(GenerateRandomBytes(16))
                       }
                   }
               };
           }

           private static byte[] GenerateRandomBytes(int length)
           {
               var bytes = new byte[length];
               using (var rng = new RNGCryptoServiceProvider())
               {
                   rng.GetBytes(bytes);
               }

               return bytes;
           }

           private static StreamingEndpointCacheControl GetCacheControl()
           {
               return new StreamingEndpointCacheControl
               {
                   MaxAge = TimeSpan.FromSeconds(1000)
               };
           }

           public static void UpdateCrossSiteAccessPoliciesForStreamingEndpoint(IStreamingEndpoint streamingEndpoint)
           {
               var clientPolicy =
                   @"&lt;?xml version=""1.0"" encoding=""utf-8""?>
               
                   
                       <policy>
                           
                               <domain uri=""></domain>
                           
                           
                              <resource path=""></resource>"" include-subpaths=""true""/>
                           
                       </policy>
                   
               ";

               var xdomainPolicy =
                   @"&lt;?xml version=""1.0"" ?>
               
                   
               ";

               streamingEndpoint.CrossSiteAccessPolicies.ClientAccessPolicy = clientPolicy;
               streamingEndpoint.CrossSiteAccessPolicies.CrossDomainPolicy = xdomainPolicy;

               streamingEndpoint.Update();
           }

           public static void GetLocatorsInAllStreamingEndpoints(IAsset asset)
           {
               var locators = asset.Locators.Where(l => l.Type == LocatorType.OnDemandOrigin);
               var ismFile = asset.AssetFiles.AsEnumerable().FirstOrDefault(a => a.Name.EndsWith(".ism"));
               var template = new UriTemplate("{contentAccessComponent}/{ismFileName}/manifest");
               var urls = locators.SelectMany(l =>
                           _context
                               .StreamingEndpoints
                               .AsEnumerable()
                               .Where(se => se.State == StreamingEndpointState.Running)
                               .Select(
                                   se =>
                                       template.BindByPosition(new Uri("http://" + se.HostName),
                                       l.ContentAccessComponent,
                                           ismFile.Name)))
                           .ToArray();

           }

           public static void Cleanup(IStreamingEndpoint streamingEndpoint,
                                       IChannel channel)
           {
               if (streamingEndpoint != null)
               {
                   streamingEndpoint.Stop();
                   streamingEndpoint.Delete();
               }

               IAsset asset;
               if (channel != null)
               {

                   foreach (var program in channel.Programs)
                   {
                       asset = _context.Assets.Where(se => se.Id == program.AssetId)
                                               .FirstOrDefault();

                       program.Stop();
                       program.Delete();

                       if (asset != null)
                       {
                           foreach (var l in asset.Locators)
                               l.Delete();

                           asset.Delete();
                       }
                   }

                   channel.Stop();
                   channel.Delete();
               }
           }
       }
    }
    </akamaisignatureheaderauthenticationkey></iprange></iprange></iprange>

    Now I want to make a method to cut a live stream for example every 15 minutes and save it as mp4 but don’t know where to start.

    Can someone point me in the right direction ?

    Kind regards

    UPDATE :

    I want to save the mp4 files on my hard disk.

  • FFMPeg generated video : Audio has 'glitches' when uploaded to YouTube

    7 octobre 2023, par CularBytes

    I've generated a voice from Azure AI Speech at 48KHz and 96K Bit Rate, generated a video of some stock footages and I'm trying to combine all of that with a background music.&#xA;The voice-over is generated per setence, so that I know how long each setence is and to include relevant video footage.

    &#xA;

    I'm using FFMpeg through the FFMpegCore nuget package.

    &#xA;

    The problem

    &#xA;

    After the video is complete with background music, I play it on my computer and it's perfect (no audio glitches, music keeps playing). But when uploaded to youtube it has 'breaks' in the music inbetween sentences (basically everytime a new voice-fragment is starting).

    &#xA;

    Example : https://www.youtube.com/watch?v=ieNvQ2TNq44

    &#xA;

    The code

    &#xA;

    All of the footage is combined with mostly FFMpeg.Join(string output, string[] videos). These video files also contain the voice-overs (per sentance).

    &#xA;

    After that I try to add the music like this :

    &#xA;

       string outputTimelineWithMusicPath = _workingDir &#x2B; $@"\{videoTitle}_withmusic.mp4";&#xA;    FFMpegArguments&#xA;        .FromFileInput(inputVideoPath)&#xA;        .AddFileInput(musicPath)&#xA;        .OutputToFile(outputPath, true, options => options&#xA;            .CopyChannel()&#xA;            .WithAudioCodec(AudioCodec.Aac)&#xA;            .WithAudioBitrate(AudioQuality.Good)&#xA;            .UsingShortest(true)&#xA;            .WithCustomArgument("-filter_complex \"[0:a]aformat=fltp:44100:stereo,apad[0a];[1]aformat=fltp:44100:stereo,volume=0.05[1a];[0a][1a]amerge[a]\" -map 0:v -map \"[a]\" -ac 2"))&#xA;        .ProcessSynchronously();&#xA;

    &#xA;

    I've tried to mess around with the CustomArgument, but so far no success.

    &#xA;

    For example, I thought removing apad from the argument so no 'blank spots' are added, should perhaps fix the issue. Also tried to use amix instead of amerge.

    &#xA;

    Last try

    &#xA;

    I've tried to first make sure both files had the same sample rate, in the hope to fix the issue. So far, no success

    &#xA;

        string outputVideoVoicePath = _workingDir &#x2B; $@"\{title}_voiceonly_formatting.mp4";&#xA;    string musicReplacePath = _workingDir &#x2B; $@"\{title}_music_formatted.aac";&#xA;    FFMpegArguments&#xA;    .FromFileInput(inputVideoPath)&#xA;    .OutputToFile(outputVideoVoicePath, true, options => options&#xA;        .WithAudioCodec(AudioCodec.Aac)&#xA;        .WithAudioBitrate(128)&#xA;        .WithAudioSamplingRate(44100)&#xA;    )&#xA;    .ProcessSynchronously();&#xA;    &#xA;    FFMpegArguments&#xA;        .FromFileInput(music.FilePath)&#xA;        .OutputToFile(musicReplacePath, true, options => options&#xA;            .WithAudioCodec(AudioCodec.Aac)&#xA;            .WithAudioBitrate(256) //also tried 96 (which is original format)&#xA;            .WithAudioSamplingRate(44100)&#xA;        )&#xA;        .ProcessSynchronously();&#xA;    &#xA;    &#xA;    Console.WriteLine("Add music...");&#xA;    var videoTitle = Regex.Replace(title, "[^a-zA-Z]&#x2B;", "");&#xA;    string outputTimelineWithMusicPath = _workingDir &#x2B; $@"\{videoTitle}_withmusic.mp4";&#xA;    FFMpegArguments&#xA;        .FromFileInput(outputVideoVoicePath)&#xA;        .AddFileInput(musicReplacePath)&#xA;        .OutputToFile(outputTimelineWithMusicPath, true, options => options&#xA;            .CopyChannel()&#xA;            .WithAudioCodec(AudioCodec.Aac)&#xA;            .WithAudioBitrate(AudioQuality.Good)&#xA;            .UsingShortest(true)&#xA;            .WithCustomArgument("-filter_complex \"[0:a]aformat=fltp:44100:stereo[0a];[1]aformat=fltp:44100:stereo,volume=0.05[1a];[0a][1a]amix=inputs=2[a]\" -map 0:v -map \"[a]\" -ac 2"))&#xA;        .ProcessSynchronously();&#xA;    return outputTimelineWithMusicPath;&#xA;

    &#xA;

    I'm not much of an expert when it comes to audio/video codecs. I do scale each stock video to 24fps, 1920x1080 and the music has a original bitrate of 256Kbps / 44100 sample rate (so I probably don't even have to convert the audio file).

    &#xA;

  • Error in streaming video over java socket using xuggler

    18 avril 2014, par user3548066

    Although the video is streaming the console is full of repetitive errors and the video freezes after about 20 seconds.

    CLIENTSIDE :

    [Thread-4] ERROR org.ffmpeg - [vfwcap @ 0551F420] real-time buffer 90% full! frame dropped!
    [Thread-4] ERROR org.ffmpeg - [vfwcap @ 0551F420] real-time buffer 90% full! frame dropped!
    [Thread-4] ERROR org.ffmpeg - [vfwcap @ 0551F420] real-time buffer 90% full! frame dropped!
    [Thread-4] ERROR org.ffmpeg - [vfwcap @ 0551F420] real-time buffer 90% full! frame dropped!
    [Thread-4] ERROR org.ffmpeg - [vfwcap @ 0551F420] real-time buffer 121% full! frame dropped!
    [Thread-4] ERROR org.ffmpeg - [vfwcap @ 0551F420] real-time buffer 90% full! frame dropped!
    [Thread-4] ERROR org.ffmpeg - [vfwcap @ 0551F420] real-time buffer 90% full! frame dropped!
    [Thread-4] ERROR org.ffmpeg - [vfwcap @ 0551F420] real-time buffer 90% full! frame dropped!
    [Thread-4] ERROR org.ffmpeg - [vfwcap @ 0551F420] real-time buffer 90% full! frame dropped!
    [Thread-4] ERROR org.ffmpeg - [vfwcap @ 0551F420] real-time buffer 90% full! frame dropped!
    [Thread-4] ERROR org.ffmpeg - [vfwcap @ 0551F420] real-time buffer 90% full! frame dropped!
    [Thread-4] ERROR org.ffmpeg - [vfwcap @ 0551F420] real-time buffer 90% full! frame dropped!

    SERVERSIDE :

    Exception in thread "AWT-EventQueue-0" java.lang.NullPointerException
       at com.xuggle.xuggler.demos.VideoImage$ImageComponent$ImageRunnable.run(VideoImage.java:103)
       at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:251)
       at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:733)
       at java.awt.EventQueue.access$200(EventQueue.java:103)
       at java.awt.EventQueue$3.run(EventQueue.java:694)
       at java.awt.EventQueue$3.run(EventQueue.java:692)
       at java.security.AccessController.doPrivileged(Native Method)
       at java.security.ProtectionDomain$1.doIntersectionPrivilege(ProtectionDomain.java:76)
       at java.awt.EventQueue.dispatchEvent(EventQueue.java:703)
       at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:242)
       at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:161)
       at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:150)
       at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:146)
       at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:138)
       at java.awt.EventDispatchThread.run(EventDispatchThread.java:91)