Recherche avancée

Médias (0)

Mot : - Tags -/logo

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (78)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (14139)

  • How can I develop a 264 "codec format worked in VideoView API ?

    31 juillet 2018, par lscodex

    I have got a big problem. I have a video "tc10.264" I downloaded from live555.com. I can not play the video that using videoview,exoplayer and vitamio sdk
    on android.
    I know that the video is codec.it is not container like mp4,flv etc.
    Later, I played the video with "ffplay -f h264 tc10.264 " via on windows console.
    and ffmpeg shows me this spec.
    h264(baseline), yubj420p and 25 fps.

    Okay this video is exist.
    After, I downloaded ffmpeg from on ubuntu in virtual machine. And I build(compile) the ffmpeg with x264 it as shown below.

    my directory path is like ffmpeg > x264 and I compiled libx264.a for ffmpeg from x264 folder.

    Note : my android phone has architecturally armeabi_v7a

    There is build_android_arm.sh script.

    #!/bin/bash
    echo starting building ....
    NDK=/home/lscodex/android-sdk/ndk-bundle
    PLATFORM=$NDK/platforms/android-19/arch-arm/
    TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64
    PREFIX=./android/arm

    function build_one
    {
     ./configure \
     --prefix=$PREFIX \
     --enable-static \
     --enable-pic \
     --host=arm-linux \
     --cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
     --sysroot=$PLATFORM

     make clean
     make
     make install
    }

    build_one

    echo Android ARM builds finished....

    After, I have a folder called android containing lib,include and bin files.So, I compiled script that shown below to achieve shared folder (".so") in ffmpeg folder.
    there is build_android_armeabi_v7a.sh script

    #!/bin/bash

    echo Android starting armeabi_v7a
    NDK=/home/lscodex/android-sdk/ndk-bundle
    PLATFORM=$NDK/platforms/android-19/arch-arm/
    PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64
    CPU=armeabi_v7a
    PREFIX=$(pwd)/android/$CPU

    GENERAL="\
    --enable-small \
    --enable-cross-compile \
    --extra-libs="-lgcc" \
    --arch=arm \
    --cc=$PREBUILT/bin/arm-linux-androideabi-gcc \
    --cross-prefix=$PREBUILT/bin/arm-linux-androideabi- \
    --nm=$PREBUILT/bin/arm-linux-androideabi-nm \
    --extra-cflags="-I../x264/android/arm/include" \
    --extra-ldflags="-L../x264/android/arm/lib" "


    MODULES="\
    --enable-gpl \
    --enable-libx264"

    H264TEST="\
    --enable-encoder=libx264 \
    --enable-encoder=libx264rgb \
    --enable-decoder=h264 \
    --enable-muxer=h264 \
    --enable-demuxer=h264 \
    --enable-parser=h264"

    function build_ARMv7
    {
     ./configure \
     --target-os=android \
     --prefix=$PREFIX \
     ${GENERAL} \
     --sysroot=$PLATFORM \
     --enable-shared \
     ${H264TEST} \
     --disable-static \
     --extra-cflags="-march=armv7-a -mfloat-abi=softfp -mfpu=vfpv3-d16 -fomit-frame-pointer -fstrict-aliasing -funswitch-loops -finline-limit=300" \
     --extra-ldflags="-Wl,-rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -nostdlib -lc -lm -ldl -llog" \
     --enable-zlib \
     ${MODULES} \
     --disable-doc \
     --enable-neon

     make clean
     make
     make install
    }


    build_ARMv7
    echo Android ARMv7-a builds finished

    finally, I obtained another an android folder that contain shared library.
    I integrated the files from the android studio. I created CMakeLists.txt and cpp folder.
    So, Everything is perfectly working, I think.
    this ndk code helping me for receive a duration of bugs_bunny.mp4 video.

    extern "C"
    JNIEXPORT jint JNICALL
    Java_com_lscodex_just_videoplayertesting2_VideoProcessing_videoDuration(
           JNIEnv *env,
           jobject obj,
           jstring input) {


       AVFormatContext *pFormatCtx = NULL;

        if(avformat_open_input(&pFormatCtx,convertStringFileToChar(env,input),NULL,NULL)<0){
            throwException(env,"Could not open input file ");
            loge("Could not open input file ");
            return 0;
        }
        if (avformat_find_stream_info(pFormatCtx,NULL)<0){
            throwException(env,"Failed to retrieve input stream information");
            loge("Failed to retrieve input stream information");
            return 0;
        }

       logd("I reached it here :) ");


       int64_t duration = pFormatCtx->duration;
       avformat_close_input(&pFormatCtx);
       avformat_free_context(pFormatCtx);
       return (jint) (duration / AV_TIME_BASE);
    }

    at least for mp4 video format.
    So, my question is, How can I run tc10.264 codec video format via ffmpeg on exoplayer or on videoview api ?

  • "File descriptor in bad state" error while running ffmpeg on android device and selecting an input device

    25 août 2012, par user1545779

    Below is the output of the ffmpeg command :# ./ffmpeg -y -f s16le -i /dev/snd/pcmC3D0c 1640.wmv -to create an audio file from a Logitech webcam on an android device.

    As shown in the output, I received a File descriptor in bad state error for referring to the mic input as /dev/snd/pcmC3D0c I determined the value of the device (webcam mic) by reviewing the contents of /proc/asound. The webcam mic was card3 and its STREAM0 file indicated that the mic has an audio format of format S16_LE

    It was also confirmed that it is a capture device and its' pcm id was pcmC3D0c (C3 being the card number and D0 being the Device number. I then confirmed the correct device by checking the /dev/snd/ directory to confirm its proper and full description. The /dev/snd folder confirmed that the mic was /dev/snd/pcmC3D0c

    I then checked the permissions and ownership to make sure that I could use that device. Hence as far as identifying the correct device to used I do believe that /dev/snd/pcmC3D0c is the correct device. I do believe this error could possibly have something to do with the OS, however after all these checks, I still cannot figure out what is giving the bad file descriptor state error.

    Please note that I tested for different output formats, etc and that did not make any difference. Any leads or suggestions ?

    # ./ffmpeg -y -f s16le -i /dev/snd/pcmC3D0c 1640.wmv

    ffmpeg version N-43170-gd84dd35 Copyright (c) 2000-2012 the FFmpeg developers
    built on Aug 24 2012 09:16:05 with gcc 4.4.3 (GCC) configuration : —enable-cross-compile —arch=arm —cpu=cortex-a9 —target-os=linux —enable-runtime-cpudetect —prefix=/output —enable-pic —cross-prefix=/home/jasongipsyblues/Desktop/apps/android-ndk-r8b/toolchains/arm-linux-androideabi-4.4.3/prebuilt/linux-x86/bin/arm-linux-androideabi- —sysroot=/home/jasongipsyblues/Desktop/apps/android-ndk-r8b/platforms/android-14/arch-arm —enable-version3 —enable-gpl —enable-memalign-hack —disable-doc —enable-yasm —enable-libx264 —enable-zlib —extra-cflags=-I../x264 —extra-ldflags='-L../x264 -lc'

    libavutil 51. 66.100 / 51. 66.100
    libavcodec 54. 48.100 / 54. 48.100
    libavformat 54. 22.100 / 54. 22.100
    libavdevice 54. 2.100 / 54. 2.100
    libavfilter 3. 5.102 / 3. 5.102
    libswscale 2. 1.100 / 2. 1.100
    libswresample 0. 15.100 / 0. 15.100
    libpostproc 52. 0.100 / 52. 0.100

    [s16le @ 0xfd84f0] Invalid sample rate 0 specified using default of 44100
    [s16le @ 0xfd84f0] Estimating duration from bitrate, this may be inaccurate
    Guessed Channel Layout for Input Stream #0.0 : mono
    Input #0, s16le, from '/dev/snd/pcmC3D0c' :
    Duration : N/A, bitrate : 705 kb/s
    Stream #0:0 : Audio : pcm_s16le, 44100 Hz, mono, s16, 705 kb/s
    Output #0, asf, to '1640.wmv' :
    Metadata :
    WM/EncodingSettings : Lavf54.22.100
    Stream #0:0 : Audio : wmav2 (a[1][0][0] / 0x0161), 44100 Hz, mono, s16, 128 kb/s
    Stream mapping :
    Stream #0:0 -> #0:0 (pcm_s16le -> wmav2)
    Press [q] to stop, [?] for help

    /dev/snd/pcmC3D0c : File descriptor in bad state

    size= 1kB time=00:00:00.00 bitrate= 0.0kbits/s
    video:0kB audio:0kB subtitle:0 global headers:0kB muxing overhead 5340.000000%

  • Cutting a live stream into separate mp4 files

    9 juin 2017, par Fearhunter

    I am doing a research for cutting a live stream in piece and save it as mp4 files. I am using this source for the proof of concept :

    https://docs.microsoft.com/en-us/azure/media-services/media-services-dotnet-creating-live-encoder-enabled-channel#download-sample

    And this is the example code I use :

    using System;
    using System.Collections.Generic;
    using System.Configuration;
    using System.IO;
    using System.Linq;
    using System.Net;
    using System.Security.Cryptography;
    using System.Text;
    using System.Threading.Tasks;
    using Microsoft.WindowsAzure.MediaServices.Client;
    using Newtonsoft.Json.Linq;

    namespace AMSLiveTest
    {
       class Program
       {
           private const string StreamingEndpointName = "streamingendpoint001";
           private const string ChannelName = "channel001";
           private const string AssetlName = "asset001";
           private const string ProgramlName = "program001";

           // Read values from the App.config file.
           private static readonly string _mediaServicesAccountName =
           ConfigurationManager.AppSettings["MediaServicesAccountName"];
           private static readonly string _mediaServicesAccountKey =
           ConfigurationManager.AppSettings["MediaServicesAccountKey"];

           // Field for service context.
           private static CloudMediaContext _context = null;
           private static MediaServicesCredentials _cachedCredentials = null;

           static void Main(string[] args)
           {
               // Create and cache the Media Services credentials in a static class variable.
               _cachedCredentials = new MediaServicesCredentials(
               _mediaServicesAccountName,
               _mediaServicesAccountKey);
               // Used the cached credentials to create CloudMediaContext.
               _context = new CloudMediaContext(_cachedCredentials);

               IChannel channel = CreateAndStartChannel();

               // Set the Live Encoder to point to the channel's input endpoint:
               string ingestUrl = channel.Input.Endpoints.FirstOrDefault().Url.ToString();

               // Use the previewEndpoint to preview and verify
               // that the input from the encoder is actually reaching the Channel.
               string previewEndpoint = channel.Preview.Endpoints.FirstOrDefault().Url.ToString();

               IProgram program = CreateAndStartProgram(channel);
               ILocator locator = CreateLocatorForAsset(program.Asset, program.ArchiveWindowLength);
               IStreamingEndpoint streamingEndpoint = CreateAndStartStreamingEndpoint();
               GetLocatorsInAllStreamingEndpoints(program.Asset);

               // Once you are done streaming, clean up your resources.
               Cleanup(streamingEndpoint, channel);
           }

           public static IChannel CreateAndStartChannel()
           {
               //If you want to change the Smooth fragments to HLS segment ratio, you would set the ChannelCreationOptions’s Output property.

               IChannel channel = _context.Channels.Create(
               new ChannelCreationOptions
               {
               Name = ChannelName,
               Input = CreateChannelInput(),
               Preview = CreateChannelPreview()
               });

               //Starting and stopping Channels can take some time to execute. To determine the state of operations after calling Start or Stop, query the IChannel.State .

               channel.Start();

               return channel;
           }

           private static ChannelInput CreateChannelInput()
           {
               return new ChannelInput
               {
                   StreamingProtocol = StreamingProtocol.RTMP,
                   AccessControl = new ChannelAccessControl
                   {
                       IPAllowList = new List<iprange>
                               {
                               new IPRange
                           {
                               Name = "TestChannelInput001",
                               // Setting 0.0.0.0 for Address and 0 for SubnetPrefixLength
                               // will allow access to IP addresses.
                               Address = IPAddress.Parse("0.0.0.0"),
                               SubnetPrefixLength = 0
                           }
                       }
                   }
               };
           }

           private static ChannelPreview CreateChannelPreview()
           {
               return new ChannelPreview
               {
                   AccessControl = new ChannelAccessControl
                   {
                       IPAllowList = new List<iprange>
                       {
                           new IPRange
                           {
                               Name = "TestChannelPreview001",
                               // Setting 0.0.0.0 for Address and 0 for SubnetPrefixLength
                               // will allow access to IP addresses.
                               Address = IPAddress.Parse("0.0.0.0"),
                               SubnetPrefixLength = 0
                           }
                       }
                   }
               };
           }

           public static void UpdateCrossSiteAccessPoliciesForChannel(IChannel channel)
           {
               var clientPolicy =
                   @"&lt;?xml version=""1.0"" encoding=""utf-8""?>
               
                   
                       <policy>
                           
                               <domain uri=""></domain>
                           
                           
                              <resource path=""></resource>"" include-subpaths=""true""/>
                           
                       </policy>
                   
               ";

               var xdomainPolicy =
                   @"&lt;?xml version=""1.0"" ?>
               
                   
               ";

               channel.CrossSiteAccessPolicies.ClientAccessPolicy = clientPolicy;
               channel.CrossSiteAccessPolicies.CrossDomainPolicy = xdomainPolicy;

               channel.Update();
           }

           public static IProgram CreateAndStartProgram(IChannel channel)
           {
               IAsset asset = _context.Assets.Create(AssetlName, AssetCreationOptions.None);

               // Create a Program on the Channel. You can have multiple Programs that overlap or are sequential;
               // however each Program must have a unique name within your Media Services account.
               IProgram program = channel.Programs.Create(ProgramlName, TimeSpan.FromHours(3), asset.Id);
               program.Start();

               return program;
           }

           public static ILocator CreateLocatorForAsset(IAsset asset, TimeSpan ArchiveWindowLength)
           {
               // You cannot create a streaming locator using an AccessPolicy that includes write or delete permissions.            

               var locator = _context.Locators.CreateLocator
                   (
                       LocatorType.OnDemandOrigin,
                       asset,
                       _context.AccessPolicies.Create
                       (
                           "Live Stream Policy",
                           ArchiveWindowLength,
                           AccessPermissions.Read
                       )
                   );

               return locator;
           }

           public static IStreamingEndpoint CreateAndStartStreamingEndpoint()
           {
               var options = new StreamingEndpointCreationOptions
               {
                   Name = StreamingEndpointName,
                   ScaleUnits = 1,
                   AccessControl = GetAccessControl(),
                   CacheControl = GetCacheControl()
               };

               IStreamingEndpoint streamingEndpoint = _context.StreamingEndpoints.Create(options);
               streamingEndpoint.Start();

               return streamingEndpoint;
           }

           private static StreamingEndpointAccessControl GetAccessControl()
           {
               return new StreamingEndpointAccessControl
               {
                   IPAllowList = new List<iprange>
                   {
                       new IPRange
                       {
                           Name = "Allow all",
                           Address = IPAddress.Parse("0.0.0.0"),
                           SubnetPrefixLength = 0
                       }
                   },

                   AkamaiSignatureHeaderAuthenticationKeyList = new List<akamaisignatureheaderauthenticationkey>
                   {
                       new AkamaiSignatureHeaderAuthenticationKey
                       {
                           Identifier = "My key",
                           Expiration = DateTime.UtcNow + TimeSpan.FromDays(365),
                           Base64Key = Convert.ToBase64String(GenerateRandomBytes(16))
                       }
                   }
               };
           }

           private static byte[] GenerateRandomBytes(int length)
           {
               var bytes = new byte[length];
               using (var rng = new RNGCryptoServiceProvider())
               {
                   rng.GetBytes(bytes);
               }

               return bytes;
           }

           private static StreamingEndpointCacheControl GetCacheControl()
           {
               return new StreamingEndpointCacheControl
               {
                   MaxAge = TimeSpan.FromSeconds(1000)
               };
           }

           public static void UpdateCrossSiteAccessPoliciesForStreamingEndpoint(IStreamingEndpoint streamingEndpoint)
           {
               var clientPolicy =
                   @"&lt;?xml version=""1.0"" encoding=""utf-8""?>
               
                   
                       <policy>
                           
                               <domain uri=""></domain>
                           
                           
                              <resource path=""></resource>"" include-subpaths=""true""/>
                           
                       </policy>
                   
               ";

               var xdomainPolicy =
                   @"&lt;?xml version=""1.0"" ?>
               
                   
               ";

               streamingEndpoint.CrossSiteAccessPolicies.ClientAccessPolicy = clientPolicy;
               streamingEndpoint.CrossSiteAccessPolicies.CrossDomainPolicy = xdomainPolicy;

               streamingEndpoint.Update();
           }

           public static void GetLocatorsInAllStreamingEndpoints(IAsset asset)
           {
               var locators = asset.Locators.Where(l => l.Type == LocatorType.OnDemandOrigin);
               var ismFile = asset.AssetFiles.AsEnumerable().FirstOrDefault(a => a.Name.EndsWith(".ism"));
               var template = new UriTemplate("{contentAccessComponent}/{ismFileName}/manifest");
               var urls = locators.SelectMany(l =>
                           _context
                               .StreamingEndpoints
                               .AsEnumerable()
                               .Where(se => se.State == StreamingEndpointState.Running)
                               .Select(
                                   se =>
                                       template.BindByPosition(new Uri("http://" + se.HostName),
                                       l.ContentAccessComponent,
                                           ismFile.Name)))
                           .ToArray();

           }

           public static void Cleanup(IStreamingEndpoint streamingEndpoint,
                                       IChannel channel)
           {
               if (streamingEndpoint != null)
               {
                   streamingEndpoint.Stop();
                   streamingEndpoint.Delete();
               }

               IAsset asset;
               if (channel != null)
               {

                   foreach (var program in channel.Programs)
                   {
                       asset = _context.Assets.Where(se => se.Id == program.AssetId)
                                               .FirstOrDefault();

                       program.Stop();
                       program.Delete();

                       if (asset != null)
                       {
                           foreach (var l in asset.Locators)
                               l.Delete();

                           asset.Delete();
                       }
                   }

                   channel.Stop();
                   channel.Delete();
               }
           }
       }
    }
    </akamaisignatureheaderauthenticationkey></iprange></iprange></iprange>

    Now I want to make a method to cut a live stream for example every 15 minutes and save it as mp4 but don’t know where to start.

    Can someone point me in the right direction ?

    Kind regards

    UPDATE :

    I want to save the mp4 files on my hard disk.