Recherche avancée

Médias (91)

Autres articles (92)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (12475)

  • Why is one ffmpeg webm dash stream much larger than the others ?

    5 janvier 2017, par ranvel

    Over the summer, I worked on putting together a script which took a x264 video/mp3 stream and broke it up into the different streams so that it would work via MSE-DASH. (Based heavily on the instructions on the webmproject.org website) Those same scripts have ceased to work, turning a 6GB video into several 25 Gb videos. I kept up with updates of ffmpeg and so I don’t know when it stopped working, but I am guessing it was due to the way that their DASH Webm implementation was updated.

    I found new method which works better, but still has a major problem with one stream. I was hoping someone could explain how this encoding works so that I could understand the underlying cause.

    #!/bin/bash
    COMMON_OPTS="-map 0:0 -an -threads 11 -cpu-used 4 -cmp chroma"
    WEBM_OPTS="-f webm -c:v vp9 -keyint_min 50 -g 50 -dash 1"

    ffmpeg -i $1 -vn -acodec libvorbis -ab 128k audio.webm &
    ffmpeg -i $1 $COMMON_OPTS $WEBM_OPTS -b:v 500k -vf scale=1280:720 -y vid-500k.webm &
    ffmpeg -i $1 $COMMON_OPTS $WEBM_OPTS -b:v 700k -vf scale=1280:720 -y vid-700k.webm &
    ffmpeg -i $1 $COMMON_OPTS $WEBM_OPTS -b:v 1000k -vf scale=1280:720 -y vid-1000k.webm &
    ffmpeg -i $1 $COMMON_OPTS $WEBM_OPTS -b:v 1500k -vf scale=1280:720 -y vid-1500k.webm  

    The transcode is not yet complete, but you can see where this is headed :

    -rw-r--r--  1 user  staff    87M Jan  4 23:27 audio.webm
    -rw-r--r--  1 user  staff    27M Jan  4 23:42 vid-1000k.webm
    -rw-r--r--  1 user  staff   285M Jan  4 23:42 vid-1500k.webm
    -rw-r--r--  1 user  staff    15M Jan  4 23:42 vid-500k.webm
    -rw-r--r--  1 user  staff    20M Jan  4 23:42 vid-700k.webm

    The 1500k variant is disproportionately larger than the other streams.

    The other problem is that when I use a shorter video, lets say eight or nine minutes, the above configuration runs as expected and everything is perfect. I don’t know where the limit for this is since each test costs a lot of processing power and time, but if it’s less than ten minutes, it works and if its longer than an hour, it produces massive files.

  • Low latency video streaming on android

    17 mai 2021, par Louis Blenner

    I'd like to be able to stream the video from my webcam to an Android app with a latency below 500ms, on my local network.

    


    To capture and send the video over the network, I use ffmpeg.

    


    ffmpeg -f v4l2 -i /dev/video0 -preset ultrafast -tune zerolatency -vcodec libx264 -an -vf format=yuv420p -f mpegts  udp://192.168.1.155:5000


    


    This command takes the webcam as an input, convert it and send it to a device using the mpegts protocol.

    


    I am able to read the video on another PC with a latency below 500 ms, using commands like

    


    gst-launch-1.0 -v udpsrc port=5000 ! video/mpegts ! tsdemux ! h264parse ! avdec_h264 ! fpsdisplaysink sync=false


    


    or

    


    mpv udp://0.0.0.0:5000 --no-cache --untimed --no-demuxer-thread --video-sync=audio --vd-lavc-threads=1 


    


    So it is possible to have this range of latency.
    
I'd like to have the same thing on Android.

    


    Here are my tries to do that.

    


    Exoplayer

    


    After looking at the different players available on Android studio, it seems like Exoplayer is the go-to choice.
    
I tried different options indicated in the live-streaming documentation, but I always end up with a stream taking seconds to start and with a latency of seconds.
    
I tried to add a Button to seek to the default position of the windows, but it results in a loading of several seconds.

    


    DefaultExtractorsFactory extractorsFactory =
                new DefaultExtractorsFactory()
                        .setTsExtractorFlags(DefaultTsPayloadReaderFactory.FLAG_IGNORE_AAC_STREAM);

        player = new SimpleExoPlayer.Builder(this)
                .setMediaSourceFactory(
                        new DefaultMediaSourceFactory(this, extractorsFactory))
                .setLoadControl(new DefaultLoadControl.Builder()
                        .setBufferDurationsMs(DefaultLoadControl.DEFAULT_MIN_BUFFER_MS, DefaultLoadControl.DEFAULT_MAX_BUFFER_MS, 200, 200)
                        .build())
                .build();
        MyPlayerView playerView = findViewById(R.id.player_view);
        // Bind the player to the view.
        playerView.setPlayer(player);
        // Build the media item.
        MediaItem mediaItem = new MediaItem.Builder()
                .setUri(Uri.parse("udp://0.0.0.0:5000"))
                .setLiveMaxOffsetMs(500)
                .setLiveTargetOffsetMs(0)
                .setLiveMinOffsetMs(0)
                .build();
        // Set the media item to be played.
        player.setMediaItem(mediaItem);
        // Prepare the player.
        player.setPlayWhenReady(true);
        player.prepare();
        //player.seekToDefaultPosition();


    


    This issue is about the same issue and the conclusion was that Exoplayer was not fit for this use case.

    


    


    I'll be honest, ultra low-latency like this isn't ExoPlayer's main use-case

    


    


    Vlc

    


    Another try was to use the Vlc library.
    
But I was unable to have the same low latency stream as with the two previous example with Vlc.
    
I tried changing the preferences of Vlc to stream as fast as possible.

    


    Input/Codecs -> x264 preset: ultrafast - zerolatency
Input/Codecs -> Access Module: UDP input
Input/Codecs -> Clock Jitter: 500
Audio: disable audio


    


    I also tried reducing the different buffers.
    
However, I still have a latency of more than 1 seconds with that.

    


    Gstreamer

    


    Another try was to create a react-native project to use the different players available here.
    
One player that seemed promising was react-native-gstreamer because it uses gstreamer which is able to stream with low latency (gst-launch command).
    
But the library is now outdated.

    


    Question

    


    There were other tries, but none were successful.
    
Is there a problem with one of my approaches ?
    
And if not, Is there a player on Android (that I missed) which is able to achieve low latency stream like gstream or mpv on linux ?

    


  • Resize videos and keep codecs using PHP-FFMpeg

    30 décembre 2019, par tmp_hallenser

    I’m trying to resize a couple of videos (with different codecs) using PHP-FFMpeg. The official documentation gives the following example (see https://github.com/PHP-FFMpeg/PHP-FFMpeg)

    $ffmpeg = FFMpeg\FFMpeg::create();
    $video = $ffmpeg->open('video.mpg');
    $video
       ->filters()
       ->resize(new FFMpeg\Coordinate\Dimension(320, 240))
       ->synchronize();
    $video
       ->frame(FFMpeg\Coordinate\TimeCode::fromSeconds(10))
       ->save('frame.jpg');
    $video
       ->save(new FFMpeg\Format\Video\X264(), 'export-x264.mp4')
       ->save(new FFMpeg\Format\Video\WMV(), 'export-wmv.wmv')
       ->save(new FFMpeg\Format\Video\WebM(), 'export-webm.webm');

    Since I don’t want to specify the codec (take the same as the source file), I was trying to create a copy-format and use this format instead for the save command.

    class MOVFormat extends FFMpeg\Format\Video\DefaultVideo
    {
       public function __construct($audioCodec = 'copy', $videoCodec = 'copy')
       {
           $this
               ->setAudioCodec($audioCodec)
               ->setVideoCodec($videoCodec);
       }

       public function supportBFrames()
       {
           return false;
       }

       public function getAvailableAudioCodecs()
       {
           return ['copy'];
       }

       public function getAvailableVideoCodecs()
       {
           return ['copy'];
       }
    }

    This results in Filtergraph '[in]scale=320:240 [out]' was defined for video output stream 0:0 but codec copy was selected. and Filtering and streamcopy cannot be used together.

    Technically, it should be possible to resize without specifying the codec if I look at this reply : https://superuser.com/a/624564

    Any help is appreciated !