Recherche avancée

Médias (1)

Mot : - Tags -/artwork

Autres articles (75)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

Sur d’autres sites (9807)

  • calculate the real duration time of a broken movie with ffmpeg/ffprobe

    26 octobre 2012, par jAckOdE

    I have a broken video file that its ffprobe (ffprobe -i movie.mkv) say its duration is about 2 hours, but the file contains only video data for first 8minutes (check by VLC Player).

    How can i get the real duration (8minutes) using ffmpeg or ffprobe ?

    here is the output :

    ffprobe version 0.7.13, Copyright (c) 2007-2011 the FFmpeg developers
     built on Aug  1 2012 21:08:35 with clang 3.1 (tags/Apple/clang-318.0.58)
     configuration: --prefix=/opt/local --enable-swscale --enable-avfilter --enable-libmp3lame --enable-libvorbis --enable-libtheora --enable-libdirac --enable-libschroedinger --enable-libopenjpeg --enable-libvpx --enable-libspeex --disable-libopencore-amrnb --disable-libopencore-amrwb --mandir=/opt/local/share/man --enable-shared --enable-pthreads --cc=/usr/bin/clang --arch=x86_64 --enable-yasm --enable-gpl --enable-postproc --enable-libx264 --enable-libxvid
     libavutil    50. 43. 0 / 50. 43. 0
     libavcodec   52.123. 0 / 52.123. 0
     libavformat  52.111. 0 / 52.111. 0
     libavdevice  52.  5. 0 / 52.  5. 0
     libavfilter   1. 80. 0 /  1. 80. 0
     libswscale    0. 14. 1 /  0. 14. 1
     libpostproc  51.  2. 0 / 51.  2. 0
    [matroska,webm @ 0x7f93dc01e600] Estimating duration from bitrate, this may be inaccurate
    Input #0, matroska,webm, from 'Mission.mkv':
     Duration: 02:12:56.21, start: 0.000000, bitrate: 448 kb/s
       Stream #0.0: Video: mpeg4, yuv420p, 1280x528 [PAR 1:1 DAR 80:33], 23.98 fps, 23.98 tbr, 1k tbn, 23.98 tbc (default)
       Stream #0.1: Audio: ac3, 48000 Hz, 5.1, s16, 448 kb/s (default)
  • Capture raw video byte stream for real time transcoding

    9 septembre 2012, par user1145905

    I would like to achieve the following :

    Set up a proxy server to handle video requests by clients (for now, say all video requests from any Android video client) from a remote video server like YouTube, Vimeo, etc. I don't have access to the video files being requested, hence the need for a proxy server. I have settled for Squid. This proxy should process the video signal/stream being passed from the remote server before relaying it back to the requesting client.

    To achieve the above, I would either

    1. Need to figure out the precise location (URL) of the video resource being requested, download it really fast, and modify it as I want before HTTP streaming it back to the client as the transcoding continues (simultaneously, with some latency)

    2. Access the raw byte stream, pipe it into a transcoder (I'm thinking ffmpeg) and proceed with the streaming to client (also with some expected latency).

    Option #2 seems tricky to do but lends more flexibility to the kind of transcoding I would like to perform. I would have to actually handle raw data/packets, but I don't know if ffmpeg takes such input.

    In short, I'm looking for a solution to implement real-time transcoding of videos that I do not have direct access to from my proxy. Any suggestions on the tools or approaches I could use ? I have also read about Gstreamer (but could not tell if it's applicable to my situation), and MPlayer/MEncoder.

    And finally, a rather specific question : Are there any tools out there that, given a YouTube video URL, can download the byte stream for further processing ? That is, something similar to the Chrome YouTube downloader but one that can be integrated with a server-side script ?

    Thanks for any pointers/suggestions !

  • error in hls play video streaming with ffmpeg real time manipulation

    4 août 2019, par Mohsen Rahnamaei

    I am using fluent ffmpeg to write something on video streaming real-time
    means that I am using this tool to write text on every ts file which server wants to serve in order to response the hls request from client :
    in server iam using this code :

    res.setHeader('Content-Type', CONTENT_TYPE.SEGMENT)
    res.statusCode = 200
    var proc = ffmpeg(req.filePath).videoFilters({
         filter: 'drawtext',
         options: {
           text: 'VERY LONG TEXT VERY VERY VERY VERY LOL!!!',
           fontsize: 36,
           fontcolor: 'white',
           x: '(main_w/2-text_w/2)',
           y: '(text_h/2)+15',
           shadowcolor: 'black',
           shadowx: 2,
           shadowy: 2
         }
       }

     )


     .videoCodec('libx264')

     .audioCodec('aac')


     .format('mpegts')

     .on('end', function (stdout, stderr) {

       console.log('Transcoding succeeded !', req.filePath);

     })

     .on('error', function (err) {
       console.log('an error happened: ' + err.message);
     }).pipe(res, {
       end: true
     })

    but in client just play first ts file and after that, I get this log in the console :

    [log] > AVC:6798 ms overlapping between fragments detected
    blob:http://demo.jwp…a8dc-56b513684988:1
    [log] > Video/PTS/DTS adjusted: 6798/6798,delta:-6798 ms
    blob:http://demo.jwp…a8dc-56b513684988:1

    and a bunch of this log :

    [warn] > Dropping 1 audio frame @ 6.805s due to 6797 ms overlap.

    what should I do ?????