Recherche avancée

Médias (1)

Mot : - Tags -/wave

Autres articles (43)

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Librairies et binaires spécifiques au traitement vidéo et sonore

    31 janvier 2010, par

    Les logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
    Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
    Binaires complémentaires et facultatifs flvtool2 : (...)

Sur d’autres sites (8976)

  • Using Gstreamer or ffmpeg to create rtsp client on Android

    9 décembre 2014, par Pankaj Bansal

    I want to stream a rtsp stream on android and I finally have come to
    conclusion that I can’t use android API’s MediaPlayer,Videoview etc because
    latency is big issue for me. I need an latency of <500 ms. Now I am
    planning to use Gstreamer or ffmpeg to create an android rtsp client. I just have few
    doubts

    1. Will the Gstreamer or ffmpeg client be able to provide latency <500ms. I read there are
      some parameters which I can tweak to get very low latency. Just want to
      confirm. I have very good network bandwidth. The frame size is generally
      1920X1080.

    2. I read Gstreamer is one made one level above ffmpeg and uses ffmpeg
      codecs to work. I want to know which one is easier to work with for creating an android client. Working on Gstreamer or workig directly on ffmpeg.

    3. If I use Gstreamer android client, Will I have to use the Gstreamer server as well to stream the data ? Currently I am using Live555 RTSP server to stream data

  • Using custom hardware decoder with chromium browser ?

    2 juin 2016, par Sunny Shukla

    I am working on a custom hardware, where we are having a hardware decoder. This hardware decoder is working fine with linux applications and gstreamer. Now we are planning to extend hardware decoding support to chromium browser.

    To the best of my knowledge, chromium browser is using ffmpeg libraries for demuxing and decoding.

    So if I add our custom hardware decoder support to ffmpeg libraries, how would chromium browser will come to know to use our custom hardware decoder while playing videos ?

    Note :- We have only one hardware decoder on our custom hardware.

  • How to generating waveform from video & show it with video

    25 novembre 2016, par Salil

    We are using Rails as a backend & AngularJS on Front End side in my App where we need to show Video & audio waveform of that video.

    We are using ’wavesurfer.js’ to show the waveform on Front End side & ’node-pcm’ to generate pcm from video file on BackEnd side.

    This is working as expected but in some of the videos while creating waveform from pcm data instead of showing small sine waves we get flat line.
    Also it takes too much time to show the waveform for every page reload.

    To overcome this issue we are planning to create waveform image using ffmpeg

    ffmpeg -i 'https://s3.amazonaws.com/aadasdsadsadasdas/xyz.mp4' -filter_complex showwavespic -frames:v 1 output.png

    This is working fine but it also takes too much time (Ofcourse only once ) to generate the image for remote video (i.e. We are saving videos on S3)

    Problem with this i don’t get any library to integrate the waveform image with the Video.

    Can someone suggest any better approach related to this.