Recherche avancée

Médias (1)

Mot : - Tags -/wave

Autres articles (47)

  • Personnaliser les catégories

    21 juin 2013, par

    Formulaire de création d’une catégorie
    Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
    Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
    On peut modifier ce formulaire dans la partie :
    Administration > Configuration des masques de formulaire.
    Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
    Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

Sur d’autres sites (7286)

  • How can I store a live stream when internet connection is interrupted ?

    5 juin 2019, par Marcello Moreira

    I’m building a solution using drone and 3g/4g connection.
    I have an IP camera encoded in H.264 by a hardware encoder connected to a raspberry pi and a 3g/4g moldem. The hardware encoder livestream de video via RTMP to a remote server I have. All these devices are in a moving platform, and sometimes the moldem loses connection with internet for a few seconds/minutes. When this happens, I want to store the live footage in the raspberry with ffmpeg, and when the connection restores I can send it back to the server. I have access to the encoded livestream from the raspberry pi over LAN even when internet is down.

    I do not know how and where should I start.
    I see two approaches for this.

    First approach

    One is to do all the streaming via ffmpeg, and disable the automatic hardware stream, when ffmpeg detects that it can’t send stream to the remote server, it starts to store the video (like a buffer) until the connection is restore. The issue with this, is that I don’t know if ffmpeg can detect if internet connection is down, and how can I buffer the video. Also by doing this, when connection is restored, live video would have a huge delay, and I can’t have lot’s of delay in my solution.

    Second approach

    The second is simultaneously store with ffmpeg the live video, when internet goes down, a process records the timestamp, and keeps watching until internet connection is restored. Then it sends to my server only the missing piece. At my server I would need to figure out a way to join those streams back up.. (I would gladly accept tips on that too). Issue with this is that there’s limited space in my raspberry, so I can only store a limited amount. Also, my device may be turned off when it lands so I need to send the video recording ASAP after connection is restored.

    So, which approach seems to be the better one ?

  • ffmpeg with Axis P1347 returns 400 Bad Request, but Axis 1357 works

    3 mai 2016, par steampowered

    I have two cameras : an Axis P1347 and an Axis P1357. ffmpeg gets a 400 Bad Request on the P1347 but everything works fine with the P1357.

    I am able to successfully stream rtsp video using vlc from an Axis P1347 Camera using the following url :

    rtsp://10.8.3.85:554/axis-media/media.amp?videocodec=h264&audio=1

    However, this same url in ffmpeg gives the following for the Axis P1347 Camera :

    root@ubuntu4-virtual-machine:/home/ubuntu4# ffmpeg -re -v verbose  -i "rtsp://10.8.3.85:554/axis-media/media.amp?videocodec=h264&audio=1"
    ffmpeg version git-2016-05-02-9fcb59c Copyright (c) 2000-2016 the FFmpeg developers
     built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.1)
     configuration: --enable-gpl --enable-libfaac --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-librtmp --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-nonfree --enable-version3
     libavutil      55. 23.100 / 55. 23.100
     libavcodec     57. 38.100 / 57. 38.100
     libavformat    57. 35.100 / 57. 35.100
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 44.100 /  6. 44.100
     libswscale      4.  1.100 /  4.  1.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    rtsp://10.8.3.85:554/axis-media/media.amp?videocodec=h264&audio=1: Server returned 400 Bad Request

    The same ffmpeg command works great with the nearly identical Axis P1357 Camera :

    root@ubuntu4-virtual-machine:/home/ubuntu4# ffmpeg -re -rtsp_transport tcp -i "rtsp://10.8.3.90:554/axis-media/media.amp?videocodec=h264&audio=1"
    ffmpeg version git-2016-05-02-9fcb59c Copyright (c) 2000-2016 the FFmpeg developers
     built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.1)
     configuration: --enable-gpl --enable-libfaac --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-librtmp --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-nonfree --enable-version3
     libavutil      55. 23.100 / 55. 23.100
     libavcodec     57. 38.100 / 57. 38.100
     libavformat    57. 35.100 / 57. 35.100
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 44.100 /  6. 44.100
     libswscale      4.  1.100 /  4.  1.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, rtsp, from 'rtsp://10.8.3.90:554/axis-media/media.amp?videocodec=h264&audio=1':
     Metadata:
       title           : Media Presentation
     Duration: N/A, start: 0.083300, bitrate: N/A
       Stream #0:0: Video: h264 (Main), yuvj420p(pc, bt709), 2592x1944 [SAR 1:1 DAR 4:3], 12 tbr, 90k tbn
       Stream #0:1: Audio: aac (LC), 16000 Hz, mono, fltp

    Obviously rtsp is turned on and working if vlc can display video, correct ? So why does VLC work with the camera, but not ffmpeg ? Note ffmpeg is installed and works correctly with a similar camera.

  • #include file from library returns "file not found"

    16 juin 2015, par Yossi Tsafar

    I’m getting this error from multiple .a files

    'libavcodec/avcodec.h' file not found

    Even though I added the library to project and also to "Library Search Paths" :

    $(PROJECT_DIR)/Project/Custom/RTPlayer/3rd-party/ffmpeg/lib

    This is how I add it to project :

    #include "libavcodec/avcodec.h"

    Please someone help me what seems to be the problem ?

    Thanks in advance !