Recherche avancée

Médias (91)

Autres articles (39)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • La sauvegarde automatique de canaux SPIP

    1er avril 2010, par

    Dans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
    Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...)

  • Librairies et binaires spécifiques au traitement vidéo et sonore

    31 janvier 2010, par

    Les logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
    Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
    Binaires complémentaires et facultatifs flvtool2 : (...)

Sur d’autres sites (6489)

  • Playing RTSP stream in Android

    9 décembre 2014, par Kamil

    I’m trying to play video stream on Android device. Unfortunatelly I still get the same problem with MediaPlayer/VideoView. I’m searching for a few days, but still haven’t found any working solution.
    For test purposes I’m using MediaPlayer app from API Demos (API Demos/Media/MediaPlayer/Play Streaming Video).
    Here is code snippet for playing stream

    mMediaPlayer = new MediaPlayer();
    mMediaPlayer.setDataSource(path);
    mMediaPlayer.setDisplay(holder);
    mMediaPlayer.prepare();
    mMediaPlayer.setOnBufferingUpdateListener(this);
    mMediaPlayer.setOnCompletionListener(this);
    mMediaPlayer.setOnPreparedListener(this);
    mMediaPlayer.setOnVideoSizeChangedListener(this);

    When I try to play stream I get this info from logcat
    http://pastebin.com/5Uib5CH5

    This is configuration of ffserver streaming the video

    Port 8090
    BindAddress 0.0.0.0

    RTSPPort 7654
    RTSPBindAddress 0.0.0.0

    MaxHTTPConnections 2000
    MaxClients 1000
    MaxBandwidth 10000

    CustomLog -
    NoDaemon
    <Feed feed1.ffm>

    File /tmp/feed1.ffm
    FileMaxSize 5M

    Launch ffmpeg -i mmsh://tempserv.cam/vid1

    ACL allow 127.0.0.1

    </Feed>

    <Stream rat1.mpg>
    Feed feed1.ffm
    Format rtp
    NoAudio
    VideoBitRate 56k
    VideoBufferSize 40
    VideoFrameRate 12
    VideoSize 176x144
    VideoGopSize 12
    VideoCodec libx264
    AVPresetVideo baseline
    </Stream>

    If anyone can advise me how to fix it, or at least indicate an mistake, I will be grateful.

  • Getting realtime output from ffmpeg to be used in progress bar (PyQt4, stdout)

    8 mars 2017, par Jason O'Neil

    I’ve looked at a number of questions but still can’t quite figure this out. I’m using PyQt, and am hoping to run ffmpeg -i file.mp4 file.avi and get the output as it streams so I can create a progress bar.

    I’ve looked at these questions :
    Can ffmpeg show a progress bar ?
    http://stackoverflow.com/questions/1606795/catching-stdout-in-realtime-from-subprocess

    I’m able to see the output of a rsync command, using this code :

    import subprocess, time, os, sys

    cmd = "rsync -vaz -P source/ dest/"
    p, line = True, 'start'


    p = subprocess.Popen(cmd,
                        shell=True,
                        bufsize=64,
                        stdin=subprocess.PIPE,
                        stderr=subprocess.PIPE,
                        stdout=subprocess.PIPE)

    for line in p.stdout:
       print("OUTPUT>>> " + str(line.rstrip()))
       p.stdout.flush()

    But when I change the command to ffmpeg -i file.mp4 file.avi I receive no output. I’m guessing this has something to do with stdout / output buffering, but I’m stuck as to how to read the line that looks like

    frame=   51 fps= 27 q=31.0 Lsize=     769kB time=2.04 bitrate=3092.8kbits/s

    Which I could use to figure out progress.

    Can someone show me an example of how to get this info from ffmpeg into python, with or without the use of PyQt (if possible)


    EDIT :
    I ended up going with jlp’s solution, my code looked like this :

    #!/usr/bin/python
    import pexpect

    cmd = 'ffmpeg -i file.MTS file.avi'
    thread = pexpect.spawn(cmd)
    print "started %s" % cmd
    cpl = thread.compile_pattern_list([
       pexpect.EOF,
       "frame= *\d+",
       '(.+)'
    ])
    while True:
       i = thread.expect_list(cpl, timeout=None)
       if i == 0: # EOF
           print "the sub process exited"
           break
       elif i == 1:
           frame_number = thread.match.group(0)
           print frame_number
           thread.close
       elif i == 2:
           #unknown_line = thread.match.group(0)
           #print unknown_line
           pass

    Which gives this output :

    started ffmpeg -i file.MTS file.avi
    frame=   13
    frame=   31
    frame=   48
    frame=   64
    frame=   80
    frame=   97
    frame=  115
    frame=  133
    frame=  152
    frame=  170
    frame=  188
    frame=  205
    frame=  220
    frame=  226
    the sub process exited

    Perfect !

  • Decode android's hardware encoded H264 camera feed using ffmpeg in real time

    31 octobre 2012, par user971871

    I'm trying to use the hardware H264 encoder on Android to create video from the camera, and use FFmpeg to mux in audio (all on the Android phone itself)

    What I've accomplished so far is packetizing the H264 video into rtsp packets, and decoding it using VLC (over UDP), so I know the video is at least correctly formatted. However, I'm having trouble getting the video data to ffmpeg in a format it can understand.

    I've tried sending the same rtsp packets to a port 5006 on localhost (over UDP), then providing ffmpeg with the sdp file that tells it which local port the video stream is coming in on and how to decode the video, if I understand rtsp streaming correctly. However this doesn't work and I'm having trouble diagnosing why, as ffmpeg just sits there waiting for input.

    For reasons of latency and scalability I can't just send the video and audio to the server and mux it there, it has to be done on the phone, in as lightweight a manner as possible.

    What I guess I'm looking for are suggestions as to how this can be accomplished. The optimal solution would be sending the packetized H264 video to ffmpeg over a pipe, but then I can't send ffmpeg the sdp file parameters it needs to decode the video.

    I can provide more information on request, like how ffmpeg is compiled for Android but I doubt that's necessary.

    Oh, and the way I start ffmpeg is through command line, I would really rather avoid mucking about with jni if that's at all possible.

    And help would be much appreciated, thanks.