Recherche avancée

Médias (91)

Autres articles (102)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

Sur d’autres sites (10424)

  • How can I create a side-by-side horizontally stacked video with a delay in video and audio using ffmpeg-python ?

    3 septembre 2021, par kup

    I am trying to create a video from two videos that will be stacked horizontally, but the resulting video is not what I expected. The same video is appearing on both sides (left and right). How can that be achieved ?

    


    The ffmpeg command I tried, which kind of worked :

    


    ffmpeg -i left.mp4 -i right.mp4 -filter_complex "
[0:v]setpts=PTS-STARTPTS[l];
[1:v]setpts=PTS-STARTPTS,tpad=start_duration=14:start_mode=add:color=black[r];
[l][r]hstack=inputs=2[stacked]; [0:a][1:a]amix=inputs=2[a]
" -map "[stacked]" -map "[a]" -c:a aac -preset superfast result.mp4


    


    The ffmpeg-python script that's not working :

    


    in0 = ffmpeg.input('0.mp4')
in1 = ffmpeg.input('right.mkv')
aout = ffmpeg.filter([in0.audio, in1.audio.filter('adelay', "5000|5000")],'amix')
vout = ffmpeg.filter([inv0.video, inv1.video.filter('tpad', start_duration=5, start_mode='add', color='black')], 'hstack')

(
    ffmpeg
    .concat(vout, aout, v=1, a=1)
    .output("out.mkv")
    .run()
)


    


    This script is not resulting in the expected output which is two videos side-by-side horizontally stacked.

    


  • How can I stream a video from localhost to web browser using the video tag ?

    3 septembre 2021, par kup

    Basically what I want to stream a video to my browser both on localhost, using Flask, OpenCV, or FFmpeg.

    


    But I am not sure how to do it.

    


    I tried this :

    


    #!/usr/bin/env python
from flask import Flask, render_template, Response
import cv2
import sys
import numpy

app = Flask(__name__)

def get_frame():
    c=cv2.VideoCapture("output.mkv")

    while True:
        retval, im = c.read()
        imgencode=cv2.imencode('.jpg',im)[1]
        stringData=imgencode.tostring()
        yield (b'--frame\r\n'
            b'Content-Type: text/plain\r\n\r\n'+stringData+b'\r\n')

    del(c)

@app.route('/vid')
def vid():
     return Response(get_frame(),mimetype='multipart/x-mixed-replace; boundary=frame')


if __name__ == '__main__':
    app.run(host='localhost',port=5000, debug=True, threaded=True)


    


    But it's not working. Maybe because it is sending images. How can I stream video ?

    


  • Gstreamer, x264enc "Redistribute latency..." error

    20 août 2021, par Jason

    I'm trying to setup a video pipeline with very limited bandwidth. I was able to do it with two raspberry pis using the below lines. The first is for the camera pi and the second is to watch stream :

    


    gst-launch-1.0 rpicamsrc preview=false !  'video/x-h264, width=800, height=600, framerate=30/1' ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! udpsink host=YOUR_PC_IP port=5000
gst-launch-1.0 udpsrc port=5000 ! gdpdepay ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink sync=false


    


    It works but I go over my bandwidth limit if there is movement. I'm not sure if there is a way to limit bandwidth by setting a parameter here :

    


    'video/x-h264, width=800, height=600, framerate=30/1'


    


    From what I can find online, I have to use something like x264enc. I've followed tutorials but I can't get x264enc to work. it always outputs "Redistribute latency..." on both machines when run and it stays there.

    


    I've tried using x264enc like follows :

    


    gst-launch-1.0 rpicamsrc preview=false !  'video/x-raw, width=800, height=600, framerate=30/1' ! x264enc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! udpsink host=YOUR_PC_IP port=5000
gst-launch-1.0 rpicamsrc preview=false !  'video/x-raw, width=800, height=600, framerate=30/1' ! x264enc tune=zerolatency ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! udpsink host=YOUR_PC_IP port=5000
gst-launch-1.0 rpicamsrc preview=false ! x264enc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! udpsink host=YOUR_PC_IP port=5000
gst-launch-1.0 rpicamsrc preview=false ! x264enc tune=zerolatency ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! udpsink host=YOUR_PC_IP port=5000


    


    Based on tutorials, I would think some of those should work. Other threads say that tune=zerolatency fixes my problem. At least the ones with same output of "Redistribute latency..." I don't know what I'm doing wrong.

    


    Any help would be appreciated. Thanks !