Recherche avancée

Médias (0)

Mot : - Tags -/page unique

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (62)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Automated installation script of MediaSPIP

    25 avril 2011, par

    To overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
    You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
    The documentation of the use of this installation script is available here.
    The code of this (...)

  • La sauvegarde automatique de canaux SPIP

    1er avril 2010, par

    Dans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
    Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...)

Sur d’autres sites (6102)

  • Python ffmpeg subprocess never exits on Linux, works on Windows

    24 mai 2021, par Chris

    I wonder if someone can help explain what is happening ?

    


    I run 2 subprocesses, 1 for ffprobe and 1 for ffmpeg.

    


    popen = subprocess.Popen(ffprobecmd, stderr=subprocess.PIPE, shell=True)


    


    And

    


    popen = subprocess.Popen(ffmpegcmd, shell=True, stdout=subprocess.PIPE)


    


    On both Windows and Linux the ffprobe command fires, finishes and gets removed from taskmanager/htop. But only on Windows does the same happen to ffmpeg. On Linux the command remains in htop...

    


    enter image description here

    


    Can anyone explain what is going on, if it matters and how I can stop it from happening please ?

    


    EDIT : Here are the commands...

    


    ffprobecmd = 'ffprobe' + \
' -user_agent "' + request.headers['User-Agent'] + '"' + \
' -headers "Referer: ' + request.headers['Referer'] + '"' + \
' -timeout "5000000"' + \
' -v error -select_streams v -show_entries stream=height -of default=nw=1:nk=1' + \
' -i "' + request.url + '"'


    


    and

    


    ffmpegcmd = 'ffmpeg' + \
' -re' + \
' -user_agent "' + r.headers['User-Agent'] + '"' + \
' -headers "Referer: ' + r.headers['Referer'] + '"' + \
' -timeout "10"' + \
' -i "' + r.url + '"' + \
' -c copy' + \
' -f mpegts' + \
' pipe:'


    


    EDIT : Here is a example that behaves as described...

    


    import flask
from flask import Response
import subprocess

app = flask.Flask(__name__)

@app.route('/', methods=['GET'])
def go():
    def stream(ffmpegcmd):
        popen = subprocess.Popen(ffmpegcmd, stdout=subprocess.PIPE, shell=True)
        try:
            for stdout_line in iter(popen.stdout.readline, ""):
                yield stdout_line
        except GeneratorExit:
            raise

    url = "https://bitdash-a.akamaihd.net/content/MI201109210084_1/m3u8s/f08e80da-bf1d-4e3d-8899-f0f6155f6efa.m3u8"

    ffmpegcmd = 'ffmpeg' + \
                ' -re' + \
                ' -timeout "10"' + \
                ' -i "' + url + '"' + \
                ' -c copy' + \
                ' -f mpegts' + \
                ' pipe:'
    return Response(stream(ffmpegcmd))

if __name__ == '__main__':
    app.run(host= '0.0.0.0', port=5000)


    


  • How to use ffmpeg to record video from SDP ?

    17 août 2024, par Thomas Carlton

    I'm running Ubuntu 18.04 and I have the following SDP Offer from Kurento Media Server. The offer is saved to the file a.sdp :

    


        v=0
    o=- 3831476180 3831476180 IN IP4 172.31.46.122
    s=Kurento Media Server
    c=IN IP4 172.31.46.122
    t=0 0
    m=audio 28460 RTP/AVPF 96 0 97
    a=setup:actpass
    a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
    a=rtpmap:96 opus/48000/2
    a=rtpmap:97 AMR/8000
    a=rtcp:28461
    a=sendrecv
    a=mid:audio0
    a=ssrc:1797155263 cname:user1913428254@host-e7ab0454
    m=video 18122 RTP/AVPF 102 103
    a=setup:actpass
    a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
    a=rtpmap:102 VP8/90000
    a=rtpmap:103 H264/90000
    a=fmtp:103 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=42e01f
    a=rtcp:18123
    a=sendrecv
    a=mid:video0
    a=rtcp-fb:102 nack
    a=rtcp-fb:102 nack pli
    a=rtcp-fb:102 goog-remb
    a=rtcp-fb:102 ccm fir
    a=rtcp-fb:103 nack
    a=rtcp-fb:103 nack pli
    a=rtcp-fb:103 ccm fir
    a=ssrc:2822275190 cname:user1913428254@host-e7ab0454


    


    Then, I'm calling ffmpeg to record this flow to a file :

    


    ffmpeg -max_delay 5000 -reorder_queue_size 16384 -protocol_whitelist file,crypto,udp,rtp -re -i a.sdp -vcodec copy -acodec aac -y output.mp4


    


    172.31.46.122 is the local IP adress and I'm running ffmpeg from the same machine as SDP offer. So ffmpeg has access to this IP address.

    


    I'm getting the error :

    


    


    bind failed : Address already in use.

    


    a.sdp : Invalid data found when processing input

    


    


    How can I solve that ?

    


  • Live555 Multicast RTSP Server - Display in FFMPEG

    31 mai 2021, par BanB

    I made a multicast RTSP server using the Live555 library.
    
It is displayed by VLC but not from VMS (NX Witness) or FFMPEG.
    
If I use the Unicast, I can display it, but I do not know how to change the file that is streaming in the Unicast.
    
Is there a way to change the streaming file in Unicast or make it viewable in FFMPEG with Multicast ?

    


    This is my code.

    


    


    Header

    


    


    #include <iostream>&#xA;#include <string>&#xA;#include <filesystem>&#xA;#include <fstream>&#xA;#include <ctime>&#xA;#include &#xA;#include &#xA;#include  &#xA;#include <opencv2></opencv2>videoio.hpp>&#xA;#include <opencv2></opencv2>imgcodecs.hpp>&#xA;&#xA;void Start(const char* streamPath, int port);&#xA;void AddDataL(BYTE* data, int length);&#xA;AddDataHW(BYTE* data, int height, int width);&#xA;&#xA;void Play();&#xA;void AfterPlaying(void*);&#xA;&#xA;void EncodeToH264(cv::Mat image);&#xA;long long GetTimeMs();&#xA;int GetFileCount();&#xA;const char* GetFirstFilePath();&#xA;&#xA;const int fourcc = cv::VideoWriter::fourcc(&#x27;x&#x27;, &#x27;2&#x27;, &#x27;6&#x27;, &#x27;4&#x27;);&#xA;&#xA;UsageEnvironment* env;&#xA;H264ServerMediaSubsession* h264;&#xA;H264VideoStreamFramer* videoSource;&#xA;RTPSink* videoSink;&#xA;&#xA;const char* dataPath = "data/";&#xA;const char* extension = ".h264";&#xA;</ctime></fstream></filesystem></string></iostream>

    &#xA;

    &#xA;

    CPP

    &#xA;

    &#xA;

    void Start()&#xA;{&#xA;    askScheduler *scheduler = BasicTaskScheduler::createNew();&#xA;    env = BasicUsageEnvironment::createNew(*scheduler);&#xA;&#xA;    RTSPServer *rtspServer = RTSPServer::createNew(*env, 8554);&#xA;&#xA;    if (rtspServer == NULL)&#xA;        throw;&#xA;&#xA;    struct sockaddr_storage destinationAddress;&#xA;    destinationAddress.ss_family = AF_INET;&#xA;    ((struct sockaddr_in &amp;)destinationAddress).sin_addr.s_addr = chooseRandomIPv4SSMAddress(*env);&#xA;&#xA;    const unsigned short rtpPortNum = 18888;&#xA;    const unsigned short rtcpPortNum = rtpPortNum &#x2B; 1;&#xA;    const unsigned char ttl = 255;&#xA;&#xA;    const Port rtpPort(rtpPortNum);&#xA;    const Port rtcpPort(rtcpPortNum);&#xA;&#xA;    Groupsock rtpGroupsock(*env, destinationAddress, rtpPort, ttl);&#xA;    rtpGroupsock.multicastSendOnly();&#xA;    Groupsock rtcpGroupsock(*env, destinationAddress, rtcpPort, ttl);&#xA;    rtcpGroupsock.multicastSendOnly();&#xA;&#xA;    OutPacketBuffer::maxSize = 300000;&#xA;    videoSink = H264VideoRTPSink::createNew(*env, &amp;rtpGroupsock, 96);&#xA;&#xA;    const unsigned estimatedSessionBandwidth = 5000;&#xA;    const unsigned maxCNAMElen = 100;&#xA;    unsigned char CNAME[maxCNAMElen &#x2B; 1];&#xA;    gethostname((char *)CNAME, maxCNAMElen);&#xA;    CNAME[maxCNAMElen] = &#x27;\0&#x27;;&#xA;&#xA;    RTCPInstance *rtcp = RTCPInstance::createNew(*env, &amp;rtcpGroupsock, estimatedSessionBandwidth, CNAME, videoSink, NULL, True);&#xA;&#xA;    ServerMediaSession *sms = ServerMediaSession::createNew(*env, "live", "live", "Session streamed by \"Test Server\"", True);&#xA;    sms->addSubsession(PassiveServerMediaSubsession::createNew(*videoSink, rtcp));&#xA;    rtspServer->addServerMediaSession(sms);&#xA;&#xA;    *env &lt;&lt; "Start > " &lt;&lt; rtspServer->rtspURL(sms) &lt;&lt; "\n";&#xA;&#xA;    Play();&#xA;&#xA;    env->taskScheduler().doEventLoop();&#xA;}&#xA;&#xA;void AddData(BYTE* data, int length)&#xA;{&#xA;    EncodeToH264(cv::imdecode(cv::Mat(1, length, CV_8UC3, data), 1));&#xA;}&#xA;&#xA;void EncodeToH264(cv::Mat image)&#xA;{&#xA;    std::string fileName = dataPath &#x2B; std::to_string(GetTimeMs()) &#x2B; extension;&#xA;    cv::VideoWriter writer(fileName.c_str(), fourcc, 1, image.size());&#xA;    writer.write(image);&#xA;    writer.write(image);&#xA;    writer.release();&#xA;}&#xA;&#xA;void Play()&#xA;{&#xA;    while (GetFileCount() == 0);&#xA;&#xA;    const char* fileName = GetFirstFilePath();&#xA;&#xA;    ByteStreamFileSource* fileSource;&#xA;    while ((fileSource = ByteStreamFileSource::createNew(*env, fileName)) == NULL);&#xA;&#xA;    FramedSource* videoES = fileSource;&#xA;&#xA;    videoSource = H264VideoStreamFramer::createNew(*env, videoES);&#xA;&#xA;    videoSink->startPlaying(*videoSource, AfterPlaying, videoSink);&#xA;}&#xA;&#xA;void AfterPlaying(void*)&#xA;{&#xA;    videoSink->stopPlaying();&#xA;    Medium::close(videoSource);&#xA;&#xA;    if (GetFileCount() > 1)&#xA;        std::filesystem::remove(GetFirstFilePath());&#xA;&#xA;    Play();&#xA;}&#xA;&#xA;long long GetTimeMs()&#xA;{&#xA;    return std::chrono::duration_cast(std::chrono::system_clock::now().time_since_epoch()).count();&#xA;}&#xA;&#xA;int GetFileCount() &#xA;{&#xA;    return std::distance(std::filesystem::directory_iterator(dataPath), std::filesystem::directory_iterator());&#xA;}&#xA;&#xA;const char* GetFirstFilePath()&#xA;{&#xA;    for (const auto&amp; entry : std::filesystem::directory_iterator(dataPath))&#xA;        return entry.path().string().c_str();&#xA;}&#xA;

    &#xA;