Recherche avancée

Médias (1)

Mot : - Tags -/musée

Autres articles (71)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

Sur d’autres sites (12794)

  • ffmpeg is not executing clip.run() when compiled with pyinstaller

    24 septembre 2018, par Thriskel

    When executing the .exe file in another machine that doens’t have it’s requeriments installed, running the programs gives the error :

    Exception in thread Thread-1:
    Traceback (most recent call last):
     File "threading.py", line 916, in _bootstrap_inner
     File "threading.py", line 864, in run
     File "y2m.py", line 80, in workit
     File "site-packages\ffmpeg\_run.py", line 202, in run
     File "subprocess.py", line 707, in __init__
     File "subprocess.py", line 992, in _execute_child
    FileNotFoundError: [WinError 2] The system cannot find the file specified

    I’m compiling the exe file using the command :

    pyinstaller y2m.py

    I’ve been reading every other post in this and others forums refered to this kind of problem but I don’t seem to find or understand the way of fixing this.

    The source code is in here

    and the warny2m.txt file is in here

    Things that I have tried :

    • using —paths=pathToLibFolder
    • using -w
    • using an older pyinstaller version
    • adding python3 to path
    • using -p DIR to add C :\Python3\Lib location then sub locations, also individual scripts. (those 3 ways to see if it would work)

    EDIT :

    When using ffmpeg_run.py the code fails. line is clip.run() where clip presents the input and output file.

  • Bash : sort find results using part of a filename

    13 novembre 2015, par utt50

    I have 3 webcams set up in a building, uploading still images to a webserver. I’m using ffmpeg to encode the jpgs to mp4 video.

    The directories are set up like this :

    Cam1/201504
    Cam1/201505
    Cam2/201504
    Cam2/201505
    Cam3/201504
    Cam3/201505

    I’m using the following bash loop/ffmpeg parameters to make one video per camera, per year. This works well so far (well... except that my SSD is rapidly degrading in performance - too many simultaneous read/write operations ?) :

    find Cam2/2013* -name "*.jpg" -print0 | xargs -0 cat | ffmpeg -f image2pipe -framerate 30 -vcodec mjpeg -i - -vcodec libx264 -profile:v baseline -level 3.0 -movflags +faststart -crf 19 -pix_fmt yuv420p -r 30 "Cam2-2013-30fps-19crf.mp4"

    The individual files are named like this (confusing ffmpeg’s built-in file sequencer) :

    Cam1_2015052413543201.jpg
    Cam1_2015052413544601.jpg
    Cam2_2015052413032601.jpg
    Cam2_2015052413544901.jpg

    I now need to create one video for an entire year across all 3 cameras, ordered by timestamp. To accomplish this, I need to sort the find results by the segment of the filename after the underscore.

    What do I pipe the find output to to accomplish this ? For example, the files above would be ordered like this :

    Cam2_2015052413032601.jpg
    Cam1_2015052413543201.jpg
    Cam1_2015052413544601.jpg
    Cam2_2015052413544901.jpg

    Any help is very much appreciated !

  • displaying a baseline h264 frames stream in browsers

    6 août 2021, par Thabet Sabha

    So, I have a server that receives a live rtsp stream then generates baseline h264 frames using ffmpeg, which then are sent via an rtcDataChannel to browser, and while the frames arrive as intended, I can't figure out a way to display them on my html5 videoElement,
here is a simplified version of my current approach :

    


    const remoteStream = new MediaSource();&#xA;myVideoElement.src = window.URL.createObjectURL(remoteStream);&#xA;&#xA;// called when remoteStream.readyState === "open"&#xA;let sourceBuffer = remoteStream.addSourceBuffer(&#x27;video/mp4; codecs="avc1.4d002a"&#x27;);&#xA;&#xA;// this gets called when ever a new frame is received from the webrtc data channel.&#xA;function onFrame(frame) {&#xA;      sourceBuffer.appendBuffer(new Uint8Array(frame));&#xA;&#xA;      /*&#xA;      console.log(frame) ==> <buffer 00="00" 01="01" 41="41" 9b="9b" a0="a0" 22="22" 80="80" a5="a5" d7="d7" 42="42" ea="ea" 34="34" 14="14" 85="85" ba="ba" bc="bc" 1b="1b" f2="f2" 71="71" 0d="0d" 8b="8b" e1="e1" 3c="3c" 52="52" d5="d5" 8c="8c" ef="ef" c1="c1" 89="89" 10="10" c5="c5" 05="05" 78="78" ee="ee" 1d="1d" 03="03" 8d="8d" 2896="2896" more="more" bytes="bytes">&#xA;      */&#xA;}&#xA;</buffer>

    &#xA;

    ffmpeg options :

    &#xA;

    [&#xA;    "-rtsp_transport", "tcp",&#xA;    "-i", `${rtspCamURL}`,  &#xA;    "-framerate", "15",&#xA;    "-c:v", "libx264",&#xA;    "-vprofile", "baseline",&#xA;    "-b:v", "600k",&#xA;    "-bufsize", "600k",&#xA;    "-pix_fmt", "yuv420p",&#xA;    &#x27;-tune&#x27;, &#x27;zerolatency&#x27;,&#xA;    "-preset", "ultrafast",&#xA;    "-f", "rawvideo",&#xA;    &#x27;-&#x27;&#xA;]; &#xA;

    &#xA;

    ffmpeg stream is then split using NAL delimiter (to generate individual frames) then each frame is sent via the data channel like so :&#xA;Buffer.concat([nalDelimiter, frame]).

    &#xA;

    I am not sure if i'm missing something as i'm not getting any helpful errors due to the remoteSource closing as soon as the first frame arrives for some reason.

    &#xA;

    or does the media source just not support raw h264 frames, and if so is there a workaround to solve this issue ? (even if it has to do with changing the ffmpeg params.

    &#xA;