Recherche avancée

Médias (91)

Autres articles (111)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • Le plugin : Gestion de la mutualisation

    2 mars 2010, par

    Le plugin de Gestion de mutualisation permet de gérer les différents canaux de mediaspip depuis un site maître. Il a pour but de fournir une solution pure SPIP afin de remplacer cette ancienne solution.
    Installation basique
    On installe les fichiers de SPIP sur le serveur.
    On ajoute ensuite le plugin "mutualisation" à la racine du site comme décrit ici.
    On customise le fichier mes_options.php central comme on le souhaite. Voilà pour l’exemple celui de la plateforme mediaspip.net :
    < ?php (...)

Sur d’autres sites (11791)

  • Is it possible to send a temporary slate (image or video) into a running Azure Live Event RTMP-stream ?

    15 novembre 2020, par Brian Frisch

    I'm currently building a video streaming app which leverages Azure Media Services Live Events.

    &#xA;

    It consists of :

    &#xA;

      &#xA;
    1. a mobile app that can stream live video and.
    2. &#xA;

    3. a web client that plays the live event video.
    4. &#xA;

    5. a producer screen with controls to start and stop the web client access to the video.
    6. &#xA;

    7. a server that handles various operations around the entire system
    8. &#xA;

    &#xA;

    It's working very well, but I would like to add a feature that would enable the producer to add some elegance to the experience. Therefore I'm trying to get my head around how I can enable the producer be able to switch the incoming source of the stream to a pre-recorded video or event a still image at any point during the recording, and also to switch back to live-video. A kill-switch of some kind, that would cover waiting-time if there's technical difficulties on the set, and it could also be used for pre-/post-roll branding slates when introing and outroing a video event. I would like this source switch to be embedded in the video stream (also so it would be possible to get this into the final video-product if I need it in an archive for later playback)

    &#xA;

    I'm trying to do it in a way where the producer can set a timestamp for when the video override should come in, and when it should stop. The I want to have my server respond to these timestamps and send the instructions over RTMP to the Azure Live Event. Is it possible to send such an instruction ("Hey, play this video-bit/show this image in the stream for x-seconds") in the RTMP-protocol ? I've tried to figure it out, and I've read about SCTE-35 markers and such, but I have not been able to find any examples on how to do it, so I'm a bit stuck.

    &#xA;

    My plan-B is to make it possible to stream an image from the mobile application that already handles the live video-stream, but I'm initially targeting an architecture where the mobile app is unaware of anything else than live streaming, and this override switch should preferably be handled by the server, which is a firebase functions setup.

    &#xA;

    If you are able to see other ways of doing it, I'm all ears.

    &#xA;

    I've already tried to build a ffmpeg method that listens to updates to the producer-set state, and then streams an image to the same RTMP-url that the video goes to from the mobile app. But it only works when the live video isn't already streaming - it seems like I cannot take over a RTMP-stream when it's already running.

    &#xA;

  • How to create readable file stream from multiple files

    9 septembre 2020, par lowcrawler

    Related to question : Add specific image files to ffmpeg video with fluent-ffmpeg

    &#xA;

    Have array of file names that reside on network shares. Would like to turn them into a timelapse video with a process on a nodejs server. I plan to use ffmpeg to do this. FFMPEG doesn't allow file lists as input (as far as I can tell). They do, however, allow a 'readable file stream'.

    &#xA;

    Is there a mechanism to create a readable file stream from an array of filenames ?

    &#xA;

    The command they use in the fluent-ffmpeg example is : var command = ffmpeg(fs.createReadStream(&#x27;/path/to/file.avi&#x27;));

    &#xA;

    ... but obviously, this is only a single file.

    &#xA;

  • Trying to stream to multiple RTMPs with tee throws : Output file #0 does not contain any stream

    27 août 2020, par user33276346

    I'm trying to stream one file to multiple rtmps. The following command streams ok to one endpoint :

    &#xA;

    ffmpeg -re -stream_loop 10 -i input.mp4 -c copy -f flv rtmps://x4t-myamsacc-usct.channel.media.azure.net:2935/live/x4t/x4t&#xA;

    &#xA;

    This one does not :

    &#xA;

    ffmpeg -re -i input.mp4 -c copy -f tee "[f=flv]rtmps://x4t-myamsacc-usct.channel.media.azure.net:2935/live/x4t/x4t"&#xA;

    &#xA;

    Throws this error :

    &#xA;

    Output file #0 does not contain any stream&#xA;

    &#xA;

    Once I can do it for one, I plan to do it for more. What could I be doing wrong ? This is the console log :

    &#xA;

    ffmpeg version git-2020-08-24-3477feb Copyright (c) 2000-2020 the FFmpeg developers&#xA;  built with gcc 10.2.1 (GCC) 20200805&#xA;  configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libsrt --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom --enable-libgsm --enable-librav1e --enable-libsvtav1 --disable-w32threads --enable-libmfx --enable-ffnvcodec --enable-cuda-llvm --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth --enable-libopenmpt --enable-amf&#xA;  libavutil      56. 58.100 / 56. 58.100&#xA;  libavcodec     58.101.100 / 58.101.100&#xA;  libavformat    58. 51.100 / 58. 51.100&#xA;  libavdevice    58. 11.101 / 58. 11.101&#xA;  libavfilter     7. 87.100 /  7. 87.100&#xA;  libswscale      5.  8.100 /  5.  8.100&#xA;  libswresample   3.  8.100 /  3.  8.100&#xA;  libpostproc    55.  8.100 / 55.  8.100&#xA;Input #0, mov,mp4,m4a,3gp,3g2,mj2, from &#x27;input.mp4&#x27;:&#xA;  Metadata:&#xA;    major_brand     : mp42&#xA;    minor_version   : 0&#xA;    compatible_brands: isommp42&#xA;    creation_time   : 2018-07-13T08:13:55.000000Z&#xA;  Duration: 00:02:15.49, start: 0.000000, bitrate: 368 kb/s&#xA;    Stream #0:0(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p(tv, bt709), 1280x720 [SAR 1:1 DAR 16:9], 239 kb/s, 30 fps, 30&#xA;tbr, 90k tbn, 60 tbc (default)&#xA;    Metadata:&#xA;      creation_time   : 2018-07-13T08:13:55.000000Z&#xA;      handler_name    : ISO Media file produced by Google Inc. Created on: 07/13/2018.&#xA;    Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 125 kb/s (default)&#xA;    Metadata:&#xA;      creation_time   : 2018-07-13T08:13:55.000000Z&#xA;      handler_name    : ISO Media file produced by Google Inc. Created on: 07/13/2018.&#xA;Output #0, tee, to &#x27;[f=flv]rtmps://x4t-myamsacc-usct.channel.media.azure.net:2935/live/x4t/x4t&#x27;:&#xA;Output file #0 does not contain any stream&#xA;

    &#xA;