Recherche avancée

Médias (0)

Mot : - Tags -/masques

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (102)

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Use, discuss, criticize

    13 avril 2011, par

    Talk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
    The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
    A discussion list is available for all exchanges between users.

Sur d’autres sites (13733)

  • ffmpeg cannot open connection tcp ://a.rtmp.youtube.com

    13 mars 2024, par Hiji Deui

    I want to live stream using ffmpeg, when live on Facebook it runs normally, but when I live on YouTube there is an error, is there anything wrong with the command I entered ? even though the command is the same as live on Facebook, but only the RTMP link has been changed

    


    

    

    ffmpeg -re -i out.mp4 -c:v copy -c:a aac -ar 44100 -ab 128k -ac 2 -strict -2 -flags +global_header -bsf:a aac_adtstoasc -bufsize 3000k -f flv "rtmp://a.rtmp.youtube.com/live2/my-key-streaming"

    


    


    



    and the output is

    


    

    

    ffmpeg version N-55112-g7eb9cf593e-static https://johnvansickle.com/ffmpeg/  Copyright (c) 2000-2020 the FFmpeg developers
  built with gcc 8 (Debian 8.3.0-6)
  configuration: --enable-gpl --enable-version3 --enable-static --disable-debug --disable-ffplay --disable-indev=sndio --disable-outdev=sndio --cc=gcc --enable-fontconfig --enable-frei0r --enable-gnutls --enable-gmp --enable-libgme --enable-gray --enable-libaom --enable-libfribidi --enable-libass --enable-libvmaf --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librubberband --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libvorbis --enable-libopus --enable-libtheora --enable-libvidstab --enable-libvo-amrwbenc --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libdav1d --enable-libxvid --enable-libzvbi --enable-libzimg
  libavutil      56. 61.100 / 56. 61.100
  libavcodec     58.114.100 / 58.114.100
  libavformat    58. 64.100 / 58. 64.100
  libavdevice    58. 11.103 / 58. 11.103
  libavfilter     7. 91.100 /  7. 91.100
  libswscale      5.  8.100 /  5.  8.100
  libswresample   3.  8.100 /  3.  8.100
  libpostproc    55.  8.100 / 55.  8.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'out.mp4':
  Metadata:
    major_brand     : mp42
    minor_version   : 0
    compatible_brands: isommp42
    creation_time   : 2020-12-26T11:13:27.000000Z
    com.android.version: 10
  Duration: 00:00:03.27, start: 0.000000, bitrate: 21344 kb/s
    Stream #0:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, unknown/bt470bg/unknown), 1920x1080, 20225 kb/s, SAR 1:1 DAR 16:9, 29.99 fps, 30.01 tbr, 90k tbn, 180k tbc (default)
    Metadata:
      rotate          : 90
      creation_time   : 2020-12-26T11:13:27.000000Z
      handler_name    : VideoHandle
    Side data:
      displaymatrix: rotation of -90.00 degrees
    Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, mono, fltp, 128 kb/s (default)
    Metadata:
      creation_time   : 2020-12-26T11:13:27.000000Z
      handler_name    : SoundHandle
[tcp @ 0x58bf880] Connection to tcp://a.rtmp.youtube.com:1935 failed: Connection timed out
[rtmp @ 0x5893140] Cannot open connection tcp://a.rtmp.youtube.com:1935
rtmp://a.rtmp.youtube.com/live2/my-key: Connection timed out

    


    


    



    how to fix this, btw i use vps, sorry, my english so bad and this is the first time i asked on this website

    


  • How do I use FFmpeg to fetch an audio from a local network and decode it to PCM ?

    26 mai 2020, par Yousef Alaqra

    Currently, I have a node js server which is connected to a specific IP address on the local network (the source of the audio), to receive the audio using VBAN protocol. VBAN protocol, basically uses UDP to send audio over the local network.

    



    Node js implementation :

    



    http.listen(3000, () => {
  console.log("Server running on port 3000");
});

let PORT = 6980;
let HOST = "192.168.1.244";

io.on("connection", (socket) => {
  console.log("a user connected");
  socket.on("disconnect", () => {
    console.log("user disconnected");
  });
});

io.on("connection", () => {

  let dgram = require("dgram");
  let server = dgram.createSocket("udp4");

  server.on("listening", () => {
    let address = server.address();
    console.log("server host", address.address);
    console.log("server port", address.port);
  });

  server.on("message", function (message, remote) {
    let audioData = vban.ProcessPacket(message);
    io.emit("audio", audioData); // // console.log(`Received packet: ${remote.address}:${remote.port}`)
  });
  server.bind({
    address: "192.168.1.230",
    port: PORT,
    exclusive: false,
  });
});


    



    once the server receives a package from the local network, it processes the package, then, using socket.io it emits the processed data to the client.

    



    An example of the processed audio data that's being emitted from the socket, and received in the angular :

    



         audio {&#xA;      format: {&#xA;        channels: 2,&#xA;        sampleRate: 44100,&#xA;        interleaved: true,&#xA;        float: false,&#xA;        signed: true,&#xA;        bitDepth: 16,&#xA;        byteOrder: &#x27;LE&#x27;&#xA;      },&#xA;      sampleRate: 44100,&#xA;      buffer: <buffer 2e="2e" 00="00" ce="ce" ff="ff" 3d="3d" bd="bd" 44="44" b6="b6" 48="48" c3="c3" 32="32" d3="d3" 31="31" d4="d4" 30="30" dd="dd" 38="38" 34="34" e5="e5" 1d="1d" c6="c6" 25="25" 974="974" more="more" bytes="bytes">,&#xA;      channels: 2,&#xA;}&#xA;</buffer>

    &#xA;&#xA;

    In the client-side (Angular), after receiving a package using socket.io.clinet, AudioConetext is used to decode the audio and play it :

    &#xA;&#xA;

       playAudio(audioData) {&#xA;    let audioCtx = new AudioContext();&#xA;    let count = 0;&#xA;    let offset = 0;&#xA;    let msInput = 1000;&#xA;    let msToBuffer = Math.max(50, msInput);&#xA;    let bufferX = 0;&#xA;    let audioBuffer;&#xA;    let prevFormat = {};&#xA;    let source;&#xA;&#xA;    if (!audioBuffer || Object.keys(audioData.format).some((key) => prevFormat[key] !== audioData.format[key])) {&#xA;      prevFormat = audioData.format;&#xA;      bufferX = Math.ceil(((msToBuffer / 1000) * audioData.sampleRate) / audioData.samples);&#xA;      if (bufferX &lt; 3) {&#xA;        bufferX = 3;&#xA;      }&#xA;      audioBuffer = audioCtx.createBuffer(audioData.channels, audioData.samples * bufferX, audioData.sampleRate);&#xA;      if (source) {&#xA;        source.disconnect();&#xA;      }&#xA;      source = audioCtx.createBufferSource();&#xA;      console.log("source", source);&#xA;      source.connect(audioCtx.destination);&#xA;      source.loop = true;&#xA;      source.buffer = audioBuffer;&#xA;      source.start();&#xA;    }&#xA;  }&#xA;

    &#xA;&#xA;

    Regardless that audio isn't playing in the client-side, and there is something wrong, this isn't the correct implementation.

    &#xA;&#xA;

    Brad, mentioned in the comments below, that I can implement this correctly and less complexity using FFmpeg child-process. And I'm very interested to know how to fetch the audio locally using FFmpeg.

    &#xA;

  • How to Use FFmpeg to Fetch an Audio From Local Network and Decode it to PCM ?

    26 mai 2020, par Yousef Alaqra

    Currently, I have a node js server which is connected to a specific IP address on the local network (the source of the audio), to receive the audio using VBAN protocol. VBAN protocol, basically uses UDP to send audio over the local network.

    &#xA;&#xA;

    Node js implementation :

    &#xA;&#xA;

    http.listen(3000, () => {&#xA;  console.log("Server running on port 3000");&#xA;});&#xA;&#xA;let PORT = 6980;&#xA;let HOST = "192.168.1.244";&#xA;&#xA;io.on("connection", (socket) => {&#xA;  console.log("a user connected");&#xA;  socket.on("disconnect", () => {&#xA;    console.log("user disconnected");&#xA;  });&#xA;});&#xA;&#xA;io.on("connection", () => {&#xA;&#xA;  let dgram = require("dgram");&#xA;  let server = dgram.createSocket("udp4");&#xA;&#xA;  server.on("listening", () => {&#xA;    let address = server.address();&#xA;    console.log("server host", address.address);&#xA;    console.log("server port", address.port);&#xA;  });&#xA;&#xA;  server.on("message", function (message, remote) {&#xA;    let audioData = vban.ProcessPacket(message);&#xA;    io.emit("audio", audioData); // // console.log(`Received packet: ${remote.address}:${remote.port}`)&#xA;  });&#xA;  server.bind({&#xA;    address: "192.168.1.230",&#xA;    port: PORT,&#xA;    exclusive: false,&#xA;  });&#xA;});&#xA;

    &#xA;&#xA;

    once the server receives a package from the local network, it processes the package, then, using socket.io it emits the processed data to the client.

    &#xA;&#xA;

    An example of the processed audio data that's being emitted from the socket, and received in the angular :

    &#xA;&#xA;

         audio {&#xA;      format: {&#xA;        channels: 2,&#xA;        sampleRate: 44100,&#xA;        interleaved: true,&#xA;        float: false,&#xA;        signed: true,&#xA;        bitDepth: 16,&#xA;        byteOrder: &#x27;LE&#x27;&#xA;      },&#xA;      sampleRate: 44100,&#xA;      buffer: <buffer 2e="2e" 00="00" ce="ce" ff="ff" 3d="3d" bd="bd" 44="44" b6="b6" 48="48" c3="c3" 32="32" d3="d3" 31="31" d4="d4" 30="30" dd="dd" 38="38" 34="34" e5="e5" 1d="1d" c6="c6" 25="25" 974="974" more="more" bytes="bytes">,&#xA;      channels: 2,&#xA;}&#xA;</buffer>

    &#xA;&#xA;

    In the client-side (Angular), after receiving a package using socket.io.clinet, AudioConetext is used to decode the audio and play it :

    &#xA;&#xA;

       playAudio(audioData) {&#xA;    let audioCtx = new AudioContext();&#xA;    let count = 0;&#xA;    let offset = 0;&#xA;    let msInput = 1000;&#xA;    let msToBuffer = Math.max(50, msInput);&#xA;    let bufferX = 0;&#xA;    let audioBuffer;&#xA;    let prevFormat = {};&#xA;    let source;&#xA;&#xA;    if (!audioBuffer || Object.keys(audioData.format).some((key) => prevFormat[key] !== audioData.format[key])) {&#xA;      prevFormat = audioData.format;&#xA;      bufferX = Math.ceil(((msToBuffer / 1000) * audioData.sampleRate) / audioData.samples);&#xA;      if (bufferX &lt; 3) {&#xA;        bufferX = 3;&#xA;      }&#xA;      audioBuffer = audioCtx.createBuffer(audioData.channels, audioData.samples * bufferX, audioData.sampleRate);&#xA;      if (source) {&#xA;        source.disconnect();&#xA;      }&#xA;      source = audioCtx.createBufferSource();&#xA;      console.log("source", source);&#xA;      source.connect(audioCtx.destination);&#xA;      source.loop = true;&#xA;      source.buffer = audioBuffer;&#xA;      source.start();&#xA;    }&#xA;  }&#xA;

    &#xA;&#xA;

    Regardless that audio isn't playing in the client-side, and there is something wrong, this isn't the correct implementation.

    &#xA;&#xA;

    Brad, mentioned in the comments below, that I can implement this correctly and less complexity using FFmpeg child-process. And I'm very interested to know how to fetch the audio locally using FFmpeg.

    &#xA;