Recherche avancée

Médias (1)

Mot : - Tags -/swfupload

Autres articles (32)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (4223)

  • Why is the audio recording on Chrome missing duration

    16 août 2019, par Ivan Sedelkin

    I’m sending some user created audio to a server which later passes it to googles speech to text api for transcription. Everything works perfectly on firefox but when I try it on Chrome it doesn’t work. I then installed FFmpeg to check if the metadata somehow changed for the file on chrome and noticed that the duration on the file is tagged "N/A". This does not happen on firefox.

    This is the audio recorded from Chrome

    Input #0, matroska,webm, from 'PATH_TO_FILE':
     Metadata:
       encoder         : Chrome
     Duration: N/A, start: 0.000000, bitrate: N/A
       Stream #0:0(eng): Audio: opus, 48000 Hz, mono, fltp (default)

    This is the audio recorded from Firefox

    Input #0, ogg, from 'PATH_TO_FILE':
     Duration: 00:00:01.26, start: 0.000000, bitrate: 53 kb/s
       Stream #0:0: Audio: opus, 48000 Hz, mono, fltp
       Metadata:
         ENCODER         : Mozilla68.0

    The audio itself is recorded using the mediarecorder-api where the blob is later converted to base64-url and sent to my server.

    This is the code that I use to record the audio :

    navigator.mediaDevices
       .getUserMedia(
         // constraints - only audio needed for this app
         {
           audio: true
         }
       )

       // Success callback
       .then(function(stream) {
         console.log(navigator.mediaDevices.getSupportedConstraints());
         var mediaRecorder = new MediaRecorder(stream, { sampleRate: 44100 });
         var chunks = [];
         $(".rec-button")
           .mousedown(function() {
             console.log("rec start");
             $(".rec-button i").addClass("recStart");
             mediaRecorder.start();
             console.log(mediaRecorder.state);
             console.log("recorder started");
           })
           .mouseup(function() {
             console.log("rec end");
             $(".rec-button i").removeClass("recStart");
             mediaRecorder.stop();
             mediaRecorder.ondataavailable = function(e) {
               chunks.push(e.data);
               var blob = new Blob(chunks, { type: "audio/ogg; codecs=opus" });
               var player = document.getElementById("player");
               player.src = URL.createObjectURL(blob);
               chunks = [];
               var reader = new window.FileReader();
               reader.readAsDataURL(blob);
               reader.onloadend = function() {
                 var base64 = reader.result;
                 var audioArr = {
                   audio: base64
                 };
                 $.ajax({
                   url: "http://localhost:4242/api/1.0/post",
                   type: "POST",
                   contentType: "application/json",
                   dataType: "json",
                   data: JSON.stringify(audioArr),
                   success: function(response) {
                     console.log(response);
                   },
                   error: function(err) {
                     console.log(err);
                   }
                 });
               };
               console.log(mediaRecorder.state);
               console.log("recorder stopped");
             };
           });
       })

       // Error callback
       .catch(function(err) {
         console.log("The following getUserMedia error occured: " + err);
       });

    My goal is that the audio file recorded from chrome has a duration so that the google api can transcribe it. If you guys have any way of overcoming this problem I would be grateful

  • HTTP Live video streaming using ffmpeg using NODEJS via Socket.IO

    4 avril 2024, par dintel

    I can't transmit the webcam to the server and to the website. Before, I was able to transmit the webcam (I don't remember how), but I only see frames continuously sent in cmd and don't see anything at localhost:3000 or 127.0.0.1:3000.
I installed socket.io, ffmpeg and nodejs path
And when i try this one, everything fine on udp, but to server and website is impossible.
ffplay udp ://127.0.0.1:23000
ffmpeg -f dshow -framerate 30 -video_size 640x480 -i video="Integrated Camera" -vcodec mpeg4 -f mpegts udp ://127.0.0.1:23000

    


    My server.js code :

    


    
const http = require('http');
const fs = require('fs');
const server = http.createServer((req, res) => {
  res.writeHead(200, {'Content-Type': 'text/html'});
  fs.readFile(__dirname + '/index.html', (err, data) => {
    if (err) {
      res.writeHead(404);
      res.end(JSON.stringify(err));
      return;
    }
    res.writeHead(200);
    res.end(data);
  });
});
const io = require('socket.io')(server);

io.on('connection', (socket) => {
  console.log('A user connected');

  socket.on('disconnect', () => {
    console.log('User disconnected');
  });
});

server.listen(3000, () => {
  console.log('Server is running on http://localhost:3000');
});

const spawn = require('child_process').spawn;

const ffmpeg = spawn('ffmpeg', [
  '-f', 'dshow',
  '-i', 'video="Integrated Camera"', // i'm sure this is my webcam window
  '-vcodec', 'libx264',
  '-preset', 'veryfast',
  '-tune', 'zerolatency',
  '-f', 'hls',
  '-hls_time', '2',
  '-hls_list_size', '6',
  '-hls_flags', 'delete_segments',
  '-hls_segment_filename', 'public/hls/stream%03d.ts',
  'public/hls/index.m3u8'
]);

ffmpeg.stderr.on('data', (data) => {
  console.error(`ffmpeg error: ${data}`);
});
`


    


    My index.html :

    


    &#xA;&#xA;&#xA;  &#xA;  &#xA;  &#xA;&#xA;&#xA;  <video width="640" height="480" controls="controls"></video>&#xA;  <code class="echappe-js">&lt;script src='http://stackoverflow.com/socket.io/socket.io.js'&gt;&lt;/script&gt;&#xA;  &lt;script&gt;&amp;#xA;    const socket = io();&amp;#xA;&amp;#xA;    const video = document.getElementById(&amp;#x27;videoPlayer&amp;#x27;);&amp;#xA;    const stream = new MediaStream();&amp;#xA;&amp;#xA;    video.srcObject = stream;&amp;#xA;    video.play();&amp;#xA;&amp;#xA;    socket.on(&amp;#x27;stream&amp;#x27;, (data) =&gt; {&amp;#xA;      const byteArray = new Uint8Array(data);&amp;#xA;      const blob = new Blob([byteArray], { type: &amp;#x27;video/mp4&amp;#x27; });&amp;#xA;      const url = window.URL.createObjectURL(blob);&amp;#xA;&amp;#xA;      const reader = new FileReader();&amp;#xA;      reader.readAsDataURL(blob);&amp;#xA;      reader.onloadend = () =&gt; {&amp;#xA;        video.src = reader.result;&amp;#xA;      };&amp;#xA;    });&amp;#xA;  &lt;/script&gt;&#xA;&#xA;&#xA;`&#xA; `&#xA;

    &#xA;

  • Raspberry Pi : Playing multiple video files in mkfifo pipe

    19 janvier 2017, par user3246167

    I have 2 files, test.mp4 and test2.mp4 that I want to play concurrently with no noticeable break in the middle. Currently I’m using

    mkfifo test
    cat test.mp4 > test &amp;
    cat test2.mp4 > test &amp;
    omxplayer test

    However, when I do this, omxplayer just returns data and doesn’t play the file. But if I just put a single file into the pipe, omxplayer shows it normally. I’ve also tried using the copy command in ffmpeg, and that also just returns data, doesn’t play the file.

    I understand that I can just concatenate the 2 files together, but that will not work for my purposes, because I will need to be able to feed files to the pipe while omxplayer is running