Recherche avancée

Médias (0)

Mot : - Tags -/xmlrpc

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (61)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (7208)

  • how to add audio using ffmpeg when recording video from browser and streaming to Youtube/Twitch ?

    26 juillet 2021, par Tosh Velaga

    I have a web application I am working on that allows the user to stream video from their browser and simultaneously livestream to both Youtube and Twitch using ffmpeg. The application works fine when I don't need to send any of the audio. Currently I am getting the error below when I try to record video and audio. I am new to using ffmpeg and so any help would be greatly appreciated. Here is also my repo if needed : https://github.com/toshvelaga/livestream Node Server

    


    Here is my node.js server with ffmpeg

    


    const child_process = require('child_process') // To be used later for running FFmpeg
const express = require('express')
const http = require('http')
const WebSocketServer = require('ws').Server
const NodeMediaServer = require('node-media-server')
const app = express()
const cors = require('cors')
const path = require('path')
const logger = require('morgan')
require('dotenv').config()

app.use(logger('dev'))
app.use(cors())

app.use(express.json({ limit: '200mb', extended: true }))
app.use(
  express.urlencoded({ limit: '200mb', extended: true, parameterLimit: 50000 })
)

var authRouter = require('./routes/auth')
var compareCodeRouter = require('./routes/compareCode')

app.use('/', authRouter)
app.use('/', compareCodeRouter)

if (process.env.NODE_ENV === 'production') {
  // serve static content
  // npm run build
  app.use(express.static(path.join(__dirname, 'client/build')))

  app.get('*', (req, res) => {
    res.sendFile(path.join(__dirname, 'client/build', 'index.html'))
  })
}

const PORT = process.env.PORT || 8080

app.listen(PORT, () => {
  console.log(`Server is starting on port ${PORT}`)
})

const server = http.createServer(app).listen(3000, () => {
  console.log('Listening on PORT 3000...')
})


const wss = new WebSocketServer({
  server: server,
})

wss.on('connection', (ws, req) => {
  const ffmpeg = child_process.spawn('ffmpeg', [
    // works fine when I use this but when I need audio problems arise
    // '-f',
    // 'lavfi',
    // '-i',
    // 'anullsrc',

    '-i',
    '-',

    '-f',
    'flv',
    '-c',
    'copy',
    `${process.env.TWITCH_STREAM_ADDRESS}`,
    '-f',
    'flv',
    '-c',
    'copy',
    `${process.env.YOUTUBE_STREAM_ADDRESS}`,
    // '-f',
    // 'flv',
    // '-c',
    // 'copy',
    // `${process.env.FACEBOOK_STREAM_ADDRESS}`,
  ])

  ffmpeg.on('close', (code, signal) => {
    console.log(
      'FFmpeg child process closed, code ' + code + ', signal ' + signal
    )
    ws.terminate()
  })

  ffmpeg.stdin.on('error', (e) => {
    console.log('FFmpeg STDIN Error', e)
  })

  ffmpeg.stderr.on('data', (data) => {
    console.log('FFmpeg STDERR:', data.toString())
  })

  ws.on('message', (msg) => {
    console.log('DATA', msg)
    ffmpeg.stdin.write(msg)
  })

  ws.on('close', (e) => {
    console.log('kill: SIGINT')
    ffmpeg.kill('SIGINT')
  })
})

const config = {
  rtmp: {
    port: 1935,
    chunk_size: 60000,
    gop_cache: true,
    ping: 30,
    ping_timeout: 60,
  },
  http: {
    port: 8000,
    allow_origin: '*',
  },
}

var nms = new NodeMediaServer(config)
nms.run()


    


    Here is my frontend code that records the video/audio and sends to server :

    


    import React, { useState, useEffect, useRef } from &#x27;react&#x27;&#xA;import Navbar from &#x27;../../components/Navbar/Navbar&#x27;&#xA;import &#x27;./Dashboard.css&#x27;&#xA;&#xA;const CAPTURE_OPTIONS = {&#xA;  audio: true,&#xA;  video: true,&#xA;}&#xA;&#xA;function Dashboard() {&#xA;  const [mute, setMute] = useState(false)&#xA;  const videoRef = useRef()&#xA;  const ws = useRef()&#xA;  const mediaStream = useUserMedia(CAPTURE_OPTIONS)&#xA;&#xA;  let liveStream&#xA;  let liveStreamRecorder&#xA;&#xA;  if (mediaStream &amp;&amp; videoRef.current &amp;&amp; !videoRef.current.srcObject) {&#xA;    videoRef.current.srcObject = mediaStream&#xA;  }&#xA;&#xA;  const handleCanPlay = () => {&#xA;    videoRef.current.play()&#xA;  }&#xA;&#xA;  useEffect(() => {&#xA;    ws.current = new WebSocket(&#xA;      window.location.protocol.replace(&#x27;http&#x27;, &#x27;ws&#x27;) &#x2B;&#xA;        &#x27;//&#x27; &#x2B; // http: -> ws:, https: -> wss:&#xA;        &#x27;localhost:3000&#x27;&#xA;    )&#xA;&#xA;    ws.current.onopen = () => {&#xA;      console.log(&#x27;WebSocket Open&#x27;)&#xA;    }&#xA;&#xA;    return () => {&#xA;      ws.current.close()&#xA;    }&#xA;  }, [])&#xA;&#xA;  const startStream = () => {&#xA;    liveStream = videoRef.current.captureStream(30) // 30 FPS&#xA;    liveStreamRecorder = new MediaRecorder(liveStream, {&#xA;      mimeType: &#x27;video/webm;codecs=h264&#x27;,&#xA;      videoBitsPerSecond: 3 * 1024 * 1024,&#xA;    })&#xA;    liveStreamRecorder.ondataavailable = (e) => {&#xA;      ws.current.send(e.data)&#xA;      console.log(&#x27;send data&#x27;, e.data)&#xA;    }&#xA;    // Start recording, and dump data every second&#xA;    liveStreamRecorder.start(1000)&#xA;  }&#xA;&#xA;  const stopStream = () => {&#xA;    liveStreamRecorder.stop()&#xA;    ws.current.close()&#xA;  }&#xA;&#xA;  const toggleMute = () => {&#xA;    setMute(!mute)&#xA;  }&#xA;&#xA;  return (&#xA;    &lt;>&#xA;      <navbar></navbar>&#xA;      <div style="{{" classname="&#x27;main&#x27;">&#xA;        <div>&#xA;          &#xA;        </div>&#xA;        <div classname="&#x27;button-container&#x27;">&#xA;          <button>Go Live</button>&#xA;          <button>Stop Recording</button>&#xA;          <button>Share Screen</button>&#xA;          <button>Mute</button>&#xA;        </div>&#xA;      </div>&#xA;    >&#xA;  )&#xA;}&#xA;&#xA;const useUserMedia = (requestedMedia) => {&#xA;  const [mediaStream, setMediaStream] = useState(null)&#xA;&#xA;  useEffect(() => {&#xA;    async function enableStream() {&#xA;      try {&#xA;        const stream = await navigator.mediaDevices.getUserMedia(requestedMedia)&#xA;        setMediaStream(stream)&#xA;      } catch (err) {&#xA;        console.log(err)&#xA;      }&#xA;    }&#xA;&#xA;    if (!mediaStream) {&#xA;      enableStream()&#xA;    } else {&#xA;      return function cleanup() {&#xA;        mediaStream.getVideoTracks().forEach((track) => {&#xA;          track.stop()&#xA;        })&#xA;      }&#xA;    }&#xA;  }, [mediaStream, requestedMedia])&#xA;&#xA;  return mediaStream&#xA;}&#xA;&#xA;export default Dashboard&#xA;

    &#xA;

  • conflicting of headers between ffmpeg and s3

    2 août 2021, par Juliette

    I have the code below in my server.js file for a web app that uses express for the backend and a built create react app that it serves.

    &#xA;

    require(&#x27;rootpath&#x27;)();&#xA;const path = require(&#x27;path&#x27;);&#xA;const express = require(&#x27;express&#x27;);&#xA;const app = express();&#xA;const bodyParser = require(&#x27;body-parser&#x27;);&#xA;const cookieParser = require(&#x27;cookie-parser&#x27;);&#xA;const cors = require(&#x27;cors&#x27;);&#xA;&#xA;app.use(bodyParser.urlencoded({ extended: false }));&#xA;app.use(bodyParser.json());&#xA;app.use(cookieParser());&#xA;&#xA;app.use(cors());&#xA;&#xA;&#xA;app.use(function(req, res, next) {&#xA;  res.header("Cross-Origin-Embedder-Policy", "require-corp");&#xA;  res.header("Cross-Origin-Opener-Policy", "same-origin");&#xA;  next();&#xA;});&#xA;&#xA;// Have Node serve the files for our built React app&#xA;app.use(express.static(path.resolve(__dirname, &#x27;build&#x27;)));&#xA;&#xA;// file api routes&#xA;app.use(&#x27;/accounts&#x27;, require(&#x27;./accounts/accounts.controller&#x27;));&#xA;&#xA;// file api routes&#xA;app.use(&#x27;/files&#x27;, require(&#x27;./files/files.controller&#x27;));&#xA;&#xA;&#xA;// All other GET requests not handled before will return our React app&#xA;app.get(&#x27;*&#x27;, (req, res) => {&#xA;    res.sendFile(path.resolve(__dirname, &#x27;build&#x27;, &#x27;index.html&#x27;));&#xA;});&#xA;&#xA;// start server&#xA;const port = process.env.PORT || 2002;&#xA;app.listen(port, () => console.log(&#x27;Server listening on port &#x27; &#x2B; port));&#xA;

    &#xA;

    The issue here is that I need this segment of code for my ffmpeg file upload to occur otherwise it throws a SharedArrayBuffer error :

    &#xA;

       app.use(function(req, res, next) {&#xA;      res.header("Cross-Origin-Embedder-Policy", "require-corp");&#xA;      res.header("Cross-Origin-Opener-Policy", "same-origin");&#xA;      next();&#xA;    });&#xA;

    &#xA;

    However, when I leave this code in, another part of my program breaks down which gets presigned urls from s3 and plays audio. The issue whenever I play audios from my s3 bucket there is this :

    &#xA;

    ERR_BLOCKED_BY_RESPONSE.NotSameOriginAfterDefaultedToSameOriginByCoep&#xA;

    &#xA;

    The s3 code is this :

    &#xA;

    function getTemporaryURL({ data }) {&#xA;&#xA;  const customer_id = data[&#x27;customer-id&#x27;];&#xA;  const sound_id = data[&#x27;sound-id&#x27;];&#xA;  &#xA;  return new Promise((resolve, reject) => {&#xA;    //get presigned url&#xA;&#xA;    var myBucket = process.env.NODE_APP_BUCKET_NAME;&#xA;    var myKey = "sounds/" &#x2B; customer_id &#x2B; "/" &#x2B; sound_id &#x2B; ".wav"; &#xA;    const signedUrlExpireSeconds = 120;&#xA;    try {&#xA;      const url = s3.getSignedUrl(&#x27;getObject&#x27;, {&#xA;        Bucket: myBucket,&#xA;        Key: myKey,&#xA;        ResponseContentDisposition: &#x27;attachment&#x27;,&#xA;        Expires: signedUrlExpireSeconds&#xA;      });&#xA;      resolve(url)&#xA;    }&#xA;    catch {&#xA;      console.log(&#x27;S3 Object does not exist&#x27;);&#xA;      resolve(&#x27;&#x27;);&#xA;    }&#xA;  });&#xA;}&#xA;

    &#xA;

    How can I modify my server.js to accommodate both of my needs ?

    &#xA;

  • node.js ffmpeg spawn child_process unexpected data output

    5 septembre 2021, par PLNR

    &#xA;I'm rather new to backend stuff, so please excuse, if my question is trivial.
    &#xA;For an Intranet project, I want to present a video element in a webpage (React, HLS.js Player).
    &#xA;The video sources are mpeg-ts streams delivered as udp multicast.
    &#xA;A small node.js / express server should handle the ffmpeg commands, to transcode the multicast to hls to display them in a browser.
    &#xA;
    &#xA;Problem is the Output :
    &#xA;The output is emitted on stderr... even the process is working as expected.
    &#xA;Here is the respective code I wrote for transcoding so far :

    &#xA;

    const express = require("express");&#xA;const { spawn, exec } = require("child_process");&#xA;const process = require("process")&#xA;&#xA;let ls;&#xA;&#xA;const app = express();&#xA;&#xA;app.get(&#x27;/cam/:source&#x27;, (body) => {&#xA;    const cam = body.params.source;&#xA;    console.log(cam);&#xA;&#xA;    let source = "udp://239.1.1.1:4444";&#xA;    let stream = "/var/www/html/streams/tmp/cam1.m3u8"&#xA;&#xA;&#xA;    stream = spawn("ffmpeg", ["-re", "-i", source, "-c:v", "libx264", "-crf", "21", "-preset", "veryfast", "-c:a", "aac", "-b:a", "128k", "-ac", "2", "-f", "hls", "-hls_list_size", "5", "-hls_flags", "delete_segments", stream], {detached: true});&#xA;&#xA;    stream.stdout.on("data", data => {&#xA;        console.log(`stdout: ${data}`);&#xA;    });&#xA;&#xA;    stream.stderr.on("data", data => {&#xA;        console.log(`stderr: ${data}`);&#xA;    });&#xA;&#xA;    stream.on("error", error => {&#xA;        console.log(`error: ${error.message}`);&#xA;    });&#xA;&#xA;    stream.on("close", code => {&#xA;        console.log(`child process exited with code ${code}`);&#xA;    });&#xA;})&#xA;&#xA;app.listen(5000, ()=> {&#xA;    console.log(&#x27;Listening&#x27;);&#xA;})&#xA;

    &#xA;

    This is maybe only cosmetics, but it makes me wondering.
    &#xA;Here is the terminal output :

    &#xA;

    [nodemon] starting `node server.js`&#xA;Listening&#xA;camera stream reloaded&#xA;stderr: ffmpeg version 4.3.2-0&#x2B;deb11u1ubuntu1 Copyright (c) 2000-2021 the FFmpeg developers&#xA;  built with gcc 10 (Ubuntu 10.2.1-20ubuntu1)&#xA;&#xA;  --shortend--&#xA;&#xA;&#xA;pid:  4206&#xA;stderr: frame=    8 fps=0.0 q=0.0 size=N/A time=00:00:00.46 bitrate=N/A speed=0.931x    &#xA;pid:  4206&#xA;stderr: frame=   21 fps= 21 q=26.0 size=N/A time=00:00:00.96 bitrate=N/A speed=0.95x    &#xA;pid:  4206&#xA;stderr: frame=   33 fps= 22 q=26.0 size=N/A time=00:00:01.49 bitrate=N/A speed=0.982x    &#xA;pid:  4206&#xA;stderr: frame=   46 fps= 23 q=26.0 size=N/A time=00:00:02.00 bitrate=N/A speed=0.989x    &#xA;pid:  4206&#xA;stderr: frame=   58 fps= 23 q=26.0 size=N/A time=00:00:02.49 bitrate=N/A speed=0.986x    &#xA;pid:  4206&#xA;

    &#xA;

    and so on...
    &#xA;
    &#xA;Any helpful information would be highly appreciated !
    &#xA;Many thanks in advance

    &#xA;