Recherche avancée

Médias (1)

Mot : - Tags -/book

Autres articles (59)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

Sur d’autres sites (9322)

  • Problems decoding a gstreamer pipeline into images using node.js

    18 septembre 2023, par JK2018

    I have this Gstreamer pipeline that launches a test video , encodes it in h264 and sends it via udp on localhost 5000.

    


    gst-launch-1.0 -v videotestsrc ! videoconvert ! x264enc tune=zerolatency ! h264parse ! queue ! rtph264pay ! udpsink host=127.0.0.1 port=5000


    


    I run this pipeline in my terminal.

    


    Then I have a minimalistic node.js server that is suposed to receive the udp stream coming from the gstreamer pipeline, decode the video into images.
Finally I emit each image (not a fragment of an image) over a socket.

    


    I have tried several approaches unsuccessfully.
My first attempt was using a node gstreamer library and use gstreamer to decode the udp and re encode as images.

    


    My second attempt was using ffmpeg library.
Here is the code below :

    


    const express = require("express");
const http = require("http");
const { Server } = require("socket.io");
const socketIO = require("socket.io");
const cors = require("cors");
const app = express();
const server = http.createServer(app);
const dgram = require("dgram");
const ffmpeg = require("fluent-ffmpeg");

const io = socketIO(server, {
  cors: {
    origin: "http://localhost:3001",
    methods: ["GET", "POST"],
  },
});

app.use(
  cors({
    origin: "http://localhost:3001",
    methods: ["GET", "POST"],
  })
);

const udpServer = dgram.createSocket("udp4");

io.on("connection", (socket) => {
  console.log("A client connected");

  socket.on("disconnect", () => {
    console.log("A client disconnected");
  });
});

// Function to decode an H.264 video frame
function decodeH264Video(inputData, callback) {
  const command = ffmpeg()
    .input(inputData)
    .inputFormat("h264")
    .inputOptions(["-c:v h264"])
    .toFormat("image2")
    .on("end", () => {
      console.log("Decoding complete");
    })
    .on("error", (err) => {
      console.error("Error decoding video:", err);
    })
    .pipe();

  callback(command);
}

// Function to convert a decoded video frame to an image (JPEG format)
function convertVideoFrameToImage(decodedData, callback) {
  const imageStream = decodedData.pipe();
  const imageBuffer = [];

  imageStream.on("data", (chunk) => {
    imageBuffer.push(chunk);
  });

  imageStream.on("end", () => {
    const imageData = Buffer.concat(imageBuffer);
    callback(imageData);
  });
}

udpServer.on("message", (message) => {
  // Decode the UDP packet containing H.264 encoded video
  decodeH264Video(message, (decodedVideo) => {
    // Process the decoded video frame and convert it to an image
    convertVideoFrameToImage(decodedVideo, (imageData) => {
      // Send the image data to connected clients
      io.sockets.emit("image", { imageData: imageData.toString("base64") });
    });
  });
});

udpServer.bind(5000);

server.listen(3000, () => {
  console.log("Server is running on port 3000");
});


    


    Any help is more than welcome

    


  • Raspberry Pi and FFMpeg live streaming video to backend Node.js server, but how do I deliver it to the front end ?

    2 août 2023, par qwet142

    I'm attempting this setup to live stream video from a Raspberry Pi to a publicly available website, and from the backend server using Node.js and Express, I would like to serve it to the front end with minimal latency available to many viewers (will be hosted somewhere like Netlify to handle distribution).

    


    WebRTC is too complex for my use case and timeframe. HLC appears to have too high a latency as I need <4s consistently. I would like to something compatible with Node.js. I am a beginner in this domain, familiar only with general programming and web development.

    &#xA;

  • How to set H264 NVENC encoding parameters to output high-quality video stream ?

    8 février 2021, par fredirty2017
      &#xA;
    • I use NVENC (nvEncodeAPI) interface hardware encoding H264, and then RTMP to stream out, when using ffplay to pull the stream to watch, I found that the picture is obviously distorted : the green lines become sharper ? the text color becomes lighter and blurred, it is not improved even if I increase the video bitrate to 3Mbps(720P). I also know that H264 encoding is lossy, and some information will be discarded.
    • &#xA;

    • Then I try to adjust the NV_ENC_CONFIG_H264_VUI_PARAMETERS parameter : when set color-space to "pc, bt709", there are some minor improvements. Maybe the color-space setting has more potential for improvement ? For NVENC encoding, is there a higher video clarity setting scheme ? I don't know how much space for technical improvement is here ?&#xA;enter image description here&#xA;based on the code : https://github.com/obsproject/obs-studio/blob/master/plugins/obs-ffmpeg/obs-ffmpeg-nvenc.c
    • &#xA;

    &#xA;