Recherche avancée

Médias (1)

Mot : - Tags -/artwork

Autres articles (73)

  • Gestion des droits de création et d’édition des objets

    8 février 2011, par

    Par défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

  • Dépôt de média et thèmes par FTP

    31 mai 2013, par

    L’outil MédiaSPIP traite aussi les média transférés par la voie FTP. Si vous préférez déposer par cette voie, récupérez les identifiants d’accès vers votre site MédiaSPIP et utilisez votre client FTP favori.
    Vous trouverez dès le départ les dossiers suivants dans votre espace FTP : config/ : dossier de configuration du site IMG/ : dossier des média déjà traités et en ligne sur le site local/ : répertoire cache du site web themes/ : les thèmes ou les feuilles de style personnalisées tmp/ : dossier de travail (...)

Sur d’autres sites (7291)

  • ts video via UDP from ffmpeg to gstreamer [closed]

    23 janvier 2024, par aron.h

    Hardware : Jetson AGX ORIN
Software : Jetpack 5.0.2

    


    I have been attempting to send a video file locally via UDP using ffmpeg :

    


    ffmpeg -stream_loop -1 -re -i test.ts -map 0 -c copy -preset ultrafast -f mpegts "udp://127.0.0.1:5000"

    


    And receiving the same stream via UDP using gstreamer :

    


    gst-launch-1.0 udpsrc port=5000 ! application/x-rtp, media=video, clock-rate=90000, encoding-name=H264 ! rtph264depay ! decodebin ! videoconvert ! aasink

    


    But I get an error on the receiving gstreamer end :

    


    /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0: Could not decode stream.
Additional debug info:
gstrtpbasedepayload.c(505): gst_rtp_base_depayload_handle_buffer (): 
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0:
Received invalid RTP payload, dropping
ERROR: from element /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0:
The stream is in the wrong format.
Additional debug info:
gstrtph264depay.c(1298): gst_rtp_h264_depay_process ():
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0:
NAL unit type 27 not supported yet


    


    More detailed information on the video file :

    


    Original ID: 1002
Codec: H264 - MPEG-4 AVC (part 10) (h264)
Type: Video
Video resolution: 1920x1080
Buffer dimensions: 1920x1088
Frame rate: 30
Decoded format: 
Orientation: Top left
Chroma location: left


    


    When I listen with the command gst-launch-1.0 -v udpsrc port=5000 ! fakesink dump=1, it is quite apparent that the packets from FFMPEG are being received.
I am not sure why gstreamer's rtph264depay says the stream is in the wrong format.

    


    Would I have to check some details on the FFMPEG side ?
This is what information FFMPEG shows by default while running.

    


    Input #0, mpegts, from 'test.ts':
  Duration: 00:00:57.36, start: 20902.827056, bitrate: 2504 kb/s
  Program 1
    Stream #0:0[0x3ea]: Video: h264 (Main) ([27][0][0][0] / 0x001B), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbc
Output #0, mpegts, to 'udp://127.0.0.1:5000':
  Metadata:
    encoder         : Lavf58.29.100
    Stream #0:0: Video: h264 (Main) ([27][0][0][0] / 0x001B), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 29.97 fps, 29.97 tbr, 90k tbn, 90k tbc
Stream mapping:
  Stream #0:0 -> #0:0 (copy)
Press [q] to stop, [?] for help
frame=  611 fps= 30 q=-1.0 Lsize=    5847kB time=00:00:20.62 bitrate=2323.0kbits/s speed=   1x
video:5350kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 9.301388%


    


    Any advice would be appreciated.

    


  • Problems decoding a gstreamer pipeline into images using node.js

    18 septembre 2023, par JK2018

    I have this Gstreamer pipeline that launches a test video , encodes it in h264 and sends it via udp on localhost 5000.

    


    gst-launch-1.0 -v videotestsrc ! videoconvert ! x264enc tune=zerolatency ! h264parse ! queue ! rtph264pay ! udpsink host=127.0.0.1 port=5000


    


    I run this pipeline in my terminal.

    


    Then I have a minimalistic node.js server that is suposed to receive the udp stream coming from the gstreamer pipeline, decode the video into images.
Finally I emit each image (not a fragment of an image) over a socket.

    


    I have tried several approaches unsuccessfully.
My first attempt was using a node gstreamer library and use gstreamer to decode the udp and re encode as images.

    


    My second attempt was using ffmpeg library.
Here is the code below :

    


    const express = require("express");
const http = require("http");
const { Server } = require("socket.io");
const socketIO = require("socket.io");
const cors = require("cors");
const app = express();
const server = http.createServer(app);
const dgram = require("dgram");
const ffmpeg = require("fluent-ffmpeg");

const io = socketIO(server, {
  cors: {
    origin: "http://localhost:3001",
    methods: ["GET", "POST"],
  },
});

app.use(
  cors({
    origin: "http://localhost:3001",
    methods: ["GET", "POST"],
  })
);

const udpServer = dgram.createSocket("udp4");

io.on("connection", (socket) => {
  console.log("A client connected");

  socket.on("disconnect", () => {
    console.log("A client disconnected");
  });
});

// Function to decode an H.264 video frame
function decodeH264Video(inputData, callback) {
  const command = ffmpeg()
    .input(inputData)
    .inputFormat("h264")
    .inputOptions(["-c:v h264"])
    .toFormat("image2")
    .on("end", () => {
      console.log("Decoding complete");
    })
    .on("error", (err) => {
      console.error("Error decoding video:", err);
    })
    .pipe();

  callback(command);
}

// Function to convert a decoded video frame to an image (JPEG format)
function convertVideoFrameToImage(decodedData, callback) {
  const imageStream = decodedData.pipe();
  const imageBuffer = [];

  imageStream.on("data", (chunk) => {
    imageBuffer.push(chunk);
  });

  imageStream.on("end", () => {
    const imageData = Buffer.concat(imageBuffer);
    callback(imageData);
  });
}

udpServer.on("message", (message) => {
  // Decode the UDP packet containing H.264 encoded video
  decodeH264Video(message, (decodedVideo) => {
    // Process the decoded video frame and convert it to an image
    convertVideoFrameToImage(decodedVideo, (imageData) => {
      // Send the image data to connected clients
      io.sockets.emit("image", { imageData: imageData.toString("base64") });
    });
  });
});

udpServer.bind(5000);

server.listen(3000, () => {
  console.log("Server is running on port 3000");
});


    


    Any help is more than welcome

    


  • FFmpeg zoompan animation results in zig-zag pattern [closed]

    2 février 2024, par kregus

    please assist, as I am pulling my hair out !

    


    The goal is for a user to specify a zoomPoint in a video as follows :

    


    {
   "scale":4,        # Scale factor, here x4
   "offset":0,       # Start zooming in after 0ms
   "duration":5000,  # Stay zoomed in for 5000ms
   "marginX":80,     # Zoom x to 80% of video width 
   "marginY":10      # Zoom y to 10% of video height
}


    


    I am running the following, simplified FFmpeg command :

    


    ffmpeg -i "/tmp/input.mp4" -c:v libx264 -r 30 -vf "scale=iw*2:ih*2,zoompan=fps=30:z='if(gte(it,0.0)*lt(it,5.0), min(pzoom+0.2, 4),if(gte(it,5.0)*lt(it,5.5), max(pzoom-0.2, 1), 1))':d=1:s=1512x982:x='(iw - iw/4)*(80/100)':y='(ih - ih/4)*(10/100)'" /tmp/output.mp4


    


    The animation duration is 0.5s, the video framerate is 30fps, and so there are 0.5 / (1/30) = 15 animation frames.

    


    Zoom distance is scale - 1 = 3 in this case, which makes the zoom increment 3 / 15 = 0.2.

    


    This results in the following example video : click here

    


    While the zoom animation ends in the correct position, you will notice it arrives at that position in a zig-zag pattern, where it starts by zooming into the top-right corner, before changing direction towards the correct position.

    


    I cannot seem to figure out how to get it to animate the zoom in a straight line to the specified x/y position.

    


    Any tips are welcome, thanks !