
Recherche avancée
Médias (2)
-
Core Media Video
4 avril 2013, par
Mis à jour : Juin 2013
Langue : français
Type : Video
-
Video d’abeille en portrait
14 mai 2011, par
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (109)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Contribute to a better visual interface
13 avril 2011MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community. -
XMP PHP
13 mai 2011, parDixit Wikipedia, XMP signifie :
Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)
Sur d’autres sites (10775)
-
How to receive byte-stream by using gstreamer with python subprocess module or gst-launch-1.0 command ?
21 avril 2022, par yuniversiI want to receive byte-stream by using gstreamer with python subprocess module.
Now I can successfully use ffmpeg to pull the byte-stream. As shown below.


import cv2
import subprocess as sp


height = 714
width = 420
rtsp_url = 'rtsp://127.0.0.1:8554/video'

# command
command = ['ffmpeg',
 '-i', rtsp_url,
 '-f', 'rawvideo',
 '-s',str(width)+'*'+str(height),
 '-pix_fmt', 'bgr24',
 '-fflags', 'nobuffer',
 '-']

p = sp.Popen(command, stdout=sp.PIPE, bufsize=10**8)

while True:
 raw_image = p.stdout.read(width*height*3)
 image = np.fromstring(raw_image, dtype='uint8')
 image = image.reshape((height,width,3)).copy()
 cv2.imshow('image', image)
 key = cv2.waitKey(20)



I want to use gstreamer command instead of ffmpeg. So far, I have realized writing byte-stream to a file by using gstreamer command line.


gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/video latency=0 drop-on-latency=true ! rtph264depay ! video/x-h264, stream-format='byte-stream' ! filesink location=/home/name/stdout



But it can't output byte-stream to pipe, so the terminal dosen't display byte-stream, not like ffmpeg command. How to change this command to output byte-stream through pipe so I can read from pipe.
Thank you for taking the time to answer for me !


This is RTSP streaming code.


import cv2
import time
import subprocess as sp
import numpy as np


rtsp_url = 'rtsp://127.0.0.1:8554/video'
video_path = r'test.mp4'
cap = cv2.VideoCapture(video_path)

# Get video information
fps = int(cap.get(cv2.CAP_PROP_FPS))
width = int(cap.get(cv2.CAP_PROP_FRAME_WIDTH))
height = int(cap.get(cv2.CAP_PROP_FRAME_HEIGHT))
print('fps={}'.format(fps))

# command
command = ['ffmpeg',
 '-re',
 '-y',
 '-stream_loop', '-1',
 '-f', 'rawvideo',
 '-vcodec', 'rawvideo',
 '-pix_fmt', 'bgr24',
 '-s', "{}x{}".format(width, height),
 '-r', str(fps),
 '-i', '-',
 '-c:v', 'libx264',
 '-pix_fmt', 'yuv420p',
 '-preset', 'ultrafast',
 # '-flags2', 'local_header',
 '-bsf:v', "'dump_extra=freq=k'", 
 '-keyint_min', '60',
 '-g', '60',
 '-sc_threshold', '0', 
 '-f', 'rtsp',
 '-rtsp_transport', 'tcp',
 '-muxdelay', '0.1', 
 rtsp_url]

p = sp.Popen(command, stdin=sp.PIPE)

cnt = 0
t_start = time.time()
while (cap.isOpened()):
 t_cur = time.time()-t_start

 ret, frame = cap.read()
 if not ret:
 cnt += 1
 print("count: {}".format(cnt))
 cap = cv2.VideoCapture(video_path)
 continue

 p.stdin.write(frame.tobytes())

 cv2.imshow('real_time', frame)

 key = cv2.waitKey(20)
 if key == 27:
 p.terminate()
 break



-
sent file using axios using passthrough stream module in nodejs
12 août 2022, par Harikrushna PatelImports


const ffmpegPath = require('@ffmpeg-installer/ffmpeg').path;
const FfmpegCommand = require('fluent-ffmpeg');
const fs = require('fs');
const path = require('path');
const streamNode = require('stream');
const FormData = require('form-data');
const axios = require('axios').default;



Code here


async function audios() {
 let stream = fs.createReadStream(path.join(__dirname, '../videos/video.mp4'));
 let writeStream = fs.createWriteStream(path.join(__dirname, '../response/audios/' + +new Date() + '.wav'));
 let pass = new streamNode.PassThrough();
 let outputFile = path.join(__dirname, '../response/audios/' + +new Date() + '.wav');
 const ffmpeg = FfmpegCommand(file);

 ffmpeg
 .setFfmpegPath(ffmpegPath)
 .format('mp4')
 .toFormat('wav')
 .on('end', function () {
 console.log('file has been converted successfully');
 })
 .on('error', function (err, stdout, stderr) {
 console.log('an error happened: ' + err.message);
 console.log('ffmpeg stdout: ' + stdout);
 console.log('ffmpeg stderr: ' + stderr);
 })
 .on('end', function() {
 console.log('Processing finished !');
 })
 .stream(pass, { end: false })
 var bodyFormData = new FormData();
 bodyFormData.append('file', pass);
 let headers = bodyFormData.getHeaders(); 

 try {
 const jdata = await axios.post('http://localhost:4080/video',bodyFormData, { maxContentLength: Infinity,
 maxBodyLength: Infinity,validateStatus: (status) => true ,headers:headers });
 console.log(jdata.data);
 } catch (error) {
 console.log("error" ,error.message);
 }

}



I am getting errors to sent passthrough stream through formdata ;
issue is ffmpeg not creating readstrem so I am created passthrough from it and passed in formdata but not working right now


-
ffmpeg module saving not working after compression
27 août 2022, par nickcoding2I'm just trying to save an mp4 to a different mp4 (before I even start playing around with the different compression settings). What exactly is going wrong here ?


const ffmpeg = require('ffmpeg');

try {
 var process = new ffmpeg('./original.mp4');
 process.then(function (video) {
 video
 .save('./new.mp4', function (error, file) {
 if (!error) {
 console.log('Video file: ' + file);
 } else {
 console.log(error)
 }
 });
 }, function (err) {
 console.log('Error: ' + err);
 });
} catch (e) {
 console.log(e.code);
 console.log(e.msg);
}



I get the following error :


Error: Command failed: ffmpeg -i ./original.mp4 ./new.mp4
/bin/sh: ffmpeg: command not found

 at ChildProcess.exithandler (child_process.js:390:12)
 at ChildProcess.emit (events.js:400:28)
 at maybeClose (internal/child_process.js:1055:16)
 at Process.ChildProcess._handle.onexit (internal/child_process.js:288:5) {
 killed: false,
 code: 127,
 signal: null,
 cmd: 'ffmpeg -i ./original.mp4 ./new.mp4'
}