
Recherche avancée
Autres articles (8)
-
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
-
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
-
Contribute to a better visual interface
13 avril 2011MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.
Sur d’autres sites (4054)
-
avformat/matroskaenc : support writing Dynamic HDR10+ packet side data
19 mars 2023, par James Almeravformat/matroskaenc : support writing Dynamic HDR10+ packet side data
Signed-off-by : James Almer <jamrial@gmail.com>
- [DH] libavformat/matroskaenc.c
- [DH] tests/ref/fate/aac-autobsf-adtstoasc
- [DH] tests/ref/fate/matroska-avoid-negative-ts
- [DH] tests/ref/fate/matroska-dovi-write-config7
- [DH] tests/ref/fate/matroska-dovi-write-config8
- [DH] tests/ref/fate/matroska-dvbsub-remux
- [DH] tests/ref/fate/matroska-encoding-delay
- [DH] tests/ref/fate/matroska-flac-extradata-update
- [DH] tests/ref/fate/matroska-h264-remux
- [DH] tests/ref/fate/matroska-mastering-display-metadata
- [DH] tests/ref/fate/matroska-move-cues-to-front
- [DH] tests/ref/fate/matroska-mpegts-remux
- [DH] tests/ref/fate/matroska-ms-mode
- [DH] tests/ref/fate/matroska-ogg-opus-remux
- [DH] tests/ref/fate/matroska-opus-remux
- [DH] tests/ref/fate/matroska-pgs-remux
- [DH] tests/ref/fate/matroska-pgs-remux-durations
- [DH] tests/ref/fate/matroska-qt-mode
- [DH] tests/ref/fate/matroska-spherical-mono-remux
- [DH] tests/ref/fate/matroska-vp8-alpha-remux
- [DH] tests/ref/fate/matroska-zero-length-block
- [DH] tests/ref/fate/rgb24-mkv
- [DH] tests/ref/fate/shortest-sub
- [DH] tests/ref/lavf-fate/av1.mkv
- [DH] tests/ref/lavf/mka
- [DH] tests/ref/lavf/mkv
- [DH] tests/ref/lavf/mkv_attachment
- [DH] tests/ref/seek/lavf-mkv
-
Can build & make video call with pjsip and ffmpeg
10 mai 2023, par QVietI try to build PJSIP with ffmpeg with this config :


i Follow those step :


- 

- build need lib and place in thirdparty folder name ffmpeg
- setup link lib & header already.
- run build with "$configure —with-ffmpeg="
- config_site.h add :












#define PJMEDIA_HAS_OPENH264_CODEC 1 
#define PJMEDIA_HAS_VIDEO 1 
#define PJMEDIA_VIDEO_DEV_HAS_FFMPEG 1 
#define PJMEDIA_HAS_FFMPEG_VID_CODEC 1 
#define PJMEDIA_HAS_FFMPEG 1 
#define PJMEDIA_HAS_FFMPEG_CODEC_H264 1 
#define PJMEDIA_HAS_LIBAVDEVICE 1 
#define PJMEDIA_HAS_OPENH264_CODEC 1





I see have to enable PJMEDIA_HAS_OPENH264_CODEC , if not, can build success but when import will receive this error :


Undefined symbol: _WelsCreateDecoder



cause miss wels package exits in openh264 lib.


The build with success after all with above config but in this :


pj_status_t status = pjsua_vid_enum_codecs(videoCodecInfo, &videoCodecCount);



the codec info just show 1 codec is "H264/97" -> is OpenH264 codec, i can't see ffmpeg here.
When im start call like normal, see log openh264 init call/ open camera .


What i need step to using ffmpeg, i can see any docs about it


can you help me ?


** this i log call stack :**


2023-04-24 10:17:21.522976+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.523 [SIPSample -[SIPSample startEndpointWithEndpointConfiguration:error:]:272] Creating new PSUASIP Endpoint instance.
10:17:21.525 os_core_unix.c !pjlib 2.13-dev for POSIX initialized
10:17:21.526 sip_endpoint.c .Creating endpoint instance...
10:17:21.527 pjlib .select() I/O Queue created (0x1050a32c8)
10:17:21.527 sip_endpoint.c .Module "mod-msg-print" registered
10:17:21.527 sip_transport.c .Transport manager created.
10:17:21.527 pjsua_core.c .PJSUA state changed: NULL --> CREATED
2023-04-24 10:17:21.528077+0700 PSUAKitSample[83000:15642817] 10:17:21.528 sip_endpoint.c .Module "mod-pjsua-log" registered

2023-04-24 10:17:21.528204+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.528 [SIPSample void logCallBack(int, const char *, int):1034] sip_endpoint.c .Module "mod-PSUA-log" registered
2023-04-24 10:17:21.529375+0700 PSUAKitSample[83000:15642817] 10:17:21.529 sip_endpoint.c .Module "mod-tsx-layer" registered

2023-04-24 10:17:21.529477+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.529 [SIPSample void logCallBack(int, const char *, int):1034] sip_endpoint.c .Module "mod-tsx-layer" registered
2023-04-24 10:17:21.529491+0700 PSUAKitSample[83000:15642817] 10:17:21.529 sip_endpoint.c .Module "mod-stateful-util" registered

2023-04-24 10:17:21.529592+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.530 [SIPSample void logCallBack(int, const char *, int):1034] sip_endpoint.c .Module "mod-stateful-util" registered
2023-04-24 10:17:21.529895+0700 PSUAKitSample[83000:15642817] 10:17:21.529 sip_endpoint.c .Module "mod-ua" registered

2023-04-24 10:17:21.530024+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.530 [SIPSample void logCallBack(int, const char *, int):1034] sip_endpoint.c .Module "mod-ua" registered
2023-04-24 10:17:21.530068+0700 PSUAKitSample[83000:15642817] 10:17:21.530 sip_endpoint.c .Module "mod-100rel" registered

2023-04-24 10:17:21.530181+0700 PSUAKitSample[83000:15642817] 10:17:21.530 sip_endpoint.c .Module "mod-pjsua" registered

2023-04-24 10:17:21.530217+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.530 [SIPSample void logCallBack(int, const char *, int):1034] sip_endpoint.c .Module "mod-100rel" registered
2023-04-24 10:17:21.530283+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.530 [SIPSample void logCallBack(int, const char *, int):1034] sip_endpoint.c .Module "mod-PSUA" registered
2023-04-24 10:17:21.530865+0700 PSUAKitSample[83000:15642817] 10:17:21.530 sip_endpoint.c .Module "mod-invite" registered

2023-04-24 10:17:21.530970+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.531 [SIPSample void logCallBack(int, const char *, int):1034] sip_endpoint.c .Module "mod-invite" registered
2023-04-24 10:17:21.677206+0700 PSUAKitSample[83000:15642817] 10:17:21.677 coreaudio_dev.c .. dev_id 0: iPhone IO device (in=1, out=1) 8000Hz

2023-04-24 10:17:21.677497+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.677 [SIPSample void logCallBack(int, const char *, int):1034] coreaudio_dev.c .. dev_id 0: iPhone IO device (in=1, out=1) 8000Hz
2023-04-24 10:17:21.677588+0700 PSUAKitSample[83000:15642817] 10:17:21.677 coreaudio_dev.c ..core audio initialized

2023-04-24 10:17:21.677804+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.678 [SIPSample void logCallBack(int, const char *, int):1034] coreaudio_dev.c ..core audio initialized
2023-04-24 10:17:21.678538+0700 PSUAKitSample[83000:15642817] 10:17:21.678 pjlib ..select() I/O Queue created (0x1060684a8)

2023-04-24 10:17:21.678743+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.679 [SIPSample void logCallBack(int, const char *, int):1034] pjlib ..select() I/O Queue created (0x1060684a8)
2023-04-24 10:17:21.683380+0700 PSUAKitSample[83000:15642817] 10:17:21.683 pjsua_vid.c ..Initializing video subsystem..

2023-04-24 10:17:21.683585+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.683 [SIPSample void logCallBack(int, const char *, int):1034] PSUA_vid.c ..Initializing video subsystem..
2023-04-24 10:17:21.684058+0700 PSUAKitSample[83000:15642817] 10:17:21.684 vid_conf.c ...Created video conference bridge with 32 ports

2023-04-24 10:17:21.684260+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.684 [SIPSample void logCallBack(int, const char *, int):1034] vid_conf.c ...Created video conference bridge with 32 ports
2023-04-24 10:17:21.684983+0700 PSUAKitSample[83000:15642817] 10:17:21.684 openh264.cpp ...OpenH264 codec initialized

2023-04-24 10:17:21.685168+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.685 [SIPSample void logCallBack(int, const char *, int):1034] openh264.cpp ...OpenH264 codec initialized
2023-04-24 10:17:21.685237+0700 PSUAKitSample[83000:15642817] 10:17:21.685 opengl_dev.c ...OpenGL device initialized

2023-04-24 10:17:21.685370+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.685 [SIPSample void logCallBack(int, const char *, int):1034] opengl_dev.c ...OpenGL device initialized
2023-04-24 10:17:21.715616+0700 PSUAKitSample[83000:15642817] 10:17:21.715 darwin_dev.m ...Darwin video initialized with 5 devices:

2023-04-24 10:17:21.715796+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.716 [SIPSample void logCallBack(int, const char *, int):1034] darwin_dev.m ...Darwin video initialized with 5 devices:
2023-04-24 10:17:21.715812+0700 PSUAKitSample[83000:15642817] 10:17:21.715 darwin_dev.m ... 0: [Renderer] iOS - UIView

2023-04-24 10:17:21.715917+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.716 [SIPSample void logCallBack(int, const char *, int):1034] darwin_dev.m ... 0: [Renderer] iOS - UIView
2023-04-24 10:17:21.715921+0700 PSUAKitSample[83000:15642817] 10:17:21.715 darwin_dev.m ... 1: [Capturer] AVF - Front Camera

2023-04-24 10:17:21.716006+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.716 [SIPSample void logCallBack(int, const char *, int):1034] darwin_dev.m ... 1: [Capturer] AVF - Front Camera
2023-04-24 10:17:21.716033+0700 PSUAKitSample[83000:15642817] 10:17:21.716 darwin_dev.m ... 2: [Capturer] AVF - Back Camera

2023-04-24 10:17:21.716137+0700 PSUAKitSample[83000:15642817] 10:17:21.716 darwin_dev.m ... 3: [Capturer] AVF - Back Dual Camera

2023-04-24 10:17:21.716152+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.716 [SIPSample void logCallBack(int, const char *, int):1034] darwin_dev.m ... 2: [Capturer] AVF - Back Camera
2023-04-24 10:17:21.716218+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.716 [SIPSample void logCallBack(int, const char *, int):1034] darwin_dev.m ... 3: [Capturer] AVF - Back Dual Camera
2023-04-24 10:17:21.716247+0700 PSUAKitSample[83000:15642817] 10:17:21.716 darwin_dev.m ... 4: [Capturer] AVF - Back Telephoto Camera

2023-04-24 10:17:21.716375+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.716 [SIPSample void logCallBack(int, const char *, int):1034] darwin_dev.m ... 4: [Capturer] AVF - Back Telephoto Camera
2023-04-24 10:17:21.716409+0700 PSUAKitSample[83000:15642817] 10:17:21.716 colorbar_dev.c ...Colorbar video src initialized with 2 device(s):

2023-04-24 10:17:21.716673+0700 PSUAKitSample[83000:15642817] 10:17:21.716 colorbar_dev.c ... 0: Colorbar generator

2023-04-24 10:17:21.716764+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.717 [SIPSample void logCallBack(int, const char *, int):1034] colorbar_dev.c ...Colorbar video src initialized with 2 device(s):
2023-04-24 10:17:21.716918+0700 PSUAKitSample[83000:15642817] 10:17:21.716 colorbar_dev.c ... 1: Colorbar-active

2023-04-24 10:17:21.716938+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.717 [SIPSample void logCallBack(int, const char *, int):1034] colorbar_dev.c ... 0: Colorbar generator
2023-04-24 10:17:21.717192+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.717 [SIPSample void logCallBack(int, const char *, int):1034] colorbar_dev.c ... 1: Colorbar-active
2023-04-24 10:17:21.717528+0700 PSUAKitSample[83000:15642817] 10:17:21.717 sip_endpoint.c .Module "mod-evsub" registered

2023-04-24 10:17:21.717645+0700 PSUAKitSample[83000:15642975] 💚 DEBUG 10:17:21.718 [SIPSample void logCallBack(int, const char *, int):1034] sip_endpoint.c .Module "mod-evsub" registered
2023-04-24 10:17:21.717710+0700 PSUAKitSample[83000:15642817] 10:17:21.717 sip_endpoint.c .Module "mod-presence" registered



-
How do you use Node.js to stream an MP4 file with ffmpeg ?
27 avril 2023, par LaserJesusI've been trying to solve this problem for several days now and would really appreciate any help on the subject.



I'm able to successfully stream an mp4 audio file stored on a Node.js server using fluent-ffmpeg by passing the location of the file as a string and transcoding it to mp3. If I create a file stream from the same file and pass that to fluent-ffmpeg instead it works for an mp3 input file, but not a mp4 file. In the case of the mp4 file no error is thrown and it claims the stream completed successfully, but nothing is playing in the browser. I'm guessing this has to do with the meta data being stored at the end of an mp4 file, but I don't know how to code around this. This is the exact same file that works correctly when it's location is passed to ffmpeg, rather than the stream. When I try and pass a stream to the mp4 file on s3, again no error is thrown, but nothing streams to the browser. This isn't surprising as ffmpeg won't work with the file locally as stream, so expecting it to handle the stream from s3 is wishful thinking.



How can I stream the mp4 file from s3, without storing it locally as a file first ? How do I get ffmpeg to do this without transcoding the file too ? The following is the code I have at the moment which isn't working. Note that it attempts to pass the s3 file as a stream to ffmpeg and it's also transcoding it into an mp3, which I'd prefer not to do.



.get(function(req,res) {
 aws.s3(s3Bucket).getFile(s3Path, function (err, result) {
 if (err) {
 return next(err);
 }
 var proc = new ffmpeg(result)
 .withAudioCodec('libmp3lame')
 .format('mp3')
 .on('error', function (err, stdout, stderr) {
 console.log('an error happened: ' + err.message);
 console.log('ffmpeg stdout: ' + stdout);
 console.log('ffmpeg stderr: ' + stderr);
 })
 .on('end', function () {
 console.log('Processing finished !');
 })
 .on('progress', function (progress) {
 console.log('Processing: ' + progress.percent + '% done');
 })
 .pipe(res, {end: true});
 });
});




This is using the knox library when it calls aws.s3... I've also tried writing it using the standard aws sdk for Node.js, as shown below, but I get the same outcome as above.



var AWS = require('aws-sdk');

var s3 = new AWS.S3({
 accessKeyId: process.env.AWS_ACCESS_KEY_ID,
 secretAccessKey: process.env.AWS_SECRET_KEY,
 region: process.env.AWS_REGION_ID
});
var fileStream = s3.getObject({
 Bucket: s3Bucket,
 Key: s3Key
 }).createReadStream();
var proc = new ffmpeg(fileStream)
 .withAudioCodec('libmp3lame')
 .format('mp3')
 .on('error', function (err, stdout, stderr) {
 console.log('an error happened: ' + err.message);
 console.log('ffmpeg stdout: ' + stdout);
 console.log('ffmpeg stderr: ' + stderr);
 })
 .on('end', function () {
 console.log('Processing finished !');
 })
 .on('progress', function (progress) {
 console.log('Processing: ' + progress.percent + '% done');
 })
 .pipe(res, {end: true});




=====================================



Updated



I placed an mp3 file in the same s3 bucket and the code I have here worked and was able to stream the file through to the browser without storing a local copy. So the streaming issues I face have something to do with the mp4/aac container/encoder format.



I'm still interested in a way to bring the m4a file down from s3 to the Node.js server in it's entirety, then pass it to ffmpeg for streaming without actually storing the file in the local file system.



=====================================



Updated Again



I've managed to get the server streaming the file, as mp4, straight to the browser. This half answers my original question. My only issue now is that I have to download the file to a local store first, before I can stream it. I'd still like to find a way to stream from s3 without needing the temporary file.



aws.s3(s3Bucket).getFile(s3Path, function(err, result){
 result.pipe(fs.createWriteStream(file_location));
 result.on('end', function() {
 console.log('File Downloaded!');
 var proc = new ffmpeg(file_location)
 .outputOptions(['-movflags isml+frag_keyframe'])
 .toFormat('mp4')
 .withAudioCodec('copy')
 .seekInput(offset)
 .on('error', function(err,stdout,stderr) {
 console.log('an error happened: ' + err.message);
 console.log('ffmpeg stdout: ' + stdout);
 console.log('ffmpeg stderr: ' + stderr);
 })
 .on('end', function() {
 console.log('Processing finished !');
 })
 .on('progress', function(progress) {
 console.log('Processing: ' + progress.percent + '% done');
 })
 .pipe(res, {end: true});
 });
});




On the receiving side I just have the following javascript in an empty html page :



window.AudioContext = window.AudioContext || window.webkitAudioContext;
context = new AudioContext();

function process(Data) {
 source = context.createBufferSource(); // Create Sound Source
 context.decodeAudioData(Data, function(buffer){
 source.buffer = buffer;
 source.connect(context.destination);
 source.start(context.currentTime);
 });
};

function loadSound() {
 var request = new XMLHttpRequest();
 request.open("GET", "/stream/", true);
 request.responseType = "arraybuffer";

 request.onload = function() {
 var Data = request.response;
 process(Data);
 };

 request.send();
};

loadSound()




=====================================



The Answer



The code above under the title 'updated again' will stream an mp4 file, from s3, via a Node.js server to a browser without using flash. It does require that the file be stored temporarily on the Node.js server so that the meta data in the file is moved from the end of the file to the front. In order to stream without storing the temporary file, you need to actual modify the file on S3 first and make this meta data change. If you have changed the file in this way on S3 then you can modify the code under the title 'updated again' so that the result from S3 is piped straight into the ffmpeg constructor, rather than into a file stream on the Node.js server, then providing that file location to ffmepg, as the code does now. You can change the final 'pipe' command to 'save(location)' to get a version of the mp4 file locally with the meta data moved to the front. You can then upload that new version of the file to S3 and try out the end to end streaming. Personally I'm now going to create a task that modifies the files in this way as they are uploaded to s3 in the first place. This allows me to record and stream in mp4 without transcoding or storing a temp file on the Node.js server.