
Recherche avancée
Médias (1)
-
Richard Stallman et le logiciel libre
19 octobre 2011, par
Mis à jour : Mai 2013
Langue : français
Type : Texte
Autres articles (51)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.
Sur d’autres sites (10313)
-
node.js ffmpeg spawn child_process unexpected data output
5 septembre 2021, par PLNR
I'm rather new to backend stuff, so please excuse, if my question is trivial.

For an Intranet project, I want to present a video element in a webpage (React, HLS.js Player).

The video sources are mpeg-ts streams delivered as udp multicast.

A small node.js / express server should handle the ffmpeg commands, to transcode the multicast to hls to display them in a browser.



Problem is the Output :

The output is emitted on stderr... even the process is working as expected.

Here is the respective code I wrote for transcoding so far :

const express = require("express");
const { spawn, exec } = require("child_process");
const process = require("process")

let ls;

const app = express();

app.get('/cam/:source', (body) => {
 const cam = body.params.source;
 console.log(cam);

 let source = "udp://239.1.1.1:4444";
 let stream = "/var/www/html/streams/tmp/cam1.m3u8"


 stream = spawn("ffmpeg", ["-re", "-i", source, "-c:v", "libx264", "-crf", "21", "-preset", "veryfast", "-c:a", "aac", "-b:a", "128k", "-ac", "2", "-f", "hls", "-hls_list_size", "5", "-hls_flags", "delete_segments", stream], {detached: true});

 stream.stdout.on("data", data => {
 console.log(`stdout: ${data}`);
 });

 stream.stderr.on("data", data => {
 console.log(`stderr: ${data}`);
 });

 stream.on("error", error => {
 console.log(`error: ${error.message}`);
 });

 stream.on("close", code => {
 console.log(`child process exited with code ${code}`);
 });
})

app.listen(5000, ()=> {
 console.log('Listening');
})



This is maybe only cosmetics, but it makes me wondering.

Here is the terminal output :

[nodemon] starting `node server.js`
Listening
camera stream reloaded
stderr: ffmpeg version 4.3.2-0+deb11u1ubuntu1 Copyright (c) 2000-2021 the FFmpeg developers
 built with gcc 10 (Ubuntu 10.2.1-20ubuntu1)

 --shortend--


pid: 4206
stderr: frame= 8 fps=0.0 q=0.0 size=N/A time=00:00:00.46 bitrate=N/A speed=0.931x 
pid: 4206
stderr: frame= 21 fps= 21 q=26.0 size=N/A time=00:00:00.96 bitrate=N/A speed=0.95x 
pid: 4206
stderr: frame= 33 fps= 22 q=26.0 size=N/A time=00:00:01.49 bitrate=N/A speed=0.982x 
pid: 4206
stderr: frame= 46 fps= 23 q=26.0 size=N/A time=00:00:02.00 bitrate=N/A speed=0.989x 
pid: 4206
stderr: frame= 58 fps= 23 q=26.0 size=N/A time=00:00:02.49 bitrate=N/A speed=0.986x 
pid: 4206



and so on...



Any helpful information would be highly appreciated !

Many thanks in advance

-
avfilter/x86/vf_blend : use unaligned movs for output
19 janvier 2022, par Marton Balint -
How to manipulate files by ffmpeg in Android 31 : permission denied
17 septembre 2021, par Omid.NI am trying to use FFmpeg in my android app. So I want to test it if it works before moving on. I use an external library : github link

The code looks like this :

package net.omidn.aslanmediaconverter;

import android.net.Uri;
import android.os.Bundle;
import android.util.Log;
import android.widget.TextView;

import androidx.appcompat.app.AppCompatActivity;

import com.arthenica.ffmpegkit.ExecuteCallback;
import com.arthenica.ffmpegkit.FFmpegKit;
import com.arthenica.ffmpegkit.FFmpegSession;
import com.arthenica.ffmpegkit.Session;

import net.bramp.ffmpeg.job.FFmpegJob;

import java.io.BufferedInputStream;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;


public class MainActivity extends AppCompatActivity {
 
 private static final String TAG = "MainActivity";
 FFmpegJob myjob;

 @Override
 protected void onCreate(Bundle savedInstanceState) {


 super.onCreate(savedInstanceState);
 setContentView(R.layout.activity_main);

 TextView textView = (TextView) findViewById(R.id.text_view);


 FFmpegJob job = null;

 File inFile = new File("/storage/emulated/0/video_2021-05-29_17-50-20.mp4");
 String inputName = Uri.fromFile(inFile).toString();
 Log.d(TAG, inputName);
 Log.d(TAG,"file exists : " + String.valueOf(inFile.exists()));
 Log.d(TAG,"file canRead : " + String.valueOf(inFile.canRead()));

 FFmpegSession fFmpegSession = FFmpegKit.executeAsync("-i file:///storage/emulated/0/video_2021-05-29_17-50-20.mp4 -c:v mpeg4 file:///storage/emulated/0/out.mp4",
 new ExecuteCallback() {
 @Override
 public void apply(Session session) {

 }
 });
 try {
 Thread.sleep(5000);
 } catch (InterruptedException e) {
 e.printStackTrace();
 }
 textView.setText("" + fFmpegSession.getState().name() + " " + fFmpegSession.getOutput());
 }

}




As you can see I give the files with
file:///
protocol. If I don't use that the resault is the same. The three lines ofLog.d(...)
will print :

2021-06-03 00:58:08.869 8376-8376/net.omidn.aslanmediaconverter D/MainActivity: file:///storage/emulated/0/video_2021-05-29_17-50-20.mp4
2021-06-03 00:58:08.869 8376-8376/net.omidn.aslanmediaconverter D/MainActivity: file exists : true
2021-06-03 00:58:08.869 8376-8376/net.omidn.aslanmediaconverter D/MainActivity: file canRead : false



The video file has read access on the storage :