
Recherche avancée
Médias (1)
-
ED-ME-5 1-DVD
11 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
Autres articles (34)
-
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Librairies et binaires spécifiques au traitement vidéo et sonore
31 janvier 2010, parLes logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
Binaires complémentaires et facultatifs flvtool2 : (...)
Sur d’autres sites (5974)
-
Can only save animation in matplotlib with default parameters
20 août 2019, par J.DoeI keep getting this error when trying to save my animations in matplotlib :
import numpy as np
import matplotlib.pyplot as plt
from matplotlib.animation import FuncAnimation
plt.rcParams['animation.ffmpeg_path'] = 'C:\FFmpeg\bin'
fig, ax = plt.subplots()
ax.set_xlim(-0.1 ,2*np.pi + 0.1)
ax.set_ylim(-1.1 ,1.1)
ln, = plt.plot([], [], '-')
x = np.linspace(0,2*np.pi,1000)
def update(frame):
y = frame*np.sin(x)
ln.set_data(x,y)
return ln,
ani = FuncAnimation(fig,
update,
frames=np.linspace(1,-1,1000),
interval=1000/144)
ani.save('lol.gif')MovieWriter ffmpeg unavailable. Trying to use pillow instead.
This is a repetition of an unanswered question : UserWarning : MovieWriter ffmpeg unavailable
I tried running a sample code from here, but it still says ffmpeg isn’t available, even though I installed and activated it according to wikihow. So even setting the path to the binary doesn’t seem to work.
I can’t set the fps or the dpi, or anything since the
save
function just defaults. What can I do so that python finally starts using ffmpeg ? -
How to get output from ffmpeg process in c#
13 juillet 2018, par Anirudha GuptaIn the code I written in WPF, I run some filter in FFmpeg, If I run the command in terminal (PowerShell or cmd prompt) It will give me information line by line what’s going on.
I am calling the process from C# code and it’s work fine. The problem I have with my code is actually I am not able to get any output from the process I run.
I have tried some answers from StackOverflow for FFmpeg process. I see 2 opportunities in my code. I can either fix it by Timer approach or secondly hook an event to OutputDataReceived.
I tried OutputDataReceived event, My code never got it worked. I tried Timer Approach but still, it’s not hitting my code. Please check the code below
_process = new Process
{
StartInfo = new ProcessStartInfo
{
FileName = ffmpeg,
Arguments = arguments,
UseShellExecute = false,
RedirectStandardOutput = true,
RedirectStandardError = true,
CreateNoWindow = true,
},
EnableRaisingEvents = true
};
_process.OutputDataReceived += Proc_OutputDataReceived;
_process.Exited += (a, b) =>
{
System.Threading.Tasks.Task.Run(() =>
{
System.Threading.Tasks.Task.Delay(5000);
System.IO.File.Delete(newName);
});
//System.IO.File.Delete()
};
_process.Start();
_timer = new Timer();
_timer.Interval = 500;
_timer.Start();
_timer.Tick += Timer_Tick;
}
private void Timer_Tick(object sender, EventArgs e)
{
while (_process.StandardOutput.EndOfStream)
{
string line = _process.StandardOutput.ReadLine();
}
// Check the process.
} -
Is it feasible to create FFmpegFrameGrabber one by one for single FFmpegFrameRecorder and maintain video stream keep alive ?
12 juillet 2023, par zhoutianThe reason why I ask these question is I got byte [] of container data(name is dhav) one by one and I need to push that data continuously to RTMP to play。


What's the current progress I made ?


For now ,I can push data to RTMP and play RTMP by VLC just for few seconds,then the RTMP stream is end .


because the grabber created by inputstream only contain a few of the data come from ByteBuffer ,when that inputstream is end, the RTMP is closed.


synchronized (buffer) {
 buffer.flip();
 byte[] bytes = new byte[buffer.remaining()];
 buffer.get(bytes);
 buffer.clear();
 isByteBufferFull[0] = false;
 try {
 grabAndPush(bytes, SRS_PUSH_ADDRESS);
 } catch (Exception e) {
 //throw new RuntimeException(e);
 }

 }



private static synchronized void grabAndPush(byte[] bytes, String pushAddress) throws Exception {
 avutil.av_log_set_level(avutil.AV_LOG_INFO);
 FFmpegLogCallback.set();

 FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(new ByteArrayInputStream(bytes));
...
}



So can anyone tell me how to keep the RTMP aways alive by FFmpegFrameGrabber and FFmpegFrameRecorder when the source data come from one by one.
very appreciate 😃


this is my code :


import lombok.extern.slf4j.Slf4j;
import org.bytedeco.ffmpeg.avcodec.AVCodecParameters;
import org.bytedeco.ffmpeg.avformat.AVFormatContext;
import org.bytedeco.ffmpeg.avformat.AVStream;
import org.bytedeco.ffmpeg.global.avcodec;
import org.bytedeco.ffmpeg.global.avutil;
import org.bytedeco.javacv.FFmpegFrameGrabber;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.bytedeco.javacv.FFmpegLogCallback;
import org.bytedeco.javacv.Frame;
import org.jfjy.ch2ji.ecctv.dh.api.ApiService;
import org.jfjy.ch2ji.ecctv.dh.callback.RealPlayCallback;

import java.io.ByteArrayInputStream;
import java.nio.ByteBuffer;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;

@Slf4j
public class GetBytes2PushRTMPNew2 {

 private static final String SRS_PUSH_ADDRESS = "rtmp://127.0.0.1:1935/live/livestream";

 static int BUFFER_CAPACITY = 1 * 1024 * 1024;

 public static void main(String[] args) throws Exception {
 FFmpegLogCallback.set();
 ApiService apiService = new ApiService();
 Long login = apiService.login("10.3.0.54", 8801, "admin", "xxxx");
 ByteBuffer buffer = ByteBuffer.allocate(BUFFER_CAPACITY);
 final boolean[] isByteBufferFull = {false};
 apiService.startRealPlay(new RealPlayCallback() {
 @Override
 public void apply(Long aLong, Integer integer, byte[] bytes) {
 try {
 //push data to bytebuffer
 synchronized (buffer) {
 if (buffer.remaining() > bytes.length) {
 buffer.put(bytes);
 } else {
 isByteBufferFull[0] = true;
 }
 }
 } catch (Exception e) {
 throw new RuntimeException(e);
 }
 }
 }, 0, 0);

 ExecutorService executorService = Executors.newFixedThreadPool(1);
 executorService.execute(new Runnable() {
 @Override
 public void run() {
 while (true) {
 //get data from bytebuffer when buffer is full
 synchronized (isByteBufferFull) {
 if (isByteBufferFull[0]) {
 synchronized (buffer) {
 buffer.flip();
 byte[] bytes = new byte[buffer.remaining()];
 buffer.get(bytes);
 buffer.clear();
 isByteBufferFull[0] = false;
 try {
 //using grabber and recorder to push RTMP
 grabAndPush(bytes, SRS_PUSH_ADDRESS);
 } catch (Exception e) {
 //throw new RuntimeException(e);
 }

 }
 }
 }
 try {
 Thread.sleep(500);
 } catch (InterruptedException e) {
 throw new RuntimeException(e);
 }
 }

 }
 });
 while (true) {

 }
 }

 private static synchronized void grabAndPush(byte[] bytes, String pushAddress) throws Exception {
 avutil.av_log_set_level(avutil.AV_LOG_INFO);
 FFmpegLogCallback.set();

 FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(new ByteArrayInputStream(bytes));


 grabber.setFormat("dhav");
 grabber.start();

 AVFormatContext avFormatContext = grabber.getFormatContext();

 int streamNum = avFormatContext.nb_streams();

 if (streamNum < 1) {
 log.error("no media!");
 return;
 }

 int frameRate = (int) grabber.getVideoFrameRate();
 if (0 == frameRate) {
 frameRate = 15;
 }
 log.info("frameRate[{}],duration[{}]秒,nb_streams[{}]",
 frameRate,
 avFormatContext.duration() / 1000000,
 avFormatContext.nb_streams());

 for (int i = 0; i < streamNum; i++) {
 AVStream avStream = avFormatContext.streams(i);
 AVCodecParameters avCodecParameters = avStream.codecpar();
 log.info("stream index[{}],codec type[{}],codec ID[{}]", i, avCodecParameters.codec_type(), avCodecParameters.codec_id());
 }

 int frameWidth = grabber.getImageWidth();
 int frameHeight = grabber.getImageHeight();
 int audioChannels = grabber.getAudioChannels();

 log.info("frameWidth[{}],frameHeight[{}],audioChannels[{}]",
 frameWidth,
 frameHeight,
 audioChannels);

 FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(pushAddress,
 frameWidth,
 frameHeight,
 audioChannels);

 recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
 recorder.setInterleaved(true);

 recorder.setFormat("flv");

 recorder.setFrameRate(frameRate);

 recorder.setGopSize(frameRate);

 recorder.setAudioChannels(grabber.getAudioChannels());


 recorder.start();


 Frame frame;


 log.info("start push");

 int videoFrameNum = 0;
 int audioFrameNum = 0;
 int dataFrameNum = 0;

 int interVal = 1000 / frameRate;
 interVal /= 8;

 while (null != (frame = grabber.grab())) {

 if (null != frame.image) {
 videoFrameNum++;
 }

 if (null != frame.samples) {
 audioFrameNum++;
 }

 if (null != frame.data) {
 dataFrameNum++;
 }

 recorder.record(frame);

 Thread.sleep(interVal);
 }

 log.info("push complete,videoFrameNum[{}],audioFrameNum[{}],dataFrameNum[{}]",
 videoFrameNum,
 audioFrameNum,
 dataFrameNum);

 recorder.close();
 grabber.close();
 }


}