
Recherche avancée
Médias (2)
-
Granite de l’Aber Ildut
9 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
-
Géodiversité
9 septembre 2011, par ,
Mis à jour : Août 2018
Langue : français
Type : Texte
Autres articles (57)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Emballe Médias : Mettre en ligne simplement des documents
29 octobre 2010, parLe plugin emballe médias a été développé principalement pour la distribution mediaSPIP mais est également utilisé dans d’autres projets proches comme géodiversité par exemple. Plugins nécessaires et compatibles
Pour fonctionner ce plugin nécessite que d’autres plugins soient installés : CFG Saisies SPIP Bonux Diogène swfupload jqueryui
D’autres plugins peuvent être utilisés en complément afin d’améliorer ses capacités : Ancres douces Légendes photo_infos spipmotion (...) -
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)
Sur d’autres sites (4961)
-
How to Output Mjpeg from Kokorin Jaffree FFmpeg via UDP to a Localhost Port ?
14 octobre 2022, par robaI have a Java program which displays dual webcams and records them to file in FHD 30fps H264/H265. It uses Sarxos Webcam for the initial setup and display but when recording, it switches to Jaffree FFmpeg. During recording Sarxos Webcam must release its webcam access and cannot display while recording continues.


I have tried recording with Xuggler/Sarxos but Sarxos seems to only access raw video from the webcams which creates limitations in the frame rate and resolution which can be achieved. At 1920x1080 the cameras can only deliver 5 fps raw video.


I am trying to direct mjpeg streams from Jaffree to localports for display purposes during recording but I cannot figure out how to do it.


Simultaneous recording plus sending to a port can be done from the terminal with the following :


ffmpeg -f dshow -video_size 1920x1080 -rtbufsize 944640k -framerate 25 -vcodec mjpeg -i video="Logitech Webcam C930e" -pix_fmt yuv420p -c:v libx264 outFHDx25.mp4 -f mpegts udp://localhost:1234?pkt_size=188&buffer_size=65535



and viewed from the port in a different terminal like this :


ffplay -i udp://localhost:1234



The video which displays is a little blocky compared with the video recorded to file. Any suggestions on how to improve this would be appreciated.


Note that FFPlay is not included in Jaffree FFMpeg.


I would like to send the mjpeg to a port and then read it into the Sarxos Webcam viewer to display while recording is in progress.


The Jaffree Java code for recording the output of one webcam to file follows. It takes the mjpeg/yuv422p output from the webcam and normally encodes it to file as H264/yuv420p :


public static FFmpeg createTestFFmpeg() {
 String camera1Ref = "video=" + cam1Vid + ":audio=" + cam1Aud;
 return FFmpeg.atPath()
 .addArguments("-f", "dshow") //selects dshow for Windows
 .addArguments("-video_size", resString) //video resolution eg 1920x1080 
 .addArguments("-rtbufsize", rtBufResultString) 
 .addArguments("-thread_queue_size", threadQ)
 .addArguments("-framerate", fpsString) // capture frame rate eg 30fps 
 .addArguments(codec, vidString) //set capture encode mode from camera
 .addArgument(audio) //on or off
 .addArguments("-i", camera1Ref) // name of camera to capture
 .addArguments("-pix_fmt", pixFmt)
 .addArguments("-c:v", enc2) //eg enc2 = "libx264", "h264_nvenc"
 .addArguments(enc3, enc4) //enc3 = "-crf", enc4 = "20"
 .addArguments(enc5, enc6) //enc5 = "-gpu:v", enc6 = "0"
 .addArguments(enc7, enc8) //enc7 = "-cq:v", enc8 = "20"
 .addArguments(enc9, enc10) //enc9 = "-rc:v", enc10 = "vbr"
 .addArguments(enc11, enc12) //enc11 = "-tune:v", enc12 = "ll"
 .addArguments(enc13, enc14) //enc13 = "-preset:v", enc14 = "p1" 
 .addArguments(enc15,enc16) //enc15 = "-b:v", enc16 = "0"
 .addArguments(enc17, enc18) //enc17 = "-maxrate:v", enc18 = "5000k"
 .addArguments(enc19, enc20) //enc19 = "-bufsize:v", enc20 = "5000k"
 .addArguments(enc21, enc22) //enc21 = "-profile:v", enc22 = "main"
 .addArgument(noFFStats) //"-nostats"{, stops logging progress/statistics
 .addArguments("-loglevel", ffLogLevel) //error logging
 .addArgument(bannerResultString) // "-hide_banner"
 .addArguments("-rtbufsize", rtBufResultString) 
 .setOverwriteOutput(true) // overwrite filename if it exists Boolean = overwriteFile
 .addOutput(
 UrlOutput
 .toUrl(filePathL)) 
 .setProgressListener(new ProgressListener(){
 @Override
 public void onProgress(FFmpegProgress progress){
 if(ffProgress){ 
 System.out.println(progress);
 
 } 
 }
 } );
 
 }



How and where do I add the code to output mjpeg via UDP to a localport while simultaneously writing H264 to a file, and what is the syntax ? I am sure it must be simple but I seem to have tried all of the permutations without success. I can write to a file OR I can output to a port but I cannot do both.


-
How to use Jaffree with Spring Boot for streaming a RTSP flow
25 août 2022, par JmarchiIm trying to build a APIRest and one of the things i want to do is recirculate the rtsp video provided by some security cameras to the frontend.


I have found the Jaffree, a dependency that integrates the ffmpeg into spring, until then all is good.


The problem is when i try to send the video to the frontend (make in React) i recieve this error :




Starting process : ffmpeg


Waiting for process to finish


...


Input #0, mpjpeg, from __________


Duration : N/A, bitrate : N/A


Stream #0:0 : Video : mjpeg (Baseline), yuvj420p(pc, bt470bg/unknown/unknown), 1920x1080 [SAR 1:1 DAR 16:9], 25 tbr, 25 tbn


[warning] Codec AVOption b (set bitrate (in bits/s)) specified for output file #0 (tcp ://127.0.0.1:52225) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream.


Output #0, ismv, to 'tcp ://127.0.0.1:52225' :


Metadata :


encoder : Lavf59.27.100


Stream #0:0 : Video : mjpeg (Baseline) (mp4v / 0x7634706D), yuvj420p(pc, bt470bg/unknown/unknown), 1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 25 tbr, 10000k tbn


Stream mapping :


Stream #0:0 -> #0:0 (copy)


frame= 21 fps=7.2 q=-1.0 size= 252kB time=00:00:00.80 bitrate=2580.9kbits/s speed=0.275x


...


: Interrupting starter thread (task-1) because of exception : TCP negotiation failed




The code in the backend is this :


@GetMapping(value = "/{id}/video")
public ResponseEntity<streamingresponsebody> getVideo() {
 String url = "**********";

 return ResponseEntity.ok()
 .contentType(MediaType.APPLICATION_OCTET_STREAM)
 .body(os ->{
 FFmpeg.atPath()
 .addArgument("-re")
 .addArguments("-acodec", "pcm_s16le")
 // .addArguments("-rtsp_transport", "tcp")
 .addArguments("-i", url)
 .addArguments("-vcodec", "copy")
 .addArguments("-af", "asetrate=22050")
 .addArguments("-acodec", "aac")
 .addArguments("-b:a", "96k" )
 .addOutput(PipeOutput.pumpTo(os)
 .disableStream(StreamType.AUDIO)
 .disableStream(StreamType.SUBTITLE)
 .disableStream(StreamType.DATA)
 .setFrameCount(StreamType.VIDEO, 100L)
 //1 frame every 10 seconds
 .setFrameRate(0.1)
 .setDuration(1, TimeUnit.HOURS)
 .setFormat("ismv"))
 .addArgument("-nostdin")
 .execute();
 });
 }
</streamingresponsebody>


And this is the html part :


<video width="100%" height="auto" controls="controls" autoplay="autoplay" muted="muted" src="http://localhost:7500/***/1/video">
 Sorry, your browser doesn't support embedded videos.
 </video>



What is it missing for the TCP negotiation ?


-
Unable to open FFmpegFrameRecorder stream on a UDP port on Windows
4 juin 2022, par DeoBelbertI am trying to stream a webcam video with FFmpegFrameRecorder and play it with "ffplay" command, but the ffplay command fails with the following message 'udp ://127.0.0.1:25777 : Invalid data found when processing input/0'.


The code I am using for streaming the video,


public void startStreaming() throws Exception {
 OpenCVFrameGrabber grabber = new OpenCVFrameGrabber(0);
 grabber.setVideoCodec(27);
 grabber.setImageMode(FrameGrabber.ImageMode.COLOR);
 grabber.setBitsPerPixel(10);
 grabber.setFormat("yuvj420p");
 grabber.start();
 FFmpegFrameRecorder recorder = new FFmpegFrameRecorder("udp://127.0.0.1:25777", grabber.getImageWidth(), grabber.getImageHeight());
 recorder.setFrameRate(grabber.getFrameRate());
 recorder.setVideoBitrate(grabber.getVideoBitrate());
 recorder.setOption("listen", "1");
 recorder.setFormat("h264");
 recorder.setPixelFormat(AV_PIX_FMT_YUV420P);
 recorder.setVideoCodecName("h264_videotoolbox");
 recorder.setVideoCodec(27);
 recorder.setOption("keyint", "10");
 recorder.start();}



And I am calling the recorder.record in a thread as follows,


frame = grabber.grabFrame(); 
recorder.record(frame);



After starting the stream, I opened the command line and tried to play the stream with the following command,


ffplay udp://127.0.0.1:25777



And it is failing with the following message,


udp://127.0.0.1:25777: Invalid data found when processing input/0



To get the stream information, I used the following command,


ffmpeg -i udp://127.0.0.1:25777



And it showed the following output,


[h264 @ 0000014dd1424880] non-existing PPS 0 referenced
[h264 @ 0000014dd1424880] decode_slice_header error
[h264 @ 0000014dd1424880] non-existing PPS 0 referenced
[h264 @ 0000014dd1424880] decode_slice_header error
[h264 @ 0000014dd1424880] non-existing PPS 0 referenced
[h264 @ 0000014dd1424880] decode_slice_header error
[h264 @ 0000014dd1424880] non-existing PPS 0 referenced
[h264 @ 0000014dd1424880] decode_slice_header error
[h264 @ 0000014dd1424880] no frame!



But when I tried with TCP protocol it is working fine and I am able to play the video.
How can I fix this issue ? this is the first time I am using the FFmpeg and JavaCV.