
Recherche avancée
Médias (91)
-
Richard Stallman et le logiciel libre
19 octobre 2011, par
Mis à jour : Mai 2013
Langue : français
Type : Texte
-
Stereo master soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
Elephants Dream - Cover of the soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
#7 Ambience
16 octobre 2011, par
Mis à jour : Juin 2015
Langue : English
Type : Audio
-
#6 Teaser Music
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#5 End Title
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
Autres articles (94)
-
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...) -
Menus personnalisés
14 novembre 2010, parMediaSPIP utilise le plugin Menus pour gérer plusieurs menus configurables pour la navigation.
Cela permet de laisser aux administrateurs de canaux la possibilité de configurer finement ces menus.
Menus créés à l’initialisation du site
Par défaut trois menus sont créés automatiquement à l’initialisation du site : Le menu principal ; Identifiant : barrenav ; Ce menu s’insère en général en haut de la page après le bloc d’entête, son identifiant le rend compatible avec les squelettes basés sur Zpip ; (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (8812)
-
There was error in the HLS Video stram endpoin in springBoot
12 août 2024, par Abir SarkarThis is my controller When I call for the first time with the actual endpoint with the proper ID, it gives the output but when I send it, it gets error-prone. It will automatically change the video ID with the segment_000.ts


@GetMapping("/stream/{videoId}")
public ResponseEntity<resource> streamVideo(
 @PathVariable String videoId,
 @RequestHeader(value = HttpHeaders.RANGE, required = false) String rangeHeader) {

 try {
 System.out.println("Video Id : "+videoId);
 // Fetch the video metadata
 Video video = videoService.findById(videoId);

 if (video == null) {
 return ResponseEntity.notFound().build();
 }

 // Construct the path to the HLS playlist
 Path playlistPath = Paths.get(video.getFilePath());

 // Check if the playlist file exists
 if (!Files.exists(playlistPath)) {
 return ResponseEntity.notFound().build();
 }

 // Load the file as a resource
 Resource resource = new FileSystemResource(playlistPath);
 String contentType = "application/vnd.apple.mpegurl";
 long fileLength = Files.size(playlistPath);

 if (rangeHeader != null) {
 try {
 // Handle range requests for seeking
 String[] ranges = rangeHeader.replace("bytes=", "").split("-");
 long rangeStart = Long.parseLong(ranges[0]);
 long rangeEnd = ranges.length > 1 ? Long.parseLong(ranges[1]) : fileLength - 1;

 // Validate range end
 if (rangeEnd >= fileLength) {
 rangeEnd = fileLength - 1;
 }

 // Validate range start
 if (rangeStart > rangeEnd) {
 return ResponseEntity.status(HttpStatus.REQUESTED_RANGE_NOT_SATISFIABLE)
 .header(HttpHeaders.CONTENT_RANGE, "bytes */" + fileLength)
 .build();
 }

 // Calculate content length
 long contentLength = rangeEnd - rangeStart + 1;

 // Prepare headers
 HttpHeaders headers = new HttpHeaders();
 headers.add(HttpHeaders.CONTENT_RANGE, "bytes " + rangeStart + "-" + rangeEnd + "/" + fileLength);
 headers.add(HttpHeaders.CONTENT_LENGTH, String.valueOf(contentLength));
 headers.add(HttpHeaders.CACHE_CONTROL, "no-cache, no-store, must-revalidate");
 headers.add(HttpHeaders.PRAGMA, "no-cache");
 headers.add(HttpHeaders.EXPIRES, "0");
 headers.add(HttpHeaders.CONTENT_TYPE, contentType);

 // Serve the partial content
 InputStream inputStream = Files.newInputStream(playlistPath);
 inputStream.skip(rangeStart);

 return ResponseEntity.status(HttpStatus.PARTIAL_CONTENT)
 .headers(headers)
 .body(new InputStreamResource(inputStream));
 } catch (NumberFormatException e) {
 return ResponseEntity.status(HttpStatus.REQUESTED_RANGE_NOT_SATISFIABLE)
 .header(HttpHeaders.CONTENT_RANGE, "bytes */" + fileLength)
 .build();
 }
 } else {
 // Serve the full content
 HttpHeaders headers = new HttpHeaders();
 headers.add(HttpHeaders.CONTENT_TYPE, contentType);
 headers.add(HttpHeaders.CONTENT_LENGTH, String.valueOf(fileLength));
 headers.add(HttpHeaders.CACHE_CONTROL, "no-cache, no-store, must-revalidate");
 headers.add(HttpHeaders.PRAGMA, "no-cache");
 headers.add(HttpHeaders.EXPIRES, "0");
 System.out.println(resource.toString());
 return ResponseEntity.ok()
 .headers(headers)
 .body(resource);
 }

 } catch (IOException e) {
 // Handle IOException
 e.printStackTrace();
 return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).build();
 } catch (Exception e) {
 // Handle other exceptions
 e.printStackTrace();
 return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).build();
 }
}
</resource>


I am attaching the error image : IMAGE


In the front end i am using the angular application :


This is the
app.jsx
file

import "./App.css";
import VideoPlayer from "./VideoPlayer";
import { useRef } from "react";

function App() {
 const playerRef = useRef(null);
 const videoLink =
 "http://localhost:8080/api/v1/video/stream/66b9e7853c9b530810bdf4f4";
 const videoPlayerOptions = {
 controls: true,
 responsive: true,
 fluid: true,
 sources: [
 {
 src: videoLink,
 type: "application/x-mpegURL",
 },
 ],
 };
 const handlePlayerReady = (player) => {
 playerRef.current = player;

 // You can handle player events here, for example:
 player.on("waiting", () => {
 videojs.log("player is waiting");
 });

 player.on("dispose", () => {
 videojs.log("player will dispose");
 });
 };
 return (
 <>
 <div>
 <h1>Video player</h1>
 </div>

 
 >
 );
}

export default App;



This is the
VideoPlayer.jsx
file :

import React, { useRef, useEffect } from "react";
import videojs from "video.js";
import "video.js/dist/video-js.css";

export const VideoPlayer = (props) => {
 const videoRef = useRef(null);
 const playerRef = useRef(null);
 const { options, onReady } = props;

 useEffect(() => {
 // Make sure Video.js player is only initialized once
 if (!playerRef.current) {
 // The Video.js player needs to be _inside_ the component el for React 18 Strict Mode.
 const videoElement = document.createElement("video-js");

 videoElement.classList.add("vjs-big-play-centered");
 videoRef.current.appendChild(videoElement);

 const player = (playerRef.current = videojs(videoElement, options, () => {
 videojs.log("player is ready");
 onReady && onReady(player);
 }));

 // You could update an existing player in the `else` block here
 // on prop change, for example:
 } else {
 const player = playerRef.current;

 player.autoplay(options.autoplay);
 player.src(options.sources);
 }
 }, [options, videoRef]);

 // Dispose the Video.js player when the functional component unmounts
 useEffect(() => {
 const player = playerRef.current;

 return () => {
 if (player && !player.isDisposed()) {
 player.dispose();
 playerRef.current = null;
 }
 };
 }, [playerRef]);

 return (
 
 <div ref="{videoRef}"></div>
 
 );
};

export default VideoPlayer;



I am trying to play the video in the player. But the video is not playing in the browser. and when I hit the endpoint with Postman, it gives me the content of the index.m3u8 file. In the player, the video length is coming, but the video is not playing. Please help me to play the video.


*


Github Project Link :
GITHUB
*

-
Need help using libavfilter for adding overlay to frames [closed]
30 juillet 2024, par Michael WernerHello gentlemen and ladies,


I am working with libavfilter and I am getting crazy.


On Windows 11 OS with latest libav (full build) a C/C++ app reads YUV420P frames from a frame grabber card.


I want to draw a bitmap (BGR24) overlay image from file on every frame via libavfilter. First I convert the BGR24 overlay image via format filter to YUV420P. Then I feed the YUV420P frame from frame grabber and the YUV420P overlay into the overlay filter.


Everything seems to be fine but when I try to get the frame out of the filter graph I always get an "Resource is temporary not available" (EAGAIN) return code, independent on how many frames I put into the graph.


The frames from the frame grabber card are fine, I could encode them or write them to a .yuv file. The overlay frame looks fine too.


My current initialization code looks like below. It does not report any errors or warnings but when I try to get the filtered frame out of the graph via
av_buffersink_get_frame
I always get anEAGAIN
return code.

Here is my current initialization code :


int init_overlay_filter(AVFilterGraph** graph, AVFilterContext** src_ctx, AVFilterContext** overlay_src_ctx,
 AVFilterContext** sink_ctx)
{
 AVFilterGraph* filter_graph;
 AVFilterContext* buffersrc_ctx;
 AVFilterContext* overlay_buffersrc_ctx;
 AVFilterContext* buffersink_ctx;
 AVFilterContext* overlay_ctx;
 AVFilterContext* format_ctx;
 const AVFilter *buffersrc, *buffersink, *overlay_buffersrc, *overlay_filter, *format_filter;
 int ret;

 // Create the filter graph
 filter_graph = avfilter_graph_alloc();
 if (!filter_graph)
 {
 fprintf(stderr, "Unable to create filter graph.\n");
 return AVERROR(ENOMEM);
 }

 // Create buffer source filter for main video
 buffersrc = avfilter_get_by_name("buffer");
 if (!buffersrc)
 {
 fprintf(stderr, "Unable to find buffer filter.\n");
 return AVERROR_FILTER_NOT_FOUND;
 }

 // Create buffer source filter for overlay image
 overlay_buffersrc = avfilter_get_by_name("buffer");
 if (!overlay_buffersrc)
 {
 fprintf(stderr, "Unable to find buffer filter.\n");
 return AVERROR_FILTER_NOT_FOUND;
 }

 // Create buffer sink filter
 buffersink = avfilter_get_by_name("buffersink");
 if (!buffersink)
 {
 fprintf(stderr, "Unable to find buffersink filter.\n");
 return AVERROR_FILTER_NOT_FOUND;
 }

 // Create overlay filter
 overlay_filter = avfilter_get_by_name("overlay");
 if (!overlay_filter)
 {
 fprintf(stderr, "Unable to find overlay filter.\n");
 return AVERROR_FILTER_NOT_FOUND;
 }

 // Create format filter
 format_filter = avfilter_get_by_name("format");
 if (!format_filter) 
 {
 fprintf(stderr, "Unable to find format filter.\n");
 return AVERROR_FILTER_NOT_FOUND;
 }

 // Initialize the main video buffer source
 char args[512];
 snprintf(args, sizeof(args),
 "video_size=1920x1080:pix_fmt=yuv420p:time_base=1/25:pixel_aspect=1/1");
 ret = avfilter_graph_create_filter(&buffersrc_ctx, buffersrc, "in", args, NULL, filter_graph);
 if (ret < 0)
 {
 fprintf(stderr, "Unable to create buffer source filter for main video.\n");
 return ret;
 }

 // Initialize the overlay buffer source
 snprintf(args, sizeof(args),
 "video_size=165x165:pix_fmt=bgr24:time_base=1/25:pixel_aspect=1/1");
 ret = avfilter_graph_create_filter(&overlay_buffersrc_ctx, overlay_buffersrc, "overlay_in", args, NULL,
 filter_graph);
 if (ret < 0)
 {
 fprintf(stderr, "Unable to create buffer source filter for overlay.\n");
 return ret;
 }

 // Initialize the format filter to convert overlay image to yuv420p
 snprintf(args, sizeof(args), "pix_fmts=yuv420p");
 ret = avfilter_graph_create_filter(&format_ctx, format_filter, "format", args, NULL, filter_graph);

 if (ret < 0) 
 {
 fprintf(stderr, "Unable to create format filter.\n");
 return ret;
 }

 // Initialize the buffer sink
 ret = avfilter_graph_create_filter(&buffersink_ctx, buffersink, "out", NULL, NULL, filter_graph);
 if (ret < 0)
 {
 fprintf(stderr, "Unable to create buffer sink filter.\n");
 return ret;
 }

 // Initialize the overlay filter
 ret = avfilter_graph_create_filter(&overlay_ctx, overlay_filter, "overlay", "W-w:H-h:enable='between(t,0,20)':format=yuv420", NULL, filter_graph);
 if (ret < 0)
 {
 fprintf(stderr, "Unable to create overlay filter.\n");
 return ret;
 }

 // Connect the filters
 ret = avfilter_link(overlay_buffersrc_ctx, 0, format_ctx, 0);

 if (ret >= 0)
 {
 ret = avfilter_link(buffersrc_ctx, 0, overlay_ctx, 0);
 }
 else
 {
 fprintf(stderr, "Unable to configure filter graph.\n");
 return ret;
 }


 if (ret >= 0) 
 {
 ret = avfilter_link(format_ctx, 0, overlay_ctx, 1);
 }
 else
 {
 fprintf(stderr, "Unable to configure filter graph.\n");
 return ret;
 }

 if (ret >= 0) 
 {
 if ((ret = avfilter_link(overlay_ctx, 0, buffersink_ctx, 0)) < 0)
 {
 fprintf(stderr, "Unable to link filter graph.\n");
 return ret;
 }
 }
 else
 {
 fprintf(stderr, "Unable to configure filter graph.\n");
 return ret;
 }

 // Configure the filter graph
 if ((ret = avfilter_graph_config(filter_graph, NULL)) < 0)
 {
 fprintf(stderr, "Unable to configure filter graph.\n");
 return ret;
 }

 *graph = filter_graph;
 *src_ctx = buffersrc_ctx;
 *overlay_src_ctx = overlay_buffersrc_ctx;
 *sink_ctx = buffersink_ctx;

 return 0;
}



Feeding the filter graph is done this way :


av_buffersrc_add_frame_flags(buffersrc_ctx, pFrameGrabberFrame, AV_BUFFERSRC_FLAG_KEEP_REF)
av_buffersink_get_frame(buffersink_ctx, filtered_frame)



av_buffersink_get_frame
returns alwaysEAGAIN
, no matter how many frames I feed into the graph. The frames (from framegrabber and the overlay frame) itself are looking fine.

I did set libav logging level to maximum but I do not see any warnings or errors or helpful, related information in the log.


Here the log output related to the filter configuration :


[in @ 00000288ee494f40] Setting 'video_size' to value '1920x1080'
[in @ 00000288ee494f40] Setting 'pix_fmt' to value 'yuv420p'
[in @ 00000288ee494f40] Setting 'time_base' to value '1/25'
[in @ 00000288ee494f40] Setting 'pixel_aspect' to value '1/1'
[in @ 00000288ee494f40] w:1920 h:1080 pixfmt:yuv420p tb:1/25 fr:0/1 sar:1/1 csp:unknown range:unknown
[overlay_in @ 00000288ff1013c0] Setting 'video_size' to value '165x165'
[overlay_in @ 00000288ff1013c0] Setting 'pix_fmt' to value 'bgr24'
[overlay_in @ 00000288ff1013c0] Setting 'time_base' to value '1/25'
[overlay_in @ 00000288ff1013c0] Setting 'pixel_aspect' to value '1/1'
[overlay_in @ 00000288ff1013c0] w:165 h:165 pixfmt:bgr24 tb:1/25 fr:0/1 sar:1/1 csp:unknown range:unknown
[format @ 00000288ff1015c0] Setting 'pix_fmts' to value 'yuv420p'
[overlay @ 00000288ff101880] Setting 'x' to value 'W-w'
[overlay @ 00000288ff101880] Setting 'y' to value 'H-h'
[overlay @ 00000288ff101880] Setting 'enable' to value 'between(t,0,20)'
[overlay @ 00000288ff101880] Setting 'format' to value 'yuv420'
[auto_scale_0 @ 00000288ff101ec0] w:iw h:ih flags:'' interl:0
[format @ 00000288ff1015c0] auto-inserting filter 'auto_scale_0' between the filter 'overlay_in' and the filter 'format'
[auto_scale_1 @ 00000288ee4a4cc0] w:iw h:ih flags:'' interl:0
[overlay @ 00000288ff101880] auto-inserting filter 'auto_scale_1' between the filter 'format' and the filter 'overlay'
[AVFilterGraph @ 00000288ee495c80] query_formats: 5 queried, 6 merged, 6 already done, 0 delayed
[auto_scale_0 @ 00000288ff101ec0] w:165 h:165 fmt:bgr24 csp:gbr range:pc sar:1/1 -> w:165 h:165 fmt:yuv420p csp:unknown range:unknown sar:1/1 flags:0x00000004
[auto_scale_1 @ 00000288ee4a4cc0] w:165 h:165 fmt:yuv420p csp:unknown range:unknown sar:1/1 -> w:165 h:165 fmt:yuva420p csp:unknown range:unknown sar:1/1 flags:0x00000004
[overlay @ 00000288ff101880] main w:1920 h:1080 fmt:yuv420p overlay w:165 h:165 fmt:yuva420p
[overlay @ 00000288ff101880] [framesync @ 00000288ff1019a8] Selected 1/25 time base
[overlay @ 00000288ff101880] [framesync @ 00000288ff1019a8] Sync level 2



-
avdevice/dshow : Cleanup also on av_log case
26 mai 2024, par Michael Niedermayer