
Recherche avancée
Médias (2)
-
Exemple de boutons d’action pour une collection collaborative
27 février 2013, par
Mis à jour : Mars 2013
Langue : français
Type : Image
-
Exemple de boutons d’action pour une collection personnelle
27 février 2013, par
Mis à jour : Février 2013
Langue : English
Type : Image
Autres articles (91)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Le plugin : Gestion de la mutualisation
2 mars 2010, parLe plugin de Gestion de mutualisation permet de gérer les différents canaux de mediaspip depuis un site maître. Il a pour but de fournir une solution pure SPIP afin de remplacer cette ancienne solution.
Installation basique
On installe les fichiers de SPIP sur le serveur.
On ajoute ensuite le plugin "mutualisation" à la racine du site comme décrit ici.
On customise le fichier mes_options.php central comme on le souhaite. Voilà pour l’exemple celui de la plateforme mediaspip.net :
< ?php (...) -
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)
Sur d’autres sites (14426)
-
How to play streaming media of rtmp/rtsp protocol in .NET MAUI Android/iOS ? [closed]
26 avril 2024, par han zhuAs far as I know, there are many solutions on Windows (winform/wpf). The most typical ones are to call the libvlcsharp and ffmpeg libraries. However, for .NET MAUI Android/iOS, I have not found the corresponding support library at present. In fact, my purpose seems very simple. I want to use MAUI to implement a real-time streaming media player that can support the rtmp/rtsp protocol. Does anyone know of any good solutions ? Thanks


I tried the MediaElement from the community toolkit but it doesn't support rtmp so I'm hoping to find a solution to support rtmp live streaming in .NET Maui Android/ios


-
Media-players show longest duration of multi-track MP4 files from FFmpeg [closed]
9 juillet 2024, par EasonWaiiI need to merge two videos into one using ffmpeg. The resulting video should have two video tracks and one audio track, with the audio track taken from the longer video.


When playing the merged video in a player like PotPlayer or VLC, it should default to playing the shorter video track. However, users should be able to switch to the other video track if they want.


The problem I am facing :
Everything is working fine, except when the player defaults to the shorter video track, it shows the timeline of the longer video track.


built with Apple clang version 15.0.0 (clang-1500.3.9.4)
 configuration: --prefix=/usr/local/Cellar/ffmpeg/7.0.1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags='-Wl,-ld_classic' --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libaribb24 --enable-libbluray --enable-libdav1d --enable-libharfbuzz --enable-libjxl --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-audiotoolbox
 libavutil 59. 8.100 / 59. 8.100
 libavcodec 61. 3.100 / 61. 3.100
 libavformat 61. 1.100 / 61. 1.100
 libavdevice 61. 1.100 / 61. 1.100
 libavfilter 10. 1.100 / 10. 1.100
 libswscale 8. 1.100 / 8. 1.100
 libswresample 5. 1.100 / 5. 1.100
 libpostproc 58. 1.100 / 58. 1.100
{
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'output21.mp4':
 Metadata:
 major_brand : isom
 minor_version : 512
 compatible_brands: isomiso2avc1mp41
 encoder : Lavf59.16.100
 Duration: 00:00:12.84, start: 0.000000, bitrate: 2207 kb/s
 Stream #0:0[0x1](und): Video: hevc (Main) (hev1 / 0x31766568), yuv420p(tv, progressive), 720x1280 [SAR 1:1 DAR 9:16], 1052 kb/s, 30 fps, 30 tbr, 15360 tbn
 Metadata:
 handler_name : VideoHandler
 vendor_id : [0][0][0][0]
 Stream #0:1[0x2](eng): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709, progressive), 720x1280, 1293 kb/s, 29.83 fps, 29.83 tbr, 11456 tbn (default)
 Metadata:
 handler_name : ?Mainconcept Video Media Handler
 vendor_id : [0][0][0][0]
 Stream #0:2[0x3](und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 130 kb/s (default)
 Metadata:
 handler_name : SoundHandler
 vendor_id : [0][0][0][0]
 "streams": [
 {
 "index": 0,
 "codec_name": "hevc",
 "codec_long_name": "H.265 / HEVC (High Efficiency Video Coding)",
 "profile": "Main",
 "codec_type": "video",
 "codec_tag_string": "hev1",
 "codec_tag": "0x31766568",
 "width": 720,
 "height": 1280,
 "coded_width": 720,
 "coded_height": 1280,
 "closed_captions": 0,
 "film_grain": 0,
 "has_b_frames": 2,
 "sample_aspect_ratio": "1:1",
 "display_aspect_ratio": "9:16",
 "pix_fmt": "yuv420p",
 "level": 93,
 "color_range": "tv",
 "chroma_location": "left",
 "field_order": "progressive",
 "refs": 1,
 "id": "0x1",
 "r_frame_rate": "30/1",
 "avg_frame_rate": "30/1",
 "time_base": "1/15360",
 "start_pts": 0,
 "start_time": "0.000000",
 "duration_ts": 196096,
 "duration": "12.766667",
 "bit_rate": "1052926",
 "nb_frames": "383",
 "extradata_size": 2480,
 "disposition": {
 "default": 0,
 "dub": 0,
 "original": 0,
 "comment": 0,
 "lyrics": 0,
 "karaoke": 0,
 "forced": 0,
 "hearing_impaired": 0,
 "visual_impaired": 0,
 "clean_effects": 0,
 "attached_pic": 0,
 "timed_thumbnails": 0,
 "non_diegetic": 0,
 "captions": 0,
 "descriptions": 0,
 "metadata": 0,
 "dependent": 0,
 "still_image": 0
 },
 "tags": {
 "language": "und",
 "handler_name": "VideoHandler",
 "vendor_id": "[0][0][0][0]"
 }
 },
 {
 "index": 1,
 "codec_name": "h264",
 "codec_long_name": "H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10",
 "profile": "High",
 "codec_type": "video",
 "codec_tag_string": "avc1",
 "codec_tag": "0x31637661",
 "width": 720,
 "height": 1280,
 "coded_width": 720,
 "coded_height": 1280,
 "closed_captions": 0,
 "film_grain": 0,
 "has_b_frames": 2,
 "pix_fmt": "yuv420p",
 "level": 31,
 "color_range": "tv",
 "color_space": "bt709",
 "color_transfer": "bt709",
 "color_primaries": "bt709",
 "chroma_location": "left",
 "field_order": "progressive",
 "refs": 1,
 "is_avc": "true",
 "nal_length_size": "4",
 "id": "0x2",
 "r_frame_rate": "179/6",
 "avg_frame_rate": "3448256/115597",
 "time_base": "1/11456",
 "start_pts": 0,
 "start_time": "0.000000",
 "duration_ts": 115597,
 "duration": "10.090520",
 "bit_rate": "1293284",
 "bits_per_raw_sample": "8",
 "nb_frames": "301",
 "extradata_size": 46,
 "disposition": {
 "default": 1,
 "dub": 0,
 "original": 0,
 "comment": 0,
 "lyrics": 0,
 "karaoke": 0,
 "forced": 0,
 "hearing_impaired": 0,
 "visual_impaired": 0,
 "clean_effects": 0,
 "attached_pic": 0,
 "timed_thumbnails": 0,
 "non_diegetic": 0,
 "captions": 0,
 "descriptions": 0,
 "metadata": 0,
 "dependent": 0,
 "still_image": 0
 },
 "tags": {
 "language": "eng",
 "handler_name": "\u001fMainconcept Video Media Handler",
 "vendor_id": "[0][0][0][0]"
 }
 },
 {
 "index": 2,
 "codec_name": "aac",
 "codec_long_name": "AAC (Advanced Audio Coding)",
 "profile": "LC",
 "codec_type": "audio",
 "codec_tag_string": "mp4a",
 "codec_tag": "0x6134706d",
 "sample_fmt": "fltp",
 "sample_rate": "44100",
 "channels": 2,
 "channel_layout": "stereo",
 "bits_per_sample": 0,
 "initial_padding": 0,
 "id": "0x3",
 "r_frame_rate": "0/0",
 "avg_frame_rate": "0/0",
 "time_base": "1/44100",
 "start_pts": 0,
 "start_time": "0.000000",
 "duration_ts": 566244,
 "duration": "12.840000",
 "bit_rate": "130447",
 "nb_frames": "554",
 "extradata_size": 5,
 "disposition": {
 "default": 1,
 "dub": 0,
 "original": 0,
 "comment": 0,
 "lyrics": 0,
 "karaoke": 0,
 "forced": 0,
 "hearing_impaired": 0,
 "visual_impaired": 0,
 "clean_effects": 0,
 "attached_pic": 0,
 "timed_thumbnails": 0,
 "non_diegetic": 0,
 "captions": 0,
 "descriptions": 0,
 "metadata": 0,
 "dependent": 0,
 "still_image": 0
 },
 "tags": {
 "language": "und",
 "handler_name": "SoundHandler",
 "vendor_id": "[0][0][0][0]"
 }
 }
 ],
 "format": {
 "filename": "output21.mp4",
 "nb_streams": 3,
 "nb_programs": 0,
 "nb_stream_groups": 0,
 "format_name": "mov,mp4,m4a,3gp,3g2,mj2",
 "format_long_name": "QuickTime / MOV",
 "start_time": "0.000000",
 "duration": "12.840000",
 "size": "3543155",
 "bit_rate": "2207573",
 "probe_score": 100,
 "tags": {
 "major_brand": "isom",
 "minor_version": "512",
 "compatible_brands": "isomiso2avc1mp41",
 "encoder": "Lavf59.16.100"
 }
 }
}




I want the timeline to display correctly according to the shorter video track when it is being played, without truncating the timeline of the longer video track and affecting its playback.


-
How to Record Video of a Dynamic Div Containing Multiple Media Elements in React Konva ?
14 septembre 2024, par Humayoun SaeedI'm working on a React application where I need to record a video of a specific div with the class name "layout." This div contains multiple media elements (such as images and videos) that are dynamically rendered inside divisions. I've tried several approaches, including using MediaRecorder, canvas-based recording with html2canvas, RecordRTC, and even ffmpeg, but none seem to capture the entire div along with its dynamic content effectively.


What would be the best approach to achieve this ? How can I record a video of this dynamically rendered div including all its media elements, ensuring a smooth capture of the transitions ?


What I’ve Tried :
MediaRecorder API : Didn't work effectively for capturing the entire div and its elements.
html2canvas : Captures snapshots but struggles with smooth transitions between media elements.
RecordRTC HTML Element Recording : Attempts to capture the canvas, but the output video size is 0 bytes.
CanvasRecorder, FFmpeg, and various other libraries also didn't provide the desired result.


import React, { useEffect, useState, useRef } from "react";

const Preview = ({ layout, onClose }) => {
 const [currentContent, setCurrentContent] = useState([]);
 const totalDuration = useRef(0);
 const videoRefs = useRef([]); // Store refs to each video element
 const [totalTime, setTotalTime] = useState(0); // Add this line
 const [elapsedTime, setElapsedTime] = useState(0); // Track elapsed time in seconds

 // video recording variable and state declaration
 // video recorder end
 // for video record useffect
 // Function to capture the renderDivision content

 const handleDownload = async () => {
 console.log("video downlaod function in developing mode.");
 };

 // end video record useffect

 // to apply motion and swtich in media of division start
 useEffect(() => {
 if (layout && layout.divisions) {
 const content = layout.divisions.map((division) => {
 let divisionDuration = 0;

 division.imageSrcs.forEach((src, index) => {
 const mediaDuration = division.durations[index]
 ? division.durations[index] * 1000 // Convert to milliseconds
 : 5000; // Fallback to 5 seconds if duration is missing
 divisionDuration += mediaDuration;
 });

 return {
 division,
 contentIndex: 0,
 divisionDuration,
 };
 });

 // Find the maximum duration
 const maxDuration = Math.max(...content.map((c) => c.divisionDuration));

 // Filter divisions that have the max duration
 const maxDurationDivisions = content.filter(
 (c) => c.divisionDuration === maxDuration
 );

 // Select the first one if there are multiple with the same max duration
 const selectedMaxDurationDivision = maxDurationDivisions[0];

 totalDuration.current = selectedMaxDurationDivision.divisionDuration; // Update the total duration in milliseconds

 setTotalTime(Math.floor(totalDuration.current / 1000000)); // Convert to seconds and set in state

 // console.log(
 // "Division with max duration (including ties):",
 // selectedMaxDurationDivision
 // );

 setCurrentContent(content);
 }
 }, [layout]);

 useEffect(() => {
 if (currentContent.length > 0) {
 const timers = currentContent.map(({ division, contentIndex }, i) => {
 const duration = division.durations[contentIndex]
 ? division.durations[contentIndex] // Duration is already in ms
 : 5000; // Default to 5000ms if no duration is defined

 const mediaElement = videoRefs.current[i];
 if (mediaElement && mediaElement.pause) {
 mediaElement.pause();
 }

 // Set up a timeout for each division to move to the next media after duration
 const timeoutId = setTimeout(() => {
 // Update content for each division independently
 updateContent(i, division, contentIndex, duration); // Move to the next content after duration

 // Ensure proper cleanup
 if (contentIndex + 1 >= division.imageSrcs.length) {
 clearTimeout(timeoutId); // Clear timeout to stop looping
 }
 }, duration);

 // Cleanup timers on component unmount
 return timeoutId;
 });

 // Return cleanup function to clear all timeouts
 return () => timers.forEach((timer) => clearTimeout(timer));
 }
 }, [currentContent]);
 // to apply motion and swtich in media of division end

 // Handle video updates when the duration is changed or a new video starts
 const updateContent = (i, division, contentIndex, duration) => {
 const newContent = [...currentContent];

 // Check if we are on the last media item
 if (contentIndex + 1 < division.imageSrcs.length) {
 // Move to next media if not the last one
 newContent[i].contentIndex = contentIndex + 1;
 } else {
 // If this is the last media item, pause here
 newContent[i].contentIndex = contentIndex; // Keep it at the last item
 setCurrentContent(newContent);

 // Handle video pause if the last media is a video
 const mediaElement = videoRefs.current[i];
 if (mediaElement && mediaElement.tagName === "VIDEO") {
 mediaElement.pause();
 mediaElement.currentTime = mediaElement.duration; // Pause at the end of the video
 }
 return; // Exit the function as we don't want to loop anymore
 }

 // Update state to trigger rendering of the next media
 setCurrentContent(newContent);

 // Handle video playback for the next media item
 const mediaElement = videoRefs.current[i];
 if (mediaElement) {
 mediaElement.pause();
 mediaElement.currentTime = 0;
 mediaElement
 .play()
 .catch((error) => console.error("Error playing video:", error));
 }
 };

 const renderDivision = (division, contentIndex, index) => {
 const mediaSrc = division.imageSrcs[contentIndex];

 if (!division || !division.imageSrcs || division.imageSrcs.length === 0) {
 return (
 
 <p>No media available</p>
 
 );
 }

 if (!mediaSrc) {
 return (
 
 <p>No media available</p>
 
 );
 }

 if (mediaSrc.endsWith(".mp4")) {
 return (
 > (videoRefs.current[index] = el)}
 src={mediaSrc}
 autoPlay
 controls={false}
 style={{
 width: "100%",
 height: "100%",
 objectFit: "cover",
 pointerEvents: "none",
 }}
 onLoadedData={() => {
 // Ensure video is properly loaded
 const mediaElement = videoRefs.current[index];
 if (mediaElement && mediaElement.readyState >= 3) {
 mediaElement.play().catch((error) => {
 console.error("Error attempting to play the video:", error);
 });
 }
 }}
 />
 );
 } else {
 return (
 
 );
 }
 };

 // progress bar code start
 useEffect(() => {
 if (totalDuration.current > 0) {
 // Reset elapsed time at the start
 setElapsedTime(0);

 const interval = setInterval(() => {
 setElapsedTime((prevTime) => {
 // Increment the elapsed time by 1 second if it's less than the total time
 if (prevTime < totalTime) {
 return prevTime + 1;
 } else {
 clearInterval(interval); // Clear the interval when totalTime is reached
 return prevTime;
 }
 });
 }, 1000); // Update every second

 // Clean up the interval on component unmount
 return () => clearInterval(interval);
 }
 }, [totalTime]);

 // progress bar code end

 return (
 
 
 
 Close
 
 <h2>Preview Layout: {layout.name}</h2>
 
 {currentContent.map(({ division, contentIndex }, i) => (
 
 {renderDivision(division, contentIndex, i)}
 
 ))}
 {/* canvas code for video start */}
 {/* canvas code for video end */}
 {/* Progress Bar and Time */}
 / Background color for progress bar track
 display: "flex",
 justifyContent: "space-between",
 alignItems: "center",
 }}
 >
 totalTime) * 100}%)`,
 backgroundColor: "#28a745", // Green color for progress bar
 transition: "width 0.5s linear", // Smooth transition
 }}
 >

 {/* Time display */}
 {/* / Fixed right margin
 zIndex: 1, // Ensure it's above the progress bar
 padding: "5px",
 fontSize: "18px",
 fontWeight: "600",
 color: "#333",
 // backgroundColor: "rgba(255, 255, 255, 0.8)", // Add a subtle background for readability
 }}
 >
 {elapsedTime} / {totalTime}s
 */}
 
 

 {/* Download button */}
 > (e.target.style.backgroundColor = "#218838")}
 onMouseOut={(e) => (e.target.style.backgroundColor = "#28a745")}
 >
 Download Video
 
 {/* {recording && <p>Recording in progress...</p>} */}
 
 
 );
};

export default Preview;




I tried several methods to record the content of the div with the class "layout," which contains dynamic media elements such as images and videos. The approaches I attempted include :


MediaRecorder API : I expected this API to capture the entire div and its contents, but it didn't handle the rendering of all dynamic media elements properly.


html2canvas : I used this to capture the layout as a canvas and then attempted to convert it into a video stream. However, it could not capture smooth transitions between media elements, leading to a choppy or incomplete video output.


RecordRTC : I integrated RecordRTC to capture the canvas stream of the div. Despite setting up the recorder, the resulting video file either had a 0-byte size or only captured parts of the content inconsistently.


FFmpeg and other libraries : I explored these tools hoping they would provide a seamless capture of the dynamic content, but they also failed to capture the full media elements, including videos playing within the layout.


In all cases, I expected to get a complete video recording of the div, including all media transitions, but the results were incomplete or not functional.


Now, I’m seeking an approach or best practice to record the entire div with its dynamic content and media playback.