
Recherche avancée
Médias (3)
-
Exemple de boutons d’action pour une collection collaborative
27 février 2013, par
Mis à jour : Mars 2013
Langue : français
Type : Image
-
Exemple de boutons d’action pour une collection personnelle
27 février 2013, par
Mis à jour : Février 2013
Langue : English
Type : Image
-
Collections - Formulaire de création rapide
19 février 2013, par
Mis à jour : Février 2013
Langue : français
Type : Image
Autres articles (108)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs
12 avril 2011, parLa manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras. -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (14596)
-
ffmpeg not producing duration value when converting a media file. If this media is used in an html player, seeking fails on some browsers
4 décembre 2020, par DavidI'm asking this question because I didn't find information on the web and this seems to be uncommon issue. After a lot of trial and error I'm creating this answer for reference.


I'm using python which runs the ffmpeg command to convert an MP4 into WEBM file. I noticed that some of the conversions didn't seek when displaying in the webpage but other webm files worked on the same server. I was converting from two different machines. The core problem was the ffmpeg command was using pipe during the conversion process on one machine and output directly to a file on the other.


Apparently doing this doesn't populate the duration attribute in the file but if I ran ffmpeg with the same switches, same order and output to a file on the system, the duration attribute would populate.


I was converting from MP4 to WEBM format using opus for the audio. This issue might only impact certain file types, webm as an example and not others.


-
FFMPEG - How to wait until all blobs are written before finishing ffmpeg process when getting them from media recorder API
7 novembre 2020, par Caio NakaiI'm using media recorder API to record a video from user's screen and sending the blobs through web socket to a nodejs server. The nodejs server is using the blobs to create a webm video file, the video is being created fine but with a delay, after the user clicks on the stop recording button it stops the media recorder api, however the server didn't finish the processing of all blobs (at least that's what I think it's happening) and then when I check the video file generated the last few seconds of the recording are missing I wonder if there's an way to solve this. Any help is appreciated :)


This is the front-end code that sends the blobs to the nodejs server


const startScreenCapture = async () => {
 try {
 let screenStream;
 videoElem = document.getElementById("myscreen");
 screenStream = await navigator.mediaDevices.getDisplayMedia(
 displayMediaOptions
 );

 const recorderOptions = {
 mimeType: "video/webm;codecs=vp9",
 videoBitsPerSecond: 3 * 1024 * 1024,
 };

 screenMediaRecorder = new MediaRecorder(screenStream, recorderOptions);
 screenMediaRecorder.start(1); // 1000 - the number of milliseconds to record into each Blob
 screenMediaRecorder.ondataavailable = (event) => {
 console.debug("Got blob data:", event.data);
 console.log("Camera stream: ", event.data);
 if (event.data && event.data.size > 0) {
 socket.emit("screen_stream", event.data);
 }
 };

 videoElem.srcObject = screenStream;
 // console.log("Screen stream", screenStream);
 // socket.emit("screen_stream", screenStream);
 } catch (err) {
 console.error("Error: " + err);
 }
};

const stopCapture = (evt) => {
 let tracks = videoElem.srcObject.getTracks();

 tracks.forEach((track) => track.stop());
 videoElem.srcObject = null;
 screenMediaRecorder.stop();
 socket.emit("stop_screen");
 socket.close();
};



This is the nodejs back-end that handle the blobs and generates the videofile


const ffmpeg2 = child_process.spawn("ffmpeg", [
 "-i",
 "-",
 "-c:v",
 "copy",
 "-c:a",
 "copy",
 "screen.webm",
 ]);


 socket.on("screen_stream", (msg) => {
 console.log("Writing screen blob! ");
 ffmpeg2.stdin.write(msg);
 });

 socket.on("stop_screen", () => {
 console.log("Stop recording..");
 });



-
How to stream RTMP to Azure Media Services ?
26 octobre 2020, par AbbasI'm trying to stream my camera to Azure Media Services LiveEvent. I'm using Media Services' REST-API to obtain the ingest URL, however the docs don't mention how to stream RTMP from an Android Phone.


So far I've tried quiet a few Android RTMP publishing libraries available on Git but each one of them fails at establishing a connection. The list of libraries I've tried so far :


- 

- https://github.com/TakuSemba/RtmpPublisher (Fails internally NDK while opening a connection)
- https://github.com/pedroSG94/rtmp-rtsp-stream-client-java (Fails while expecting to receive a header from the ingest URL)
- And several others all exhibiting one of the two above mentioned behaviors.








I've also tried streaming from an mp4 video file using ffmpeg inspired from this SO Answer :


ffmpeg -re -i video.mp4 -vcodec libx264 -profile:v main -preset:v medium -r 30 -g 60 -keyint_min 60 -sc_threshold 0 -b:v 2500k -maxrate 2500k -bufsize 2500k -filter:v scale="trunc(oha/2)2:720" -sws_flags lanczos+accurate_rnd -acodec aac -b:a 96k -ar 48000 -ac 2 -f flv rtmp://



But I'm getting :


rtmp://: I/O error



Am I missing something ?


Is it even at all possible to stream to an ingest URL without a middle tier as suggested by many Azure people is the way to go ?


Edit : I've successfully streamed to YouTube Live Streaming using two RTMP libraries and so I'm now pretty sure the issue is not with the RTMP streaming libraries but with how the Azure Live Streaming works. I'm definitely missing a step here.