
Recherche avancée
Médias (39)
-
Stereo master soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
ED-ME-5 1-DVD
11 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
1,000,000
27 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Demon Seed
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Four of Us are Dying
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Corona Radiata
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (108)
-
Mise à jour de la version 0.1 vers 0.2
24 juin 2013, parExplications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)
Sur d’autres sites (15700)
-
How to work with data from streaming services in my Java application ?
24 novembre 2020, par gabriel garciaI'm currently trying to develop an "streaming client" as a way to organize multiple stream services (twitch, yt, mitele...) in a single desktop application written in Java.


It basically relies on streamlink (which relies in ffmpeg) thanks to all it's features so my project could be defined as a frontend for streamlink.


Straight to the point, one of the features I'd like to add it is the option to programatically record streams in the background and showing this video stream to the user when it's requested. Since there's also the possibility that the user wants to watch the stream without recording it, I'm forced to work with all that byte-like data sent from those streaming sources.


So, the problem is basically that I do not know much about video coding/decoding/muxing/demuxing nor video theory like container structure, video formats and such.


But the idea is to work with all the data sent from the stream source (let's say twitch, for example), read this bytes (I'm not sure what kind of information is sent to the client nor format) from the
java.lang.Process
'sstdout
and then present it to the client.

Here's another problem : I don't know how to play video streams in JavaFX and I don't think it's even supported right now. So I would have to extract each frame and sound associated from the
stdout
and show them to the user each time a new frame is received (oups, another problem since I don't know when does each frame starts/ends since I'm reading eachstdout
's line).

As a summary :


- 

- How can I know when does each frame starts/stops ?
- How can I extract the image and sound from each frame ?






I hope I'm not asking too much and that you could shed some light upon my darkness.


-
How to work with data received from streaming services in my Java application ?
24 novembre 2020, par gabriel garciaI'm currently trying to develop an "streaming client" as a way to organize multiple stream services (twitch, yt, mitele...) in a single desktop application written in Java.


It basically relies on streamlink (which relies in ffmpeg) thanks to all it's features so my project could be defined as a frontend for streamlink.


Straight to the point, one of the features I'd like to add it is the option to programatically record streams in the background and showing this video stream to the user when it's requested. Since there's also the possibility that the user wants to watch the stream without recording it, I'm forced to work with all that byte-like data sent from those streaming sources.


So, the problem is basically that I do not know much about video coding/decoding/muxing/demuxing nor video theory like container structure, video formats and such.


But the idea is to work with all the data sent from the stream source (let's say twitch, for example), read this bytes (I'm not sure what kind of information is sent to the client nor format) from the
java.lang.Process
'sstdout
and then present it to the client.

Here's another problem : I don't know how to play video streams in JavaFX and I don't think it's even supported right now. So I would have to extract each frame and sound associated from the
stdout
and show them to the user each time a new frame is received (oups, another problem since I don't know when does each frame starts/ends since I'm reading eachstdout
's line).

As a summary :


- 

- What kind of data am I receiving from the streaming source ?
- How can I know when does each frame starts/stops ?
- How can I extract the image and sound from each frame ?








I hope I'm not asking too much and that you could shed some light upon my darkness.


-
Workflow and data format for sending MediaRecorder output to express server
30 avril 2021, par MaxI've been trying to figure this out for a while but got lost between different ways of sending files and different data formats.


I am recording the stream of a canvas animation with MediaRecorder. As far as I understand this returns a blob with the video in binary format. Now I want to send this data to my express server and convert it to an h264 encoded mp4 file. My first impulse was to use ffmpeg on the server. Unfortunately I'm struggling with the details of the implementation. I am unsure on how to best transmit the data and in what format and how to feed it to ffmpeg.


This is what I have on the client side :


// Get stream from element
stream = element.captureStream(30)

// Create media recorder with stream
const recorder = new MediaRecorder(stream)

// Save to file
recorder.ondataavailable = ({ data }) => {
 
 const formData = new FormData()
 formData.append("file", data)

 const options = {
 method: "POST",
 body: formData,
 }

 fetch("http://localhost:3001/api/blob_to_mp4", options).then(
 (res) => {
 console.log(res)
 }
 )
}



And this is what I have on the server side :


"use strict";

const express = require("express");
const cors = require("cors");
const ffmpeg = require("fluent-ffmpeg");
const fs = require("fs");

const port = process.env.PORT || 3001;
const app = express();
var command = ffmpeg();

app.use(cors());
app.use(express.urlencoded({ extended: true }));
app.use(express.json());

app.post("/api/blob_to_mp4", function (req, res) {
 var data = Buffer.from("");

 // Add data
 req.on("data", function (chunk) {
 data = Buffer.concat([data, chunk]);
 });

 // Full data available
 req.on("end", () => {
 req.rawBody = data;
 });

 res.send("hello world");
});

app.listen(port);
console.log(`Server running on port ${port}`);