
Recherche avancée
Médias (17)
-
Matmos - Action at a Distance
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
DJ Dolores - Oslodum 2004 (includes (cc) sample of “Oslodum” by Gilberto Gil)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Danger Mouse & Jemini - What U Sittin’ On ? (starring Cee Lo and Tha Alkaholiks)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Cornelius - Wataridori 2
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Rapture - Sister Saviour (Blackstrobe Remix)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Chuck D with Fine Arts Militia - No Meaning No
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (54)
-
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...) -
Menus personnalisés
14 novembre 2010, parMediaSPIP utilise le plugin Menus pour gérer plusieurs menus configurables pour la navigation.
Cela permet de laisser aux administrateurs de canaux la possibilité de configurer finement ces menus.
Menus créés à l’initialisation du site
Par défaut trois menus sont créés automatiquement à l’initialisation du site : Le menu principal ; Identifiant : barrenav ; Ce menu s’insère en général en haut de la page après le bloc d’entête, son identifiant le rend compatible avec les squelettes basés sur Zpip ; (...) -
Le plugin : Podcasts.
14 juillet 2010, parLe problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
Types de fichiers supportés dans les flux
Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...)
Sur d’autres sites (9373)
-
OpenGL and ffmpeg make video with stable fps
27 août 2022, par TurgutI've made a program that takes multiple vidoes as inputs, have ffmpeg decode them, send them to opengl, then create a window using glfw, draw textures on the screen using those videos (Edits those textures), then I read the screen using
glReadPixels
so ffmpeg can encode it. I send the read frames to the encoder and it encodes it. I specify the fps on start, but the problem is the video is faster then it's supposed to be. Now I can do something like this :

double pt_in_seconds = pts * (double)time_base.num / (double)time_base.den;
while (pt_in_seconds > glfwGetTime()) {
 glfwWaitEventsTimeout(pt_in_seconds - glfwGetTime());
}



But the problem with this is that this approach makes the run-time really long. So if I input a 1 hour video I have to wait for 1 hours. If I don't use this code snippet it generates the output as fast as it can, but like I said the output video is faster than it's supposed to be. Whats shown in the glfw window is irrelevant, it's hidden anyways, it's just there to manipulate/merge input videos.


Is there a better way for ffmpeg to stabilize the encoded information ? At the end of the day glfw just displays the decoded videos, since they are both on the same iteration.


It looks roughly like this :


...
while(true)
{
 // The actual program originally reads every input inside a vector here.
 // But since the program itself is really long I just did this as a representation
 uint8_t* decoded_data = decoder.decode_one_frame();
 
 // draw_frame_on_screen returns glReadPixels result.
 uint8_t* screen_data = opengl_engine.draw_frame_on_screen(decoded_data);

 encoder.encode_one_frame(screen_data);
}



Encoder is entirely just muxing.c from ffmpegs official docs, I've just removed the dummy image and added my screen_data as input.


Using ubuntu, GLFW, GLAD, ffmpeg.


-
Qt6.4.1 QProcess cannot call external program FFmpeg [closed]
21 décembre 2022, par XingchenBased on the official Qt example, slightly modified to call the external ffmpeg.exe for transcoding operations


QProcess *p = new QProcess(this);
QString program = "C:\\Users\\kyrio\\Documents\\Qt_Project\\build-test-Desktop_Qt_6_4_1_MinGW_64_bit-Debug\\debug";
QStringList arguments;
arguments << "ffmpeg" << "-i" << "C:\\Users\\kyrio\\Videos\\222.mp4" << "C:\\Users\\kyrio\\Videos\\223.mov";
p->start(program, arguments);



Run no results, try a variety of writing methods also no results, get the output is empty, and no FFmpeg-related processes under the task manager

Try to call cmd, task manager can see the sub-processes under the new cmd.exe

command is fine, can be called in the terminal, but need to add ./or.

can be successfully run in the terminal

Try prefixing arguments with.\or.\or./, still no response

Tried to get the exact path to the ffmpeg.exe file in the program string, still no response

I want to be able to successfully call ffmpeg.exe to achieve the video transcoding needs

I have made the "program" exact to ffmpeg.exe and this is still unresponsive.No process, no errorString output.The output of exitCode is also absent.


QString program = "C:\\Users\\kyrio\\Documents\\Qt_Project\\build-test-Desktop_Qt_6_4_1_MinGW_64_bit-Debug\\debug\\ffmpeg.exe";





I tried to start cmd and connected the errorOccurred signal, but there is no output, it is worth noting that the process cmd appears in the task manager

no output
task manager

-
I want to convert relay server nodejs http->websocket code version to java netty to spring-websocket
20 août 2019, par rura6502I want to rewrite this example(https://github.com/phoboslab/jsmpeg/blob/master/websocket-relay.js) to java using netty and spring-WebSocket.
This nodejs example’s HTTP server get the media data from FFmpeg and relay to the WebSocket. And then javascript library draws on the HTML Canvas.
But My problem is that when I use netty, spring-WebSocket, some data cannot read by the javascript library and there are many data loss.
In the example, this nodejs code’s main part I think.
http = require('http'),
WebSocket = require('ws');
// setting websocket server ..............
var streamServer = http.createServer( function(request, response) {
// ....................
response.connection.setTimeout(0);
request.on('data', function(data){
socketServer.broadcast(data);
// .....
});
// .................
}).listen(STREAM_PORT);So I already tried to change this. I just used netty code in the official document(https://netty.io/wiki/user-guide-for-4.x.html) and changed sending a part for the Websocket
// this code is in channelRead method
ByteBuf buf = (ByteBuf) msg;
try {
while (buf.isReadable()) { // (1)
byte[] bytes = new byte[buf.readableBytes()];
buf.readBytes(bytes);
WSHandler.wsSessions.stream().forEach(wsSession -> {
try {
wsSession.sendMessage(new BinaryMessage(bytes));
} catch (IOException e) {
e.printStackTrace();
};
});
}
} finally {
ReferenceCountUtil.release(msg); // (2)
}Please tell me what I missed. help me. thanks.