
Recherche avancée
Médias (91)
-
Head down (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Echoplex (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Discipline (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Letting you (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
1 000 000 (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
999 999 (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
Autres articles (108)
-
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...) -
Ajouter notes et légendes aux images
7 février 2011, parPour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
Modification lors de l’ajout d’un média
Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...) -
Librairies et logiciels spécifiques aux médias
10 décembre 2010, parPour un fonctionnement correct et optimal, plusieurs choses sont à prendre en considération.
Il est important, après avoir installé apache2, mysql et php5, d’installer d’autres logiciels nécessaires dont les installations sont décrites dans les liens afférants. Un ensemble de librairies multimedias (x264, libtheora, libvpx) utilisées pour l’encodage et le décodage des vidéos et sons afin de supporter le plus grand nombre de fichiers possibles. Cf. : ce tutoriel ; FFMpeg avec le maximum de décodeurs et (...)
Sur d’autres sites (9341)
-
Java - RTSP save snapshot from Stream Packets
9 août 2016, par Guerino RodellaI’m developing an application which requests snapshots to DVR and IP Cameras. The device I’m working on just offer RTSP requests to do so. Then I implemented the necessary RTSP methods to start receiving the stream packets and I started receiving then via UDP connection established. My doubt is, how can I save the received data to a jpeg file ? Where’s the begging and end of the image bytes received ?
I searched a lot libraries which implement this type of service in Java, like Xuggler ( which it’s maintained no more ), javacpp-presets - has ffmpeg and opencv libraries included - I had some environment problems with it. If someone know an easy and good one which saves snapshots from the streams, let me know.
My code :
final long timeout = System.currentTimeMillis() + 3000;
byte[] fullImage = new byte[ 1024 * 1024 ];
DatagramSocket udpSocket = new DatagramSocket( 8000 );
int lastByte = 0;
// Skip first 2 packets because I think they are HEADERS
// Since I don't know what they mean, I just print then in hexa
for( int i = 0; i < 2; i++ ){
byte[] buffer = new byte[ 1024 ];
DatagramPacket dataPacket = new DatagramPacket( buffer, buffer.length );
udpSocket.receive( dataPacket );
int dataLenght = dataPacket.getLength();
buffer = Arrays.copyOf( buffer, dataLenght );
System.out.println( "RECEIVED[" + DatatypeConverter.printHexBinary( buffer ) + " L: " + dataLenght );
}
do{
byte[] buffer = new byte[ 1024 ];
DatagramPacket dataPacket = new DatagramPacket( fullImage, fullImage.length );
udpSocket.receive( dataPacket );
System.out.println( "RECEIVED: " + new String( fullImage ) );
for( int i = 0; i < buffer.length; i++ ){
fullImage[ i + lastByte ] = buffer[ i ];
lastByte ++;
}
} while( System.currentTimeMillis() < timeout );
// I know this timeout is wrong, I should stop after getting full image bytesThe output :
RECEIVED : 80E0000100004650000000006742E01FDA014016C4 L : 21
RECEIVED : 80E00002000046500000000068CE30A480 L : 17
RECEIVED : Tons of data from the streaming...
RECEIVED : Tons of data from the streaming...
RECEIVED : Tons of data from the streaming...
[...]As you might suppose, the image I’m saving into a file is not readable because I’m doing it wrong. I think the header provide me some info about the next packets the server will sent me telling the start and the end of the image from the streaming. But I don’t understood them. Someone know how to solve it ? Any tips are welcome !
-
Recording a webpage stream with multiple requests using PhantomJS & ffmpeg to /dev/stdout leads to ffmpeg error
2 septembre 2016, par Allisson FerreiraFirst of all, sorry for my english.
I’m in a quest for days. I’ve researched everywhere and I couldn’t find an answer to my problem.
I’m using Nodejs, Phantomjs and ffmpeg in this scenary :
- An user enters the site, login with facebook and he can ask for a video with his name and some random photos (gathered by /me/ & sent via JSON POST) ;
- Node receive the user data, creates a child process (PhantomJS + ffmpeg) and awaits for a response to send the video URL to the user.
When I run a single instance of this request, everything is working fine. BUT, when two or more users make the request, only one video is sent and the others process end up in a ffmpeg stream error.
I think the reason is that all the ffmpeg processes are using the same place (/dev/stdout). Since one process is already using it, the others enters in a "can’t access" error. But it is a assumption, I don’t know how /dev/stdout really works.
Here are my codes. (I have removed some lines and renamed some variables for better understanding, sorry for any mistake)
index.js :
var generateVideo = 'phantomjs phantom.js '+videoID+' '+userID+' | ffmpeg -vcodec png -f image2pipe -r 30 -i - -pix_fmt yuv420p public/videos/'+userID+'/'+videoID+'.mp4 -y';
childProcess.exec(generateVideo, function(err, stdout, stderr) {
var json = {};
json.video = '/videos/'+userID+'/'+videoID+'.mp4';
res.send(json);
});phantom.js :
var page = require('webpage').create();
page.viewportSize = { width: 1366, height: 768 };
page.settings.resourceTimeout = 10000;
var args = require('system').args;
var videoID = args[1];
var userID = args[2];
page.open('http://localhost:3000/recordvideo/'+videoID, 'post', function(status){
var frame = 0;
var target_fps = 30;
var maxframes = page.evaluate(function () {
return getTotalDurationInSeconds();
}) * target_fps;
setInterval(function(){
page.render('/dev/stdout', { format: "png" });
if( frame >= maxframes ){
phantom.exit();
}
frame++;
}, (1000 / target_fps));
});And the error :
[Error: Command failed: /bin/sh -c phantomjs phantom.js XXXXXXXX XXXXXXXX | ffmpeg -vcodec png -f image2pipe -r 30 -i - -pix_fmt yuv420p public/videos/XXXXXXXX/XXXXXXXX.mp4 -y
www-0 ffmpeg version N-80901-gfebc862 Copyright (c) 2000-2016 the FFmpeg developers
www-0 built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.3)
www-0 configuration: --extra-libs=-ldl --prefix=/opt/ffmpeg --mandir=/usr/share/man --enable-avresample --disable-debug --enable-nonfree --enable-gpl --enable-version3 --enable-libopencore-amrnb --enable-libopencore-amrwb --disable-decoder=amrnb --disable-decoder=amrwb --enable-libpulse --enable-libfreetype --enable-gnutls --enable-libx264 --enable-libx265 --enable-libfdk-aac --enable-libvorbis --enable-libmp3lame --enable-libopus --enable-libvpx --enable-libspeex --enable-libass --enable-avisynth --enable-libsoxr --enable-libxvid --enable-libvidstab
www-0 libavutil 55. 28.100 / 55. 28.100
www-0 libavcodec 57. 48.101 / 57. 48.101
www-0 libavformat 57. 41.100 / 57. 41.100
www-0 libavdevice 57. 0.102 / 57. 0.102
www-0 libavfilter 6. 47.100 / 6. 47.100
www-0 libavresample 3. 0. 0 / 3. 0. 0
www-0 libswscale 4. 1.100 / 4. 1.100
www-0 libswresample 2. 1.100 / 2. 1.100
www-0 libpostproc 54. 0.100 / 54. 0.100
www-0 [png @ 0x3d7c4a0] Invalid PNG signature 0x46726F6D20506861.
www-0 [image2pipe @ 0x3d72780] decoding for stream 0 failed
www-0 [image2pipe @ 0x3d72780] Could not find codec parameters for stream 0 (Video: png, none(pc)): unspecified size
www-0 Consider increasing the value for the 'analyzeduration' and 'probesize' options
www-0 Input #0, image2pipe, from 'pipe:':
www-0 Duration: N/A, bitrate: N/A
www-0 Stream #0:0: Video: png, none(pc), 30 tbr, 30 tbn, 30 tbc
www-0 [buffer @ 0x3d81540] Unable to parse option value "0x0" as image size
www-0 [buffer @ 0x3d81540] Unable to parse option value "-1" as pixel format
www-0 [buffer @ 0x3d81540] Unable to parse option value "0x0" as image size
www-0 [buffer @ 0x3d81540] Error setting option video_size to value 0x0.
www-0 [graph 0 input from stream 0:0 @ 0x3d72600] Error applying options to the filter.
www-0 Error opening filters!
www-0 ]I really hope that I can find an answer here !
And sorry if there already is an answer for this. But I researched for days.Thank you in advance !
-
Streaming video with node
14 septembre 2016, par Kei TaylorI am attempting to stream video with from my built-in Mac OSX webcam to a node server. My intent is to parse the video data I receive with ffmpeg once it reaches the server. This is my first time trying to manipulate video.
My problem is, right now I am unable to use VLC to send data to a node server. I am opening a stream with VLC by using the "Open Network" option, and streaming through what I think is my local IP, port 3000. However, I am not sure how to get from that to opening the stream on a node file. Also, I am not able to open the stream on the same computer and view it (I mean, when I click open stream on VLC and plug in my local IP and port 3000, I can’t view the stream I’m sending out).
Clearly, I am doing something wrong. As this is my first experience with VLC and video transmission, its possible I’m missing something important. Its my impression that I should be able to stream to an ip and port using VLC, and then set up a node server that listens for data from this same port and ip, recieves chunks of video data, and formats it using FFMPEG.
My questions are :
1) Is this understanding accurate ?
2) Does anyone have guidance on how I would transmit with VLC and read with FFMPEG in node (and then send to another client ?)
3) Failing that, any guidance on the simple question of how to transmit a video stream using VLC and then open it using VLC on the same computer ?
4) Any resources explaining how to do this ?I’d really appreciate layman’s, baby explanations. I’ve found a few blog posts that have illuminated issues, but been a bit difficult to follow.
Thanks, as always.