
Recherche avancée
Médias (16)
-
#7 Ambience
16 octobre 2011, par
Mis à jour : Juin 2015
Langue : English
Type : Audio
-
#6 Teaser Music
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#5 End Title
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#3 The Safest Place
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#4 Emo Creates
15 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#2 Typewriter Dance
15 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
Autres articles (39)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Contribute to documentation
13 avril 2011Documentation is vital to the development of improved technical capabilities.
MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
To contribute, register to the project users’ mailing (...)
Sur d’autres sites (5202)
-
How to set time_base when muxing AVI with libavformat ?
18 mai 2012, par AnastasiaIn my application I receive from a remote server synchronized video
(mpeg4) and audio (mp3) and mux them to avi file. The video comes
frame-by-frame, and the audio comes in small chunks, say 200-250 ms.
Both video frames and audio chunks have timestamps in milliseconds.My question is how to set audio time_base and pts correctly ?
For video I set time_base.num = framerate, time_base.den = 1 ; and
calculate pts as follows :AVRational time_base_1kHz;
time_base_1kHz.num = 1;
time_base_1kHz.den = 1000;
packet.pts = av_rescale_q(timeStamp - baseTimeStamp_, time_base_1kHz, videoStream_->time_base);where baseTimeStamp_ is the 1st timestamp of the stream.
But if I try do similar calculation for audio, I don't get playable
avi. If I always set to audio AV_NOPTS_VALUE, then avi has playable
video, but no audio.So what are the correct values for audio time_base and pts's ?
Do I set time_base and pts's for video correctly ? -
SOLVED - Compiling FFMPEG on Windows with Cywin and NDK r5
19 mai 2012, par protectedmemberThis isn't a question - it's an answer for alll of you who have been facing the same problems as I have. I've been trying to compile this thing for a while now and I know of the numerous posts floating around the internet offering help. I have read and tried most of the suggestions and wanted to colate my success into this single post for others to benefit from.
Since I don't have a blog, I thought it wouldn't hurt to post on here instead.
I have managed to compile FFMPEG 0.10.3 (Freedom) on Windows 7 (32 bit) using NDK r5 and Cygwin. The steps :
1 - Download/install Cygwin in the root of your C drive. I'm not going to give instructions on this, it's simple enough and there are plenty of tutorials on this.
2 - Download NDK r5 from here and extract to the root of your C drive.
3 - Download FFMPEG 0.10.3 from here and extract to the root of your C drive.
4 - Open the file 'configure' in the root of the FFMPEG directory in a text editor.
5 - Comment out lines 2073, 2074 and 2075.
6 - Below 2075, add the following line :
TMPDIR=c :/cygwin/tmp
7 - Download this script (thankyou roman10) and place it inside your FFMPEG root directory. Rename the file to
build_android.sh
8 - Open the script in a text editor and edit line 17 to read
c :/android-ndk-r5
9 - Click start > run and type "bash" (without the speech marks) and press enter.
10 - Type the following and press enter :
cd /cygdrive/c/ffmpeg-0.10.3
11 - Type the following and press enter :
./build_android.sh
12 - Sit back and wait... libffmpeg.so will soon appear in your "c :\ffmpeg-0.10.3\android\" directory (where is defined in the bottom of the script from roman10's blog). The default architecture is armv7-a.
The script from roman10's blog will actually compile quite a large shared object (.so) file. The compiler flags can be adjusted to suit your needs in the script from roman10's blog.
I hope this helps,
P.
-
How to send MPEGTS streams over UDP
24 décembre 2015, par pAkY88I am developing a realtime video-streaming system which is composed basically by a server and several clients.
For now, let’s ignore how packets are forwarded among the server and the clients, let’s focus just on how the server can send a MPEGTS stream over UDP packets.
The stream is encoded in MPEGTS format.
What I’m trying to do is reading some packets (the main question is "how many ?") and encapsulating them in UDP packets. The destination (a client) reads these UDP packets and then forward them to VLC, which is able to play MPEGTS network streams by reading UDP packets.
If I send only video packets, everything works fine, instead if I try to encapsulate in the same UDP packet, both some video packets and some audio packets, VLC is not able to decode and play the stream.
I read somewhere that each UDP packet should contain 7 TS packets, but unfortunately even if I comply with this rule, VLC doesn’t decode the stream correctly.Here is a sample code of my program : http://pastebin.com/evMi6FkY
How should I encapsulate MPEGTS packets in UDP packets ?
Thanks !