
Recherche avancée
Médias (9)
-
Stereo master soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
Elephants Dream - Cover of the soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
#7 Ambience
16 octobre 2011, par
Mis à jour : Juin 2015
Langue : English
Type : Audio
-
#6 Teaser Music
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#5 End Title
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#3 The Safest Place
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
Autres articles (39)
-
Qu’est ce qu’un éditorial
21 juin 2013, parEcrivez votre de point de vue dans un article. Celui-ci sera rangé dans une rubrique prévue à cet effet.
Un éditorial est un article de type texte uniquement. Il a pour objectif de ranger les points de vue dans une rubrique dédiée. Un seul éditorial est placé à la une en page d’accueil. Pour consulter les précédents, consultez la rubrique dédiée.
Vous pouvez personnaliser le formulaire de création d’un éditorial.
Formulaire de création d’un éditorial Dans le cas d’un document de type éditorial, les (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (5704)
-
Separate Audio and Video in android
13 février 2014, par ManojI am beginner to Android application development,The aim of my project is to separate a audio and video from video file. after surfing in internet i came to know that, using FFmpeg we can do it.
ffmpeg -i input.mkv # show stream numbers and formats
ffmpeg -i input.mkv -c copy audio.m4a # AAC
ffmpeg -i input.mkv -c copy audio.mp3 # MP3
ffmpeg -i input.mkv -c copy audio.ac3 # AC3
ffmpeg -i input.mkv -an -c copy video.mkv
ffmpeg -i input.mkv -map 0:1 -c copy audio.m4a # stream 1separated video file wouldn't contain any audio it must be contain only video type.
whether this option too possible in ffmpeg ?
is any other alternative option there ?. -
How to properly open libvpx for encoding in via avcodec_open2() ?
10 juin 2014, par EdisonWith a valid AVCodec and a valid AVContext, when calling avcodec_open2() EINVAL (-22, Invalid arguments) is returned, it turns out the EINVAL is triggered via
ret = avctx->codec->init(avctx);
if (ret < 0) {
goto free_and_end;
}inside avcodec_open2().
The same code that I have right now can open mpeg4 and h264 just fine.
Are there any special option parameters that has to be set (e.g. for h263, image size has to be multiples of certain numbers) to open libvpx codec ?
-
Jitsi and ffplay
15 juin 2014, par KotkotI’m playing with jitsi. Got examples form source code. I modified it a bit.
Here is what I’ve got.
I am trying to play the transmitted stream in VLC of ffplay or any other player,
but I cannot.I use these application parameters to run the code :
--local-port-base=5000 --remote-host=localhost --remote-port-base=10000
What am I doing wrong ?
package com.company;
/*
* Jitsi, the OpenSource Java VoIP and Instant Messaging client.
*
* Distributable under LGPL license.
* See terms of license at gnu.org.
*/
import org.jitsi.service.libjitsi.LibJitsi;
import org.jitsi.service.neomedia.*;
import org.jitsi.service.neomedia.device.MediaDevice;
import org.jitsi.service.neomedia.format.MediaFormat;
import org.jitsi.service.neomedia.format.MediaFormatFactory;
import java.io.PrintStream;
import java.net.DatagramSocket;
import java.net.InetAddress;
import java.net.InetSocketAddress;
import java.util.HashMap;
import java.util.Map;
/**
* Implements an example application in the fashion of JMF's AVTransmit2 example
* which demonstrates the use of the <tt>libjitsi</tt> library for the purposes
* of transmitting audio and video via RTP means.
*
* @author Lyubomir Marinov
*/
public class VideoTransmitter {
/**
* The port which is the source of the transmission i.e. from which the
* media is to be transmitted.
*
* @see #LOCAL_PORT_BASE_ARG_NAME
*/
private int localPortBase;
/**
* The <tt>MediaStream</tt> instances initialized by this instance indexed
* by their respective <tt>MediaType</tt> ordinal.
*/
private MediaStream[] mediaStreams;
/**
* The <tt>InetAddress</tt> of the host which is the target of the
* transmission i.e. to which the media is to be transmitted.
*
* @see #REMOTE_HOST_ARG_NAME
*/
private InetAddress remoteAddr;
/**
* The port which is the target of the transmission i.e. to which the media
* is to be transmitted.
*
* @see #REMOTE_PORT_BASE_ARG_NAME
*/
private int remotePortBase;
/**
* Initializes a new <tt>AVTransmit2</tt> instance which is to transmit
* audio and video to a specific host and a specific port.
*
* @param localPortBase the port which is the source of the transmission
* i.e. from which the media is to be transmitted
* @param remoteHost the name of the host which is the target of the
* transmission i.e. to which the media is to be transmitted
* @param remotePortBase the port which is the target of the transmission
* i.e. to which the media is to be transmitted
* @throws Exception if any error arises during the parsing of the specified
* <tt>localPortBase</tt>, <tt>remoteHost</tt> and <tt>remotePortBase</tt>
*/
private VideoTransmitter(
String localPortBase,
String remoteHost, String remotePortBase)
throws Exception {
this.localPortBase
= (localPortBase == null)
? -1
: Integer.valueOf(localPortBase).intValue();
this.remoteAddr = InetAddress.getByName(remoteHost);
this.remotePortBase = Integer.valueOf(remotePortBase).intValue();
}
/**
* Starts the transmission. Returns null if transmission started ok.
* Otherwise it returns a string with the reason why the setup failed.
*/
private String start()
throws Exception {
/*
* Prepare for the start of the transmission i.e. initialize the
* MediaStream instances.
*/
MediaType[] mediaTypes = MediaType.values();
MediaService mediaService = LibJitsi.getMediaService();
int localPort = localPortBase;
int remotePort = remotePortBase;
mediaStreams = new MediaStream[mediaTypes.length];
for (MediaType mediaType : mediaTypes) {
if(mediaType != MediaType.VIDEO) continue;
/*
* The default MediaDevice (for a specific MediaType) is configured
* (by the user of the application via some sort of UI) into the
* ConfigurationService. If there is no ConfigurationService
* instance known to LibJitsi, the first available MediaDevice of
* the specified MediaType will be chosen by MediaService.
*/
MediaDevice device
= mediaService.getMediaDeviceForPartialDesktopStreaming(100,100,100,100);
if (device == null) {
continue;
}
MediaStream mediaStream = mediaService.createMediaStream(device);
// direction
/*
* The AVTransmit2 example sends only and the AVReceive2 receives
* only. In a call, the MediaStream's direction will most commonly
* be set to SENDRECV.
*/
mediaStream.setDirection(MediaDirection.SENDONLY);
// format
String encoding;
double clockRate;
/*
* The AVTransmit2 and AVReceive2 examples use the H.264 video
* codec. Its RTP transmission has no static RTP payload type number
* assigned.
*/
byte dynamicRTPPayloadType;
switch (device.getMediaType()) {
case AUDIO:
encoding = "PCMU";
clockRate = 8000;
/* PCMU has a static RTP payload type number assigned. */
dynamicRTPPayloadType = -1;
break;
case VIDEO:
encoding = "H264";
clockRate = MediaFormatFactory.CLOCK_RATE_NOT_SPECIFIED;
/*
* The dymanic RTP payload type numbers are usually negotiated
* in the signaling functionality.
*/
dynamicRTPPayloadType = 99;
break;
default:
encoding = null;
clockRate = MediaFormatFactory.CLOCK_RATE_NOT_SPECIFIED;
dynamicRTPPayloadType = -1;
}
if (encoding != null) {
MediaFormat format
= mediaService.getFormatFactory().createMediaFormat(
encoding,
clockRate);
/*
* The MediaFormat instances which do not have a static RTP
* payload type number association must be explicitly assigned
* a dynamic RTP payload type number.
*/
if (dynamicRTPPayloadType != -1) {
mediaStream.addDynamicRTPPayloadType(
dynamicRTPPayloadType,
format);
}
mediaStream.setFormat(format);
}
// connector
StreamConnector connector;
if (localPortBase == -1) {
connector = new DefaultStreamConnector();
} else {
int localRTPPort = localPort++;
int localRTCPPort = localPort++;
connector
= new DefaultStreamConnector(
new DatagramSocket(localRTPPort),
new DatagramSocket(localRTCPPort));
}
mediaStream.setConnector(connector);
// target
/*
* The AVTransmit2 and AVReceive2 examples follow the common
* practice that the RTCP port is right after the RTP port.
*/
int remoteRTPPort = remotePort++;
int remoteRTCPPort = remotePort++;
mediaStream.setTarget(
new MediaStreamTarget(
new InetSocketAddress(remoteAddr, remoteRTPPort),
new InetSocketAddress(remoteAddr, remoteRTCPPort)));
// name
/*
* The name is completely optional and it is not being used by the
* MediaStream implementation at this time, it is just remembered so
* that it can be retrieved via MediaStream#getName(). It may be
* integrated with the signaling functionality if necessary.
*/
mediaStream.setName(mediaType.toString());
mediaStreams[mediaType.ordinal()] = mediaStream;
}
/*
* Do start the transmission i.e. start the initialized MediaStream
* instances.
*/
for (MediaStream mediaStream : mediaStreams) {
if (mediaStream != null) {
mediaStream.start();
}
}
return null;
}
/**
* Stops the transmission if already started
*/
private void stop() {
if (mediaStreams != null) {
for (int i = 0; i < mediaStreams.length; i++) {
MediaStream mediaStream = mediaStreams[i];
if (mediaStream != null) {
try {
mediaStream.stop();
} finally {
mediaStream.close();
mediaStreams[i] = null;
}
}
}
mediaStreams = null;
}
}
/**
* The name of the command-line argument which specifies the port from which
* the media is to be transmitted. The command-line argument value will be
* used as the port to transmit the audio RTP from, the next port after it
* will be to transmit the audio RTCP from. Respectively, the subsequent
* ports will be used to transmit the video RTP and RTCP from."
*/
private static final String LOCAL_PORT_BASE_ARG_NAME
= "--local-port-base=";
/**
* The name of the command-line argument which specifies the name of the
* host to which the media is to be transmitted.
*/
private static final String REMOTE_HOST_ARG_NAME = "--remote-host=";
/**
* The name of the command-line argument which specifies the port to which
* the media is to be transmitted. The command-line argument value will be
* used as the port to transmit the audio RTP to, the next port after it
* will be to transmit the audio RTCP to. Respectively, the subsequent ports
* will be used to transmit the video RTP and RTCP to."
*/
private static final String REMOTE_PORT_BASE_ARG_NAME
= "--remote-port-base=";
/**
* The list of command-line arguments accepted as valid by the
* <tt>AVTransmit2</tt> application along with their human-readable usage
* descriptions.
*/
private static final String[][] ARGS
= {
{
LOCAL_PORT_BASE_ARG_NAME,
"The port which is the source of the transmission i.e. from"
+ " which the media is to be transmitted. The specified"
+ " value will be used as the port to transmit the audio"
+ " RTP from, the next port after it will be used to"
+ " transmit the audio RTCP from. Respectively, the"
+ " subsequent ports will be used to transmit the video RTP"
+ " and RTCP from."
},
{
REMOTE_HOST_ARG_NAME,
"The name of the host which is the target of the transmission"
+ " i.e. to which the media is to be transmitted"
},
{
REMOTE_PORT_BASE_ARG_NAME,
"The port which is the target of the transmission i.e. to which"
+ " the media is to be transmitted. The specified value"
+ " will be used as the port to transmit the audio RTP to"
+ " the next port after it will be used to transmit the"
+ " audio RTCP to. Respectively, the subsequent ports will"
+ " be used to transmit the video RTP and RTCP to."
}
};
public static void main(String[] args)
throws Exception {
// We need two parameters to do the transmission. For example,
// ant run-example -Drun.example.name=AVTransmit2 -Drun.example.arg.line="--remote-host=127.0.0.1 --remote-port-base=10000"
if (args.length < 2) {
prUsage();
} else {
Map argMap = parseCommandLineArgs(args);
LibJitsi.start();
try {
// Create a audio transmit object with the specified params.
VideoTransmitter at
= new VideoTransmitter(
argMap.get(LOCAL_PORT_BASE_ARG_NAME),
argMap.get(REMOTE_HOST_ARG_NAME),
argMap.get(REMOTE_PORT_BASE_ARG_NAME));
// Start the transmission
String result = at.start();
// result will be non-null if there was an error. The return
// value is a String describing the possible error. Print it.
if (result == null) {
System.err.println("Start transmission for 600 seconds...");
// Transmit for 60 seconds and then close the processor
// This is a safeguard when using a capture data source
// so that the capture device will be properly released
// before quitting.
// The right thing to do would be to have a GUI with a
// "Stop" button that would call stop on AVTransmit2
try {
Thread.sleep(600_000);
} catch (InterruptedException ie) {
}
// Stop the transmission
at.stop();
System.err.println("...transmission ended.");
} else {
System.err.println("Error : " + result);
}
} finally {
LibJitsi.stop();
}
}
}
/**
* Parses the arguments specified to the <tt>AVTransmit2</tt> application on
* the command line.
*
* @param args the arguments specified to the <tt>AVTransmit2</tt>
* application on the command line
* @return a <tt>Map</tt> containing the arguments specified to the
* <tt>AVTransmit2</tt> application on the command line in the form of
* name-value associations
*/
static Map parseCommandLineArgs(String[] args) {
Map argMap = new HashMap();
for (String arg : args) {
int keyEndIndex = arg.indexOf('=');
String key;
String value;
if (keyEndIndex == -1) {
key = arg;
value = null;
} else {
key = arg.substring(0, keyEndIndex + 1);
value = arg.substring(keyEndIndex + 1);
}
argMap.put(key, value);
}
return argMap;
}
/**
* Outputs human-readable description about the usage of the
* <tt>AVTransmit2</tt> application and the command-line arguments it
* accepts as valid.
*/
private static void prUsage() {
PrintStream err = System.err;
err.println("Usage: " + VideoTransmitter.class.getName() + " <args>");
err.println("Valid args:");
for (String[] arg : ARGS)
err.println(" " + arg[0] + " " + arg[1]);
}
}
</args>