Recherche avancée

Médias (1)

Mot : - Tags -/intégration

Autres articles (78)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • (Dés)Activation de fonctionnalités (plugins)

    18 février 2011, par

    Pour gérer l’ajout et la suppression de fonctionnalités supplémentaires (ou plugins), MediaSPIP utilise à partir de la version 0.2 SVP.
    SVP permet l’activation facile de plugins depuis l’espace de configuration de MediaSPIP.
    Pour y accéder, il suffit de se rendre dans l’espace de configuration puis de se rendre sur la page "Gestion des plugins".
    MediaSPIP est fourni par défaut avec l’ensemble des plugins dits "compatibles", ils ont été testés et intégrés afin de fonctionner parfaitement avec chaque (...)

Sur d’autres sites (8888)

  • Unable to stream file onto localhost - ffmpeg

    18 octobre 2013, par trueblue

    I am new to ffmpeg/ffserver. I am trying to stream a local file named Trial onto a localhost using ffserver. I want to run the file in browser as http://localhost:8090/feed1.ffm
    I am executing the below command in Ubuntu(Trial is a Mpeg TS file) :

     ffmpeg -i Trial http://localhost:8090/feed1.ffm

    Upon execution of above command I am getting below error :

    FFmpeg version SVN-r0.5.9-4:0.5.9-0ubuntu0.10.04.3, Copyright (c) 2000-2009 Fabrice Bellard, et al.
     configuration: --extra-version=4:0.5.9-0ubuntu0.10.04.3 --prefix=/usr --enable-avfilter --enable-avfilter-lavf --enable-vdpau --enable-bzlib --enable-libgsm --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-pthreads --enable-zlib --disable-stripping --disable-vhook --enable-runtime-cpudetect --enable-gpl --enable-postproc --enable-swscale --enable-x11grab --enable-libdc1394 --enable-shared --disable-static
     libavutil     49.15. 0 / 49.15. 0
     libavcodec    52.20. 1 / 52.20. 1
     libavformat   52.31. 0 / 52.31. 0
     libavdevice   52. 1. 0 / 52. 1. 0
     libavfilter    0. 4. 0 /  0. 4. 0
     libswscale     0. 7. 1 /  0. 7. 1
     libpostproc   51. 2. 0 / 51. 2. 0
     built on Jan 24 2013 19:42:59, gcc: 4.4.3

    Seems stream 0 codec frame rate differs from container frame rate: 119.88 (120000/1001) -> 59.94 (60000/1001)
    Input #0, mpegts, from 'Trial':
     Duration: 00:00:04.22, start: 0.177633, bitrate: 40368 kb/s
     Program 2
       Stream #0.0[0x21]: Video: mpeg2video, yuv420p, 1280x720 [PAR 1:1 DAR 16:9], 45000 kb/s, 59.94 tbr, 90k tbn, 119.88 tbc
    Output #0, ffm, to 'http://localhost:8090/feed1.ffm':
       Stream #0.0: Video: flv, yuv420p, 352x288, q=1-5, 100 kb/s, 1000k tbn, 15 tbc
       Stream #0.1: Audio: mp2, 44100 Hz, mono, s16, 32 kb/s
       Stream #0.2: Video: mpeg1video, yuv420p, 160x128, q=3-31, 64 kb/s, 1000k tbn, 3 tbc
       Stream #0.3: Audio: mp2, 22050 Hz, mono, s16, 64 kb/s
       Stream #0.4: Video: msmpeg4, yuv420p, 352x240, q=3-31, 256 kb/s, 1000k tbn, 15 tbc
    Could not find input stream matching output stream #0.1

    My ffserver.conf file goes like this :

    # Port on which the server is listening. You must select a different
    # port from your standard HTTP web server if it is running on the same
    # computer.
    Port 8090

    # Address on which the server is bound. Only useful if you have
    # several network interfaces.
    BindAddress 0.0.0.0

    # Number of simultaneous HTTP connections that can be handled. It has
    # to be defined *before* the MaxClients parameter, since it defines the
    # MaxClients maximum limit.
    MaxHTTPConnections 2000

    # Number of simultaneous requests that can be handled. Since FFServer
    # is very fast, it is more likely that you will want to leave this high
    # and use MaxBandwidth, below.
    MaxClients 1000

    # This the maximum amount of kbit/sec that you are prepared to
    # consume when streaming to clients.
    MaxBandwidth 1000

    # Access log file (uses standard Apache log file format)
    # '-' is the standard output.
    CustomLog -

    # Suppress that if you want to launch ffserver as a daemon.
    NoDaemon


    ##################################################################
    # Definition of the live feeds. Each live feed contains one video
    # and/or audio sequence coming from an ffmpeg encoder or another
    # ffserver. This sequence may be encoded simultaneously with several
    # codecs at several resolutions.

    <feed>

    # You must use &#39;ffmpeg&#39; to send a live feed to ffserver. In this
    # example, you can type:
    #
    # ffmpeg http://localhost:8090/feed1.ffm

    # ffserver can also do time shifting. It means that it can stream any
    # previously recorded live stream. The request should contain:
    # "http://xxxx?date=[YYYY-MM-DDT][[HH:]MM:]SS[.m...]".You must specify
    # a path where the feed is stored on disk. You also specify the
    # maximum size of the feed, where zero means unlimited. Default:
    # File=/tmp/feed_name.ffm FileMaxSize=5M
    File /tmp/feed1.ffm
    FileMaxSize 5M

    # You could specify
    # ReadOnlyFile /saved/specialvideo.ffm
    # This marks the file as readonly and it will not be deleted or updated.

    # Specify launch in order to start ffmpeg automatically.
    # First ffmpeg must be defined with an appropriate path if needed,
    # after that options can follow, but avoid adding the http:// field
    #Launch ffmpeg

    # Only allow connections from localhost to the feed.
    ACL allow 127.0.0.1

    </feed>



    <stream>
    Feed feed1.ffm
    Format swf
    VideoCodec flv
    VideoFrameRate 15
    VideoBufferSize 80000
    VideoBitRate 100
    VideoQMin 1
    VideoQMax 5
    VideoSize 352x288
    PreRoll 0
    Noaudio
    </stream>

    ##################################################################
    # Now you can define each stream which will be generated from the
    # original audio and video stream. Each format has a filename (here
    # &#39;test1.mpg&#39;). FFServer will send this stream when answering a
    # request containing this filename.

    <stream>

    # coming from live feed &#39;feed1&#39;
    Feed feed1.ffm

    # Format of the stream : you can choose among:
    # mpeg       : MPEG-1 multiplexed video and audio
    # mpegvideo  : only MPEG-1 video
    # mp2        : MPEG-2 audio (use AudioCodec to select layer 2 and 3 codec)
    # ogg        : Ogg format (Vorbis audio codec)
    # rm         : RealNetworks-compatible stream. Multiplexed audio and video.
    # ra         : RealNetworks-compatible stream. Audio only.
    # mpjpeg     : Multipart JPEG (works with Netscape without any plugin)
    # jpeg       : Generate a single JPEG image.
    # asf        : ASF compatible streaming (Windows Media Player format).
    # swf        : Macromedia Flash compatible stream
    # avi        : AVI format (MPEG-4 video, MPEG audio sound)
    Format mpeg

    # Bitrate for the audio stream. Codecs usually support only a few
    # different bitrates.
    AudioBitRate 32

    # Number of audio channels: 1 = mono, 2 = stereo
    AudioChannels 1

    # Sampling frequency for audio. When using low bitrates, you should
    # lower this frequency to 22050 or 11025. The supported frequencies
    # depend on the selected audio codec.
    AudioSampleRate 44100

    # Bitrate for the video stream
    VideoBitRate 64


    # Ratecontrol buffer size
    VideoBufferSize 40

    # Number of frames per second
    VideoFrameRate 3

    # Size of the video frame: WxH (default: 160x128)
    # The following abbreviations are defined: sqcif, qcif, cif, 4cif, qqvga,
    # qvga, vga, svga, xga, uxga, qxga, sxga, qsxga, hsxga, wvga, wxga, wsxga,
    # wuxga, woxga, wqsxga, wquxga, whsxga, whuxga, cga, ega, hd480, hd720,
    # hd1080
    VideoSize 160x128

    # Transmit only intra frames (useful for low bitrates, but kills frame rate).
    #VideoIntraOnly

    # If non-intra only, an intra frame is transmitted every VideoGopSize
    # frames. Video synchronization can only begin at an intra frame.
    VideoGopSize 12

    # More MPEG-4 parameters
    # VideoHighQuality
    # Video4MotionVector

    # Choose your codecs:
    #AudioCodec mp2
    #VideoCodec mpeg1video

    # Suppress audio
    #NoAudio

    # Suppress video
    #NoVideo

    #VideoQMin 3
    #VideoQMax 31

    # Set this to the number of seconds backwards in time to start. Note that
    # most players will buffer 5-10 seconds of video, and also you need to allow
    # for a keyframe to appear in the data stream.
    #Preroll 15

    # ACL:

    # You can allow ranges of addresses (or single addresses)
    #ACL ALLOW <first address="address"> <last address="address">

    # You can deny ranges of addresses (or single addresses)
    #ACL DENY <first address="address"> <last address="address">

    # You can repeat the ACL allow/deny as often as you like. It is on a per
    # stream basis. The first match defines the action. If there are no matches,
    # then the default is the inverse of the last ACL statement.
    #
    # Thus &#39;ACL allow localhost&#39; only allows access from localhost.
    # &#39;ACL deny 1.0.0.0 1.255.255.255&#39; would deny the whole of network 1 and
    # allow everybody else.

    </last></first></last></first></stream>


    ##################################################################
    # Example streams


    # Multipart JPEG

    #<stream>
    #Feed feed1.ffm
    #Format mpjpeg
    #VideoFrameRate 2
    #VideoIntraOnly
    #NoAudio
    #Strict -1
    #</stream>


    # Single JPEG

    #<stream>
    #Feed feed1.ffm
    #Format jpeg
    #VideoFrameRate 2
    #VideoIntraOnly
    ##VideoSize 352x240
    #NoAudio
    #Strict -1
    #</stream>



    # Flash

    #<stream>
    #Feed feed1.ffm
    #Format swf
    #VideoFrameRate 2
    #VideoIntraOnly
    #NoAudio
    #</stream>


    # ASF compatible

    <stream>
    Feed feed1.ffm
    Format asf
    VideoFrameRate 15
    VideoSize 352x240
    VideoBitRate 256
    VideoBufferSize 40
    VideoGopSize 30
    AudioBitRate 64
    StartSendOnKey
    </stream>


    # MP3 audio

    #<stream>
    #Feed feed1.ffm
    #Format mp2
    #AudioCodec mp3
    #AudioBitRate 64
    #AudioChannels 1
    #AudioSampleRate 44100
    #NoVideo
    #</stream>


    # Ogg Vorbis audio

    #<stream>
    #Feed feed1.ffm
    #Title "Stream title"
    #AudioBitRate 64
    #AudioChannels 2
    #AudioSampleRate 44100
    #NoVideo
    #</stream>


    # Real with audio only at 32 kbits

    #<stream>
    #Feed feed1.ffm
    #Format rm
    #AudioBitRate 32
    #NoVideo
    #NoAudio
    #</stream>


    # Real with audio and video at 64 kbits

    #<stream>
    #Feed feed1.ffm
    #Format rm
    #AudioBitRate 32
    #VideoBitRate 128
    #VideoFrameRate 25
    #VideoGopSize 25
    #NoAudio
    #</stream>


    ##################################################################
    # A stream coming from a file: you only need to set the input
    # filename and optionally a new format. Supported conversions:
    #    AVI -> ASF

    #<stream>
    #File "/usr/local/httpd/htdocs/tlive.rm"
    #NoAudio
    #</stream>

    #<stream>
    #File "/usr/local/httpd/htdocs/test.asf"
    #NoAudio
    #Author "Me"
    #Copyright "Super MegaCorp"
    #Title "Test stream from disk"
    #Comment "Test comment"
    #</stream>


    ##################################################################
    # RTSP examples
    #
    # You can access this stream with the RTSP URL:
    #   rtsp://localhost:5454/test1-rtsp.mpg
    #
    # A non-standard RTSP redirector is also created. Its URL is:
    #   http://localhost:8090/test1-rtsp.rtsp

    #<stream>
    #Format rtp
    #File "/usr/local/httpd/htdocs/test1.mpg"
    #</stream>


    ##################################################################
    # SDP/multicast examples
    #
    # If you want to send your stream in multicast, you must set the
    # multicast address with MulticastAddress. The port and the TTL can
    # also be set.
    #
    # An SDP file is automatically generated by ffserver by adding the
    # &#39;sdp&#39; extension to the stream name (here
    # http://localhost:8090/test1-sdp.sdp). You should usually give this
    # file to your player to play the stream.
    #
    # The &#39;NoLoop&#39; option can be used to avoid looping when the stream is
    # terminated.

    #<stream>
    #Format rtp
    #File "/usr/local/httpd/htdocs/test1.mpg"
    #MulticastAddress 224.124.0.1
    #MulticastPort 5000
    #MulticastTTL 16
    #NoLoop
    #</stream>


    ##################################################################
    # Special streams

    # Server status

    <stream>
    Format status

    # Only allow local people to get the status
    ACL allow localhost
    ACL allow 192.168.0.0 192.168.255.255

    #FaviconURL http://pond1.gladstonefamily.net:8080/favicon.ico
    </stream>


    # Redirect index.html to the appropriate site

    <redirect>
    URL http://www.ffmpeg.org/
    </redirect>

    Kindly anyone please assist me whether I am missing something or do i need to change my server.conf file ? I have referred many websites. But still I am unable to fix it. Thanks in advance.

  • Jitsi and ffplay

    15 juin 2014, par Kotkot

    I’m playing with jitsi. Got examples form source code. I modified it a bit.
    Here is what I’ve got.
    I am trying to play the transmitted stream in VLC of ffplay or any other player,
    but I cannot.

    I use these application parameters to run the code :

    --local-port-base=5000 --remote-host=localhost --remote-port-base=10000

    What am I doing wrong ?

    package com.company;


       /*
        * Jitsi, the OpenSource Java VoIP and Instant Messaging client.
        *
    * Distributable under LGPL license.
    * See terms of license at gnu.org.
    */

    import org.jitsi.service.libjitsi.LibJitsi;
    import org.jitsi.service.neomedia.*;
    import org.jitsi.service.neomedia.device.MediaDevice;
    import org.jitsi.service.neomedia.format.MediaFormat;
    import org.jitsi.service.neomedia.format.MediaFormatFactory;

    import java.io.PrintStream;
    import java.net.DatagramSocket;
    import java.net.InetAddress;
    import java.net.InetSocketAddress;
    import java.util.HashMap;
    import java.util.Map;

    /**
    * Implements an example application in the fashion of JMF's AVTransmit2 example
    * which demonstrates the use of the <tt>libjitsi</tt> library for the purposes
    * of transmitting audio and video via RTP means.
    *
    * @author Lyubomir Marinov
    */
    public class VideoTransmitter {
       /**
        * The port which is the source of the transmission i.e. from which the
        * media is to be transmitted.
        *
        * @see #LOCAL_PORT_BASE_ARG_NAME
        */
       private int localPortBase;

       /**
        * The <tt>MediaStream</tt> instances initialized by this instance indexed
        * by their respective <tt>MediaType</tt> ordinal.
        */
       private MediaStream[] mediaStreams;

       /**
        * The <tt>InetAddress</tt> of the host which is the target of the
        * transmission i.e. to which the media is to be transmitted.
        *
        * @see #REMOTE_HOST_ARG_NAME
        */
       private InetAddress remoteAddr;

       /**
        * The port which is the target of the transmission i.e. to which the media
        * is to be transmitted.
        *
        * @see #REMOTE_PORT_BASE_ARG_NAME
        */
       private int remotePortBase;

       /**
        * Initializes a new <tt>AVTransmit2</tt> instance which is to transmit
        * audio and video to a specific host and a specific port.
        *
        * @param localPortBase  the port which is the source of the transmission
        *                       i.e. from which the media is to be transmitted
        * @param remoteHost     the name of the host which is the target of the
        *                       transmission i.e. to which the media is to be transmitted
        * @param remotePortBase the port which is the target of the transmission
        *                       i.e. to which the media is to be transmitted
        * @throws Exception if any error arises during the parsing of the specified
        *                   <tt>localPortBase</tt>, <tt>remoteHost</tt> and <tt>remotePortBase</tt>
        */
       private VideoTransmitter(
               String localPortBase,
               String remoteHost, String remotePortBase)
               throws Exception {
           this.localPortBase
                   = (localPortBase == null)
                   ? -1
                   : Integer.valueOf(localPortBase).intValue();
           this.remoteAddr = InetAddress.getByName(remoteHost);
           this.remotePortBase = Integer.valueOf(remotePortBase).intValue();
       }

       /**
        * Starts the transmission. Returns null if transmission started ok.
        * Otherwise it returns a string with the reason why the setup failed.
        */
       private String start()
               throws Exception {
           /*
            * Prepare for the start of the transmission i.e. initialize the
            * MediaStream instances.
            */
           MediaType[] mediaTypes = MediaType.values();
           MediaService mediaService = LibJitsi.getMediaService();
           int localPort = localPortBase;
           int remotePort = remotePortBase;

           mediaStreams = new MediaStream[mediaTypes.length];
           for (MediaType mediaType : mediaTypes) {
               if(mediaType != MediaType.VIDEO) continue;
               /*
                * The default MediaDevice (for a specific MediaType) is configured
                * (by the user of the application via some sort of UI) into the
                * ConfigurationService. If there is no ConfigurationService
                * instance known to LibJitsi, the first available MediaDevice of
                * the specified MediaType will be chosen by MediaService.
                */
               MediaDevice device
                       = mediaService.getMediaDeviceForPartialDesktopStreaming(100,100,100,100);
               if (device == null) {
                   continue;
               }
               MediaStream mediaStream = mediaService.createMediaStream(device);

               // direction
               /*
                * The AVTransmit2 example sends only and the AVReceive2 receives
                * only. In a call, the MediaStream's direction will most commonly
                * be set to SENDRECV.
                */
               mediaStream.setDirection(MediaDirection.SENDONLY);

               // format
               String encoding;
               double clockRate;
               /*
                * The AVTransmit2 and AVReceive2 examples use the H.264 video
                * codec. Its RTP transmission has no static RTP payload type number
                * assigned.
                */
               byte dynamicRTPPayloadType;

               switch (device.getMediaType()) {
                   case AUDIO:
                       encoding = "PCMU";
                       clockRate = 8000;
                   /* PCMU has a static RTP payload type number assigned. */
                       dynamicRTPPayloadType = -1;
                       break;
                   case VIDEO:
                       encoding = "H264";
                       clockRate = MediaFormatFactory.CLOCK_RATE_NOT_SPECIFIED;
                   /*
                    * The dymanic RTP payload type numbers are usually negotiated
                    * in the signaling functionality.
                    */
                       dynamicRTPPayloadType = 99;
                       break;
                   default:
                       encoding = null;
                       clockRate = MediaFormatFactory.CLOCK_RATE_NOT_SPECIFIED;
                       dynamicRTPPayloadType = -1;
               }

               if (encoding != null) {
                   MediaFormat format
                           = mediaService.getFormatFactory().createMediaFormat(
                           encoding,
                           clockRate);

                   /*
                    * The MediaFormat instances which do not have a static RTP
                    * payload type number association must be explicitly assigned
                    * a dynamic RTP payload type number.
                    */
                   if (dynamicRTPPayloadType != -1) {
                       mediaStream.addDynamicRTPPayloadType(
                               dynamicRTPPayloadType,
                               format);
                   }

                   mediaStream.setFormat(format);
               }

               // connector
               StreamConnector connector;

               if (localPortBase == -1) {
                   connector = new DefaultStreamConnector();
               } else {
                   int localRTPPort = localPort++;
                   int localRTCPPort = localPort++;

                   connector
                           = new DefaultStreamConnector(
                           new DatagramSocket(localRTPPort),
                           new DatagramSocket(localRTCPPort));
               }
               mediaStream.setConnector(connector);

               // target
               /*
                * The AVTransmit2 and AVReceive2 examples follow the common
                * practice that the RTCP port is right after the RTP port.
                */
               int remoteRTPPort = remotePort++;
               int remoteRTCPPort = remotePort++;

               mediaStream.setTarget(
                       new MediaStreamTarget(
                               new InetSocketAddress(remoteAddr, remoteRTPPort),
                               new InetSocketAddress(remoteAddr, remoteRTCPPort)));

               // name
               /*
                * The name is completely optional and it is not being used by the
                * MediaStream implementation at this time, it is just remembered so
                * that it can be retrieved via MediaStream#getName(). It may be
                * integrated with the signaling functionality if necessary.
                */
               mediaStream.setName(mediaType.toString());

               mediaStreams[mediaType.ordinal()] = mediaStream;
           }

           /*
            * Do start the transmission i.e. start the initialized MediaStream
            * instances.
            */
           for (MediaStream mediaStream : mediaStreams) {
               if (mediaStream != null) {

                   mediaStream.start();
               }
           }



           return null;
       }

       /**
        * Stops the transmission if already started
        */
       private void stop() {
           if (mediaStreams != null) {
               for (int i = 0; i &lt; mediaStreams.length; i++) {
                   MediaStream mediaStream = mediaStreams[i];

                   if (mediaStream != null) {
                       try {
                           mediaStream.stop();
                       } finally {
                           mediaStream.close();
                           mediaStreams[i] = null;
                       }
                   }
               }

               mediaStreams = null;
           }
       }

       /**
        * The name of the command-line argument which specifies the port from which
        * the media is to be transmitted. The command-line argument value will be
        * used as the port to transmit the audio RTP from, the next port after it
        * will be to transmit the audio RTCP from. Respectively, the subsequent
        * ports will be used to transmit the video RTP and RTCP from."
        */
       private static final String LOCAL_PORT_BASE_ARG_NAME
               = "--local-port-base=";

       /**
        * The name of the command-line argument which specifies the name of the
        * host to which the media is to be transmitted.
        */
       private static final String REMOTE_HOST_ARG_NAME = "--remote-host=";

       /**
        * The name of the command-line argument which specifies the port to which
        * the media is to be transmitted. The command-line argument value will be
        * used as the port to transmit the audio RTP to, the next port after it
        * will be to transmit the audio RTCP to. Respectively, the subsequent ports
        * will be used to transmit the video RTP and RTCP to."
        */
       private static final String REMOTE_PORT_BASE_ARG_NAME
               = "--remote-port-base=";

       /**
        * The list of command-line arguments accepted as valid by the
        * <tt>AVTransmit2</tt> application along with their human-readable usage
        * descriptions.
        */
       private static final String[][] ARGS
               = {
               {
                       LOCAL_PORT_BASE_ARG_NAME,
                       "The port which is the source of the transmission i.e. from"
                               + " which the media is to be transmitted. The specified"
                               + " value will be used as the port to transmit the audio"
                               + " RTP from, the next port after it will be used to"
                               + " transmit the audio RTCP from. Respectively, the"
                               + " subsequent ports will be used to transmit the video RTP"
                               + " and RTCP from."
               },
               {
                       REMOTE_HOST_ARG_NAME,
                       "The name of the host which is the target of the transmission"
                               + " i.e. to which the media is to be transmitted"
               },
               {
                       REMOTE_PORT_BASE_ARG_NAME,
                       "The port which is the target of the transmission i.e. to which"
                               + " the media is to be transmitted. The specified value"
                               + " will be used as the port to transmit the audio RTP to"
                               + " the next port after it will be used to transmit the"
                               + " audio RTCP to. Respectively, the subsequent ports will"
                               + " be used to transmit the video RTP and RTCP to."
               }
       };

       public static void main(String[] args)
               throws Exception {
           // We need two parameters to do the transmission. For example,
           // ant run-example -Drun.example.name=AVTransmit2 -Drun.example.arg.line="--remote-host=127.0.0.1 --remote-port-base=10000"
           if (args.length &lt; 2) {
               prUsage();
           } else {
               Map argMap = parseCommandLineArgs(args);

               LibJitsi.start();
               try {
                   // Create a audio transmit object with the specified params.
                   VideoTransmitter at
                           = new VideoTransmitter(
                           argMap.get(LOCAL_PORT_BASE_ARG_NAME),
                           argMap.get(REMOTE_HOST_ARG_NAME),
                           argMap.get(REMOTE_PORT_BASE_ARG_NAME));
                   // Start the transmission
                   String result = at.start();

                   // result will be non-null if there was an error. The return
                   // value is a String describing the possible error. Print it.
                   if (result == null) {
                       System.err.println("Start transmission for 600 seconds...");

                       // Transmit for 60 seconds and then close the processor
                       // This is a safeguard when using a capture data source
                       // so that the capture device will be properly released
                       // before quitting.
                       // The right thing to do would be to have a GUI with a
                       // "Stop" button that would call stop on AVTransmit2
                       try {
                           Thread.sleep(600_000);
                       } catch (InterruptedException ie) {
                       }

                       // Stop the transmission
                       at.stop();

                       System.err.println("...transmission ended.");
                   } else {
                       System.err.println("Error : " + result);
                   }
               } finally {
                   LibJitsi.stop();
               }
           }
       }

       /**
        * Parses the arguments specified to the <tt>AVTransmit2</tt> application on
        * the command line.
        *
        * @param args the arguments specified to the <tt>AVTransmit2</tt>
        *             application on the command line
        * @return a <tt>Map</tt> containing the arguments specified to the
        * <tt>AVTransmit2</tt> application on the command line in the form of
        * name-value associations
        */
       static Map parseCommandLineArgs(String[] args) {
           Map argMap = new HashMap();

           for (String arg : args) {
               int keyEndIndex = arg.indexOf('=');
               String key;
               String value;

               if (keyEndIndex == -1) {
                   key = arg;
                   value = null;
               } else {
                   key = arg.substring(0, keyEndIndex + 1);
                   value = arg.substring(keyEndIndex + 1);
               }
               argMap.put(key, value);
           }
           return argMap;
       }

       /**
        * Outputs human-readable description about the usage of the
        * <tt>AVTransmit2</tt> application and the command-line arguments it
        * accepts as valid.
        */
       private static void prUsage() {
           PrintStream err = System.err;

           err.println("Usage: " + VideoTransmitter.class.getName() + " <args>");
           err.println("Valid args:");
           for (String[] arg : ARGS)
               err.println("  " + arg[0] + " " + arg[1]);
       }
       }
    </args>
  • FFMPEG : Offseting & merging audios [migrated]

    5 novembre 2014, par user1064504

    I am trying to offset multiple audios into one, each with different offset.

    <code>ffmpeg -i a.ogg -i 1.ogg -filter_complex "amix=inputs=2[op];[op]adelay=5000|15000" out.ogg
    

    Can someone help with understand how to correctly use adelay with amix for multiple files, I am trying to achieve something like this.

    <code>
      &lt;-ist audio->    &lt;---2nd-audio--->

    <---------------------------------------------->