Recherche avancée

Médias (1)

Mot : - Tags -/biomaping

Autres articles (52)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (10428)

  • streaming H.264 over RTP with libavformat

    16 avril 2012, par Jacob Peddicord

    I've been trying over the past week to implement H.264 streaming over RTP, using x264 as an encoder and libavformat to pack and send the stream. Problem is, as far as I can tell it's not working correctly.

    Right now I'm just encoding random data (x264_picture_alloc) and extracting NAL frames from libx264. This is fairly simple :

    x264_picture_t pic_out;
    x264_nal_t* nals;
    int num_nals;
    int frame_size = x264_encoder_encode(this->encoder, &nals, &num_nals, this->pic_in, &pic_out);

    if (frame_size <= 0)
    {
       return frame_size;
    }

    // push NALs into the queue
    for (int i = 0; i < num_nals; i++)
    {
       // create a NAL storage unit
       NAL nal;
       nal.size = nals[i].i_payload;
       nal.payload = new uint8_t[nal.size];
       memcpy(nal.payload, nals[i].p_payload, nal.size);

       // push the storage into the NAL queue
       {
           // lock and push the NAL to the queue
           boost::mutex::scoped_lock lock(this->nal_lock);
           this->nal_queue.push(nal);
       }
    }

    nal_queue is used for safely passing frames over to a Streamer class which will then send the frames out. Right now it's not threaded, as I'm just testing to try to get this to work. Before encoding individual frames, I've made sure to initialize the encoder.

    But I don't believe x264 is the issue, as I can see frame data in the NALs it returns back.
    Streaming the data is accomplished with libavformat, which is first initialized in a Streamer class :

    Streamer::Streamer(Encoder* encoder, string rtp_address, int rtp_port, int width, int height, int fps, int bitrate)
    {
       this->encoder = encoder;

       // initalize the AV context
       this->ctx = avformat_alloc_context();
       if (!this->ctx)
       {
           throw runtime_error("Couldn't initalize AVFormat output context");
       }

       // get the output format
       this->fmt = av_guess_format("rtp", NULL, NULL);
       if (!this->fmt)
       {
           throw runtime_error("Unsuitable output format");
       }
       this->ctx->oformat = this->fmt;

       // try to open the RTP stream
       snprintf(this->ctx->filename, sizeof(this->ctx->filename), "rtp://%s:%d", rtp_address.c_str(), rtp_port);
       if (url_fopen(&(this->ctx->pb), this->ctx->filename, URL_WRONLY) < 0)
       {
           throw runtime_error("Couldn't open RTP output stream");
       }

       // add an H.264 stream
       this->stream = av_new_stream(this->ctx, 1);
       if (!this->stream)
       {
           throw runtime_error("Couldn't allocate H.264 stream");
       }

       // initalize codec
       AVCodecContext* c = this->stream->codec;
       c->codec_id = CODEC_ID_H264;
       c->codec_type = AVMEDIA_TYPE_VIDEO;
       c->bit_rate = bitrate;
       c->width = width;
       c->height = height;
       c->time_base.den = fps;
       c->time_base.num = 1;

       // write the header
       av_write_header(this->ctx);
    }

    This is where things seem to go wrong. av_write_header above seems to do absolutely nothing ; I've used wireshark to verify this. For reference, I use Streamer streamer(&enc, "10.89.6.3", 49990, 800, 600, 30, 40000); to initialize the Streamer instance, with enc being a reference to an Encoder object used to handle x264 previously.

    Now when I want to stream out a NAL, I use this :

    // grab a NAL
    NAL nal = this->encoder->nal_pop();
    cout << "NAL popped with size " << nal.size << endl;

    // initalize a packet
    AVPacket p;
    av_init_packet(&p);
    p.data = nal.payload;
    p.size = nal.size;
    p.stream_index = this->stream->index;

    // send it out
    av_write_frame(this->ctx, &p);

    At this point, I can see RTP data appearing over the network, and it looks like the frames I've been sending, even including a little copyright blob from x264. But, no player I've used has been able to make any sense of the data. VLC quits wanting an SDP description, which apparently isn't required.

    I then tried to play it through gst-launch :

    gst-launch udpsrc port=49990 ! rtph264depay ! decodebin ! xvimagesink

    This will sit waiting for UDP data, but when it is received, I get :

    ERROR : element /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0 : No RTP
    format was negotiated. Additional debug info :
    gstbasertpdepayload.c(372) : gst_base_rtp_depayload_chain () :
    /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0 : Input buffers
    need to have RTP caps set on them. This is usually achieved by setting
    the 'caps' property of the upstream source element (often udpsrc or
    appsrc), or by putting a capsfilter element before the depayloader and
    setting the 'caps' property on that. Also see
    http://cgit.freedesktop.org/gstreamer/gst-plugins-good/tree/gst/rtp/README

    As I'm not using GStreamer to stream itself, I'm not quite sure what it means with RTP caps. But, it makes me wonder if I'm not sending enough information over RTP to describe the stream. I'm pretty new to video and I feel like there's some key thing I'm missing here. Any hints ?

  • FFMpeg copy stream SEI type 162 error

    13 juillet 2015, par Vishal Shah

    I am using FFMpeg to copy an RTSP stream to an RTMP stream. Both of which are network streams. This is the command I am using.

    ffmpeg -re -i rtsp ://url.to/rtsp/stream -c copy -f flv rtmp ://target.rtmp/uri/stream

    I am receiving data on the youtube live control room. But After a point I get an error like this in my FFMpeg console and then the data transfer on youtube live stops.

    SEI type 162 size 1496 truncated at 1280

    Can someone throw light on what this error means and how to resolve it ?

    Note that the RTSP stream is over the network

  • MediaCodec - save timing info for ffmpeg ?

    18 novembre 2014, par Mark

    I have a requirement to encrypt video before it hits the disk. It seems on Android the only way to do this is to use MediaCodec, and encrypt and save the raw h264 elementary streams. (The MediaRecorder and Muxer classes operate on FileDescriptors, not an OutputStream, so I can’t wrap it with a CipherOutputStream).

    Using the grafika code as a base, I’m able to save a raw h264 elementary stream by replacing the Muxer in the VideoEncoderCore class with a WriteableByteChannel, backed by a CipherOutputStream (code below, minus the CipherOutputStream).

    If I take the resulting output file over to the desktop I’m able to use ffmpeg to mux the h264 stream to a playable mp4 file. What’s missing however is timing information. ffmpeg always assumes 25fps. What I’m looking for is a way to save the timing info, perhaps to a separate file, that I can use to give ffmpeg the right information on the desktop.

    I’m not doing audio yet, but I can imagine I’ll need to do the same thing there, if I’m to have any hope of remotely accurate syncing.

    FWIW, I’m a total newbie here, and I really don’t know much of anything about SPS, NAL, Atoms, etc.

    /*
    * Copyright 2014 Google Inc. All rights reserved.
    *
    * Licensed under the Apache License, Version 2.0 (the "License");
    * you may not use this file except in compliance with the License.
    * You may obtain a copy of the License at
    *
    *      http://www.apache.org/licenses/LICENSE-2.0
    *
    * Unless required by applicable law or agreed to in writing, software
    * distributed under the License is distributed on an "AS IS" BASIS,
    * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    * See the License for the specific language governing permissions and
    * limitations under the License.
    */


    import android.media.MediaCodec;
    import android.media.MediaCodecInfo;
    import android.media.MediaFormat;
    import android.util.Log;
    import android.view.Surface;

    import java.io.BufferedOutputStream;
    import java.io.File;
    import java.io.FileOutputStream;
    import java.io.IOException;
    import java.nio.ByteBuffer;
    import java.nio.channels.Channels;
    import java.nio.channels.WritableByteChannel;

    /**
    * This class wraps up the core components used for surface-input video encoding.
    * <p>
    * Once created, frames are fed to the input surface.  Remember to provide the presentation
    * time stamp, and always call drainEncoder() before swapBuffers() to ensure that the
    * producer side doesn't get backed up.
    * </p><p>
    * This class is not thread-safe, with one exception: it is valid to use the input surface
    * on one thread, and drain the output on a different thread.
    */
    public class VideoEncoderCore {
       private static final String TAG = MainActivity.TAG;
       private static final boolean VERBOSE = false;

       // TODO: these ought to be configurable as well
       private static final String MIME_TYPE = "video/avc";    // H.264 Advanced Video Coding
       private static final int FRAME_RATE = 30;               // 30fps
       private static final int IFRAME_INTERVAL = 5;           // 5 seconds between I-frames

       private Surface mInputSurface;
       private MediaCodec mEncoder;
       private MediaCodec.BufferInfo mBufferInfo;
       private int mTrackIndex;
       //private MediaMuxer mMuxer;
       //private boolean mMuxerStarted;
       private WritableByteChannel outChannel;

       /**
        * Configures encoder and muxer state, and prepares the input Surface.
        */
       public VideoEncoderCore(int width, int height, int bitRate, File outputFile)
               throws IOException {
           mBufferInfo = new MediaCodec.BufferInfo();

           MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, width, height);

           // Set some properties.  Failing to specify some of these can cause the MediaCodec
           // configure() call to throw an unhelpful exception.
           format.setInteger(MediaFormat.KEY_COLOR_FORMAT,
                   MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
           format.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
           format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
           format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);
           if (VERBOSE) Log.d(TAG, "format: " + format);

           // Create a MediaCodec encoder, and configure it with our format.  Get a Surface
           // we can use for input and wrap it with a class that handles the EGL work.
           mEncoder = MediaCodec.createEncoderByType(MIME_TYPE);
           mEncoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
           mInputSurface = mEncoder.createInputSurface();
           mEncoder.start();

           // Create a MediaMuxer.  We can't add the video track and start() the muxer here,
           // because our MediaFormat doesn't have the Magic Goodies.  These can only be
           // obtained from the encoder after it has started processing data.
           //
           // We're not actually interested in multiplexing audio.  We just want to convert
           // the raw H.264 elementary stream we get from MediaCodec into a .mp4 file.
           //mMuxer = new MediaMuxer(outputFile.toString(),
           //        MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);

           mTrackIndex = -1;
           //mMuxerStarted = false;
           outChannel = Channels.newChannel(new BufferedOutputStream(new FileOutputStream(outputFile)));
       }

       /**
        * Returns the encoder's input surface.
        */
       public Surface getInputSurface() {
           return mInputSurface;
       }

       /**
        * Releases encoder resources.
        */
       public void release() {
           if (VERBOSE) Log.d(TAG, "releasing encoder objects");
           if (mEncoder != null) {
               mEncoder.stop();
               mEncoder.release();
               mEncoder = null;
           }
           try {
               outChannel.close();
           }
           catch (Exception e) {
               Log.e(TAG,"Couldn't close output stream.");
           }
       }

       /**
        * Extracts all pending data from the encoder and forwards it to the muxer.
        * </p><p>
        * If endOfStream is not set, this returns when there is no more data to drain.  If it
        * is set, we send EOS to the encoder, and then iterate until we see EOS on the output.
        * Calling this with endOfStream set should be done once, right before stopping the muxer.
        * </p><p>
        * We're just using the muxer to get a .mp4 file (instead of a raw H.264 stream).  We're
        * not recording audio.
        */
       public void drainEncoder(boolean endOfStream) {
           final int TIMEOUT_USEC = 10000;
           if (VERBOSE) Log.d(TAG, "drainEncoder(" + endOfStream + ")");

           if (endOfStream) {
               if (VERBOSE) Log.d(TAG, "sending EOS to encoder");
               mEncoder.signalEndOfInputStream();
           }

           ByteBuffer[] encoderOutputBuffers = mEncoder.getOutputBuffers();
           while (true) {
               int encoderStatus = mEncoder.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);
               if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                   // no output available yet
                   if (!endOfStream) {
                       break;      // out of while
                   } else {
                       if (VERBOSE) Log.d(TAG, "no output available, spinning to await EOS");
                   }
               } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                   // not expected for an encoder
                   encoderOutputBuffers = mEncoder.getOutputBuffers();
               } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                   // should happen before receiving buffers, and should only happen once
                   //if (mMuxerStarted) {
                   //    throw new RuntimeException("format changed twice");
                   //}
                   MediaFormat newFormat = mEncoder.getOutputFormat();
                   Log.d(TAG, "encoder output format changed: " + newFormat);

                   // now that we have the Magic Goodies, start the muxer
                   //mTrackIndex = mMuxer.addTrack(newFormat);
                   //mMuxer.start();
                   //mMuxerStarted = true;
               } else if (encoderStatus &lt; 0) {
                   Log.w(TAG, "unexpected result from encoder.dequeueOutputBuffer: " +
                           encoderStatus);
                   // let's ignore it
               } else {
                   ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
                   if (encodedData == null) {
                       throw new RuntimeException("encoderOutputBuffer " + encoderStatus +
                               " was null");
                   }

                   /*
                      FFMPEG needs this info.
                   if ((mBufferInfo.flags &amp; MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                       // The codec config data was pulled out and fed to the muxer when we got
                       // the INFO_OUTPUT_FORMAT_CHANGED status.  Ignore it.
                       if (VERBOSE) Log.d(TAG, "ignoring BUFFER_FLAG_CODEC_CONFIG");
                       mBufferInfo.size = 0;
                   }
                   */

                   if (mBufferInfo.size != 0) {
                       /*
                       if (!mMuxerStarted) {
                           throw new RuntimeException("muxer hasn't started");
                       }
                       */

                       // adjust the ByteBuffer values to match BufferInfo (not needed?)
                       encodedData.position(mBufferInfo.offset);
                       encodedData.limit(mBufferInfo.offset + mBufferInfo.size);

                       try {
                           outChannel.write(encodedData);
                       }
                       catch (Exception e) {
                           Log.e(TAG,"Error writing output.",e);
                       }
                       if (VERBOSE) {
                           Log.d(TAG, "sent " + mBufferInfo.size + " bytes to muxer, ts=" +
                                   mBufferInfo.presentationTimeUs);
                       }
                   }

                   mEncoder.releaseOutputBuffer(encoderStatus, false);

                   if ((mBufferInfo.flags &amp; MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                       if (!endOfStream) {
                           Log.w(TAG, "reached end of stream unexpectedly");
                       } else {
                           if (VERBOSE) Log.d(TAG, "end of stream reached");
                       }
                       break;      // out of while
                   }
               }
           }
       }
    }
    </p>