Recherche avancée

Médias (1)

Mot : - Tags -/artwork

Autres articles (48)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

Sur d’autres sites (7224)

  • Muxing Android MediaCodec encoded H264 packets into RTMP

    31 décembre 2015, par Vadym

    I am coming from a thread Encoding H.264 from camera with Android MediaCodec. My setup is very similar. However, I attempt to write mux the encoded frames and with javacv and broadcast them via rtmp.

    RtmpClient.java

    ...
    private volatile BlockingQueue mFrameQueue = new LinkedBlockingQueue(MAXIMUM_VIDEO_FRAME_BACKLOG);
    ...
    private void startStream() throws FrameRecorder.Exception, IOException {
       if (TextUtils.isEmpty(mDestination)) {
           throw new IllegalStateException("Cannot start RtmpClient without destination");
       }

       if (mCamera == null) {
           throw new IllegalStateException("Cannot start RtmpClient without camera.");
       }

       Camera.Parameters cameraParams = mCamera.getParameters();

       mRecorder = new FFmpegFrameRecorder(
               mDestination,
               mVideoQuality.resX,
               mVideoQuality.resY,
               (mAudioQuality.channelType.equals(AudioQuality.CHANNEL_TYPE_STEREO) ? 2 : 1));

       mRecorder.setFormat("flv");

       mRecorder.setFrameRate(mVideoQuality.frameRate);
       mRecorder.setVideoBitrate(mVideoQuality.bitRate);
       mRecorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);

       mRecorder.setSampleRate(mAudioQuality.samplingRate);
       mRecorder.setAudioBitrate(mAudioQuality.bitRate);
       mRecorder.setAudioCodec(avcodec.AV_CODEC_ID_AAC);

       mVideoStream = new VideoStream(mRecorder, mVideoQuality, mFrameQueue, mCamera);
       mAudioStream = new AudioStream(mRecorder, mAudioQuality);

       mRecorder.start();

       // Setup a bufferred preview callback
       setupCameraCallback(mCamera, mRtmpClient, DEFAULT_PREVIEW_CALLBACK_BUFFERS,
               mVideoQuality.resX * mVideoQuality.resY * ImageFormat.getBitsPerPixel(
                       cameraParams.getPreviewFormat())/8);

       try {
           mVideoStream.start();
           mAudioStream.start();
       }
       catch(Exception e) {
           e.printStackTrace();
           stopStream();
       }
    }
    ...
    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
       boolean frameQueued = false;

       if (mRecorder == null || data == null) {
           return;
       }

       frameQueued = mFrameQueue.offer(data);

       // return the buffer to be reused - done in videostream
       //camera.addCallbackBuffer(data);
    }
    ...

    VideoStream.java

    ...
    @Override
    public void run() {
       try {
           mMediaCodec = MediaCodec.createEncoderByType("video/avc");
           MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", mVideoQuality.resX, mVideoQuality.resY);
           mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, mVideoQuality.bitRate);
           mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, mVideoQuality.frameRate);
           mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
           mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
           mMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
           mMediaCodec.start();
       }
       catch(IOException e) {
           e.printStackTrace();
       }

       long startTimestamp = System.currentTimeMillis();
       long frameTimestamp = 0;
       byte[] rawFrame = null;

       try {
           while (!Thread.interrupted()) {
               rawFrame = mFrameQueue.take();

               frameTimestamp = 1000 * (System.currentTimeMillis() - startTimestamp);

               encodeFrame(rawFrame, frameTimestamp);

               // return the buffer to be reused
               mCamera.addCallbackBuffer(rawFrame);
           }
       }
       catch (InterruptedException ignore) {
           // ignore interrup while waiting
       }

       // Clean up video stream allocations
       try {
           mMediaCodec.stop();
           mMediaCodec.release();
           mOutputStream.flush();
           mOutputStream.close();
       } catch (Exception e){
           e.printStackTrace();
       }
    }
    ...
    private void encodeFrame(byte[] input, long timestamp) {
       try {
           ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();
           ByteBuffer[] outputBuffers = mMediaCodec.getOutputBuffers();

           int inputBufferIndex = mMediaCodec.dequeueInputBuffer(0);

           if (inputBufferIndex >= 0) {
               ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
               inputBuffer.clear();
               inputBuffer.put(input);
               mMediaCodec.queueInputBuffer(inputBufferIndex, 0, input.length, timestamp, 0);
           }

           MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();

           int outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 0);

           if (outputBufferIndex >= 0) {
               while (outputBufferIndex >= 0) {
                   ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];

                   // Should this be a direct byte buffer?
                   byte[] outData = new byte[bufferInfo.size - bufferInfo.offset];
                   outputBuffer.get(outData);

                   mFrameRecorder.record(outData, bufferInfo.offset, outData.length, timestamp);

                   mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
                   outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 0);
               }
           }
           else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
               outputBuffers = mMediaCodec.getOutputBuffers();
           } else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
               // ignore for now
           }
       } catch (Throwable t) {
           t.printStackTrace();
       }

    }
    ...

    FFmpegFrameRecorder.java

    ...
    // Hackish codec copy frame recording function
    public boolean record(byte[] encodedData, int offset, int length, long frameCount) throws Exception {
       int ret;

       if (encodedData == null) {
           return false;
       }

       av_init_packet(video_pkt);

       // this is why i wondered whether I should get outputbuffer data into direct byte buffer
       video_outbuf.put(encodedData, 0, encodedData.length);

       video_pkt.data(video_outbuf);
       video_pkt.size(video_outbuf_size);

       video_pkt.pts(frameCount);
       video_pkt.dts(frameCount);

       video_pkt.stream_index(video_st.index());

       synchronized (oc) {
           /* write the compressed frame in the media file */
           if (interleaved && audio_st != null) {
               if ((ret = av_interleaved_write_frame(oc, video_pkt)) < 0) {
                   throw new Exception("av_interleaved_write_frame() error " + ret + " while writing interleaved video frame.");
               }
           } else {
               if ((ret = av_write_frame(oc, video_pkt)) < 0) {
                   throw new Exception("av_write_frame() error " + ret + " while writing video frame.");
               }
           }
       }
       return (video_pkt.flags() & AV_PKT_FLAG_KEY) == 1;
    }
    ...

    When I try to stream the video and run ffprobe on it, I get the following output :

    ffprobe version 2.5.3 Copyright (c) 2007-2015 the FFmpeg developers
     built on Jan 19 2015 12:56:57 with gcc 4.1.2 (GCC) 20080704 (Red Hat 4.1.2-55)
     configuration: --prefix=/usr --bindir=/usr/bin --datadir=/usr/share/ffmpeg --incdir=/usr/include/ffmpeg --libdir=/usr/lib64 --mandir=/usr/share/man --arch=x86_64 --optflags='-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic' --enable-bzlib --disable-crystalhd --enable-libass --enable-libdc1394 --enable-libfaac --enable-nonfree --disable-indev=jack --enable-libfreetype --enable-libgsm --enable-libmp3lame --enable-openal --enable-libopencv --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-x11grab --enable-avfilter --enable-avresample --enable-postproc --enable-pthreads --disable-static --enable-shared --enable-gpl --disable-debug --disable-stripping --enable-libcaca --shlibdir=/usr/lib64 --enable-runtime-cpudetect
     libavutil      54. 15.100 / 54. 15.100
     libavcodec     56. 13.100 / 56. 13.100
     libavformat    56. 15.102 / 56. 15.102
     libavdevice    56.  3.100 / 56.  3.100
     libavfilter     5.  2.103 /  5.  2.103
     libavresample   2.  1.  0 /  2.  1.  0
     libswscale      3.  1.101 /  3.  1.101
     libswresample   1.  1.100 /  1.  1.100
     libpostproc    53.  3.100 / 53.  3.100
    Metadata:
     Server                NGINX RTMP (github.com/arut/nginx-rtmp-module)
     width                 320.00
     height                240.00
     displayWidth          320.00
     displayHeight         240.00
     duration              0.00
     framerate             0.00
     fps                   0.00
     videodatarate         261.00
     videocodecid          7.00
     audiodatarate         62.00
     audiocodecid          10.00
     profile
     level
    [live_flv @ 0x1edb0820] Could not find codec parameters for stream 0 (Video: none, none, 267 kb/s): unknown codec
    Consider increasing the value for the 'analyzeduration' and 'probesize' options
    Input #0, live_flv, from 'rtmp://<server>/input/<stream>':
     Metadata:
       Server          : NGINX RTMP (github.com/arut/nginx-rtmp-module)
       displayWidth    : 320
       displayHeight   : 240
       fps             : 0
       profile         :
       level           :
     Duration: 00:00:00.00, start: 16.768000, bitrate: N/A
       Stream #0:0: Video: none, none, 267 kb/s, 1k tbr, 1k tbn, 1k tbc
       Stream #0:1: Audio: aac (LC), 16000 Hz, mono, fltp, 63 kb/s
    Unsupported codec with id 0 for input stream 0
    </stream></server>

    I am not, by any means, an expert in H264 or video encoding. I know that the encoded frames that come out from MediaCodec contain SPS NAL, PPS NAL, and frame NAL units. I’ve also written the MediaCodec output into a file and was able to play it back (I did have to specify the format and framerate as otherwise it would play too fast).

    My assumption is that things should work (see how little I know :)). Knowing that SPS and PPS are written out, decoder should know enough. Yet, ffprobe fails to recognize codec, fps, and other video information. Do I need to pass packet flag information to FFmpegFrameRecorder.java:record() function ? Or should I use direct buffer ? Any suggestion will be appreciated ! I should figure things out with a hint.

    PS : I know that some codecs use Planar and other SemiPlanar color formats. That distinction will come later if I get past this. Also, I didn’t go the Surface to MediaCodec way because I need to support API 17 and it requires more changes than this route, which I think helps me understand the more basic flow. Agan, I appreciate any suggestions. Please let me know if something needs to be clarified.

    Update #1

    So having done more testing, I see that my encoder outputs the following frames :

    000000016742800DDA0507E806D0A1350000000168CE06E2
    0000000165B840A6F1E7F0EA24000AE73BEB5F51CC7000233A84240...
    0000000141E2031364E387FD4F9BB3D67F51CC7000279B9F9CFE811...
    0000000141E40304423FFFFF0B7867F89FAFFFFFFFFFFCBE8EF25E6...
    0000000141E602899A3512EF8AEAD1379F0650CC3F905131504F839...
    ...

    The very first frame contains SPS and PPS. From what I was able to see, these are transmitted only once. The rest are NAL types 1 and 5. So, my assumption is that, for ffprobe to see stream info not only when the stream starts, I should capture SPS and PPS frames and re-transmit them myself periodically, after a certain number of frames, or perhaps before every I-frame. What do you think ?

    Update #2

    Unable to validate that I’m writing frames successfully. After having tried to read back the written packet, I cannot validate written bytes. As strange, on successful write of IPL image and streaming, I also cannot print out bytes of encoded packet after avcodec_encode_video2. Hit the official dead end.

  • iOS Recorded Video Playback on Android

    1er mars 2015, par Nirav

    I am trying to record video from iPhone device using UIImagePickerController and able to store it in MP4 format. The same video is uploaded on the Amazon S3 cloud.

    When I try to play the same video on Android devices, it fails to play with an error, cannot play.

    I searched forums/google and found that ffmpeg should be used to compress the video before uploading. I want to do the compression on the phone itself rather than on the server. Which is the best way to achieve this ?

    Regards,

    Nirav

  • In Android, how can I display mp4 files captured by Android itself without having "sorry this video cannot be played" trouble ?

    31 octobre 2013, par fercis

    I am new in Android and have an application as follows :

    1) I record videos by my phone (Samsung Galaxy Note 2).
    2) I send video files (mp4) to a cloud (Dropbox in my example) without any change in encoding. And store the URL of the sent video on the cloud.

    And when the application is in view video mode,
    3) I want to view the videos with the URL (which may be captured by other client Applications) in videoview.

    I cannot view the videos that are captured by the phone itself. There are many discussions in SO about the encoding compatibility (H 263, H 264 etc.).

    Video recoding with ffmpeg

    However, I could not manage to find a way to CHANGE the encoding of the video to be recorded by the application. Camera.Parameters class has the encoding information only for pictures. The quality is not important, I only want to capture videos which can be directly played in the same Android application without any change.