Recherche avancée

Médias (0)

Mot : - Tags -/content

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (61)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Configuration spécifique d’Apache

    4 février 2011, par

    Modules spécifiques
    Pour la configuration d’Apache, il est conseillé d’activer certains modules non spécifiques à MediaSPIP, mais permettant d’améliorer les performances : mod_deflate et mod_headers pour compresser automatiquement via Apache les pages. Cf ce tutoriel ; mode_expires pour gérer correctement l’expiration des hits. Cf ce tutoriel ;
    Il est également conseillé d’ajouter la prise en charge par apache du mime-type pour les fichiers WebM comme indiqué dans ce tutoriel.
    Création d’un (...)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (6546)

  • How to play raw h264 produced by MediaCodec encoder ?

    1er novembre 2014, par jackos2500

    I’m a bit new when it comes to MediaCodec (and video encoding/decoding in general), so correct me if anything I say here is wrong.

    I want to play the raw h264 output of MediaCodec with VLC/ffplay. I need this to play becuase my end goal is to stream some live video to a computer, and MediaMuxer only produces a file on disk rather than something I can stream with (very) low latency to a desktop. (I’m open to other solutions, but I have not found anything else that fits the latency requirement)

    Here is the code I’m using encode the video and write it to a file : (it’s based off the MediaCodec example found here, only with the MediaMuxer part removed)

    package com.jackos2500.droidtop;

    import android.media.MediaCodec;
    import android.media.MediaCodecInfo;
    import android.media.MediaFormat;
    import android.opengl.EGL14;
    import android.opengl.EGLConfig;
    import android.opengl.EGLContext;
    import android.opengl.EGLDisplay;
    import android.opengl.EGLExt;
    import android.opengl.EGLSurface;
    import android.opengl.GLES20;
    import android.os.Environment;
    import android.util.Log;
    import android.view.Surface;

    import java.io.BufferedOutputStream;
    import java.io.File;
    import java.io.FileOutputStream;
    import java.io.IOException;
    import java.nio.ByteBuffer;

    public class StreamH264 {
       private static final String TAG = "StreamH264";
       private static final boolean VERBOSE = true;           // lots of logging

       // where to put the output file (note: /sdcard requires WRITE_EXTERNAL_STORAGE permission)
       private static final File OUTPUT_DIR = Environment.getExternalStorageDirectory();

       public static int MEGABIT = 1000 * 1000;
       private static final int IFRAME_INTERVAL = 10;

       private static final int TEST_R0 = 0;
       private static final int TEST_G0 = 136;
       private static final int TEST_B0 = 0;
       private static final int TEST_R1 = 236;
       private static final int TEST_G1 = 50;
       private static final int TEST_B1 = 186;

       private MediaCodec codec;
       private CodecInputSurface inputSurface;
       private BufferedOutputStream out;

       private MediaCodec.BufferInfo bufferInfo;
       public StreamH264() {

       }

       private void prepareEncoder() throws IOException {
           bufferInfo = new MediaCodec.BufferInfo();

           MediaFormat format = MediaFormat.createVideoFormat("video/avc", 1280, 720);
           format.setInteger(MediaFormat.KEY_BIT_RATE, 2 * MEGABIT);
           format.setInteger(MediaFormat.KEY_FRAME_RATE, 30);
           format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
           format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);

           codec = MediaCodec.createEncoderByType("video/avc");
           codec.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
           inputSurface = new CodecInputSurface(codec.createInputSurface());
           codec.start();

           File dst = new File(OUTPUT_DIR, "test.264");
           out = new BufferedOutputStream(new FileOutputStream(dst));
       }
       private void releaseEncoder() throws IOException {
           if (VERBOSE) Log.d(TAG, "releasing encoder objects");
           if (codec != null) {
               codec.stop();
               codec.release();
               codec = null;
           }
           if (inputSurface != null) {
               inputSurface.release();
               inputSurface = null;
           }
           if (out != null) {
               out.flush();
               out.close();
               out = null;
           }
       }
       public void stream() throws IOException {
           try {
               prepareEncoder();
               inputSurface.makeCurrent();
               for (int i = 0; i < (30 * 5); i++) {
                   // Feed any pending encoder output into the file.
                   drainEncoder(false);

                   // Generate a new frame of input.
                   generateSurfaceFrame(i);
                   inputSurface.setPresentationTime(computePresentationTimeNsec(i, 30));

                   // Submit it to the encoder.  The eglSwapBuffers call will block if the input
                   // is full, which would be bad if it stayed full until we dequeued an output
                   // buffer (which we can't do, since we're stuck here).  So long as we fully drain
                   // the encoder before supplying additional input, the system guarantees that we
                   // can supply another frame without blocking.
                   if (VERBOSE) Log.d(TAG, "sending frame " + i + " to encoder");
                   inputSurface.swapBuffers();
               }
               // send end-of-stream to encoder, and drain remaining output
               drainEncoder(true);
           } finally {
               // release encoder, muxer, and input Surface
               releaseEncoder();
           }
       }

       private void drainEncoder(boolean endOfStream) throws IOException {
           final int TIMEOUT_USEC = 10000;
           if (VERBOSE) Log.d(TAG, "drainEncoder(" + endOfStream + ")");

           if (endOfStream) {
               if (VERBOSE) Log.d(TAG, "sending EOS to encoder");
               codec.signalEndOfInputStream();
           }
           ByteBuffer[] outputBuffers = codec.getOutputBuffers();
           while (true) {
               int encoderStatus = codec.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC);
               if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                   // no output available yet
                   if (!endOfStream) {
                       break;      // out of while
                   } else {
                       if (VERBOSE) Log.d(TAG, "no output available, spinning to await EOS");
                   }
               } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                   // not expected for an encoder
                   outputBuffers = codec.getOutputBuffers();
               } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                   // should happen before receiving buffers, and should only happen once
                   MediaFormat newFormat = codec.getOutputFormat();
                   Log.d(TAG, "encoder output format changed: " + newFormat);
               } else if (encoderStatus < 0) {
                   Log.w(TAG, "unexpected result from encoder.dequeueOutputBuffer: " + encoderStatus);
                   // let's ignore it
               } else {
                   ByteBuffer encodedData = outputBuffers[encoderStatus];
                   if (encodedData == null) {
                       throw new RuntimeException("encoderOutputBuffer " + encoderStatus + " was null");
                   }

                   if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                       // The codec config data was pulled out and fed to the muxer when we got
                       // the INFO_OUTPUT_FORMAT_CHANGED status.  Ignore it.
                       if (VERBOSE) Log.d(TAG, "ignoring BUFFER_FLAG_CODEC_CONFIG");
                       bufferInfo.size = 0;
                   }

                   if (bufferInfo.size != 0) {
                       // adjust the ByteBuffer values to match BufferInfo (not needed?)
                       encodedData.position(bufferInfo.offset);
                       encodedData.limit(bufferInfo.offset + bufferInfo.size);

                       byte[] data = new byte[bufferInfo.size];
                       encodedData.get(data);
                       out.write(data);
                       if (VERBOSE) Log.d(TAG, "sent " + bufferInfo.size + " bytes to file");
                   }

                   codec.releaseOutputBuffer(encoderStatus, false);

                   if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                       if (!endOfStream) {
                           Log.w(TAG, "reached end of stream unexpectedly");
                       } else {
                           if (VERBOSE) Log.d(TAG, "end of stream reached");
                       }
                       break;      // out of while
                   }
               }
           }
       }
       private void generateSurfaceFrame(int frameIndex) {
           frameIndex %= 8;

           int startX, startY;
           if (frameIndex < 4) {
               // (0,0) is bottom-left in GL
               startX = frameIndex * (1280 / 4);
               startY = 720 / 2;
           } else {
               startX = (7 - frameIndex) * (1280 / 4);
               startY = 0;
           }

           GLES20.glClearColor(TEST_R0 / 255.0f, TEST_G0 / 255.0f, TEST_B0 / 255.0f, 1.0f);
           GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);

           GLES20.glEnable(GLES20.GL_SCISSOR_TEST);
           GLES20.glScissor(startX, startY, 1280 / 4, 720 / 2);
           GLES20.glClearColor(TEST_R1 / 255.0f, TEST_G1 / 255.0f, TEST_B1 / 255.0f, 1.0f);
           GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
           GLES20.glDisable(GLES20.GL_SCISSOR_TEST);
       }
       private static long computePresentationTimeNsec(int frameIndex, int frameRate) {
           final long ONE_BILLION = 1000000000;
           return frameIndex * ONE_BILLION / frameRate;
       }

       /**
        * Holds state associated with a Surface used for MediaCodec encoder input.
        * <p>
        * The constructor takes a Surface obtained from MediaCodec.createInputSurface(), and uses that
        * to create an EGL window surface.  Calls to eglSwapBuffers() cause a frame of data to be sent
        * to the video encoder.
        * </p><p>
        * This object owns the Surface -- releasing this will release the Surface too.
        */
       private static class CodecInputSurface {
           private static final int EGL_RECORDABLE_ANDROID = 0x3142;

           private EGLDisplay mEGLDisplay = EGL14.EGL_NO_DISPLAY;
           private EGLContext mEGLContext = EGL14.EGL_NO_CONTEXT;
           private EGLSurface mEGLSurface = EGL14.EGL_NO_SURFACE;

           private Surface mSurface;

           /**
            * Creates a CodecInputSurface from a Surface.
            */
           public CodecInputSurface(Surface surface) {
               if (surface == null) {
                   throw new NullPointerException();
               }
               mSurface = surface;

               eglSetup();
           }

           /**
            * Prepares EGL.  We want a GLES 2.0 context and a surface that supports recording.
            */
           private void eglSetup() {
               mEGLDisplay = EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY);
               if (mEGLDisplay == EGL14.EGL_NO_DISPLAY) {
                   throw new RuntimeException("unable to get EGL14 display");
               }
               int[] version = new int[2];
               if (!EGL14.eglInitialize(mEGLDisplay, version, 0, version, 1)) {
                   throw new RuntimeException("unable to initialize EGL14");
               }

               // Configure EGL for recording and OpenGL ES 2.0.
               int[] attribList = {
                       EGL14.EGL_RED_SIZE, 8,
                       EGL14.EGL_GREEN_SIZE, 8,
                       EGL14.EGL_BLUE_SIZE, 8,
                       EGL14.EGL_ALPHA_SIZE, 8,
                       EGL14.EGL_RENDERABLE_TYPE, EGL14.EGL_OPENGL_ES2_BIT,
                       EGL_RECORDABLE_ANDROID, 1,
                       EGL14.EGL_NONE
               };
               EGLConfig[] configs = new EGLConfig[1];
               int[] numConfigs = new int[1];
               EGL14.eglChooseConfig(mEGLDisplay, attribList, 0, configs, 0, configs.length,
                       numConfigs, 0);
               checkEglError("eglCreateContext RGB888+recordable ES2");

               // Configure context for OpenGL ES 2.0.
               int[] attrib_list = {
                       EGL14.EGL_CONTEXT_CLIENT_VERSION, 2,
                       EGL14.EGL_NONE
               };
               mEGLContext = EGL14.eglCreateContext(mEGLDisplay, configs[0], EGL14.EGL_NO_CONTEXT,
                       attrib_list, 0);
               checkEglError("eglCreateContext");

               // Create a window surface, and attach it to the Surface we received.
               int[] surfaceAttribs = {
                       EGL14.EGL_NONE
               };
               mEGLSurface = EGL14.eglCreateWindowSurface(mEGLDisplay, configs[0], mSurface,
                       surfaceAttribs, 0);
               checkEglError("eglCreateWindowSurface");
           }

           /**
            * Discards all resources held by this class, notably the EGL context.  Also releases the
            * Surface that was passed to our constructor.
            */
           public void release() {
               if (mEGLDisplay != EGL14.EGL_NO_DISPLAY) {
                   EGL14.eglMakeCurrent(mEGLDisplay, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_SURFACE,
                           EGL14.EGL_NO_CONTEXT);
                   EGL14.eglDestroySurface(mEGLDisplay, mEGLSurface);
                   EGL14.eglDestroyContext(mEGLDisplay, mEGLContext);
                   EGL14.eglReleaseThread();
                   EGL14.eglTerminate(mEGLDisplay);
               }

               mSurface.release();

               mEGLDisplay = EGL14.EGL_NO_DISPLAY;
               mEGLContext = EGL14.EGL_NO_CONTEXT;
               mEGLSurface = EGL14.EGL_NO_SURFACE;

               mSurface = null;
           }

           /**
            * Makes our EGL context and surface current.
            */
           public void makeCurrent() {
               EGL14.eglMakeCurrent(mEGLDisplay, mEGLSurface, mEGLSurface, mEGLContext);
               checkEglError("eglMakeCurrent");
           }

           /**
            * Calls eglSwapBuffers.  Use this to "publish" the current frame.
            */
           public boolean swapBuffers() {
               boolean result = EGL14.eglSwapBuffers(mEGLDisplay, mEGLSurface);
               checkEglError("eglSwapBuffers");
               return result;
           }

           /**
            * Sends the presentation time stamp to EGL.  Time is expressed in nanoseconds.
            */
           public void setPresentationTime(long nsecs) {
               EGLExt.eglPresentationTimeANDROID(mEGLDisplay, mEGLSurface, nsecs);
               checkEglError("eglPresentationTimeANDROID");
           }

           /**
            * Checks for EGL errors.  Throws an exception if one is found.
            */
           private void checkEglError(String msg) {
               int error;
               if ((error = EGL14.eglGetError()) != EGL14.EGL_SUCCESS) {
                   throw new RuntimeException(msg + ": EGL error: 0x" + Integer.toHexString(error));
               }
           }
       }
    }
    </p>

    However, the file produced from this code does not play with VLC or ffplay. Can anyone tell me what I’m doing wrong ? I believe it is due to an incorrect format (or total lack) of headers required for the playing of raw h264, as I have had success playing .264 files downloaded from the internet with ffplay. Also, I’m not sure exactly how I’m going to stream this video to a computer, so if somebody could give me some suggestions as to how I might do that, I would be very grateful ! Thanks !

  • FFMPEG Encoding Issues

    16 octobre 2014, par madprogrammer2015

    I am having issues encoding screen captures, into a h.264 file for viewing. The program below is cobbled together from examples here and here. The first example, uses an older version of the ffmpeg api. So I tried to update that example for use in my program. The file is created and has something written to it, but when I view the file. The encoded images are all distorted. I am able to run the video encoding example from the ffmpeg api successfully. This is my first time posting, so if I missed anything please let me know.

    I appreciate any assistance that is given.

    My program :

    #include
    #include <string>
    #include <sstream>
    #include
    #include <iostream>
    #include

    extern "C"{
    #include
    #include <libavcodec></libavcodec>avcodec.h>
    #include <libavutil></libavutil>imgutils.h>
    #include <libswscale></libswscale>swscale.h>
    #include <libavutil></libavutil>opt.h>
    }

    using namespace std;

    void ScreenShot(const char* BmpName, uint8_t *frame)
    {
       HWND DesktopHwnd = GetDesktopWindow();
       RECT DesktopParams;
       HDC DevC = GetDC(DesktopHwnd);
       GetWindowRect(DesktopHwnd,&amp;DesktopParams);
       DWORD Width = DesktopParams.right - DesktopParams.left;
       DWORD Height = DesktopParams.bottom - DesktopParams.top;

       DWORD FileSize = sizeof(BITMAPFILEHEADER)+sizeof(BITMAPINFOHEADER)+(sizeof(RGBTRIPLE)+1*(Width*Height*4));
       char *BmpFileData = (char*)GlobalAlloc(0x0040,FileSize);

       PBITMAPFILEHEADER BFileHeader = (PBITMAPFILEHEADER)BmpFileData;
       PBITMAPINFOHEADER  BInfoHeader = (PBITMAPINFOHEADER)&amp;BmpFileData[sizeof(BITMAPFILEHEADER)];

       BFileHeader->bfType = 0x4D42; // BM
       BFileHeader->bfSize = sizeof(BITMAPFILEHEADER);
       BFileHeader->bfOffBits = sizeof(BITMAPFILEHEADER)+sizeof(BITMAPINFOHEADER);

       BInfoHeader->biSize = sizeof(BITMAPINFOHEADER);
       BInfoHeader->biPlanes = 1;
       BInfoHeader->biBitCount = 32;
       BInfoHeader->biCompression = BI_RGB;
       BInfoHeader->biHeight = Height;
       BInfoHeader->biWidth = Width;

       RGBTRIPLE *Image = (RGBTRIPLE*)&amp;BmpFileData[sizeof(BITMAPFILEHEADER)+sizeof(BITMAPINFOHEADER)];
       RGBTRIPLE color;
       //pPixels = (RGBQUAD **)new RGBQUAD[sizeof(BITMAPFILEHEADER) + sizeof(BITMAPINFOHEADER)];
       int start = clock();

       HDC CaptureDC = CreateCompatibleDC(DevC);
       HBITMAP CaptureBitmap = CreateCompatibleBitmap(DevC,Width,Height);
       SelectObject(CaptureDC,CaptureBitmap);
       BitBlt(CaptureDC,0,0,Width,Height,DevC,0,0,SRCCOPY|CAPTUREBLT);
       GetDIBits(CaptureDC,CaptureBitmap,0,Height,frame,(LPBITMAPINFO)BInfoHeader, DIB_RGB_COLORS);

       int end = clock();

       cout &lt;&lt; "it took " &lt;&lt; end - start &lt;&lt; " to capture a frame" &lt;&lt; endl;

       DWORD Junk;
       HANDLE FH = CreateFileA(BmpName,GENERIC_WRITE,FILE_SHARE_WRITE,0,CREATE_ALWAYS,0,0);
       WriteFile(FH,BmpFileData,FileSize,&amp;Junk,0);
       CloseHandle(FH);
       GlobalFree(BmpFileData);
    }

    void video_encode_example(const char *filename, int codec_id)
    {
       AVCodec *codec;
       AVCodecContext *c= NULL;
       int i, ret, x, y, got_output;
       FILE *f;
       AVFrame *frame;
       AVPacket pkt;
       uint8_t endcode[] = { 0, 0, 1, 0xb7 };

       printf("Encode video file %s\n", filename);

       /* find the mpeg1 video encoder */
       codec = avcodec_find_encoder(AV_CODEC_ID_H264);
       if (!codec) {
           fprintf(stderr, "Codec not found\n");
           cin.get();
           exit(1);
       }

       c = avcodec_alloc_context3(codec);
       if (!c) {
           fprintf(stderr, "Could not allocate video codec context\n");
           cin.get();
           exit(1);
       }

       /* put sample parameters */
       c->bit_rate = 400000;
       /* resolution must be a multiple of two */
       c->width = 352;
       c->height = 288;
       /* frames per second */
       c->time_base.num=1;
       c->time_base.den = 25;
       c->gop_size = 10; /* emit one intra frame every ten frames */
       c->max_b_frames=1;
       c->pix_fmt = AV_PIX_FMT_YUV420P;

       if(codec_id == AV_CODEC_ID_H264)
           av_opt_set(c->priv_data, "preset", "slow", 0);

       /* open it */
       if (avcodec_open2(c, codec, NULL) &lt; 0) {
           fprintf(stderr, "Could not open codec\n");
           exit(1);
       }

       f = fopen(filename, "wb");
       if (!f) {
           fprintf(stderr, "Could not open %s\n", filename);
           exit(1);
       }

       frame = av_frame_alloc();
       if (!frame) {
           fprintf(stderr, "Could not allocate video frame\n");
           exit(1);
       }

       frame->format = c->pix_fmt;
       frame->width  = c->width;
       frame->height = c->height;

       /* the image can be allocated by any means and av_image_alloc() is
       just the most convenient way if av_malloc() is to be used */
       ret = av_image_alloc(frame->data, frame->linesize, c->width, c->height, c->pix_fmt, 32);

       if (ret &lt; 0) {
           fprintf(stderr, "Could not allocate raw picture buffer\n");
           exit(1);
       }

       /* encode 1 second of video */
       for(i=0;i&lt;250;i++) {
           av_init_packet(&amp;pkt);
           pkt.data = NULL;    // packet data will be allocated by the encoder
           pkt.size = 0;

           fflush(stdout);
           /* prepare a dummy image */
           /* Y */
           for(y=0;yheight;y++) {
               for(x=0;xwidth;x++) {
                   frame->data[0][y * frame->linesize[0] + x] = x + y + i * 3;
               }
           }

           /* Cb and Cr */
           for(y=0;yheight/2;y++) {
               for(x=0;xwidth/2;x++) {
                   frame->data[1][y * frame->linesize[1] + x] = 128 + y + i * 2;
                   frame->data[2][y * frame->linesize[2] + x] = 64 + x + i * 5;
               }
           }

           frame->pts = i;

           /* encode the image */
           ret = avcodec_encode_video2(c, &amp;pkt, frame, &amp;got_output);
           if (ret &lt; 0) {
               fprintf(stderr, "Error encoding frame\n");
               exit(1);
           }

           if (got_output) {
               printf("Write frame %3d (size=%5d)\n", i, pkt.size);
               fwrite(pkt.data, 1, pkt.size, f);
               av_free_packet(&amp;pkt);
           }
       }

       /* get the delayed frames */
       for (got_output = 1; got_output; i++) {
           fflush(stdout);

           ret = avcodec_encode_video2(c, &amp;pkt, NULL, &amp;got_output);

           if (ret &lt; 0) {
               fprintf(stderr, "Error encoding frame\n");
               exit(1);
           }

           if (got_output) {
               printf("Write frame %3d (size=%5d)\n", i, pkt.size);
               fwrite(pkt.data, 1, pkt.size, f);
               av_free_packet(&amp;pkt);
           }
       }

       /* add sequence end code to have a real mpeg file */
       fwrite(endcode, 1, sizeof(endcode), f);
       fclose(f);

        avcodec_close(c);
        av_free(c);
        av_freep(&amp;frame->data[0]);
        av_frame_free(&amp;frame);
        printf("\n");
    }

    void write_video_frame()
    {
    }

    int lineSizeOfFrame(int width)
    {
       return  (width*24 + 31)/32 * 4;//((width*24 / 8) + 3) &amp; ~3;//(width*24 + 31)/32 * 4;
    }

    int getScreenshotWithCursor(uint8_t* frame)
    {
       int successful = 0;
           HDC screen, bitmapDC;
           HBITMAP screen_bitmap;

           screen = GetDC(NULL);
           RECT DesktopParams;

           HWND desktop = GetDesktopWindow();
           GetWindowRect(desktop, &amp;DesktopParams);

           int width = DesktopParams.right;
           int height = DesktopParams.bottom;

           bitmapDC = CreateCompatibleDC(screen);
           screen_bitmap = CreateCompatibleBitmap(screen, width, height);
           SelectObject(bitmapDC, screen_bitmap);
           if (BitBlt(bitmapDC, 0, 0, width, height, screen, 0, 0, SRCCOPY))
           {
                   int pos_x, pos_y;
                   HICON hcur;
                   ICONINFO icon_info;
                   CURSORINFO cursor_info;
                   cursor_info.cbSize = sizeof(CURSORINFO);
                   if (GetCursorInfo(&amp;cursor_info))
                   {
                           if (cursor_info.flags == CURSOR_SHOWING)
                           {
                                   hcur = CopyIcon(cursor_info.hCursor);
                                   if (GetIconInfo(hcur, &amp;icon_info))
                                   {
                                           pos_x = cursor_info.ptScreenPos.x - icon_info.xHotspot;
                                           pos_y = cursor_info.ptScreenPos.y - icon_info.yHotspot;
                                           DrawIcon(bitmapDC, pos_x, pos_y, hcur);
                                           if (icon_info.hbmColor) DeleteObject(icon_info.hbmColor);
                                           if (icon_info.hbmMask) DeleteObject(icon_info.hbmMask);
                                   }
                           }
                   }
                   int header_size = sizeof(BITMAPINFOHEADER) + 256*sizeof(RGBQUAD);
                   size_t line_size = lineSizeOfFrame(width);
                   PBITMAPINFO lpbi = (PBITMAPINFO) malloc(header_size);
                   lpbi->bmiHeader.biSize = header_size;
                   lpbi->bmiHeader.biWidth = width;
                   lpbi->bmiHeader.biHeight = height;
                   lpbi->bmiHeader.biPlanes = 1;
                   lpbi->bmiHeader.biBitCount = 24;
                   lpbi->bmiHeader.biCompression = BI_RGB;
                   lpbi->bmiHeader.biSizeImage = height*line_size;
                   lpbi->bmiHeader.biXPelsPerMeter = 0;
                   lpbi->bmiHeader.biYPelsPerMeter = 0;
                   lpbi->bmiHeader.biClrUsed = 0;
                   lpbi->bmiHeader.biClrImportant = 0;
                   if (GetDIBits(bitmapDC, screen_bitmap, 0, height, (LPVOID)frame, lpbi, DIB_RGB_COLORS))
                   {
                       int i;
                       uint8_t *buf_begin = frame;
                       uint8_t *buf_end = frame + line_size*(lpbi->bmiHeader.biHeight - 1);
                       void *temp = malloc(line_size);
                       for (i = 0; i &lt; lpbi->bmiHeader.biHeight / 2; ++i)
                       {
                           memcpy(temp, buf_begin, line_size);
                           memcpy(buf_begin, buf_end, line_size);
                           memcpy(buf_end, temp, line_size);
                           buf_begin += line_size;
                           buf_end -= line_size;
                       }
                       cout &lt;&lt; *buf_begin &lt;&lt; endl;
                       free(temp);
                       successful = 1;
                   }
                   free(lpbi);
           }
           DeleteObject(screen_bitmap);
           DeleteDC(bitmapDC);
           ReleaseDC(NULL, screen);
           return successful;
    }

    int main()
    {
       RECT DesktopParams;

       HWND desktop = GetDesktopWindow();
       GetWindowRect(desktop, &amp;DesktopParams);

       int width = DesktopParams.right;
       int height = DesktopParams.bottom;
       uint8_t *frame = (uint8_t *)malloc(width * height);

       AVCodec *codec;
       AVCodecContext *codecContext = NULL;
       AVPacket packet;
       FILE *f;
       AVFrame *pictureYUV = NULL;
       AVFrame *pictureRGB;

       avcodec_register_all();

       codec = avcodec_find_encoder(AV_CODEC_ID_H264);

       if(!codec)
       {
           cout &lt;&lt; "codec not found!" &lt;&lt; endl;
           cin.get();
           return 1;
       }
       else
       {
           cout &lt;&lt; "codec h265 found!" &lt;&lt; endl;
       }

       codecContext = avcodec_alloc_context3(codec);

       codecContext->bit_rate = width * height * 4;
       codecContext->width = width;
       codecContext->height = height;
       codecContext->time_base.num = 1;
       codecContext->time_base.den = 250;
       codecContext->gop_size = 10;
       codecContext->max_b_frames = 1;
       codecContext->keyint_min = 1;
       codecContext->i_quant_factor = (float)0.71;                        // qscale factor between P and I frames
       codecContext->b_frame_strategy = 20;                               ///// find out exactly what this does
       codecContext->qcompress = (float)0.6;                              ///// find out exactly what this does
       codecContext->qmin = 20;                                           // minimum quantizer
       codecContext->qmax = 51;                                           // maximum quantizer
       codecContext->max_qdiff = 4;                                       // maximum quantizer difference between frames
       codecContext->refs = 4;                                            // number of reference frames
       codecContext->trellis = 1;
       codecContext->pix_fmt = AV_PIX_FMT_YUV420P;
       codecContext->codec_id = AV_CODEC_ID_H264;
       codecContext->codec_type = AVMEDIA_TYPE_VIDEO;

       if(avcodec_open2(codecContext, codec, NULL) &lt; 0)
       {
           cout &lt;&lt; "couldn't open codec" &lt;&lt; endl;
           cout &lt;&lt; stderr &lt;&lt; endl;
           cin.get();
           return 1;
       }
       else
       {
           cout &lt;&lt; "opened h265 codec!" &lt;&lt; endl;
           cin.get();
       }

       f = fopen("test.h264", "wb");

       if(!f)
       {
           cout &lt;&lt; "Unable to open file" &lt;&lt; endl;
           return 1;
       }



       struct SwsContext *img_convert_ctx = sws_getContext(codecContext->width, codecContext->height, PIX_FMT_RGB32, codecContext->width,
           codecContext->height, codecContext->pix_fmt, SWS_BILINEAR, NULL, NULL, NULL);

       int got_output = 0, i = 0;
       uint8_t encode[] = { 0, 0, 1, 0xb7 };

       try
       {
           for(i = 0; i &lt; codecContext->time_base.den; i++)
           {
               av_init_packet(&amp;packet);
               packet.data = NULL;
               packet.size = 0;

               pictureRGB = av_frame_alloc();
               pictureYUV = av_frame_alloc();

               getScreenshotWithCursor(frame);
               //ScreenShot("example.bmp", frame);

               int nbytes = avpicture_get_size(AV_PIX_FMT_YUV420P, codecContext->width, codecContext->height);                                    // allocating outbuffer
               uint8_t* outbuffer = (uint8_t*)av_malloc(nbytes*sizeof(uint8_t));

               pictureRGB = av_frame_alloc();
               pictureYUV = av_frame_alloc();

               avpicture_fill((AVPicture*)pictureRGB, frame, PIX_FMT_RGB32, codecContext->width, codecContext->height);                   // fill image with input screenshot
               avpicture_fill((AVPicture*)pictureYUV, outbuffer, PIX_FMT_YUV420P, codecContext->width, codecContext->height);
               av_image_alloc(pictureYUV->data, pictureYUV->linesize, codecContext->width, codecContext->height, codecContext->pix_fmt, 32);

               sws_scale(img_convert_ctx, pictureRGB->data, pictureRGB->linesize, 0, codecContext->height, pictureYUV->data, pictureYUV->linesize);

               pictureYUV->pts = i;

               avcodec_encode_video2(codecContext, &amp;packet, pictureYUV, &amp;got_output);

               if(got_output)
               {
                   printf("Write frame %3d (size=%5d)\n", i, packet.size);
                   fwrite(packet.data, 1, packet.size, f);
                   av_free_packet(&amp;packet);
               }

               //av_frame_free(&amp;pictureRGB);
               //av_frame_free(&amp;pictureYUV);
           }

           for(got_output = 1; got_output; i++)
           {
               fflush(stdout);

               avcodec_encode_video2(codecContext, &amp;packet, NULL, &amp;got_output);

               if (got_output) {
                   printf("Write frame %3d (size=%5d)\n", i, packet.size);
                   fwrite(packet.data, 1, packet.size, f);
                   av_free_packet(&amp;packet);
               }
           }
       }
       catch(std::exception ex)
       {
           cout &lt;&lt; ex.what() &lt;&lt; endl;
       }

       avcodec_close(codecContext);
       av_free(codecContext);
       av_freep(&amp;pictureYUV->data[0]);
       //av_frame_free(&amp;picture);
       fwrite(encode, 1, sizeof(encode), f);
       fclose(f);

       cin.get();

       return 0;
    }
    </iostream></sstream></string>
  • How to make your plugin configurable – Introducing the Piwik Platform

    18 septembre 2014, par Thomas Steur — Development

    This is the next post of our blog series where we introduce the capabilities of the Piwik platform (our previous post was How to add new pages and menu items to Piwik). This time you will learn how to define settings for your plugin. For this tutorial you will need to have basic knowledge of PHP.

    What can I do with settings ?

    The Settings API offers you a simple way to make your plugin configurable within the Admin interface of Piwik without having to deal with HTML, JavaScript, CSS or CSRF tokens. There are many things you can do with settings, for instance let users configure :

    • connection infos to a third party system such as a WordPress installation.
    • select a metric to be displayed in your widget
    • select a refresh interval for your widget
    • which menu items, reports or widgets should be displayed
    • and much more

    Getting started

    In this series of posts, we assume that you have already set up your development environment. If not, visit the Piwik Developer Zone where you’ll find the tutorial Setting up Piwik.

    To summarize the things you have to do to get setup :

    • Install Piwik (for instance via git).
    • Activate the developer mode : ./console development:enable --full.
    • Generate a plugin : ./console generate:plugin --name="MySettingsPlugin". There should now be a folder plugins/MySettingsPlugin.
    • And activate the created plugin under Settings => Plugins.

    Let’s start creating settings

    We start by using the Piwik Console to create a settings template :

    ./console generate:settings

    The command will ask you to enter the name of the plugin the settings should belong to. I will simply use the above chosen plugin name “MySettingsPlugin”. There should now be a file plugins/MySettingsPlugin/Settings.php which contains already some examples to get you started easily. To see the settings in action go to Settings => Plugin settings in your Piwik installation.

    Adding one or more settings

    Settings are added in the init() method of the settings class by calling the method addSetting() and passing an instance of a UserSetting or SystemSetting object. How to create a setting is explained in the next chapter.

    Customising a setting

    To create a setting you have to define a name along some options. For instance which input field should be displayed, what type of value you expect, a validator and more. Depending on the input field we might automatically validate the values for you. For example if you define available values for a select field then we make sure to validate and store only a valid value which provides good security out of the box.

    For a list of possible properties have a look at the SystemSetting and UserSetting API reference.

    class Settings extends \Piwik\Plugin\Settings
    {
       public $refreshInterval;

       protected function init()
       {
           $this-&gt;setIntroduction('Here you can specify the settings for this plugin.');

           $this-&gt;createRefreshIntervalSetting();
       }

       private function createRefreshIntervalSetting()
       {
           $this-&gt;refreshInterval = new UserSetting('refreshInterval', 'Refresh Interval');
           $this-&gt;refreshInterval-&gt;type = static::TYPE_INT;
           $this-&gt;refreshInterval-&gt;uiControlType = static::CONTROL_TEXT;
           $this-&gt;refreshInterval-&gt;uiControlAttributes = array('size' =&gt; 3);
           $this-&gt;refreshInterval-&gt;description = 'How often the value should be updated';
           $this-&gt;refreshInterval-&gt;inlineHelp = 'Enter a number which is &gt;= 15';
           $this-&gt;refreshInterval-&gt;defaultValue = '30';
           $this-&gt;refreshInterval-&gt;validate = function ($value, $setting) {
               if ($value &lt; 15) {
                   throw new \Exception('Value is invalid');
               }
           };

           $this-&gt;addSetting($this-&gt;refreshInterval);
       }
    }

    In this example you can see some of those properties. Here we create a setting named “refreshInterval” with the display name “Refresh Interval”. We want the setting value to be an integer and the user should enter this value in a text input field having the size 3. There is a description, an inline help and a default value of 30. The validate function makes sure to accept only integers that are at least 15, otherwise an error in the UI will be shown.

    You do not always have to specify a PHP type and a uiControlType. For instance if you specify a PHP type boolean we automatically display a checkbox by default. Similarly if you specify to display a checkbox we assume that you want a boolean value.

    Accessing settings values

    You can access the value of a setting in a widget, in a controller, in a report or anywhere you want. To access the value create an instance of your settings class and get the value like this :

    $settings = new Settings();
    $interval = $settings-&gt;refreshInterval-&gt;getValue()

    Type of settings

    The Piwik platform differentiates between UserSetting and SystemSetting. User settings can be configured by any logged in user and each user can configure the setting independently. The Piwik platform makes sure that settings are stored per user and that a user cannot see another users configuration.

    A system setting applies to all of your users. It can be configured only by a user who has super user access. By default, the value can be read only by a super user as well but often you want to have it readable by anyone or at least by logged in users. If you set a setting readable the value will still be only displayed to super users but you will always be able to access the value in the background.

    Imagine you are building a widget that fetches data from a third party system where you need to configure an API URL and token. While no regular user should see the value of both settings, the value should still be readable by any logged in user. Otherwise when logged in users cannot read the setting value then the data cannot be fetched in the background when this user wants to see the content of the widget. Solve this by making the setting readable by the current user :

    $setting-&gt;readableByCurrentUser = !Piwik::isUserIsAnonymous();

    Publishing your Plugin on the Marketplace

    In case you want to share your settings or your plugin with other Piwik users you can do this by pushing your plugin to a public GitHub repository and creating a tag. Easy as that. Read more about how to distribute a plugin.

    Advanced features

    Isn’t it easy to create settings for plugins ? We never even created a file ! The Settings API already offers many possibilities but it might not yet be as flexible as your use case requires. So let us know in case you are missing something and we hope to add this feature at some point in the future.

    If you have any feedback regarding our APIs or our guides in the Developer Zone feel free to send it to us.