Recherche avancée

Médias (1)

Mot : - Tags -/ticket

Autres articles (62)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 is the first MediaSPIP stable release.
    Its official release date is June 21, 2013 and is announced here.
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Création définitive du canal

    12 mars 2010, par

    Lorsque votre demande est validée, vous pouvez alors procéder à la création proprement dite du canal. Chaque canal est un site à part entière placé sous votre responsabilité. Les administrateurs de la plateforme n’y ont aucun accès.
    A la validation, vous recevez un email vous invitant donc à créer votre canal.
    Pour ce faire il vous suffit de vous rendre à son adresse, dans notre exemple "http://votre_sous_domaine.mediaspip.net".
    A ce moment là un mot de passe vous est demandé, il vous suffit d’y (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (8437)

  • ffmpeg : unsync audio after processing

    18 novembre 2013, par QuickSilver

    I am recording a video and using RecordRTC : WebRTC . After receiving the webm video and wav audio at server, I'm encoding it to a mp4 file using ffmpeg(executing shell command via php). But after encoding process, the audio is unsync with video (audio ends before video). How can I fix this ?

    js code is here

    ffmpeg command used is :

    ffmpeg -y -i 166890589.wav -i 166890589.webm -vcodec libx264 166890589.mp4

    Console output :

    ffmpeg version 0.8.9-6:0.8.9-0ubuntu0.13.04.1, Copyright (c) 2000-2013 the Libav developers
     built on Nov  9 2013 19:15:52 with gcc 4.7.3
    *** THIS PROGRAM IS DEPRECATED ***
    This program is only provided for compatibility and will be removed in a future release. Please use avconv instead.
    [wav @ 0x81caa60] max_analyze_duration reached
    Input #0, wav, from '166890589.wav':
     Duration: 00:00:07.05, bitrate: 1411 kb/s
       Stream #0.0: Audio: pcm_s16le, 44100 Hz, 2 channels, s16, 1411 kb/s
    [matroska,webm @ 0x823c340] Unknown entry 0x63C5
    [matroska,webm @ 0x823c340] Estimating duration from bitrate, this may be inaccurate

    Seems stream 0 codec frame rate differs from container frame rate: 1000.00 (1000/1) -> 10.00 (10/1)
    Input #1, matroska,webm, from '166890589.webm':
     Duration: 00:00:08.40, start: 0.000000, bitrate: N/A
       Stream #1.0: Video: vp8, yuv420p, 320x240, PAR 1:1 DAR 4:3, 10 tbr, 1k tbn, 1k tbc (default)
    [buffer @ 0x8245620] w:320 h:240 pixfmt:yuv420p
    [libx264 @ 0x82618a0] using SAR=1/1
    [libx264 @ 0x82618a0] using cpu capabilities: MMX2 SSE2Fast SSSE3 FastShuffle SSE4.2 AVX
    [libx264 @ 0x82618a0] profile Main, level 1.1
    [libx264 @ 0x82618a0] 264 - core 123 r2189 35cf912 - H.264/MPEG-4 AVC codec - Copyleft 2003-2012 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x1:0x111 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=1 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=0 b_adapt=1 b_bias=0 direct=1 weightb=0 open_gop=1 weightp=2 keyint=250 keyint_min=10 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.25 aq=1:1.00
    Output #0, mp4, to '166890589.mp4':
     Metadata:
       encoder         : Lavf53.21.1
       Stream #0.0: Video: libx264, yuv420p, 320x240 [PAR 1:1 DAR 4:3], q=-1--1, 10 tbn, 10 tbc (default)
       Stream #0.1: Audio: libvo_aacenc, 44100 Hz, 2 channels, s16, 200 kb/s
    Stream mapping:
     Stream #1.0 -> #0.0
     Stream #0.0 -> #0.1
    Press ctrl-c to stop encoding
    frame=   84 fps=  0 q=25.0 Lsize=     260kB time=7.06 bitrate= 301.3kbits/s    
    video:83kB audio:172kB global headers:0kB muxing overhead 1.783102%
    frame I:1     Avg QP:17.52  size:  6554
    [libx264 @ 0x82618a0] frame P:41    Avg QP:19.07  size:  1555
    [libx264 @ 0x82618a0] frame B:42    Avg QP:20.01  size:   325
    [libx264 @ 0x82618a0] consecutive B-frames: 33.3%  0.0%  0.0% 66.7%
    [libx264 @ 0x82618a0] mb I  I16..4: 36.7%  0.0% 63.3%
    [libx264 @ 0x82618a0] mb P  I16..4:  3.8%  0.0%  5.0%  P16..4: 34.3%  9.8%  7.1%  0.0%  0.0%    skip:40.0%
    [libx264 @ 0x82618a0] mb B  I16..4:  1.4%  0.0%  0.1%  B16..8: 37.5%  5.5%  0.4%  direct: 2.5%  skip:52.5%  L0:41.4% L1:51.5% BI: 7.2%
    [libx264 @ 0x82618a0] coded y,uvDC,uvAC intra: 40.7% 76.2% 26.8% inter: 10.9% 22.5% 2.7%
    [libx264 @ 0x82618a0] i16 v,h,dc,p: 28% 34% 23% 15%
    [libx264 @ 0x82618a0] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 29% 28% 21%  3%  4%  4%  5%  3%  3%
    [libx264 @ 0x82618a0] i8c dc,h,v,p: 49% 20% 26%  4%
    [libx264 @ 0x82618a0] Weighted P-Frames: Y:0.0% UV:0.0%
    [libx264 @ 0x82618a0] ref P L0: 62.7%  4.6% 16.3% 16.4%
    [libx264 @ 0x82618a0] ref B L0: 79.0% 21.0%
    [libx264 @ 0x82618a0] kb/s:80.00
  • Multiple Video Streams in one Feed ffmpeg

    1er février 2014, par trueblue

    We are trying to send live stream from two webcams as below :

    ffmpeg -f video4linux2 -i /dev/video0 -f video4linux2 -i /dev/video1 http://127.0.0.1:8090/feed1.ffm

    We want to play both the streams using any players available. When we use VLC, to open the stream, we get only one stream( from /dev/video0). The command is as below :

    vlc http://127.0.0.1:8090/test.mpg

    Here I am running ffserver in my machine and trying to access as localhost. My ffserver config is as below :

    Port 8090
    BindAddress 0.0.0.0
    MaxHTTPConnections 2000
    MaxClients 1000
    MaxBandwidth 1000
    CustomLog -
    NoDaemon

    <feed>
    File /tmp/feed1.ffm
    FileMaxSize 1G
    ACL allow 127.0.0.1
    </feed>

    <stream>

    # coming from live feed &#39;feed1&#39;
    Feed feed1.ffm
    Format mpeg
    VideoBufferSize 40000
    VideoSize 1280x720
    VideoCodec mpeg1video
    NoAudio
    ACL ALLOW 127.0.0.1
    </stream>

    <stream>
    Format status

    # Only allow local people to get the status
    ACL allow localhost
    ACL allow 192.168.0.0 192.168.255.255

    #FaviconURL http://pond1.gladstonefamily.net:8080/favicon.ico
    </stream>


    # Redirect index.html to the appropriate site

    <redirect>
    URL http://www.ffmpeg.org/
    </redirect>

    If we try the below command to save two streams onto the file, we are getting two instances of VLC player and both Streams can be seen :

    ffmpeg -f video4linux2 -i /dev/video0 -f video4linux2 -i /dev/video1 /home/2Streams.mpg

    Its a strange behavior I am able to save two Video Streams as a file but I am unable to send Two Video Streams in one Single feed. Kindly help me out in achieving the same.

    Regards

  • Android bytedeco javacpp ffmpeg decode h264 bytes to yuv and render with openGL ES 2.0. Wrong colors

    9 juin 2015, par tema_man

    there ! I try to display a video stream, which comes from server as byte array.
    Data in this array is h264 encoded image and i decode it with bytedeco javacpp-presets library in this way :

    public class DMDecoder {

    private static final String LOG_TAG = "DMDecoder";

    private AVCodec avCodec;
    private AVCodecContext avCodecContext;
    private AVFrame avFrame;
    private AVPacket avPacket;
    private boolean wasIFrame;
    private long IFrameTimeStampMs;
    private int maxFps;
    private int codecId;

    private DMDecoderCallback callback;

    public DMDecoder(DMDecoderCallback cb) {
       this.callback = cb;
       this.codecId = AV_CODEC_ID_H264;
       avcodec_register_all();
       restart();
    }

    public void restart() {
       stop();
       start();
    }

    public void stop() {
       frames = 0;
       if (avCodecContext != null) {
           avcodec_close(avCodecContext);
           avcodec_free_context(avCodecContext);
           avCodecContext = null;
       }

       if (avCodec != null) {
           av_free(avCodec);
           avCodec = null;
       }

       if (avFrame != null) {
           av_frame_free(avFrame);
           avFrame = null;
       }

       if (avPacket != null) {
           av_free_packet(avPacket);
           avPacket = null;
       }
    }

    public void start() {
       avCodec = avcodec_find_decoder(codecId);

       avCodecContext = avcodec_alloc_context3(avCodec);
       AVDictionary opts = new AVDictionary();
       avcodec_open2(avCodecContext, avCodec, opts);

       avFrame = av_frame_alloc();
       avPacket = new AVPacket();
       av_init_packet(avPacket);
    }

    public VideoFrame decode(byte[] data, int dataOffset, int dataSize) {
       avPacket.pts(AV_NOPTS_VALUE);
       avPacket.dts(AV_NOPTS_VALUE);
       avPacket.data(new BytePointer(data).position(dataOffset));
       avPacket.size(dataSize);
       avPacket.pos(-1);

       IntBuffer gotPicture = IntBuffer.allocate(1);

       int processedBytes = avcodec_decode_video2(
               avCodecContext, avFrame, gotPicture, avPacket);

       if (avFrame.width() == 0 || avFrame.height() == 0) return null;

       VideoFrame frame = new VideoFrame();

      frame.colorPlane0 = new byte[avFrame.width() * avFrame.height()];
      frame.colorPlane1 = new byte[avFrame.width() / 2 * avFrame.height() / 2];
      frame.colorPlane2 = new byte[avFrame.width() / 2 * avFrame.height() / 2];

       if (avFrame.data(0) != null) avFrame.data(0).get(frame.colorPlane0);
       if (avFrame.data(1) != null) avFrame.data(1).get(frame.colorPlane1);
       if (avFrame.data(2) != null) avFrame.data(2).get(frame.colorPlane2);

       frame.lineSize0 = avFrame.width();
       frame.lineSize1 = avFrame.width() / 2;
       frame.lineSize2 = avFrame.width() / 2;

       frame.width = avFrame.width();
       frame.height = avFrame.height();

       return frame;
     }
    }

    VideoFrame class is just simple POJO :

    public class VideoFrame {
       public byte[] colorPlane0;
       public byte[] colorPlane1;
       public byte[] colorPlane2;
       public int lineSize0;
       public int lineSize1;
       public int lineSize2;
       public int width;
       public int height;
       public long presentationTime;
    }

    After decoding i send this frame to my GLRenderer class

    public class GLRenderer implements GLSurfaceView.Renderer {

       private static final String LOG_TAG = "GLRenderer";

       private TexturePlane plane;

       private ConcurrentLinkedQueue<videoframe> frames;
       private int maxFps = 30;
       private VideoFrame currentFrame;
       private long startTime, endTime;
       private int viewWidth, viewHeight;
       private boolean isFirstFrameProcessed;

       public GLRenderer(int viewWidth, int viewHeight) {
           frames = new ConcurrentLinkedQueue&lt;>();
           this.viewWidth = viewWidth;
           this.viewHeight = viewHeight;
       }

       // mMVPMatrix is an abbreviation for "Model View Projection Matrix"
       private final float[] mMVPMatrix = new float[16];
       private final float[] mProjectionMatrix = new float[16];
       private final float[] mViewMatrix = new float[16];

       @Override

       public void onSurfaceCreated(GL10 unused, EGLConfig config) {
           // Set the background frame color
           GLES20.glClearColor(0.1f, 0.1f, 0.1f, 1.0f);

           plane = new TexturePlane();
       }

       public void setMaxFps(int maxFps) {
           this.maxFps = maxFps;
       }

       @Override
       public void onDrawFrame(GL10 unused) {


           // Draw background color
           GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);

           // Set the camera position (View matrix)
           Matrix.setLookAtM(mViewMatrix, 0, 0, 0, -3, 0f, 0f, 0f, 0f, 1.0f, 0.0f);

           // Calculate the projection and view transformation
           Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mViewMatrix, 0);

           if (!isFirstFrameProcessed) checkViewPort(viewWidth, viewHeight);

           if (maxFps > 0 &amp;&amp; startTime > 0) {
               endTime = System.currentTimeMillis();
               long time = endTime - startTime;
               //
               long wantedTime = 1000 / maxFps;
               //
               long wait;
               if (time &lt; wantedTime) {
                   wait = wantedTime - time;
                   //
                   try {
                       Thread.sleep(wait);
                   } catch (InterruptedException e) {
                       Log.e(LOG_TAG, "thread interrupted exception");
                   }
               }
           }
           startTime = System.currentTimeMillis();
           tick();
           plane.draw(mMVPMatrix);
       }

       private void updateFrame(VideoFrame frame) {
           plane.updateTexture(frame.colorPlane0, frame.width, frame.height, 0);
           plane.updateTexture(frame.colorPlane1, frame.width / 2, frame.height / 2, 1);
           plane.updateTexture(frame.colorPlane2, frame.width / 2, frame.height / 2, 2);
           plane.setTextureWidth(frame.width);
           plane.setTextureHeight(frame.height);
       }

       private void tick() {

           if (frames.isEmpty()) return;

           VideoFrame frame = frames.peek();
           if (frame == null) return;

           long tms = System.currentTimeMillis();
           if (frame.presentationTime &lt;= tms) {
               updateFrame(frame);
               currentFrame = frame;
               frames.remove(frame);
           }
       }

       @Override
       public void onSurfaceChanged(GL10 unused, int width, int height) {
           checkViewPort(width, height);
           viewWidth = width;
           viewHeight = height;
           plane.setTextureWidth(width);
           plane.setTextureHeight(height);
       }

       private void checkViewPort(int width, int height) {
           float viewRatio = (float) width / height;
           if (currentFrame != null) {
               float targetRatio = (float) currentFrame.width / currentFrame.height;
               int x, y, newWidth, newHeight;
               if (targetRatio > viewRatio) {
                   newWidth = width;
                   newHeight = (int) (width / targetRatio);
                   x = 0;
                   y = (height - newHeight) / 2;
               } else {
                   newHeight = height;
                   newWidth = (int) (height * targetRatio);
                   y = 0;
                   x = (width - newWidth) / 2;
               }
               GLES20.glViewport(x, y, newWidth, newHeight);
           } else {
               GLES20.glViewport(0, 0, width, height);
           }

           Matrix.frustumM(mProjectionMatrix, 0, 1, -1, -1, 1, 3, 4);
       }

       public void addFrame(VideoFrame frame) {
           if (frame != null) {
               frames.add(frame);
           }
       }
    }
    </videoframe>

    GLRenderer works with simple openGL polygon, on which i draw all textures

       public class TexturePlane {

       private static final String LOG_TAG = "TexturePlane";

       private final String vertexShaderCode = "" +
       "uniform mat4 uMVPMatrix;" +
       "attribute vec4 vPosition;" +
       "attribute vec2 a_TexCoordinate;" +
       "varying vec2 v_TexCoordinate;" +

       "void main() {" +
       "  gl_Position = uMVPMatrix * vPosition;" +
       "  v_TexCoordinate = a_TexCoordinate;" +
       "}";

       private final String fragmentShaderCode = "" +
       "precision mediump float;" +
       "varying vec2 v_TexCoordinate;" +
       "uniform sampler2D s_texture_y;" +
       "uniform sampler2D s_texture_u;" +
       "uniform sampler2D s_texture_v;" +

       "void main() {" +
       "   float y = texture2D(s_texture_y, v_TexCoordinate).r;" +
       "   float u = texture2D(s_texture_u, v_TexCoordinate).r - 0.5;" +
       "   float v = texture2D(s_texture_v, v_TexCoordinate).r - 0.5;" +

       "   float r = y + 1.13983 * v;" +
       "   float g = y - 0.39465 * u - 0.58060 * v;" +
       "   float b = y + 2.03211 * u;" +

       "   gl_FragColor = vec4(r, g, b, 1.0);" +

       "}";

       private final FloatBuffer vertexBuffer;
       private final FloatBuffer textureBuffer;
       private final ShortBuffer drawListBuffer;
       private final int mProgram;
       private int mPositionHandle;
       private int mMVPMatrixHandle;

           // number of coordinates per vertex in this array
       private static final int COORDS_PER_VERTEX = 3;
       private static final int COORDS_PER_TEXTURE = 2;

       private static float squareCoords[] = {
           -1f, 1f, 0.0f,
           -1f, -1f, 0.0f,
           1f, -1f, 0.0f,
           1f, 1f, 0.0f
       };

       private static float uvs[] = {
           0.0f, 0.0f,
           0.0f, 1.0f,
           1.0f, 1.0f,
           1.0f, 0.0f
       };

       private final short drawOrder[] = {0, 1, 2, 0, 2, 3}; // order to draw vertices
       private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertex

       private int textureWidth = 640;
       private int textureHeight = 480;

       private int yTextureUniformHandle;
       private int uTextureUniformHandle;
       private int vTextureUniformHandle;

       private int yTextureHandle;
       private int uTextureHandle;
       private int vTextureHandle;

       private int mTextureCoordinateHandle;

       public void setTextureWidth(int textureWidth) {
           this.textureWidth = textureWidth;
       }

       public int getTextureWidth() {
           return textureWidth;
       }

       public void setTextureHeight(int textureHeight) {
           this.textureHeight = textureHeight;
       }

       public int getTextureHeight() {
           return textureHeight;
       }

       /**
        * Sets up the drawing object data for use in an OpenGL ES context.
        */
       public TexturePlane() {
               // initialize vertex byte buffer for shape coordinates
           ByteBuffer bb = ByteBuffer.allocateDirect(squareCoords.length * 4);
           bb.order(ByteOrder.nativeOrder());
           vertexBuffer = bb.asFloatBuffer();
           vertexBuffer.put(squareCoords);
           vertexBuffer.position(0);

               // initialize byte buffer for the draw list
           ByteBuffer dlb = ByteBuffer.allocateDirect(drawOrder.length * 2);
           dlb.order(ByteOrder.nativeOrder());
           drawListBuffer = dlb.asShortBuffer();
           drawListBuffer.put(drawOrder);
           drawListBuffer.position(0);

               // initialize byte buffer for the draw list
           ByteBuffer tbb = ByteBuffer.allocateDirect(uvs.length * 4);
           tbb.order(ByteOrder.nativeOrder());
           textureBuffer = tbb.asFloatBuffer();
           textureBuffer.put(uvs);
           textureBuffer.position(0);

               mProgram = GLES20.glCreateProgram();             // create empty OpenGL Program
               compileShaders();
               setupTextures();
           }

           public void setupTextures() {
               yTextureHandle = setupTexture(null, textureWidth, textureHeight, 0);
               uTextureHandle = setupTexture(null, textureWidth, textureHeight, 1);
               vTextureHandle = setupTexture(null, textureWidth, textureHeight, 2);
           }

           public int setupTexture(ByteBuffer data, int width, int height, int index) {
               final int[] textureHandle = new int[1];

               GLES20.glGenTextures(1, textureHandle, 0);

               if (textureHandle[0] != 0) {
                       // Bind to the texture in OpenGL
                   GLES20.glActiveTexture(GLES20.GL_TEXTURE0 + index);
                   GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);

                   updateTexture(data, width, height, index);

                       // Set filtering
                   GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
                   GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);

                       // Set wrapping mode
                   GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_REPEAT);
                   GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_REPEAT);
               }

               if (textureHandle[0] == 0) {
                   Log.e(LOG_TAG, "Error loading texture.");
               }

               return textureHandle[0];
           }

           public void updateTexture(byte[] data, int width, int height, int index) {

               if (data == null) {
                   if (width == 0 || height == 0) {
                       width = textureWidth;
                       height = textureHeight;
                   }

                   data = new byte[width * height];
                   if (index == 0) {
                       Arrays.fill(data, y);
                   } else if (index == 1) {
                       Arrays.fill(data, u);
                   } else {
                       Arrays.fill(data, v);
                   }
               }

               byteBuffer.wrap(data);
               byteBuffer.position(0);

               GLES20.glActiveTexture(GLES20.GL_TEXTURE0 + index);

               GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE,
                   width, height, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, byteBuffer);
           }

           private void compileShaders() {
               // prepare shaders and OpenGL program
               int vertexShader = loadShader(
                   GLES20.GL_VERTEX_SHADER,
                   vertexShaderCode);
               int fragmentShader = loadShader(
                   GLES20.GL_FRAGMENT_SHADER,
                   fragmentShaderCode);

               GLES20.glAttachShader(mProgram, vertexShader);   // add the vertex shader to program
               GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
               GLES20.glLinkProgram(mProgram);                  // create OpenGL program executables
               checkGlError("glLinkProgram");

               // Add program to OpenGL environment
               GLES20.glUseProgram(mProgram);

               mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
               mTextureCoordinateHandle = GLES20.glGetAttribLocation(mProgram, "a_TexCoordinate");

               GLES20.glEnableVertexAttribArray(mPositionHandle);
               GLES20.glEnableVertexAttribArray(mTextureCoordinateHandle);

               yTextureUniformHandle = GLES20.glGetUniformLocation(mProgram, "s_texture_y");
               uTextureUniformHandle = GLES20.glGetUniformLocation(mProgram, "s_Texture_u");
               vTextureUniformHandle = GLES20.glGetUniformLocation(mProgram, "s_Texture_v");

               mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
               checkGlError("glGetUniformLocation");
           }

       /**
        * Utility method for compiling a OpenGL shader.
        * <p></p>
        * <p><strong>Note:</strong> When developing shaders, use the checkGlError()
        * method to debug shader coding errors.</p>
        *
        * @param type       - Vertex or fragment shader type.
        * @param shaderCode - String containing the shader code.
        * @return - Returns an id for the shader.
        */
       public int loadShader(int type, String shaderCode) {

               // create a vertex shader type (GLES20.GL_VERTEX_SHADER)
               // or a fragment shader type (GLES20.GL_FRAGMENT_SHADER)
           int shader = GLES20.glCreateShader(type);

               // add the source code to the shader and compile it
           GLES20.glShaderSource(shader, shaderCode);
           GLES20.glCompileShader(shader);

           return shader;
       }

       /**
        * Utility method for debugging OpenGL calls. Provide the name of the call
        * just after making it:
        * <p></p>
        * <pre>
        * mColorHandle = GLES20.glGetUniformLocation(mProgram, "vColor");
        * MyGLRenderer.checkGlError("glGetUniformLocation");</pre>
        *
        * If the operation is not successful, the check throws an error.
        *
        * @param glOperation - Name of the OpenGL call to check.
        */
       public void checkGlError(String glOperation) {
           int error;
           String errorString;
           while ((error = GLES20.glGetError()) != GLES20.GL_NO_ERROR) {
               errorString = GLU.gluErrorString(error);
               String message = glOperation + ": glError " + error + ": " + errorString;
               Log.e(LOG_TAG, message);
               throw new RuntimeException(message);
           }
       }

       public void draw(float[] mvpMatrix) {

               // Prepare the triangle coordinate data
           GLES20.glVertexAttribPointer(
               mPositionHandle, COORDS_PER_VERTEX,
               GLES20.GL_FLOAT, false,
               vertexStride, vertexBuffer);

           GLES20.glVertexAttribPointer(
               mTextureCoordinateHandle, COORDS_PER_TEXTURE,
               GLES20.GL_FLOAT, false,
               0, textureBuffer);

           GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
           checkGlError("glUniformMatrix4fv");

           GLES20.glUniform1i(yTextureUniformHandle, 0);
           GLES20.glUniform1i(uTextureUniformHandle, 1);
           GLES20.glUniform1i(vTextureUniformHandle, 2);

               // Draw the square
           GLES20.glDrawElements(
               GLES20.GL_TRIANGLES, drawOrder.length,
               GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
       }
    }

    But i have a problem there. My GL surface display image with wrong colors. image

    What i’m doing wrong ?

    UPDATE :

    As Ronald S. Bultje say, i added glBindTexture(...) function in my code. And now updateTexture(...) method looks like this :

    public void updateTexture(byte[] data, int width, int height, int index) {

       if (data == null) {
           if (width == 0 || height == 0) {
               width = textureWidth;
               height = textureHeight;
           }

           data = new byte[width * height];
           if (index == 0) {
               Arrays.fill(data, y);
           } else if (index == 1) {
               Arrays.fill(data, u);
           } else {
               Arrays.fill(data, v);
           }
       }

       byteBuffer.wrap(data);
       byteBuffer.position(0);

       GLES20.glActiveTexture(GLES20.GL_TEXTURE0 + index);

       int textureHandle = index == 0 ? yTextureHandle : index == 1 ? uTextureHandle : vTextureHandle;
       GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle);

       GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE,
           width, height, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, byteBuffer);
    }