Recherche avancée

Médias (0)

Mot : - Tags -/masques

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (95)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Contribute to documentation

    13 avril 2011

    Documentation is vital to the development of improved technical capabilities.
    MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
    To contribute, register to the project users’ mailing (...)

  • Use, discuss, criticize

    13 avril 2011, par

    Talk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
    The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
    A discussion list is available for all exchanges between users.

Sur d’autres sites (8049)

  • logos and subtitles in reverse using ffmpeg and OpenGL in iOS 5.0

    6 février 2012, par resident_

    I am using ffmpeg to play video on iOS 5.0. In my app with ffmpeg decoded video frames and use OpenGL to display it.

    But I have a problem I don't resolve it. Chains logos and subtitles of the video image is displayed in reverse. I think that is the problem of rendering OpenGL 2.0 or ffmpeg decoded.

    Can you tell me what is wrong ?, and How I can fix it ?

    Very thanks,

    Edit : I change my prepareTExture method with this :

    - (void) prepareTextureW: (GLuint) texW textureHeight: (GLuint) texH frameWidth: (GLuint) frameW frameHeight: (GLuint) frameH {

    float aspect = (float)frameW/(float)frameH;
    float minX=-1.f, minY=-1.f, maxX=1.f, maxY=1.f;
    float scale ;
    if(aspect>=(float)backingHeight/(float)backingWidth){
       // Aspect ratio will retain width.
       scale = (float)backingHeight / (float) frameW;
       maxY = ((float)frameH * scale) / (float) backingWidth;
       minY = -maxY;
    } else {
       // Retain height.
       scale = (float) backingWidth / (float) frameW;
       maxX = ((float) frameW * scale) / (float) backingHeight;
       minX = -maxX;
    }
    if(frameTexture) glDeleteTextures(1, &frameTexture);
    glEnable(GL_TEXTURE_2D);
    glGenTextures(1, &frameTexture);
    glBindTexture(GL_TEXTURE_2D, frameTexture);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT );
    glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT );
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, texW, texH, 0, GL_RGB, GL_UNSIGNED_SHORT_5_6_5, NULL);          


    verts[0] = maxX;          
     verts[1] = maxY;
     verts[2] = minX;      
     verts[3] = maxY;
     verts[4] = maxX;  
     verts[5] = minY;
     verts[6] = minX;    
     verts[7] = minY;

    float s = (float) frameW / (float) texW;
    float t = (float) frameH / (float) texH;            

    texCoords[0] = 0.f;        texCoords[1] = 1.f;
    texCoords[2] = 1;          texCoords[3] = 1.f;
    texCoords[4] = 0.f;        texCoords[5] =0;
    texCoords[6] = 1;          texCoords[7] =0;

    mFrameH = frameH;
    mFrameW = frameW;
    mTexH = texH;
    mTexW = texW;
    maxS = s;
    maxT = t;

    // Just supporting one rotation direction, landscape left.  Rotate Z by 90 degrees.
    matSetRotZ(&rot,M_PI_2);

    matMul(&mvp, &rot, &rot);
    [self setupShader];    

    }

    And now this is my result : link image

  • HLS playlist of self-contained fmp4 segments

    16 octobre 2020, par Mathieu

    I am working on a VMS that stores 10 second long video segments in MPEGTS format. Those segments can then be streamed using HLS, with playlists that look like this :

    


    #EXTM3U
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-VERSION:6
#EXT-X-TARGETDURATION:11
#EXT-X-PLAYLIST-TYPE:EVENT
#EXT-X-START:TIME-OFFSET=1.0,PRECISE=YES
#EXTINF:10,
1602816779831000000.ts
#EXTINF:10,
1602816789831000000.ts
#EXT-X-ENDLIST


    


    This is working great as long as those files are encoded in h.264. However, if I try creating a similar playlist using h.265 segments, it works only with our Android client, Apple and hls.js having decided to support h.265 HLS using fragmented MP4 only.

    


    "Natively" supporting h.265 by storing fmp4 files directly isn't an option for me, so I would like to transpackage those MPEGTS files to fmp4 on demand.

    


    So what I have attempted to do is, return this playlist instead (changing only the file extension) :

    


    #EXTM3U
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-VERSION:6
#EXT-X-TARGETDURATION:11
#EXT-X-PLAYLIST-TYPE:EVENT
#EXT-X-START:TIME-OFFSET=1.0,PRECISE=YES
#EXTINF:10,
1602816779831000000.mp4
#EXTINF:10,
1602816789831000000.mp4
#EXT-X-ENDLIST


    


    and then lazily transpackage those MPEGTS files to fmp4 one by one using FFMPEG as they are getting requested :

    


    ffmpeg -i 1602816779831000000.ts -c copy -movflags frag_keyframe+empty_moov+default_base_moof 1602816779831000000.mp4
ffmpeg -i 1602816789831000000.ts -c copy -movflags frag_keyframe+empty_moov+default_base_moof 1602816789831000000.mp4


    


    Unfortunately, this seems to work only for playlists with a single segment (which means, up to 10 seconds in my case). As soon as I have 2+ files, it doesn't work, with a behavior that changes depending on which client I'm using : some will play the first file then stop, some will fast forward to the last file then play this one instead, some won't play at all...

    


    I understand that the "normal" approach for fmp4 streaming over HLS is to use a "media initialization" segment and put it in a #EXT-X-MAP header for each segment, which are then usually encoded as *.m4s files instead of *.mp4. However, is it possible to make fmp4 work over HLS with self-contained segments, similarly to what we can do with MPEGTS ? Since playlists with a single entry seem to support that, I would assume there is probably a way to do so.

    


    Also, I know Apple got inspired by MPEG-DASH for this part of the HLS spec, and from what I understand, this is possible in MPEG-DASH.

    


  • How can I record audio along with Aforge player video recording ?

    4 juin 2019, par H.NS

    My app is playing webcam video using Aforge Plyer. Now I want to record this video. Using Aforge player audio recording is not possible.

    Is there any way to record audio separately and merge it with recorded video using Aforge player ?

    Found that using Direct Show architecture is the way to achieve this. But it will be very difficult to change the architecture at the last time of development. Since I’m unaware with directshow concepts and almost 90 percentage of my project is completed with Aforge player.

    My current code is here. It can record video from selected webcam using aforge player. But audio is missing

    using AForge.Video;
    using AForge.Video.DirectShow;
    using Accord.Video.FFMPEG;

       private FilterInfoCollection VideoCaptureDevices;
       private VideoCaptureDevice FinalVideo = null;
       private VideoCaptureDeviceForm captureDevice;
       private Bitmap video;
       private VideoFileWriter FileWriter = new VideoFileWriter();
       private SaveFileDialog saveAvi;

       private void VideoRecord_Load(object sender, EventArgs e)
       {
           VideoCaptureDevices = new FilterInfoCollection(FilterCategory.VideoInputDevice);
           captureDevice = new VideoCaptureDeviceForm();
       }
       private void play_Click(object sender, EventArgs e)
       {
           if (captureDevice.ShowDialog(this) == DialogResult.OK)
           {
               VideoCaptureDevice videoSource = captureDevice.VideoDevice;
               FinalVideo = captureDevice.VideoDevice;
               FinalVideo.NewFrame += new NewFrameEventHandler(FinalVideo_NewFrame);
               FinalVideo.Start();
           }
       }
       void FinalVideo_NewFrame(object sender, NewFrameEventArgs eventArgs)
       {
           if (butStop.Text == "Stop Record")
           {
               video = (Bitmap)eventArgs.Frame.Clone();
               FileWriter.WriteVideoFrame(video);
           }
           else
           {
               video = (Bitmap)eventArgs.Frame.Clone();;
           }
       }
       private void Record_Click(object sender, EventArgs e)
       {
           saveAvi = new SaveFileDialog();
           saveAvi.Filter = "Avi Files (*.avi)|*.avi";
           if (saveAvi.ShowDialog() == System.Windows.Forms.DialogResult.OK)
           {
               int h = captureDevice.VideoDevice.VideoResolution.FrameSize.Height;
               int w = captureDevice.VideoDevice.VideoResolution.FrameSize.Width;
               FileWriter.Open(saveAvi.FileName, w, h, 25, VideoCodec.Default, 5000000);
               FileWriter.WriteVideoFrame(video);
               butStop.Text = "Stop Record";
           }
       }
       private void stopRecord_Click(object sender, EventArgs e)
       {
           if (butStop.Text == "Stop Record")
           {
               butStop.Text = "Stop";
               if (FinalVideo == null)
               { return; }
               if (FinalVideo.IsRunning)
               {
                   FileWriter.Close();
               }
           }
           else
           {
               this.FinalVideo.Stop();
               FileWriter.Close();
           }
       }
    }
    }