Recherche avancée

Médias (1)

Mot : - Tags -/école

Autres articles (97)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • MediaSPIP Core : La Configuration

    9 novembre 2010, par

    MediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
    Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...)

Sur d’autres sites (7704)

  • Syncing Audio with Video iOS RTSP Player

    3 octobre 2015, par Dave Thomas

    I am combining two different classes from two different git projects to create an RTSP streamer for an iOS live streaming application.

    edit : Agree with the -1 this question is probably a shot in the dark. But, to answer the question if I am asked "why I am not using entirely the DFURTSPPlayer library ?" Because I would rather use the YUV display with opengl of the second project, hackcam, rather than decode the video frames into a UIImages like DFURTS does. Hackcam does not have audio

    Also please comment if you down vote, at least help me find an answer by telling me what I need to refine to be clear or point out if this question is inappropriate

    My current issue is that the audio playback has about a 1 second latency, and is out of sync with the video which is close to real time.

    I know that the audio is in sync because I’ve tested the RTSP streams in VLC. Something is wrong with my implementation. Mostly frankensteining these too projects together and the fact that I am not familiar with ffmpeg c library or AudioQueue for iOS.

    Any help would be greatly appreciated !

    I’ve taken the AudioStreamer class from this repository :
    https://github.com/durfu/DFURTSPPlayer

    https://github.com/durfu/DFURTSPPlayer/blob/master/DFURTSPPlayer/DFURTSPPlayer/FFMpegDecoder/AudioStreamer.m

    And I am trying to get it to work with this one :
    https://github.com/hackacam/ios_rtsp_player/blob/master/src/FfmpegWrapper.m

    I can post more code if needed, but my main loop in FfmpegWrapper now looks like this (_audioController is reference to AudioStreamer.m) :

    -(int) startDecodingWithCallbackBlock: (void (^) (AVFrameData *frame)) frameCallbackBlock
                         waitForConsumer: (BOOL) wait
                      completionCallback: (void (^)()) completion
    {
       OSMemoryBarrier();
       _stopDecode=false;
       dispatch_queue_t decodeQueue = dispatch_queue_create("decodeQueue", NULL);
       dispatch_async(decodeQueue, ^{
           int frameFinished;
           OSMemoryBarrier();
           while (self->_stopDecode==false){
               @autoreleasepool {
                   CFTimeInterval currentTime = CACurrentMediaTime();
                   if ((currentTime-_previousDecodedFrameTime) > MIN_FRAME_INTERVAL &&
                       av_read_frame(_formatCtx, &_packetFFmpeg)>=0) {

                       _previousDecodedFrameTime = currentTime;
                       // Is this a packet from the video stream?
                       if(_packetFFmpeg.stream_index==_videoStream) {
                           // Decode video frame
                           avcodec_decode_video2(_codecCtx, _frame, &frameFinished,
                                                 &_packetFFmpeg);

                           // Did we get a video frame?
                           if(frameFinished) {
                               // create a frame object and call the block;
                               AVFrameData *frameData = [self createFrameData:_frame trimPadding:YES];
                               frameCallbackBlock(frameData);
                           }

                           // Free the packet that was allocated by av_read_frame
                           av_free_packet(&_packetFFmpeg);

                       } else if (_packetFFmpeg.stream_index==audioStream) {

                           // NSLog(@"audio stream");
                           [audioPacketQueueLock lock];

                           audioPacketQueueSize += _packetFFmpeg.size;
                           [audioPacketQueue addObject:[NSMutableData dataWithBytes:&_packetFFmpeg length:sizeof(_packetFFmpeg)]];

                           [audioPacketQueueLock unlock];

                           if (!primed) {
                               primed=YES;
                               [_audioController _startAudio];
                           }

                           if (emptyAudioBuffer) {
                               [_audioController enqueueBuffer:emptyAudioBuffer];
                           }

                           //av_free_packet(&_packetFFmpeg);

                       } else {

                           // Free the packet that was allocated by av_read_frame
                           av_free_packet(&_packetFFmpeg);
                       }


                   } else{
                       usleep(1000);
                   }
               }
           }
           completion();
       });
       return 0;
    }

    Enqueue Buffer in AudioStreamer :

    - (OSStatus)enqueueBuffer:(AudioQueueBufferRef)buffer
    {
       OSStatus status = noErr;

       if (buffer) {
           AudioTimeStamp bufferStartTime;
           buffer->mAudioDataByteSize = 0;
           buffer->mPacketDescriptionCount = 0;

           if (_streamer.audioPacketQueue.count <= 0) {
               _streamer.emptyAudioBuffer = buffer;
               return status;
           }

           _streamer.emptyAudioBuffer = nil;

           while (_streamer.audioPacketQueue.count && buffer->mPacketDescriptionCount < buffer->mPacketDescriptionCapacity) {
               AVPacket *packet = [_streamer readPacket];

               if (buffer->mAudioDataBytesCapacity - buffer->mAudioDataByteSize >= packet->size) {
                   if (buffer->mPacketDescriptionCount == 0) {
                       bufferStartTime.mSampleTime = packet->dts * _audioCodecContext->frame_size;
                       bufferStartTime.mFlags = kAudioTimeStampSampleTimeValid;
                   }

                   memcpy((uint8_t *)buffer->mAudioData + buffer->mAudioDataByteSize, packet->data, packet->size);
                   buffer->mPacketDescriptions[buffer->mPacketDescriptionCount].mStartOffset = buffer->mAudioDataByteSize;
                   buffer->mPacketDescriptions[buffer->mPacketDescriptionCount].mDataByteSize = packet->size;
                   buffer->mPacketDescriptions[buffer->mPacketDescriptionCount].mVariableFramesInPacket = _audioCodecContext->frame_size;

                   buffer->mAudioDataByteSize += packet->size;
                   buffer->mPacketDescriptionCount++;


                   _streamer.audioPacketQueueSize -= packet->size;

                   av_free_packet(packet);
               }
               else {

                   //av_free_packet(packet);
                   break;
               }
           }

           [decodeLock_ lock];
           if (buffer->mPacketDescriptionCount > 0) {
               status = AudioQueueEnqueueBuffer(audioQueue_, buffer, 0, NULL);
               if (status != noErr) {
                   NSLog(@"Could not enqueue buffer.");
               }
           } else {
               AudioQueueStop(audioQueue_, NO);
               finished_ = YES;
           }

           [decodeLock_ unlock];
       }

       return status;
    }

    Read packet in ffmpegwrapper :

    - (AVPacket*)readPacket
    {
       if (_currentPacket.size > 0 || _inBuffer) return &_currentPacket;

       NSMutableData *packetData = [audioPacketQueue objectAtIndex:0];
       _packet = [packetData mutableBytes];

       if (_packet) {
           if (_packet->dts != AV_NOPTS_VALUE) {
               _packet->dts += av_rescale_q(0, AV_TIME_BASE_Q, _audioStream->time_base);
           }

           if (_packet->pts != AV_NOPTS_VALUE) {
               _packet->pts += av_rescale_q(0, AV_TIME_BASE_Q, _audioStream->time_base);
           }

           [audioPacketQueueLock lock];
           audioPacketQueueSize -= _packet->size;
           if ([audioPacketQueue count] > 0) {
               [audioPacketQueue removeObjectAtIndex:0];
           }
           [audioPacketQueueLock unlock];

           _currentPacket = *(_packet);
       }

       return &_currentPacket;
    }
  • ffmpeg streaming camera with directshow

    17 novembre 2015, par atu0830

    I am trying use ffmpeg to streaming one camera. The command is

    ffmpeg.exe -y  -f dshow -i video="AmCam" -c:v copy  -framerate 7.5 -map 0:0 -f ssegment -segment_time 4 -segment_format mpegts  -segment_list "web\stream.m3u8"  -segment_list_size 720  -segment_list_flags live  -segment_wrap 10 -segment_list_type m3u8  "web\segments\s%%d.ts"

    And I create a html in web folder

                                         
                                   
                                       
                                     
       <video controls="controls" width="720" height="405" autoplay="autoplay">  
           <source src="stream.m3u8" type="application/x-mpegURL"></source>          
       </video>                                    
                                       

    All ts file generated but looking Safari on iPad looding but it always show dark player and loading

  • Catalyst Open Source Academy

    16 janvier 2015, par Matthieu Aubry — Community

    The Open Source Academy is an initiative designed to provide training and work experience for young New Zealand technologists. Catalyst organises the Academy to show young technologists how to participate in open source communities and to fully explore their passion for IT through freely available open source tools.

    It has been running annually since 2011. We are proud that Piwik project could participate in the Academy again this year !

    What students got done

    It’s amazing what a few young students can get done in four days of participating in an open source project like Piwik ! They were able to quickly get started with Piwik, and continued to make useful contributions to the Piwik analytics platform.

    New Darkness theme

    Liam has created a new dark theme for Piwik called Darkness.

    Darkness theme

    To create the theme, Liam had to improve Piwik core stylesheets and created this pull request : Reuse the LESS variable for white color across all stylesheets.

    Accessibility improvements

    We were lucky to spend time with Julius, a Catalyst employee who is blind. He showed us in great detail how difficult and time consuming it can be for a blind user to use Piwik. For example we noticed how complicated it was for Julius to navigate the menus, to get to the main content, and to use the calendar and the Website selector. During this presentation we also noticed that Piwik was not yet usable with the keyboard.

    As a result of this session with Julius we got to work with the students to improve accessibility in Piwik.

    Accessibility session on Piwik

    (photo source)

    List of accessibility improvements

    All these pull requests were created by the students and have been successfully merged into Piwik :

    To learn more about accessibility in Piwik check out this issue on our tracker.

    Summary

    Working with young students was fun and interesting. We were excited to see how much they got done in such a short time !

    At Piwik and Piwik PRO we are committed to building the best open analytics platform, and we will continue to support students who want to take part in the Piwik adventure.

    Be well,