Recherche avancée

Médias (1)

Mot : - Tags -/école

Autres articles (45)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

  • Le plugin : Podcasts.

    14 juillet 2010, par

    Le problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
    Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
    Types de fichiers supportés dans les flux
    Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...)

Sur d’autres sites (6597)

  • Using FFmpeg with Android-NDK

    4 février 2014, par iSun

    I want to use FFmpeg with Android-NDK and I am very basic in this section, I need some help and I research about this topic before on the net, but got no satisfactory solution.

    1 - How can I use FFmpeg through Windows OS. (I do not want to use Ubuntu, please do not suggest.)

    2 - How can I inset FFmpeg commands via NDK ?

    Last notice : I using android-ndk-r7b and Cygwin.

    Any suggestion would be appreciated.

  • Build Live Audio Stream Player

    5 décembre 2011, par Kurt

    For an internship project i've been trying to develop a simple audio player for audio live stream.

    Currently i'm using a homemade three buffering (of 1/3 s each) solution played by QAudioOutput, which recall himself after finished his reading.

    void VideoServer::getBuf(QBuffer * p_buf)
    {
       audio_chunk*    ac = NULL;
       std::vector v;

       v.clear();
       for (int i = 0; i < 20;)
       {
           ac = _audioPreviewSharedData->deQueueAudio();
           if (ac)
           {
               v.insert(v.end(), ac->v_buf.begin(), ac->v_buf.end());
               i++;
               delete ac;
           }
           else
               usleep(50000);
       }
       p_buf->close();
       p_buf->setData((const char *)(&v[0]), v.size()*2);
       p_buf->open(QIODevice::ReadOnly);
    }

    -

    void VideoServer::slot_launchAudioPreviewBuffering()
    {
       getBuf(_buf1);
       getBuf(_buf2);
       _state = 2;
       connect(_audioPreviewTimer, SIGNAL(timeout()), this, SLOT(slot_audioPreviewBuffering()));
       _audioPreviewTimer->start(0);
       connect(_audioOut, SIGNAL(stateChanged(QAudio::State)), this, SLOT(finishedPlaying(QAudio::State)));
    }

    -

    void VideoServer::finishedPlaying(QAudio::State state)
    {
       if(state == QAudio::IdleState) {
           slot_audioPreviewBuffering();
       }
    }

    -

    void VideoServer::slot_audioPreviewBuffering()
    {
       switch (_state) {
       case 0:
           {
               _audioOut->start(_buf2);
               getBuf(_buf1);
               _state = 1;
               break;
           }
       case 1:
           {
               _audioOut->start(_buf3);
               getBuf(_buf2);
               _state = 2;
               break;
           }
       case 2:
           {
               _audioOut->start(_buf1);
               getBuf(_buf3);
               _state = 0;
               break;
           }
       }
    }

    But i'm suffering of choppy sound (little interruption between audio chunk).

    How to play this flux without interruption () and with a reasonable delay between audio and video (less 1s) ? Is there a best way ? Am i doing wrong ?

    Thank you !

  • Webcam stream with FFMpeg on iPhone

    6 décembre 2011, par Saphrosit

    I'm trying to send and show a webcam stream from a linux server to an iPhone app. I don't know if it's the best solution, but I downloaded and installed FFMpeg on the linux server (following, for those who want to know, this tutorial).
    FFMpeg is working fine. After a lots of wandering, I managed to send a stream to the client launching

    ffmpeg  -s 320x240 -f video4linux2 -i /dev/video0 -f mpegts -vcodec libx264 udp://192.168.1.34:1234

    where 192.168.1.34 is the address of the client. Actually the client is a Mac, but it is supposed to be an iPhone. I know the stream is sent and received correctly (tested in different ways).
    However I didn't managed to watch the stream directly on the iPhone.
    I thought of different (possible) solutions :

    • first solution : store incoming data in a NSMutableData object. Then, when the stream ends, store it and then play it using a MPMoviePlayerController. Here's the code :

      [video writeToFile:@"videoStream.m4v" atomically:YES];
      NSURL *url = [NSURL fileURLWithPath:@"videoStream.m4v"];

      MPMoviePlayerController *videoController = [[MPMoviePlayerController alloc] initWithContentURL:url];

      [videoController.view setFrame:CGRectMake(100, 100, 150, 150)];

      [self.view addSubview:videoController.view];

      [videoController play];

      the problem of this solution is that nothing is played (I only see a black square), even if the video is saved correctly (I can play it directly from my disk using VLC). Besides, it's not such a great idea. It's just to make things work.

    • Second solution : use CMSampleBufferRef to store the incoming video. Much more problems comes with this solution : first of all, there's no CoreMedia.framework in my system. Besides I do not get well what does this class represents and what should I do to make it works : I mean if I start (somehow) filling this "SampleBuffer" with bytes I receive from UDP connection, then it will automatically call the CMSampleBufferMakeDataReadyCallback function I set during creation ? If yes, when ? When the single frame is completed or when the whole stream is received ?

    • Third solution : use AVFoundation framework (neither this is actually available on my Mac). I did not understand if it's actually possible to start recording from a remote source or even from a NSMutableData, a char* or something like that. On AVFoundation Programming Guide I didn't find any reference that say if it's possible or not.

    I don't know which one of this solution is the best for my purpose. ANY suggestion would be appreciate.

    Besides, there's also another problem : I didn't use any segmenter program to send the video. Now, if I'm not getting wrong, segmenter needs to split the source video in smaller/shorter video easier to send. If it is right, then maybe it's not strictly necessary to make things work (may be added later). However, since the server is running under linux, I cannot use Apple's mediastreamsegmeter. May someone suggest an opensource segmenter to use in association with FFMpeg ?


    UPDATE : I edited my question adding more informations on what I did since now and what my doubts are.