Recherche avancée

Médias (1)

Mot : - Tags -/intégration

Autres articles (99)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (8581)

  • Live555 : X264 Stream Live source based on "testOnDemandRTSPServer"

    12 janvier 2017, par user2660369

    I am trying to create a rtsp Server that streams the OpenGL output of my program. I had a look at How to write a Live555 FramedSource to allow me to stream H.264 live, but I need the stream to be unicast. So I had a look at testOnDemandRTSPServer. Using the same Code fails. To my understanding I need to provide memory in which I store my h264 frames so the OnDemandServer can read them on Demand.

    H264VideoStreamServerMediaSubsession.cpp

    H264VideoStreamServerMediaSubsession*
    H264VideoStreamServerMediaSubsession::createNew(UsageEnvironment& env,
                             Boolean reuseFirstSource) {
     return new H264VideoStreamServerMediaSubsession(env, reuseFirstSource);
    }

    H264VideoStreamServerMediaSubsession::H264VideoStreamServerMediaSubsession(UsageEnvironment& env, Boolean reuseFirstSource)
     : OnDemandServerMediaSubsession(env, reuseFirstSource), fAuxSDPLine(NULL), fDoneFlag(0), fDummyRTPSink(NULL) {
    }

    H264VideoStreamServerMediaSubsession::~H264VideoStreamServerMediaSubsession() {
     delete[] fAuxSDPLine;
    }

    static void afterPlayingDummy(void* clientData) {
     H264VideoStreamServerMediaSubsession* subsess = (H264VideoStreamServerMediaSubsession*)clientData;
     subsess->afterPlayingDummy1();
    }

    void H264VideoStreamServerMediaSubsession::afterPlayingDummy1() {
     // Unschedule any pending 'checking' task:
     envir().taskScheduler().unscheduleDelayedTask(nextTask());
     // Signal the event loop that we're done:
     setDoneFlag();
    }

    static void checkForAuxSDPLine(void* clientData) {
     H264VideoStreamServerMediaSubsession* subsess = (H264VideoStreamServerMediaSubsession*)clientData;
     subsess->checkForAuxSDPLine1();
    }

    void H264VideoStreamServerMediaSubsession::checkForAuxSDPLine1() {
     char const* dasl;

     if (fAuxSDPLine != NULL) {
       // Signal the event loop that we're done:
       setDoneFlag();
     } else if (fDummyRTPSink != NULL && (dasl = fDummyRTPSink->auxSDPLine()) != NULL) {
       fAuxSDPLine = strDup(dasl);
       fDummyRTPSink = NULL;

       // Signal the event loop that we're done:
       setDoneFlag();
     } else {
       // try again after a brief delay:
       int uSecsToDelay = 100000; // 100 ms
       nextTask() = envir().taskScheduler().scheduleDelayedTask(uSecsToDelay,
                     (TaskFunc*)checkForAuxSDPLine, this);
     }
    }

    char const* H264VideoStreamServerMediaSubsession::getAuxSDPLine(RTPSink* rtpSink, FramedSource* inputSource) {
     if (fAuxSDPLine != NULL) return fAuxSDPLine; // it's already been set up (for a previous client)

     if (fDummyRTPSink == NULL) { // we're not already setting it up for another, concurrent stream
       // Note: For H264 video files, the 'config' information ("profile-level-id" and "sprop-parameter-sets") isn't known
       // until we start reading the file.  This means that "rtpSink"s "auxSDPLine()" will be NULL initially,
       // and we need to start reading data from our file until this changes.
       fDummyRTPSink = rtpSink;

       // Start reading the file:
       fDummyRTPSink->startPlaying(*inputSource, afterPlayingDummy, this);

       // Check whether the sink's 'auxSDPLine()' is ready:
       checkForAuxSDPLine(this);
     }

     envir().taskScheduler().doEventLoop(&fDoneFlag);

     return fAuxSDPLine;
    }

    FramedSource* H264VideoStreamServerMediaSubsession::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) {
     estBitrate = 500; // kb
     megamol::remotecontrol::View3D_MRC *parent = (megamol::remotecontrol::View3D_MRC*)this->parent;
     return H264VideoStreamFramer::createNew(envir(), parent->h264FramedSource);
    }

    RTPSink* H264VideoStreamServerMediaSubsession::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* /*inputSource*/) {
     return H264VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
    }

    FramedSource.cpp

    H264FramedSource* H264FramedSource::createNew(UsageEnvironment& env,
                                             unsigned preferredFrameSize,
                                             unsigned playTimePerFrame)
    {
       return new H264FramedSource(env, preferredFrameSize, playTimePerFrame);
    }

    H264FramedSource::H264FramedSource(UsageEnvironment& env,
                                  unsigned preferredFrameSize,
                                  unsigned playTimePerFrame)
       : FramedSource(env),
       fPreferredFrameSize(fMaxSize),
       fPlayTimePerFrame(playTimePerFrame),
       fLastPlayTime(0),
       fCurIndex(0)
    {

       x264_param_default_preset(&param, "veryfast", "zerolatency");
       param.i_threads = 1;
       param.i_width = 1024;
       param.i_height = 768;
       param.i_fps_num = 30;
       param.i_fps_den = 1;
       // Intra refres:
       param.i_keyint_max = 60;
       param.b_intra_refresh = 1;
       //Rate control:
       param.rc.i_rc_method = X264_RC_CRF;
       param.rc.f_rf_constant = 25;
       param.rc.f_rf_constant_max = 35;
       param.i_sps_id = 7;
       //For streaming:
       param.b_repeat_headers = 1;
       param.b_annexb = 1;
       x264_param_apply_profile(&param, "baseline");

       param.i_log_level = X264_LOG_ERROR;

       encoder = x264_encoder_open(&param);
       pic_in.i_type            = X264_TYPE_AUTO;
       pic_in.i_qpplus1         = 0;
       pic_in.img.i_csp         = X264_CSP_I420;
       pic_in.img.i_plane       = 3;


       x264_picture_alloc(&pic_in, X264_CSP_I420, 1024, 768);

       convertCtx = sws_getContext(1024, 768, PIX_FMT_RGBA, 1024, 768, PIX_FMT_YUV420P, SWS_FAST_BILINEAR, NULL, NULL, NULL);
       eventTriggerId = envir().taskScheduler().createEventTrigger(deliverFrame0);
    }

    H264FramedSource::~H264FramedSource()
    {
       envir().taskScheduler().deleteEventTrigger(eventTriggerId);
       eventTriggerId = 0;
    }

    void H264FramedSource::AddToBuffer(uint8_t* buf, int surfaceSizeInBytes)
    {
       uint8_t* surfaceData = (new uint8_t[surfaceSizeInBytes]);

       memcpy(surfaceData, buf, surfaceSizeInBytes);

       int srcstride = 1024*4;
       sws_scale(convertCtx, &surfaceData, &srcstride,0, 768, pic_in.img.plane, pic_in.img.i_stride);
       x264_nal_t* nals = NULL;
       int i_nals = 0;
       int frame_size = -1;


       frame_size = x264_encoder_encode(encoder, &nals, &i_nals, &pic_in, &pic_out);

       static bool finished = false;

       if (frame_size >= 0)
       {
       static bool alreadydone = false;
       if(!alreadydone)
       {

           x264_encoder_headers(encoder, &nals, &i_nals);
           alreadydone = true;
       }
       for(int i = 0; i < i_nals; ++i)
       {
           m_queue.push(nals[i]);
       }
       }
       delete [] surfaceData;
       surfaceData = nullptr;

       envir().taskScheduler().triggerEvent(eventTriggerId, this);
    }

    void H264FramedSource::doGetNextFrame()
    {
       deliverFrame();
    }

    void H264FramedSource::deliverFrame0(void* clientData)
    {
       ((H264FramedSource*)clientData)->deliverFrame();
    }

    void H264FramedSource::deliverFrame()
    {
       x264_nal_t nalToDeliver;

       if (fPlayTimePerFrame > 0 && fPreferredFrameSize > 0) {
       if (fPresentationTime.tv_sec == 0 && fPresentationTime.tv_usec == 0) {
           // This is the first frame, so use the current time:
           gettimeofday(&fPresentationTime, NULL);
       } else {
           // Increment by the play time of the previous data:
           unsigned uSeconds   = fPresentationTime.tv_usec + fLastPlayTime;
           fPresentationTime.tv_sec += uSeconds/1000000;
           fPresentationTime.tv_usec = uSeconds%1000000;
       }

       // Remember the play time of this data:
       fLastPlayTime = (fPlayTimePerFrame*fFrameSize)/fPreferredFrameSize;
       fDurationInMicroseconds = fLastPlayTime;
       } else {
       // We don't know a specific play time duration for this data,
       // so just record the current time as being the 'presentation time':
       gettimeofday(&fPresentationTime, NULL);
       }

       if(!m_queue.empty())
       {
       m_queue.wait_and_pop(nalToDeliver);

       uint8_t* newFrameDataStart = (uint8_t*)0xD15EA5E;

       newFrameDataStart = (uint8_t*)(nalToDeliver.p_payload);
       unsigned newFrameSize = nalToDeliver.i_payload;

       // Deliver the data here:
       if (newFrameSize > fMaxSize) {
           fFrameSize = fMaxSize;
           fNumTruncatedBytes = newFrameSize - fMaxSize;
       }
       else {
           fFrameSize = newFrameSize;
       }

       memcpy(fTo, nalToDeliver.p_payload, nalToDeliver.i_payload);

       FramedSource::afterGetting(this);
       }
    }

    Relevant part of the RTSP-Server Therad

     RTSPServer* rtspServer = RTSPServer::createNew(*(parent->env), 8554, NULL);
     if (rtspServer == NULL) {
       *(parent->env) << "Failed to create RTSP server: " << (parent->env)->getResultMsg() << "\n";
       exit(1);
     }
     char const* streamName = "Stream";
     parent->h264FramedSource = H264FramedSource::createNew(*(parent->env), 0, 0);
     H264VideoStreamServerMediaSubsession *h264VideoStreamServerMediaSubsession = H264VideoStreamServerMediaSubsession::createNew(*(parent->env), true);
     h264VideoStreamServerMediaSubsession->parent = parent;
     sms->addSubsession(h264VideoStreamServerMediaSubsession);
     rtspServer->addServerMediaSession(sms);

     parent->env->taskScheduler().doEventLoop(); // does not return

    Once a connection exists the render loop calls

    h264FramedSource->AddToBuffer(videoData, 1024*768*4);
  • rtp mpg stream using jrtplib occurred error "illegal timestamp unit"

    13 avril 2013, par socket

    I wanna code rtp video stream sender and use VLC to play it in the LAN. I just have a mpeg4 file "a.mpg". And in my opinion, jrtplib can pack the packed well, so I think I can use sendPacket() function directly to send the mpg file. code below :

    #include
    #include
    #include
    #include
    #include
    #include
    #include
    #include <iostream>
    #include

    using namespace jrtplib;
    using namespace std;
    int main()
    {
    RTPSession s;
    RTPSessionParams sparam;
    s.SetTimestampUnit(160.0/104857.0);//~
    RTPUDPv4TransmissionParams tparam;
    tparam.SetPortbase(9000);
    int status=s.Create(sparam,&amp;tparam);
    if(status&lt;0)
    {
       cerr&lt;&lt;"error 1"&lt;/~
    s.SetDefaultMark(false);
    s.SetDefaultTimestampIncrement(160);//~~

    unsigned char buf[160];
    FILE*f=fopen("a.mpg","rb");
    while(!feof(f))
    {
       fread(buf,1,160,f);
       RTPTime delay(0.02);//~
       bool ret=false;
       status=s.SendPacket(buf,160);
       if(status&lt;0)
       {
           cerr&lt;&lt;"error 3"&lt;code></iostream>

    Then, I compile it :

    g++ jrtpsend.cpp -ljrtp -I /usr/local/include/jrtplib3

    After that, I run the program but error occurred :

    kl@kl-ubuntu:~/code/RTP$ ./a.out
    error 1
    Speficied an illegal timestamp unit for the the RTCP packet builder

    Actually, I didn't know timestamp exactly. When I use ffplay to play a.mpg showing that :

    Input #0, mpeg, from &#39;a.mpg&#39;:q=    0KB vq=    0KB sq=    0B f=0/0  
    Duration: 00:01:00.69, start: 0.529089, bitrate: 556 kb/s
    Stream #0:0[0x1e0]: Video: mpeg1video, yuv420p, 640x480 [SAR 239:182 DAR 478:273],   104857 kb/s, 25 fps, 25 tbr, 90k tbn, 25 tbc
    Stream #0:1[0x1c0]: Audio: mp2, 44100 Hz, stereo, s16p, 128 kb/s

    2.46 A-V : -0.003 fd= 7 aq= 2KB vq= 33KB sq= 0B f=0/0

    So, I use the data 104857kb/s in the code (setTimestampUnit()), but I have no idea whether I use it correctly.
    All in all, please give me some suggestion how to run the program well or ...
    Best regards :-)

  • Android FFmpeg reports "file protocol not found"

    27 avril 2017, par Abdul Mateen

    I am trying to convert sequence images to video using ffmpeg command in android.
    I have build ffmpeg for android and put ffmpeg binary file in /data/local folder.
    I am running ffmpeg command using Runtime.getruntime.execute("mycommand") in Android.

    Stucture of mycommand is like this

    "/data/local/ffmpeg -r 40 qscale 2 -i /mnt/sdcard/images/img%d.jpg /mnt/sdcard/images/finalvideo.mp4"

    mycommand contains the path of a folder where I placed all my sequence images. Running this command gives me error "protocol not found".

    Also I read that file protocol is default in ffmpeg documentation.

    How can I run such commands ?

    The Error I am getting after running simple command like ffmpeg -i /mnt/sdcard/vid.mp4

    ffmpeg version 0.11.1.git Copyright (c) 2000-2012 the FFmpeg developers
    built on Sep 26 2012 11:58:35 with gcc 4.4.3 (GCC)
    configuration: --target-os=linux --prefix=/home/esteves/android-ffmpeg --enable-cross-compile --enable-runtime-cpudetect --disable-asm --arch=arm --cc=/home/esteves/android-ndk/toolchains/arm-linux-androideabi-4.4.3/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc --cross-prefix=/home/esteves/android-ndk/toolchains/arm-linux-androideabi-4.4.3/prebuilt/linux-x86/bin/arm-linux-androideabi- --disable-stripping --nm=/home/esteves/android-ndk/toolchains/arm-linux-androideabi-4.4.3/prebuilt/linux-x86/bin/arm-linux-androideabi-nm --sysroot=/home/esteves/android-ndk/platforms/android-8/arch-arm --enable-nonfree --enable-version3 --disable-everything --enable-gpl --disable-doc --enable-avresample --enable-demuxer=amr --enable-demuxer=aac --enable-demuxer=rawvideo --enable-demuxer=rtsp --enable-muxer=rtsp --enable-muxer=flv --enable-muxer=mpegts --enable-muxer=mp4 --enable-demuxer=rtp --enable-demuxer=rtp --disable-ffplay --disable-ffserver --enable-ffmpeg --disable-ffprobe --enable-libx264 --enable-encoder=libx264   libavutil      51. 72.100 / 51. 72.100
    libavcodec     54. 55.100 / 54. 55.100
    libavformat    54. 25.105 / 54. 25.105
    libswscale      2.  1.101 /  2.  1.101
    error of main   libswresample   0. 15.100 /  0. 15.100
    error of main   libpostproc    52.  0.100 / 52.  0.100
    /mnt/sdcard/vid.mp4: Protocol not found