Recherche avancée

Médias (10)

Mot : - Tags -/wav

Autres articles (71)

  • Installation en mode ferme

    4 février 2011, par

    Le mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
    C’est la méthode que nous utilisons sur cette même plateforme.
    L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
    Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...)

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

Sur d’autres sites (13074)

  • Live555 : X264 Stream Live source based on "testOnDemandRTSPServer"

    12 janvier 2017, par user2660369

    I am trying to create a rtsp Server that streams the OpenGL output of my program. I had a look at How to write a Live555 FramedSource to allow me to stream H.264 live, but I need the stream to be unicast. So I had a look at testOnDemandRTSPServer. Using the same Code fails. To my understanding I need to provide memory in which I store my h264 frames so the OnDemandServer can read them on Demand.

    H264VideoStreamServerMediaSubsession.cpp

    H264VideoStreamServerMediaSubsession*
    H264VideoStreamServerMediaSubsession::createNew(UsageEnvironment& env,
                             Boolean reuseFirstSource) {
     return new H264VideoStreamServerMediaSubsession(env, reuseFirstSource);
    }

    H264VideoStreamServerMediaSubsession::H264VideoStreamServerMediaSubsession(UsageEnvironment& env, Boolean reuseFirstSource)
     : OnDemandServerMediaSubsession(env, reuseFirstSource), fAuxSDPLine(NULL), fDoneFlag(0), fDummyRTPSink(NULL) {
    }

    H264VideoStreamServerMediaSubsession::~H264VideoStreamServerMediaSubsession() {
     delete[] fAuxSDPLine;
    }

    static void afterPlayingDummy(void* clientData) {
     H264VideoStreamServerMediaSubsession* subsess = (H264VideoStreamServerMediaSubsession*)clientData;
     subsess->afterPlayingDummy1();
    }

    void H264VideoStreamServerMediaSubsession::afterPlayingDummy1() {
     // Unschedule any pending 'checking' task:
     envir().taskScheduler().unscheduleDelayedTask(nextTask());
     // Signal the event loop that we're done:
     setDoneFlag();
    }

    static void checkForAuxSDPLine(void* clientData) {
     H264VideoStreamServerMediaSubsession* subsess = (H264VideoStreamServerMediaSubsession*)clientData;
     subsess->checkForAuxSDPLine1();
    }

    void H264VideoStreamServerMediaSubsession::checkForAuxSDPLine1() {
     char const* dasl;

     if (fAuxSDPLine != NULL) {
       // Signal the event loop that we're done:
       setDoneFlag();
     } else if (fDummyRTPSink != NULL && (dasl = fDummyRTPSink->auxSDPLine()) != NULL) {
       fAuxSDPLine = strDup(dasl);
       fDummyRTPSink = NULL;

       // Signal the event loop that we're done:
       setDoneFlag();
     } else {
       // try again after a brief delay:
       int uSecsToDelay = 100000; // 100 ms
       nextTask() = envir().taskScheduler().scheduleDelayedTask(uSecsToDelay,
                     (TaskFunc*)checkForAuxSDPLine, this);
     }
    }

    char const* H264VideoStreamServerMediaSubsession::getAuxSDPLine(RTPSink* rtpSink, FramedSource* inputSource) {
     if (fAuxSDPLine != NULL) return fAuxSDPLine; // it's already been set up (for a previous client)

     if (fDummyRTPSink == NULL) { // we're not already setting it up for another, concurrent stream
       // Note: For H264 video files, the 'config' information ("profile-level-id" and "sprop-parameter-sets") isn't known
       // until we start reading the file.  This means that "rtpSink"s "auxSDPLine()" will be NULL initially,
       // and we need to start reading data from our file until this changes.
       fDummyRTPSink = rtpSink;

       // Start reading the file:
       fDummyRTPSink->startPlaying(*inputSource, afterPlayingDummy, this);

       // Check whether the sink's 'auxSDPLine()' is ready:
       checkForAuxSDPLine(this);
     }

     envir().taskScheduler().doEventLoop(&fDoneFlag);

     return fAuxSDPLine;
    }

    FramedSource* H264VideoStreamServerMediaSubsession::createNewStreamSource(unsigned /*clientSessionId*/, unsigned& estBitrate) {
     estBitrate = 500; // kb
     megamol::remotecontrol::View3D_MRC *parent = (megamol::remotecontrol::View3D_MRC*)this->parent;
     return H264VideoStreamFramer::createNew(envir(), parent->h264FramedSource);
    }

    RTPSink* H264VideoStreamServerMediaSubsession::createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* /*inputSource*/) {
     return H264VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
    }

    FramedSource.cpp

    H264FramedSource* H264FramedSource::createNew(UsageEnvironment& env,
                                             unsigned preferredFrameSize,
                                             unsigned playTimePerFrame)
    {
       return new H264FramedSource(env, preferredFrameSize, playTimePerFrame);
    }

    H264FramedSource::H264FramedSource(UsageEnvironment& env,
                                  unsigned preferredFrameSize,
                                  unsigned playTimePerFrame)
       : FramedSource(env),
       fPreferredFrameSize(fMaxSize),
       fPlayTimePerFrame(playTimePerFrame),
       fLastPlayTime(0),
       fCurIndex(0)
    {

       x264_param_default_preset(&param, "veryfast", "zerolatency");
       param.i_threads = 1;
       param.i_width = 1024;
       param.i_height = 768;
       param.i_fps_num = 30;
       param.i_fps_den = 1;
       // Intra refres:
       param.i_keyint_max = 60;
       param.b_intra_refresh = 1;
       //Rate control:
       param.rc.i_rc_method = X264_RC_CRF;
       param.rc.f_rf_constant = 25;
       param.rc.f_rf_constant_max = 35;
       param.i_sps_id = 7;
       //For streaming:
       param.b_repeat_headers = 1;
       param.b_annexb = 1;
       x264_param_apply_profile(&param, "baseline");

       param.i_log_level = X264_LOG_ERROR;

       encoder = x264_encoder_open(&param);
       pic_in.i_type            = X264_TYPE_AUTO;
       pic_in.i_qpplus1         = 0;
       pic_in.img.i_csp         = X264_CSP_I420;
       pic_in.img.i_plane       = 3;


       x264_picture_alloc(&pic_in, X264_CSP_I420, 1024, 768);

       convertCtx = sws_getContext(1024, 768, PIX_FMT_RGBA, 1024, 768, PIX_FMT_YUV420P, SWS_FAST_BILINEAR, NULL, NULL, NULL);
       eventTriggerId = envir().taskScheduler().createEventTrigger(deliverFrame0);
    }

    H264FramedSource::~H264FramedSource()
    {
       envir().taskScheduler().deleteEventTrigger(eventTriggerId);
       eventTriggerId = 0;
    }

    void H264FramedSource::AddToBuffer(uint8_t* buf, int surfaceSizeInBytes)
    {
       uint8_t* surfaceData = (new uint8_t[surfaceSizeInBytes]);

       memcpy(surfaceData, buf, surfaceSizeInBytes);

       int srcstride = 1024*4;
       sws_scale(convertCtx, &surfaceData, &srcstride,0, 768, pic_in.img.plane, pic_in.img.i_stride);
       x264_nal_t* nals = NULL;
       int i_nals = 0;
       int frame_size = -1;


       frame_size = x264_encoder_encode(encoder, &nals, &i_nals, &pic_in, &pic_out);

       static bool finished = false;

       if (frame_size >= 0)
       {
       static bool alreadydone = false;
       if(!alreadydone)
       {

           x264_encoder_headers(encoder, &nals, &i_nals);
           alreadydone = true;
       }
       for(int i = 0; i < i_nals; ++i)
       {
           m_queue.push(nals[i]);
       }
       }
       delete [] surfaceData;
       surfaceData = nullptr;

       envir().taskScheduler().triggerEvent(eventTriggerId, this);
    }

    void H264FramedSource::doGetNextFrame()
    {
       deliverFrame();
    }

    void H264FramedSource::deliverFrame0(void* clientData)
    {
       ((H264FramedSource*)clientData)->deliverFrame();
    }

    void H264FramedSource::deliverFrame()
    {
       x264_nal_t nalToDeliver;

       if (fPlayTimePerFrame > 0 && fPreferredFrameSize > 0) {
       if (fPresentationTime.tv_sec == 0 && fPresentationTime.tv_usec == 0) {
           // This is the first frame, so use the current time:
           gettimeofday(&fPresentationTime, NULL);
       } else {
           // Increment by the play time of the previous data:
           unsigned uSeconds   = fPresentationTime.tv_usec + fLastPlayTime;
           fPresentationTime.tv_sec += uSeconds/1000000;
           fPresentationTime.tv_usec = uSeconds%1000000;
       }

       // Remember the play time of this data:
       fLastPlayTime = (fPlayTimePerFrame*fFrameSize)/fPreferredFrameSize;
       fDurationInMicroseconds = fLastPlayTime;
       } else {
       // We don't know a specific play time duration for this data,
       // so just record the current time as being the 'presentation time':
       gettimeofday(&fPresentationTime, NULL);
       }

       if(!m_queue.empty())
       {
       m_queue.wait_and_pop(nalToDeliver);

       uint8_t* newFrameDataStart = (uint8_t*)0xD15EA5E;

       newFrameDataStart = (uint8_t*)(nalToDeliver.p_payload);
       unsigned newFrameSize = nalToDeliver.i_payload;

       // Deliver the data here:
       if (newFrameSize > fMaxSize) {
           fFrameSize = fMaxSize;
           fNumTruncatedBytes = newFrameSize - fMaxSize;
       }
       else {
           fFrameSize = newFrameSize;
       }

       memcpy(fTo, nalToDeliver.p_payload, nalToDeliver.i_payload);

       FramedSource::afterGetting(this);
       }
    }

    Relevant part of the RTSP-Server Therad

     RTSPServer* rtspServer = RTSPServer::createNew(*(parent->env), 8554, NULL);
     if (rtspServer == NULL) {
       *(parent->env) << "Failed to create RTSP server: " << (parent->env)->getResultMsg() << "\n";
       exit(1);
     }
     char const* streamName = "Stream";
     parent->h264FramedSource = H264FramedSource::createNew(*(parent->env), 0, 0);
     H264VideoStreamServerMediaSubsession *h264VideoStreamServerMediaSubsession = H264VideoStreamServerMediaSubsession::createNew(*(parent->env), true);
     h264VideoStreamServerMediaSubsession->parent = parent;
     sms->addSubsession(h264VideoStreamServerMediaSubsession);
     rtspServer->addServerMediaSession(sms);

     parent->env->taskScheduler().doEventLoop(); // does not return

    Once a connection exists the render loop calls

    h264FramedSource->AddToBuffer(videoData, 1024*768*4);
  • ffmpeg says : "option framerate not found"

    16 juillet 2013, par Jason

    Good afternoon.

    I've been trying to use ffmpeg to split a .mpeg into a sequence of .jpeg images. Please note that this is the inverse problem of the one specified here, but the problem I'm facing is not the same as the problem the author of that thread is facing.

    Specifically, I have tried all the following mpeg codecs available in my ffmpeg installation :

    DEV D  ffv1            FFmpeg video codec #1
    DEVSD  ffvhuff         Huffyuv FFmpeg variant
    DEVSDT mpeg1video      MPEG-1 video
    D V DT mpeg1video_vdpau MPEG-1 video (VDPAU acceleration)
    D V D  mpeg2_crystalhd MPEG-2 Video (CrystalHD acceleration)
    DEVSDT mpeg2video      MPEG-2 video
    DEVSDT mpeg4           MPEG-4 part 2
    D V D  mpeg4_crystalhd MPEG-4 Part 2 (CrystalHD acceleration)
    D V DT mpeg4_vdpau     MPEG-4 part 2 (VDPAU)
    D VSDT mpegvideo       MPEG-1 video
    D V DT mpegvideo_vdpau MPEG-1/2 video (VDPAU acceleration)
    D VSDT mpegvideo_xvmc  MPEG-1/2 video XvMC (X-Video Motion Compensation)
    DEVSD  msmpeg4         MPEG-4 part 2 Microsoft variant version 3
    D V D  msmpeg4_crystalhd MPEG-4 Part 2 Microsoft variant version 3 (CrystalHD acceleration)
    D VSD  msmpeg4v1       MPEG-4 part 2 Microsoft variant version 1
    DEVSD  msmpeg4v2       MPEG-4 part 2 Microsoft variant version 2

    I've tried supplying ffmpeg with all different codecs to try to split an input .mpeg video file with 2548 frames into 2548 .jpegs, by using the -vcodec option :

    ffmpeg -r 2548 -i frames.mpeg frames_%06d.jpg -vcodec $codec

    where $codec is any codec among the ones listed above, e.g. ffv1, mpeg4, etc. However, all my efforts result in the following error :

    [NULL @ 0x1948f20] Value 4707126720094797824.000000 for parameter 'probesize' out of range
    [NULL @ 0x1948f20] Value 4707126720094797824.000000 for parameter 'analyzeduration' out of range
    [NULL @ 0x1948f20] Value 4697254411347427328.000000 for parameter 'indexmem' out of range
    [NULL @ 0x1948f20] Value 4703785510416416768.000000 for parameter 'rtbufsize' out of range
    [NULL @ 0x1948f20] Value -4616189618054758400.000000 for parameter 'fpsprobesize' out of range
    [NULL @ 0x1948f20] Value 4607182418800017408.000000 for parameter 'f_err_detect' out of     range
    [NULL @ 0x1948f20] Value 4607182418800017408.000000 for parameter 'err_detect' out of range
    Option framerate not found.  

    Which is weird, because I've actually specified the frame rate parameter through the -r switch. I googled a bit and it appears that this error was mentioned approximately a year and a half ago on the ffmpeg bug tracker but that information makes little sense to me because it's relevant to the devs. Curiously, if I interchange the position of the parameters and put -r after -i or even after -vcodec, the error message changes to :

    [NULL @ 0x24abf20] Value 4707126720094797824.000000 for parameter 'probesize' out of range
    [NULL @ 0x24abf20] Value 4707126720094797824.000000 for parameter 'analyzeduration' out of range
    [NULL @ 0x24abf20] Value 4697254411347427328.000000 for parameter 'indexmem' out of range
    [NULL @ 0x24abf20] Value 4703785510416416768.000000 for parameter 'rtbufsize' out of range
    [NULL @ 0x24abf20] Value -4616189618054758400.000000 for parameter 'fpsprobesize' out of  range
    [NULL @ 0x24abf20] Value 4607182418800017408.000000 for parameter 'f_err_detect' out of  range
    [NULL @ 0x24abf20] Value 4607182418800017408.000000 for parameter 'err_detect' out of range
    [NULL @ 0x24b23c0] Value 4686111960511545344.000000 for parameter 'b' out of range
    [NULL @ 0x24b23c0] Value 4683532506232782848.000000 for parameter 'ab' out of range
    [NULL @ 0x24b23c0] Value 4705844345939427328.000000 for parameter 'bt' out of range
    [NULL @ 0x24b23c0] Value 4617315517961601024.000000 for parameter 'me_method' out of range
    [NULL @ 0x24b23c0] Value 4622945017495814144.000000 for parameter 'g' out of range
    [NULL @ 0x24b23c0] Value 4611686018427387904.000000 for parameter 'qmin' out of range
    [NULL @ 0x24b23c0] Value 4629418941960159232.000000 for parameter 'qmax' out of range
    [NULL @ 0x24b23c0] Value 4613937818241073152.000000 for parameter 'qdiff' out of range
    [NULL @ 0x24b23c0] Value -4616189618054758400.000000 for parameter 'wpredp' out of range
    [NULL @ 0x24b23c0] Value 4607182418800017408.000000 for parameter 'bug' out of range
    [NULL @ 0x24b23c0] Value 4607182418800017408.000000 for parameter 'er' out of range
    [NULL @ 0x24b23c0] Value 4607182418800017408.000000 for parameter 'err_detect' out of range
    [NULL @ 0x24b23c0] Value 4613937818241073152.000000 for parameter 'ec' out of range
    [NULL @ 0x24b23c0] Value 4620693217682128896.000000 for parameter 'ildctcmp' out of range
    [NULL @ 0x24b23c0] Value 4620693217682128896.000000 for parameter 'subq' out of range
    [NULL @ 0x24b23c0] Value 4696837138094751744.000000 for parameter 'ibias' out of range
    [NULL @ 0x24b23c0] Value 4696837138094751744.000000 for parameter 'pbias' out of range
    [NULL @ 0x24b23c0] Value 4642507528377204736.000000 for parameter 'lmin' out of range
    [NULL @ 0x24b23c0] Value 4660262442142531584.000000 for parameter 'lmax' out of range
    [NULL @ 0x24b23c0] Value 4696130710463840256.000000 for parameter 'flags2' out of range
    [NULL @ 0x24b23c0] Value 4607182418800017408.000000 for parameter 'threads' out of range
    [NULL @ 0x24b23c0] Value 4620693217682128896.000000 for parameter 'nssew' out of range
    [NULL @ 0x24b23c0] Value -4586705114244317184.000000 for parameter 'profile' out of range
    [NULL @ 0x24b23c0] Value -4586705114244317184.000000 for parameter 'level' out of range
    [NULL @ 0x24b23c0] Value 4623507967449235456.000000 for parameter 'skipcmp' out of range
    [NULL @ 0x24b23c0] Value 4642507528377204736.000000 for parameter 'mblmin' out of range
    [NULL @ 0x24b23c0] Value 4660262442142531584.000000 for parameter 'mblmax' out of range
    [NULL @ 0x24b23c0] Value 4643211215818981376.000000 for parameter 'mepc' out of range
    [NULL @ 0x24b23c0] Value 4607182418800017408.000000 for parameter 'bidir_refine' out of range
    [NULL @ 0x24b23c0] Value -4616189618054758400.000000 for parameter 'cqp' out of range
    [NULL @ 0x24b23c0] Value 4627730092099895296.000000 for parameter 'keyint_min' out of range
    [NULL @ 0x24b23c0] Value 4607182418800017408.000000 for parameter 'refs' out of range
    [NULL @ 0x24b23c0] Value -4616189618054758400.000000 for parameter 'directpred' out of range
    [NULL @ 0x24b23c0] Value 4618441417868443648.000000 for parameter 'sc_factor' out of range
    [NULL @ 0x24b23c0] Value 4643211215818981376.000000 for parameter 'mv0_threshold' out of range
    [NULL @ 0x24b23c0] Value 4630826316843712512.000000 for parameter 'b_sensitivity' out of range
    [NULL @ 0x24b23c0] Value -4616189618054758400.000000 for parameter 'compression_level' out of range
    [NULL @ 0x24b23c0] Value -4616189618054758400.000000 for parameter 'min_prediction_order' out of range
    [NULL @ 0x24b23c0] Value -4616189618054758400.000000 for parameter 'max_prediction_order' out of range
    [NULL @ 0x24b23c0] Value -4616189618054758400.000000 for parameter 'prediction_order_method' out of range
    [NULL @ 0x24b23c0] Value -4616189618054758400.000000 for parameter 'min_partition_order' out of range
    [NULL @ 0x24b23c0] Value -4616189618054758400.000000 for parameter 'max_partition_order' out of range
    [NULL @ 0x24b23c0] Value 4607182418800017408.000000 for parameter 'ticks_per_frame' out of range
    [NULL @ 0x24b23c0] Value 4611686018427387904.000000 for parameter 'color_primaries' out of range
    [NULL @ 0x24b23c0] Value 4611686018427387904.000000 for parameter 'color_trc' out of range
    [NULL @ 0x24b23c0] Value 4611686018427387904.000000 for parameter 'colorspace' out of range
    [NULL @ 0x24b23c0] Value -4616189618054758400.000000 for parameter 'aq_mode' out of range
    [NULL @ 0x24b23c0] Value -4616189618054758400.000000 for parameter 'rc_lookahead' out of range
    [NULL @ 0x24b23c0] Value -4616189618054758400.000000 for parameter 'lpc_type' out of range
    [NULL @ 0x24b23c0] Value -4616189618054758400.000000 for parameter 'lpc_passes' out of range
    [NULL @ 0x24b23c0] Value 4613937818241073152.000000 for parameter 'thread_type' out of range
    [NULL @ 0x24b23c0] Value -4616189618054758400.000000 for parameter 'request_sample_fmt' out of range
    frames.mpeg: could not find codec parameters

    I'm at a loss as to what to do with this problem, and am looking for some assistance.

    Kind regards,

    Jason

  • Combining Video and Audio of different length in bulk

    25 mai 2017, par user2981223

    I am going through a tv series right now and editing the files to be to my liking. I have one set which has the video I want and one set that has the audio. I have a batch file that I can run that takes the video from every file in folder "A" and the audio from every file in folder "B" and outputs it to a folder named "output." But with this particular series, that is only half of what I need done.

    At the end of every episode of the files in the "B" folder there are some extra things. What I would like to do is take the audio and video from "A" and the audio from "B", combine it all into one file and also take the "A" and "B" files, compare the time stamps, and add the extra video from "B" to the output file.

    Let me put it another way. Let’s say "A" is 1080p with Japanese audio and is 20 minutes long. Let’s say "B" is 720p with English audio and is 23 minutes long. I want the whole 1080p video with both audio tracks, plus the 720p video spliced onto the end. Both files start at the same spot so syncing isn’t an issue. The issue is that the difference in time is different for every episode. So some episodes are 3 minutes longer, some only 30 seconds. Is there a way to make ffmpeg or another tool look at the difference in times and just add the excess to the output file ?

    Sorry for being long winded. Thanks for any help and guidance.