Recherche avancée

Médias (1)

Mot : - Tags -/net art

Autres articles (21)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

Sur d’autres sites (4139)

  • avfilter/vf_delogo : round to the closest value

    9 décembre 2015, par Jean Delvare
    avfilter/vf_delogo : round to the closest value
    

    When the interpolated value is divided by the sum of weights, no
    rounding is done, which means the value is truncated. This results in
    a slight bias towards dark green in the interpolated area. Rounding
    properly removes the bias.

    I measured this change to reduce the interpolation error by 1 to 2 %
    on average on a number of sample input and logo area combinations.

    Signed-off-by : Jean Delvare <jdelvare@suse.de>
    Signed-off-by : Michael Niedermayer <michael@niedermayer.cc>

    • [DH] libavfilter/vf_delogo.c
    • [DH] tests/ref/fate/filter-delogo
  • Passing pointer to thread via message_loop

    21 août 2016, par Mohammad Abu Musa

    I am using FFMPEG to encode a video, I got a green screen for the packets which I think it means I am getting empty buffer. I suspect I am passing the parameters wrong. I want help in getting the parameters passed correctly.

       vpx_codec_iter_t iter = NULL;
       const vpx_codec_cx_pkt_t *pkt;
       // copy the pixels into our "raw input" container.
       int bytes_filled = avpicture_fill(&amp;pic_raw, data, AV_PIX_FMT_YUV420P, out_width, out_height);
       if(!bytes_filled) {
           Logger::Log("Cannot fill the raw input buffer");
           return;
       }

       if(vpx_codec_encode(&amp;codec, &amp;raw, frame_cnt, 1, flags, VPX_DL_REALTIME))
             die_codec(&amp;codec, "Failed to encode frame");

       while( (pkt = vpx_codec_get_cx_data(&amp;codec, &amp;iter)) ) {
           switch(pkt->kind) {
               case VPX_CODEC_CX_FRAME_PKT:
                   glb_app_thread.message_loop().PostWork(callback_factory_.NewCallback(&amp;EncoderInstance::write_ivf_frame_header, pkt));
                   glb_app_thread.message_loop().PostWork(callback_factory_.NewCallback(&amp;EncoderInstance::WriteFile, pkt));
                   break;
               default:break;
           }
       }

    for the event handler

    void EncoderInstance::WriteFile(int32_t result, const vpx_codec_cx_pkt_t *pkt){
       lock_file_.Acquire();
       (void) fwrite(pkt->data.frame.buf, 1, pkt->data.frame.sz, outfile);
       Logger::Log("Packet written");
       lock_file_.Release();

    }

    My question is, am I passing the argument pkt correctly ? or should I put these packets into a queue then let the writeFile function work on the queue.

  • Using ffmpeg on RTOS

    7 avril 2015, par Dhiraj

    I am trying to capture video and audio from a webcam and stream it wirelessly through a software defined radio. Essentially, I need to packetize the video stream so that it is suitable for the transport layer implemented in INEGRITY OS running on an ARM processor. While I am able to capture the video and transmit it wirelessly, on the receiving end, when I try to view the video using ffplay, the quality is very bad. Ugly green patches and video tearing. Do pardon my ignorance but ffmpeg is not my forte. This is how I am sending the video :

    ffmpeg -rtbufsize 1500M -f dshow -i video="Vimicro USB Camera (Altair)":audio="Microphone (Realtek High Definition Audio)" -r 10 -vcodec libx264 -threads 0 -crf 23 -preset ultrafast -tune zerolatency -acodec libmp3lame -b 600k -flush_packets 0 -f mpegts udp://192.9.200.254:8000?pkt_size=1128

    On the receiver end, I run ffplay using the following command :

    ffplay udp://192.9.200.69:8000

    Importantly, the video from the USB camera is sent over ethernet to an ARM processor running INTEGRITY RTOS. The transport layer of the Software Defined Radio is implemented in the RTOS. This is where the video data is multiplexed with other application data being transmitted through the SDR and hence a stringent requirement on the packet size (1128 bytes). From the ARM processor, the data packet is sent on to a DSP where the network and DLL layer are implemented and finally on to a FPGA where the PHY layer is implemented.

    Apart from using ffplay, I have also tried mplayer, however, the video output is equally bad.
    Any help would be greatly appreciated.