Recherche avancée

Médias (29)

Mot : - Tags -/Musique

Autres articles (72)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (11564)

  • Optimize ffmpeg for text with mpeg1

    20 septembre 2022, par lmalves

    I'm kind of new to video enconding and decoding but I'm currently working with NodeJS and I need to stream the content of a window (using electron) to an angular app.

    


    In my current implementation I'm "grabbing" the app window content :

    


    ffmpeg -f gdigrab -i title="Windows App" -vcodec mpeg1video -r 30 -b:v 10  -bufsize:v 327680 -f rtp rtp://127.0.0.1:1234

    


    Then I'm conveting to http:

    


    ffmpeg -i rtp://127.0.0.1:1234 -protocol_whitelist file,crypto,udp,rtp,http,tcp -f mpegts -codec:v mpeg1video -s 500x515 -b:v 10 -bf 0 http://localhost:59655/

    


    And on the client app using MpegJS to display it with almost the default configuration.

    


    It works relativelly well except for some parts of text where if the font is smaller it's almost impossible to read it.

    


    Like I said I'm new on this and I'm not sure how to optimize it to text.
Any suggestions ?

    


  • How to Output Mjpeg from Kokorin Jaffree FFmpeg via UDP to a Localhost Port ?

    14 octobre 2022, par roba

    I have a Java program which displays dual webcams and records them to file in FHD 30fps H264/H265. It uses Sarxos Webcam for the initial setup and display but when recording, it switches to Jaffree FFmpeg. During recording Sarxos Webcam must release its webcam access and cannot display while recording continues.

    


    I have tried recording with Xuggler/Sarxos but Sarxos seems to only access raw video from the webcams which creates limitations in the frame rate and resolution which can be achieved. At 1920x1080 the cameras can only deliver 5 fps raw video.

    


    I am trying to direct mjpeg streams from Jaffree to localports for display purposes during recording but I cannot figure out how to do it.

    


    Simultaneous recording plus sending to a port can be done from the terminal with the following :

    


    ffmpeg -f  dshow  -video_size 1920x1080 -rtbufsize 944640k -framerate 25 -vcodec mjpeg  -i video="Logitech Webcam C930e" -pix_fmt yuv420p -c:v libx264 outFHDx25.mp4 -f mpegts udp://localhost:1234?pkt_size=188&buffer_size=65535


    


    and viewed from the port in a different terminal like this :

    


    ffplay -i udp://localhost:1234


    


    The video which displays is a little blocky compared with the video recorded to file. Any suggestions on how to improve this would be appreciated.

    


    Note that FFPlay is not included in Jaffree FFMpeg.

    


    I would like to send the mjpeg to a port and then read it into the Sarxos Webcam viewer to display while recording is in progress.

    


    The Jaffree Java code for recording the output of one webcam to file follows. It takes the mjpeg/yuv422p output from the webcam and normally encodes it to file as H264/yuv420p :

    


    public static FFmpeg createTestFFmpeg() {
      String camera1Ref = "video=" + cam1Vid + ":audio=" + cam1Aud;
          return FFmpeg.atPath()
              .addArguments("-f", "dshow")  //selects dshow for Windows
              .addArguments("-video_size", resString)  //video resolution  eg 1920x1080          
              .addArguments("-rtbufsize", rtBufResultString) 
              .addArguments("-thread_queue_size", threadQ)
              .addArguments("-framerate", fpsString)   // capture frame rate  eg 30fps         
              .addArguments(codec, vidString)  //set capture encode mode from camera
              .addArgument(audio) //on or off
              .addArguments("-i", camera1Ref)   // name of camera to capture
              .addArguments("-pix_fmt", pixFmt)
              .addArguments("-c:v", enc2)  //eg enc2 = "libx264", "h264_nvenc"
              .addArguments(enc3, enc4)  //enc3 = "-crf", enc4 = "20"
              .addArguments(enc5, enc6)  //enc5 = "-gpu:v", enc6 = "0"
              .addArguments(enc7, enc8)  //enc7 = "-cq:v", enc8 = "20"
              .addArguments(enc9, enc10)  //enc9 = "-rc:v", enc10 = "vbr"
              .addArguments(enc11, enc12)  //enc11 = "-tune:v", enc12 = "ll"
              .addArguments(enc13, enc14)  //enc13 = "-preset:v", enc14 = "p1" 
              .addArguments(enc15,enc16)  //enc15 = "-b:v", enc16 = "0"
              .addArguments(enc17, enc18)  //enc17 = "-maxrate:v", enc18 = "5000k"
              .addArguments(enc19, enc20)  //enc19 = "-bufsize:v", enc20 = "5000k"
              .addArguments(enc21, enc22)  //enc21 = "-profile:v", enc22 = "main"
              .addArgument(noFFStats) //"-nostats"{, stops logging progress/statistics
              .addArguments("-loglevel", ffLogLevel)  //error logging
              .addArgument(bannerResultString)  // "-hide_banner"
              .addArguments("-rtbufsize", rtBufResultString) 
              .setOverwriteOutput(true)   // overwrite filename if it exists  Boolean = overwriteFile
              .addOutput(
                  UrlOutput
                      .toUrl(filePathL))                    
              .setProgressListener(new ProgressListener(){
                  @Override
                  public void onProgress(FFmpegProgress progress){
                     if(ffProgress){ 
                          System.out.println(progress);
                          
                     } 
                    }
            } );
            
   }


    


    How and where do I add the code to output mjpeg via UDP to a localport while simultaneously writing H264 to a file, and what is the syntax ? I am sure it must be simple but I seem to have tried all of the permutations without success. I can write to a file OR I can output to a port but I cannot do both.

    


  • ffmpeg only works 2 instances

    31 janvier 2023, par Alex

    I have 3 usb web cameras. Im using Windows 7.

    


    I have 3 usb web cameras

    


    And I need to create 3 video streams :

    


    ffmpeg -f dshow -i video="Full HD webcam" -vcodec libx264 -tune zerolatency -threads 0 -b 900k -f mpegts udp ://localhost:1234

    


    ffmpeg -f dshow -i video=”@device_pnp_~?\usb#vid_1908&pid_2311&mi_00#8&134fde2a&0&0000#65e8773d-8f56-11d0-a3b9-00a0c9223196\global” -vcodec libx264 -tune zerolatency -threads 0 -b 900k -f mpegts udp ://localhost:1233

    


    ffmpeg -f dshow -i video="@device_pnp_~?\usb#vid_1908&pid_2311&mi_00#8&962d85&0&0000#65e8773d-8f56-11d0-a3b9-00a0c9223196\global" -vcodec libx264 -tune zerolatency -threads 0 -b 900k -f mpegts udp ://localhost:1232

    


    But only works 2 any instances of video streams. When I try to create third video stream I get error :

    


    I get error

    


    Can anybody help me, please ?

    


    i tried to create 3 video streams, but could only create 2 instances of ffmpeg process.