Recherche avancée

Médias (0)

Mot : - Tags -/diogene

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (111)

  • Problèmes fréquents

    10 mars 2010, par

    PHP et safe_mode activé
    Une des principales sources de problèmes relève de la configuration de PHP et notamment de l’activation du safe_mode
    La solution consiterait à soit désactiver le safe_mode soit placer le script dans un répertoire accessible par apache pour le site

  • Prérequis à l’installation

    31 janvier 2010, par

    Préambule
    Cet article n’a pas pour but de détailler les installations de ces logiciels mais plutôt de donner des informations sur leur configuration spécifique.
    Avant toute chose SPIPMotion tout comme MediaSPIP est fait pour tourner sur des distributions Linux de type Debian ou dérivées (Ubuntu...). Les documentations de ce site se réfèrent donc à ces distributions. Il est également possible de l’utiliser sur d’autres distributions Linux mais aucune garantie de bon fonctionnement n’est possible.
    Il (...)

  • Déploiements possibles

    31 janvier 2010, par

    Deux types de déploiements sont envisageable dépendant de deux aspects : La méthode d’installation envisagée (en standalone ou en ferme) ; Le nombre d’encodages journaliers et la fréquentation envisagés ;
    L’encodage de vidéos est un processus lourd consommant énormément de ressources système (CPU et RAM), il est nécessaire de prendre tout cela en considération. Ce système n’est donc possible que sur un ou plusieurs serveurs dédiés.
    Version mono serveur
    La version mono serveur consiste à n’utiliser qu’une (...)

Sur d’autres sites (8733)

  • ffmpeg command line for capturing audio and video in 720p from decklink card using Windows 7

    29 octobre 2013, par Huron

    I am trying to capture audio and video from a blackmagic decklink capture card using Windows 7 @ 720p, but I cant seem to get the ffmpeg command line settings right.

    ffmpeg -list_devices true -f dshow -i dummy

    [dshow @ 02457a60] DirectShow video devices
    [dshow @ 02457a60]  "Blackmagic WDM Capture"
    [dshow @ 02457a60]  "Decklink Video Capture"
    [dshow @ 02457a60] DirectShow audio devices
    [dshow @ 02457a60]  "Decklink Audio Capture"

    ffmpeg -list_options true -f dshow -i video="Decklink Video Capture"

    [dshow @ 03c2ea20] DirectShow video device options
    [dshow @ 03c2ea20]  Pin "Capture"
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=720x486 fps=29.97 max s=720x486 fps=29.97
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=720x486 fps=23.976 max s=720x486 fps=23.976
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=720x576 fps=25 max s=720x576 fps=25
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=720x486 fps=59.9402 max s=720x486 fps=59.9402
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=720x576 fps=50 max s=720x576 fps=50
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=1920x1080 fps=23.976 max s=1920x1080 fps=23.976
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=1920x1080 fps=24 max s=1920x1080 fps=24
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=1920x1080 fps=25 max s=1920x1080 fps=25
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=1920x1080 fps=29.97 max s=1920x1080 fps=29.97
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=1920x1080 fps=30 max s=1920x1080 fps=30
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=1280x720 fps=50 max s=1280x720fps=50
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=1280x720 fps=59.9402 max s=1280x720 fps=59.9402
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=1280x720 fps=60.0002 max s=1280x720 fps=60.0002

    ffmpeg -list_options true -f dshow -i audio="Decklink Audio Capture"

    [dshow @ 047fea20] DirectShow audio device options
    [dshow @ 047fea20]  Pin "Capture"
    [dshow @ 047fea20]   min ch=1 bits=16 rate= 48000 max ch=1 bits=16 rate= 48000
    [dshow @ 047fea20]   min ch=2 bits=16 rate= 48000 max ch=2 bits=16 rate= 48000
    [dshow @ 047fea20]   min ch=4 bits=16 rate= 48000 max ch=4 bits=16 rate= 48000
    [dshow @ 047fea20]   min ch=6 bits=16 rate= 48000 max ch=6 bits=16 rate= 48000
    [dshow @ 047fea20]   min ch=8 bits=16 rate= 48000 max ch=8 bits=16 rate= 48000
    [dshow @ 047fea20]   min ch=10 bits=16 rate= 48000 max ch=10 bits=16 rate= 48000
    [dshow @ 047fea20]   min ch=12 bits=16 rate= 48000 max ch=12 bits=16 rate= 48000
    [dshow @ 047fea20]   min ch=16 bits=16 rate= 48000 max ch=16 bits=16 rate= 48000

    This is the stream information for my current video/audio source, connected to the decklink card's hdmi port

    Stream #0:0: Video: rawvideo (UYVY / 0x59565955), uyvy422(tv), 1280x720, 59.94 tbr, 10000k tbn, 59.94 tbc
    Stream #0:1: Audio: pcm_s16le, 48000 Hz, stereo, s16, 1536 kb/s

    Ultimately I will need to display this video on screen. And simultaneously be able to start and stop recording, while the preview keeps playing.

    My idea was to use ffmpeg to capture the video and audio signal and transmit this to a network stream (eg 127.0.0.1:6666). Then use VLC player to display the stream (the preview). And ultimately start and/or stop another ffmpeg to save that same stream to disk.

    In my mind this works, but I am no audio/video expert so if anyone with more experience could help out, I would appreciate it.

    Update :

    I have been able to display the video using ffplay, with the following command :

    ffplay -f dshow -video_size 1280x720 -rtbufsize 702000k -framerate 60 -i video="Decklink Video Capture":audio="Decklink Audio Capture" -threads 2

    Next step is streaming it so I can view the stream (preview) with VLC.

    Tried to use this command :

    ffmpeg -f dshow -video_size 1280x720 -rtbufsize 702000k -framerate 60 -i video="Decklink Video Capture":audio="Decklink Audio Capture" -threads 2 -f mpegts rtp://127.0.0.1:6666?pkt_size=188?buffer_size=65535

    Which does not give any errors, so appears to work. But when I try to open the stream in VLC I get the following error :

    SDP required :
    A description in SDP format is required to receive the RTP stream. Note that rtp :// URIs cannot work with dynamic RTP payload format (65).

    After a bit of reading it seems I should not be streaming to rtp :// but rather to udp ://

    Command became :

    ffmpeg -f dshow -video_size 1280x720 -rtbufsize 702000k -framerate 60 -i video="Decklink Video Capture":audio="Decklink Audio Capture" -threads 2 -f mpegts udp://127.0.0.1:6666?pkt_size=188?buffer_size=65535

    And when I try to open it now in VLC I get no error, no warning, but also no video.

    Time for some more reading.

  • lavc : do not implicitly share the frame pool between threads

    17 janvier 2017, par Anton Khirnov
    lavc : do not implicitly share the frame pool between threads
    

    Currently the frame pool used by the default get_buffer2()
    implementation is a single struct, allocated when opening the decoder.
    A pointer to it is simply copied to each frame thread and we assume that
    no thread attempts to modify it at an unexpected time. This is rather
    fragile and potentially dangerous.

    With this commit, the frame pool is made refcounted, with the reference
    being propagated across threads along with other context variables. The
    frame pool is now also immutable - when the stream parameters change we
    drop the old reference and create a new one.

    • [DH] libavcodec/decode.c
    • [DH] libavcodec/internal.h
    • [DH] libavcodec/pthread_frame.c
    • [DH] libavcodec/utils.c
  • Command of FFMPEG to make a video from Image(JPEG) + Audio(.mp3) & Share video in Whatsapp

    22 novembre 2018, par Neel Mevada

    I am trying to create a video .mp4 file from .mp3 audio & .jpeg image.

    I am able to make a video and able to play in Video Players in Android devices.

    But after creation of file when i tried to Share that Video in Whatsapp, at that time it shows a message "The file format not supported".

    I am using below FFMPEG Command :

    "-loop 1 -r 1 -i " + imageFilePath + " -i " + audioFilePath + " -c:v libx264 -crf 27 -tune stillimage -c:a copy -pix_fmt yuv420p -preset ultrafast -shortest " + pathOutputVideo(sectionName);

    And for Sharing video , i am using below Code :

     MediaScannerConnection.scanFile(ShareQuestionAudioActivity.this, new String[]{FfmpegController.pathOutputVideo(qModel.getSectionName().toUpperCase().replaceAll(" ", "_"))},
                   null, new MediaScannerConnection.OnScanCompletedListener() {
                       public void onScanCompleted(String path, Uri uri) {
                           Intent shareIntent = new Intent(android.content.Intent.ACTION_SEND);
                           shareIntent.setType("video/*");
                           shareIntent.putExtra(Intent.EXTRA_STREAM, Uri.parse(FfmpegController.pathOutputVideo(qModel.getSectionName().toUpperCase().replaceAll(" ", "_"))));
                           startActivity(Intent.createChooser(shareIntent, "Share Question"));
                       }
                   });

    From this Link I found that i need to use H264 + AAC. but still not able to share video with supported file format