Recherche avancée

Médias (1)

Mot : - Tags -/censure

Autres articles (91)

  • Qu’est ce qu’un éditorial

    21 juin 2013, par

    Ecrivez votre de point de vue dans un article. Celui-ci sera rangé dans une rubrique prévue à cet effet.
    Un éditorial est un article de type texte uniquement. Il a pour objectif de ranger les points de vue dans une rubrique dédiée. Un seul éditorial est placé à la une en page d’accueil. Pour consulter les précédents, consultez la rubrique dédiée.
    Vous pouvez personnaliser le formulaire de création d’un éditorial.
    Formulaire de création d’un éditorial Dans le cas d’un document de type éditorial, les (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

Sur d’autres sites (12126)

  • Streaming raw sound with FFMPEG

    30 juin 2020, par Dr_Click

    I'm using the ffmpeg library to decode / encode audio in JAVA, using the Process objets. The aim is to got the raw datas decoded by ffmpeg in my JAVA code and then, to send them back to ffmpeg to generate a stream or create a file.

    



    As JAVA sounds only allow wav audio datas, I made a first try with this command line :

    



    ffmpeg -i http://199.180.75.118:80 -acodec pcm_s16le -ar 44100 -ac 2 "/home/dr_click/montest.wav"


    



    It works. But, my goal is to get the frames on the fly and not a file. I thought about using a pipe but it seems it will work only with Linux and not with Windows (and no idea for MacOS)
So, I'd prefer to stream locally a wav file and to catch it with JAVA into a AudioInputStream.

    



    I wrote this command :

    



    ffmpeg -re -i http://199.180.75.118:80 -acodec pcm_s16le -ar 44100 -ac 2 -f rtp rtp://127.0.0.1:1234


    



    It semms to work on the server side. But, when I enter the below command line on the server side :

    



    ffplay rtp://127.0.0.1:1234


    



    It doesn't work at all. I got this error :

    



      

    • [rtp @ 0x7f29c8000b80] Unable to receive RTP payload type 97 without an SDP file describing it
    • 


    



    If I try the couple :

    



    ffmpeg -i http://199.180.75.118:80 -acodec pcm_s16le -ar 44100 -ac 2 -f rtp -sdp_file /home/dr_click/audio.sdp rtp://127.0.1.1:1234


    



    and

    



    ffplay /home/dr_click/audio.sdp


    



    I got the following error :

    



      

    • [rtp @ 0x7f7d00008040] Protocol 'rtp' not on whitelist 'file,crypto' ! if
    • 


    



    And if I finally try :

    



    fmpeg -protocol_whitelist file,http,rtp,tcp -i http://199.180.75.118:80 -acodec pcm_s16le -ar 44100 -ac 2 -f rtp -sdp_file /home/dr_click/audio.sdp rtp://127.0.1.1:1234


    



    I still get the same error.

    



    Which point do I miss to stream locally some raw datas and catch them back in the AudioInputStream ?

    



    Thank you for all your replies.

    


  • Concatenate mp4 files in Android using halfninja ffmpeg

    21 février 2015, par Budius

    I’ve manage to compile halfninja ffmpeg scripts for Android NDK using NDK version r5c. (Unfortunately any attempt to compile with earlier NDK generated some error), also I’m not very knowledgeable on the whole NDK process, so it’s a bit hit-n-miss for me.

    His scripts are compiling ffmpeg version N-30996-gf925b24 (the specific commit he did the scripts for)

    Moving forward to my actual app.
    I manage to trim videos without problems, now I need to join/concatenate them but any attemp at using any and several combinations of the commands found on those 3 links (link1, link2, link3) generate errors such as cat is not valid, > is undefinined, unknown option filter_complex or trying to override some of the input files.

    Does anyone know if it’s possible and (how to do it), to join/concatenate mp4 videos (all same codec, size, quality, etc) using half-ninja compile of ffmpeg on Android, or how to compile/get a ffmpeg for Android using latest source codes ?

    I’ve also gave a quick try on the mp4Parser without much success.

    ultimately I was trying to get this pseudo-method to work :

    public static File concatenate(String[] inputPaths, String outputPath){

       // ... do stuff do generate ffmpeg commands....
       VideoKit v = new VideoKit();
       v.run(cmds);

       File f = new File(outputPath);
       return f;
    }
  • Unable to find audio stream

    18 novembre 2011, par Kurt

    I'm trying to adapt a code in order to add a sound capture feature (on a live stream), with the help of ffmpeg and directshow.

    When i try to play with ffplay the AVIsynth file, everything works perfectly i've got the audio and video. But when i open this input file by code i only find the video stream.

    The Avs file :

    V = DirectShowSource("Decklink_HDMI.grf", fps=10, framecount=1000000000, seek=false, audio=false)
    A = DirectShowSource("Decklink_Audio.grf", fps=1, framecount=1000000000, video=false)
    AudioDub(V, A)

    The opening code :

    ffmpeg::AVInputFormat * ifmt;

    ifmt = ffmpeg::av_find_input_format("avs");

    // Open input file
    if(ffmpeg::avformat_open_input(&pFormatCtx, filename.toStdString().c_str(), ifmt, NULL) != 0)

    When i make a variable lookout on gdb just after the opening.

    i'm looking at nb_streams in pFormatCtx->nb_streams and it's at 1
    The only stream i can find in pFormatCtx->stream is a video one. And that's why i'm not able to capture the sound.

    So i'm asking if i'm doing something wrong with my opening or if i miss something.

    Ps : I know the code shown is short but the problem appears at this very step !

    Thank you

    Kurt

    — EDIT —

    I've also noticed that when I dump the AVFormatContext the video stream got a 456x32 size.
    And i can find this very same size of the window displayed when i try to launch a corrupted script with ffplay.

    The original video format when i play the correct script with ffplay is of 1920x1080

    I think my problem is maybe deeper than the simple fact of not being able to get the audio stream.

    I'm trying to find out how to know the error message that is displayed on this 456x32 windows

    — EDIT2 —

    I Find out what is written on this image and my problem is solved, badly placed avs script an old one was is place.

    I'm ashamed.