Recherche avancée

Médias (39)

Mot : - Tags -/audio

Autres articles (93)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

Sur d’autres sites (2733)

  • While loop in bash to read a file skips first 2 characters of THIRD Line

    9 juillet 2018, par Yaser Sakkaf
    #bin/bash
    INPUT_DIR="$1"
    INPUT_VIDEO="$2"
    OUTPUT_PATH="$3"
    SOURCE="$4"
    DATE="$5"

    INPUT="$INPUT_DIR/sorted_result.txt"
    COUNT=1
    initial=00:00:00
    while IFS= read -r line; do
     OUT_DIR=$OUTPUT_PATH/$COUNT
     mkdir "$OUT_DIR"
     ffmpeg -nostdin -i $INPUT_VIDEO -vcodec h264 -vf fps=25 -ss $initial -to $line $OUT_DIR/$COUNT.avi
     ffmpeg -i $OUT_DIR/$COUNT.avi -acodec pcm_s16le -ar 16000 -ac 1 $OUT_DIR/$COUNT.wav
     python3.6 /home/Video_Audio_Chunks_1.py $OUT_DIR/$COUNT.wav
     python /home/transcribe.py  --decoder beam --cuda --source $SOURCE --date $DATE --video $OUT_DIR/$COUNT.avi --out_dir "$OUT_DIR"
     COUNT=$((COUNT + 1))
     echo "--------------------------------------------------"
     echo $initial
     echo $line
     echo "--------------------------------------------------"
     initial=$line
    done < "$INPUT"

    This is the code I am working on.
    The contents of file sorted_results.txt are as follows :

    00:6:59
    00:7:55
    00:8:39
    00:19:17
    00:27:48
    00:43:27

    While reading the file it skips first two characters of the third line i.e. it takes it as :8:39 which results in the ffmpeg error and the script stops.

    However when I only print the variables $INITIAL and $LINE, commenting the ffmpeg command the values are printed correctly i.e. same as the file contents.

    I think the ffmpeg command is somehow affecting the file reading process or the variable value. BUT I CAN’T UNDERSTAND HOW ?

    PLEASE HELP.

  • Trying to convert mp4 to webm with beamcoder javascript

    8 février 2023, par DumbDev

    I am trying to convert a video from mp4 to webm with beamcoder js (Aerostat Beam Coder). The video output i am getting has no audio and also its duration is getting way longer, 19s input video outputs to 5 mins of video. I still don't understand what's wrong with my configuration.

    


    Goal is to convert mp4 videos into webm for now.

    


    Thanks for the help in advance.

    


    Here's my code.

    


    funtion mp4towebm(inputFile, outputFile)  {
    return new Promise(async (res, rej) => {
        log("Demuxing Input");

        let dex = await beamcoder.demuxer(inputFile);
        // Dex Data
        writeFileJson(`${TEST_DIR_PATH}IN_dex_data.json`, dex);

        // Extract Video parameters (Header Info)
        // let VParams = dex.streams.find((x) => x.codecpar.codec_type === "video").codecpar;
        // writeFileJson(`${TEST_DIR_PATH}IN_dex_codex_params.json`, VParams);

        // Read First packet from demuxer
        let VPckt = {};
        let VPcts = [];
        log("Reading Video Packets");
        // Check for all the packets until there are none.
        while (VPckt !== null) {
            VPckt = await dex.read();
            if (VPckt && dex.streams[VPckt.stream_index].codecpar.codec_type === "video") {
                VPcts.push(VPckt);
            }
        }
        log("Reading audio packets");
        let APckt = {};
        let APckts = [];
        while (APckt !== null) {
            APckt = await dex.read();
            if (APckt && dex.streams[APckt.stream_index].codecpar.codec_type === "audio") {
                APckts.push(VPckt);
            }
        }
        log("Packet Reading Done");
        // log(JSON.stringify(videoPcts[0]));

        // Initialize a decoder
        log("Creating Decoder");
        let VDec = beamcoder.decoder({
            demuxer: dex,
            stream_index: 0,
        });
        let ADec = beamcoder.decoder({
            demuxer: dex,
            stream_index: 1,
        });
        log("Decoding Video Packets");
        let VDecRslt = await VDec.decode(VPcts);
        log("Decoding Audio Packets");
        let ADecRslt = await ADec.decode(APckts);

        log("Flushing Decoder Result");
        let VDecFR = await VDec.flush();
        let ADevFR = await ADec.flush();
        writeFileJson(`${TEST_DIR_PATH}IN_dec_video_output.json`, VDec);
        // writeFileJson(`${TEST_DIR_PATH}IN_dec_audio_output.json`, ADec);
        // log(decRslt.frames.length);
        log("Creating  Video Encoder");
        let VEnc = beamcoder.encoder({
            name: "libvpx-vp9",
            width: VDec.width,
            height: VDec.height,
            pix_fmt: VDec.pix_fmt,
            time_base: VDec.time_base,
            bit_rate: 512000,
            // bits_per_coded_sample: VDec.bits_per_coded_sample,
            // bits_per_raw_sample: VDec.bits_per_raw_sample,
            level: VDec.level,
            framerate: VDec.framerate,
        });
        log("Creating  Audio Encoder");

        let AEnc = beamcoder.encoder({
            name: "libopus",
            channel_layout: ADec.channel_layout,
            sample_rate: ADec.sample_rate,
            channels: ADec.channels,
            sample_fmt: "flt",
            bit_rate: 512000,
        });

        log("Encoding Video Frames");
        let VEncRslt = await VEnc.encode(VDecRslt.frames);
        log("Encoding Audio Frames");
        let AEncRslt = await AEnc.encode(ADecRslt.frames);
        log("Flushing Video  Encoder Result");
        let VEncFR = await VEnc.flush();
        log("Flushing Audio  Encoder Result");
        let AEncFR = await AEnc.flush();

        writeFileJson(`${TEST_DIR_PATH}OUT_enc_video_output.json`, VEnc);
        // writeFileJson(`${TEST_DIR_PATH}OUT_dex_codex_params.json`, VParams);

        log("Creating Muxer");
        let mux = beamcoder.muxer({
            filename: outputFile,
            duration: dex.duration,
            start_time: dex.start_time,
            packet_size: VEncRslt.packets[0].size,
            // width: dex.width,
            // height: dex.height,
            // interleaved: true,
        });
        await mux.openIO();
        log("Creating Video Stream");
        let OVStream = mux.newStream({
            name: "libvpx-vp9",
            duration: dex.duration,
            time_base: VEnc.time_base,
            avg_frame_rate: VDec.framerate,
            // pix_fmt: VEnc.pix_fmt,
            // bit_rate: VEnc.bit_rate,
            codecpar: VEnc.extractParams(),
            // duration: dex.duration,
            // framerate: VDec.framerate,
        });
        log("Creating Audio Stream");

        let OAStream = mux.newStream({
            name: "libopus",
            duration: dex.duration,
            channel_layout: ADec.channel_layout,
            sample_rate: ADec.sample_rate,
            channels: ADec.channels,
            sample_fmt: "flt",
            // bit_rate: ADec.bit_rate,
            // time_base: AEnc.time_base,

            codecpar: AEnc.extractParams(),
            // duration: dex.duration,
            // framerate: VDec.framerate,
        });
        log("Writing Header");
        await mux.writeHeader();

        log("Writing Packets");
        // // let OPacket = {};
        // // while(OPacket != null){
        // //     OPacket = VEnc.
        // // }
        for (let VEncPckt of VEncRslt.packets) {
            await mux.writeFrame({ packet: VEncPckt });
        }
        for (let AEncPckt of AEncRslt.packets) {
            await mux.writeFrame({ packet: AEncPckt });
        }
        // log("Adding Audio Stream From Dex");
        // let OAStream = mux.newStream(dex.streams[1]);

        writeFileJson(`${TEST_DIR_PATH}OUT_mux_data.json`, mux);

        await mux.writeTrailer();
    });
}


    


  • Revision 30164 : Code mort bis

    24 juillet 2009, par kent1@… — Log

    Code mort bis