Recherche avancée

Médias (3)

Mot : - Tags -/collection

Autres articles (50)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Possibilité de déploiement en ferme

    12 avril 2011, par

    MediaSPIP peut être installé comme une ferme, avec un seul "noyau" hébergé sur un serveur dédié et utilisé par une multitude de sites différents.
    Cela permet, par exemple : de pouvoir partager les frais de mise en œuvre entre plusieurs projets / individus ; de pouvoir déployer rapidement une multitude de sites uniques ; d’éviter d’avoir à mettre l’ensemble des créations dans un fourre-tout numérique comme c’est le cas pour les grandes plate-formes tout public disséminées sur le (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (11324)

  • Revision 3698 : tenter de choper le code postale uniquement si l’accuracy > 5 ...

    9 juillet 2010, par b_b — Log

    tenter de choper le code postale uniquement si l’accuracy > 5 ​http://groups.google.com.jm/group/google-maps-api/msg/88b138b5458e3124

  • Revision 3698 : tenter de choper le code postale uniquement si l’accuracy > 5 ...

    9 juillet 2010, par b_b — Log

    tenter de choper le code postale uniquement si l’accuracy > 5 ​http://groups.google.com.jm/group/google-maps-api/msg/88b138b5458e3124

  • How can I run command line FFMPEG and accept multiple pipes (video and audio) without blocking on the first input ?

    18 février 2016, par Version135b

    I’m trying to mux h264 and aac created with MediaCodec using FFMPEG, and also use FFMPEG’s RTMP support to send to youtube. I’ve created two pipes, and am writing from java (android) through WriteableByteChannels. I can send to one pipe just fine (accepting null audio) like this :

    ./ffmpeg -f lavfi -i aevalsrc=0 -i "files/camera-test.h264" -acodec aac -vcodec copy -bufsize 512k -f flv "rtmp://a.rtmp.youtube.com/live2/XXXX"

    YouTube streaming works perfectly (but I have no audio). Using two pipes this is my command :

    ./ffmpeg \
    -i "files/camera-test.h264" \
    -i "files/audio-test.aac" \
    -vcodec copy \
    -acodec copy \
    -map 0:v:0 -map 1:a:0 \
    -f flv "rtmp://a.rtmp.youtube.com/live2/XXXX""

    The pipes are created with mkfifo , and opened from java like this :

    pipeWriterVideo = Channels.newChannel(new FileOutputStream(outputFileVideo.toString()));

    The order of execution (for now in my test phase) is creation of the files, starting ffmpeg (through adb shell) and then starting recording which opens the channels. ffmpeg will immediately open the h264 stream and then wait, since it is reading from the pipe the first channel open (for video) will successfully run. When it comes to trying to open the audio the same way, it fails because ffmpeg has not actually started reading from the pipe. I can open a second terminal window and cat the audio file and my app spits out what i hope is encoded aac, but ffmpeg fails, usually just sitting there waiting. Here is the verbose output :

    ffmpeg version N-78385-g855d9d2 Copyright (c) 2000-2016 the FFmpeg
    developers
     built with gcc 4.8 (GCC)
     configuration: --prefix=/home/dev/svn/android-ffmpeg-with-rtmp/src/ffmpeg/android/arm
       --enable-shared --disable-static --disable-doc --disable-ffplay
       --disable-ffprobe --disable-ffserver --disable-symver
       --cross-prefix=/home/dev/dev/android-ndk-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86_64/bin/arm-linux-androideabi-
       --target-os=linux --arch=arm --enable-cross-compile
       --enable-librtmp --enable-pic --enable-decoder=h264
       --sysroot=/home/dev/dev/android-ndk-r10e/platforms/android-19/arch-arm
       --extra-cflags='-Os -fpic -marm'
       --extra-ldflags='-L/home/dev/svn/android-ffmpeg-with-rtmp/src/openssl-android/libs/armeabi '
       --extra-ldexeflags=-pie --pkg-config=/usr/bin/pkg-config
     libavutil      55. 17.100 / 55. 17.100
     libavcodec     57. 24.102 / 57. 24.102
     libavformat    57. 25.100 / 57. 25.100
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 31.100 /  6. 31.100
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
    matched as AVOption 'debug' with argument 'verbose'.
    Trailing options were found on the commandline.
    Finished splitting the commandline.
    Parsing a group of options: global .
    Applying option async (audio sync method) with argument 1.
    Successfully parsed a group of options.
    Parsing a group of options: input file files/camera-test.h264.
    Successfully parsed a group of options.
    Opening an input file: files/camera-test.h264.
    [file @ 0xb503b100] Setting default whitelist 'file'

    I think if I could just get ffmpeg to start listening to both pipes, the rest would work out !

    Thanks for your time.

    EDIT :
    I’ve made progress by decoupling the audio pipe connection and encoding, but now as soon as the video stream has been passed it errors on audio. I started a separate thread to create the WriteableByteChannel for audio and it never gets passed the FileOutputStream creation.

    matched as AVOption 'debug' with argument 'verbose'.
    Trailing options were found on the commandline.
    Finished splitting the commandline.
    Parsing a group of options: global .
    Successfully parsed a group of options.
    Parsing a group of options: input file files/camera-test.h264.
    Successfully parsed a group of options.
    Opening an input file: files/camera-test.h264.
    [file @ 0xb503b100] Setting default whitelist 'file'
    [h264 @ 0xb503c400] Format h264 probed with size=2048 and score=51
    [h264 @ 0xb503c400] Before avformat_find_stream_info() pos: 0 bytes read:15719 seeks:0
    [h264 @ 0xb5027400] Current profile doesn't provide more RBSP data in PPS, skipping
    [h264 @ 0xb503c400] max_analyze_duration 5000000 reached at 5000000 microseconds st:0
    [h264 @ 0xb503c400] After avformat_find_stream_info() pos: 545242 bytes read:546928 seeks:0 frames:127
    Input #0, h264, from 'files/camera-test.h264':
     Duration: N/A, bitrate: N/A
       Stream #0:0, 127, 1/1200000: Video: h264 (Baseline), 1 reference frame, yuv420p(left), 854x480 (864x480), 1/50, 25 fps, 25 tbr, 1200k tbn, 50 tbc
    Successfully opened the file.
    Parsing a group of options: input file files/audio-test.aac.
    Applying option vcodec (force video codec ('copy' to copy stream)) with argument copy.
    Successfully parsed a group of options.
    Opening an input file: files/audio-test.aac.
    Unknown decoder 'copy'
    [AVIOContext @ 0xb5054020] Statistics: 546928 bytes read, 0 seeks

    Here is where I attempt to open the audio pipe.

    new Thread(){
        public void run(){
             Log.d("Audio", "pre thread");
             FileOutputStream fs = null;
             try {
                  fs = new FileOutputStream("/data/data/android.com.android.grafika/files/audio-test.aac");
             } catch (FileNotFoundException e) {
                  e.printStackTrace();
             }
             Log.d("Audio", "made fileoutputstream");  //never hits here
             mVideoEncoder.pipeWriterAudio = Channels.newChannel(fs);
             Log.d("Audio", "made it past opening audio pipe");
        }
    }.start();

    Thanks.