Recherche avancée

Médias (91)

Autres articles (54)

  • (Dés)Activation de fonctionnalités (plugins)

    18 février 2011, par

    Pour gérer l’ajout et la suppression de fonctionnalités supplémentaires (ou plugins), MediaSPIP utilise à partir de la version 0.2 SVP.
    SVP permet l’activation facile de plugins depuis l’espace de configuration de MediaSPIP.
    Pour y accéder, il suffit de se rendre dans l’espace de configuration puis de se rendre sur la page "Gestion des plugins".
    MediaSPIP est fourni par défaut avec l’ensemble des plugins dits "compatibles", ils ont été testés et intégrés afin de fonctionner parfaitement avec chaque (...)

  • Activation de l’inscription des visiteurs

    12 avril 2011, par

    Il est également possible d’activer l’inscription des visiteurs ce qui permettra à tout un chacun d’ouvrir soit même un compte sur le canal en question dans le cadre de projets ouverts par exemple.
    Pour ce faire, il suffit d’aller dans l’espace de configuration du site en choisissant le sous menus "Gestion des utilisateurs". Le premier formulaire visible correspond à cette fonctionnalité.
    Par défaut, MediaSPIP a créé lors de son initialisation un élément de menu dans le menu du haut de la page menant (...)

  • MediaSPIP : Modification des droits de création d’objets et de publication définitive

    11 novembre 2010, par

    Par défaut, MediaSPIP permet de créer 5 types d’objets.
    Toujours par défaut les droits de création et de publication définitive de ces objets sont réservés aux administrateurs, mais ils sont bien entendu configurables par les webmestres.
    Ces droits sont ainsi bloqués pour plusieurs raisons : parce que le fait d’autoriser à publier doit être la volonté du webmestre pas de l’ensemble de la plateforme et donc ne pas être un choix par défaut ; parce qu’avoir un compte peut servir à autre choses également, (...)

Sur d’autres sites (11675)

  • Android JavaCV FFmpeg webstream to local static website

    26 mars 2017, par Thomas Devoogdt

    For my integrated test I’m working on an application that needs to provide a live stream to a locally hosted website. I’ve already built a working site that run’s on nanohttpd. This application performs also special image processing. Therefore I use JavaCV. The library is working perfectly and all cpp bindings are working too.

    My question : How to set up a live stream that can directly be played in a static site hosted by nanohttpd ? - I am on the right way ?

    My code :

    init :

    private void initLiveStream() throws FrameRecorder.Exception {
       /* ~~~ https://github.com/bytedeco/javacv/issues/598 ~~~ */
       frameRecorder = new FFmpegFrameRecorder("http://localhost:9090", imageWidth, imageHeight, 0);
       frameRecorder.setVideoOption("preset", "ultrafast");
       frameRecorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
       frameRecorder.setAudioCodec(0);
       frameRecorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
       frameRecorder.setFormat("webm");
       frameRecorder.setGopSize(10);
       frameRecorder.setFrameRate(frameRate);
       frameRecorder.setVideoBitrate(5000);
       frameRecorder.setOption("content_type","video/webm");
       frameRecorder.setOption("listen", "1");
       frameRecorder.start();
    }

    In my CameraView :

    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
       Camera.Size size = camera.getParameters().getPreviewSize();
       Frame frame = new AndroidFrameConverter().convert(data, size.width, size.height);
       try {
            if(frameRecorder!=null){
                frameRecorder.record(frame);
            }
        } catch (FrameRecorder.Exception e) {
            e.printStackTrace();
        }
    }

    Here is one of the stack traces that ar shown frequently in my search to the solution :

    org.bytedeco.javacv.FrameRecorder$Exception: avio_open error() error -111: Could not open 'http://localhost:9090'

    I couldn’t find any other thread addressing this specific issue.

    Thanks in advance

    EDIT

    Thanks to Chester Cobus, Here is my used code :

    Websocket :

    //Constructor
    AsyncHttpServer serverStream = new AsyncHttpServer();
    List<websocket> sockets = new ArrayList&lt;>();

    //http://stackoverflow.com/a/33021907/5500092
    //I'm planning to use more sockets. This is the only uniform expression I found.
    serverStream.websocket("/((?:[^/]*/)*)(.*)", new AsyncHttpServer.WebSocketRequestCallback() {
        @Override
        public void onConnected(final WebSocket webSocket, AsyncHttpServerRequest request) {
            String uri = request.getPath();
            if (uri.equals("/live")) {
                sockets.add(webSocket);

                //Use this to clean up any references to your websocket
                webSocket.setClosedCallback(new CompletedCallback() {
                    @Override
                    public void onCompleted(Exception ex) {
                        try {
                            if (ex != null)
                                Log.e("WebSocket", "Error");
                        } finally {
                            sockets.remove(webSocket);
                        }
                    }
                });
            }
        }
    });

    //Updater (Observer pattern)
    @Override
    public void updated(byte[] data) {
       for (WebSocket socket : sockets) {
            socket.write(new ByteBufferList(data));
       }
    }
    </websocket>

    Record Acitivy

    private long start_time = System.currentTimeMillis();

    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
       long now_time = System.currentTimeMillis();
       if ((now_time - start_time) > 250) {
           start_time = now_time;
           //https://forums.xamarin.com/discussion/40991/onpreviewframe-issue-converting-preview-byte-to-android-graphics-bitmap
           Camera.Size size = camera.getParameters().getPreviewSize();
           YuvImage image = new YuvImage(data, ImageFormat.NV21, size.width, size.height, null);
           ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
           image.compressToJpeg(new Rect(0, 0, size.width, size.height), 60, byteArrayOutputStream);
           MainActivity.getWebStreamer().updated(byteArrayOutputStream.toByteArray());
       }
    }

    JavaScript

    var socket;
    var imageElement;

    /**
    * path - String.Format("ws://{0}:8090/live", Window.Location.HostName)
    * image - HTMLImageElement
    */
    function imageStreamer(path, image) {
       imageElement = image;
       socket = new WebSocket(path);

       socket.onmessage = function(msg) {
           var arrayBuffer = msg.data;
           var reader = new FileReader();
           reader.onload = function(e) {
               imageElement.src = e.target.result;
           };
           reader.readAsDataURL(arrayBuffer);
       };
    }
  • ffmpeg pipe Invalid data found when processing input

    28 mars 2021, par Ankit Maheshwari
    Here is my configuration.&#xA;

    &#xA;

    &#xD;&#xA;
    &#xD;&#xA;
    const ffmpegPath = require(&#x27;@ffmpeg-installer/ffmpeg&#x27;).path;&#xA;const spawn = require(&#x27;child_process&#x27;).spawn;&#xA;&#xA;ffmpeg = spawn(ffmpegPath, [&#xA;      &#xA;// Remove this line, as well as `-shortest`, if you send audio from the browser.&#xA;      // &#x27;-f&#x27;, &#x27;lavfi&#x27;, &#x27;-i&#x27;, &#x27;anullsrc&#x27;,&#xA;&#xA;      // FFmpeg will read input video from STDIN&#xA;      &#x27;-i&#x27;, &#x27;-&#x27;,&#xA;&#xA;      // -re flag means to Read input at native frame rate.&#xA;      &#x27;-re&#x27;, &#x27;-y&#x27;,&#xA;&#xA;      // thread_queue_size added to avoid err: Thread message queue blocking; consider raising the thread_queue_size option, required before each input - this is for image2&#xA;      // &#x27;-thread_queue_size&#x27;, &#x27;2048&#x27;,&#xA;&#xA;      // REF TO OVERLAY &#xA;      // https://stackoverflow.com/questions/10438713/overlay-animated-images-with-transparency-over-a-static-background-image-using-f?rq=1&#xA;      // -loop loops the background image input so that we don&#x27;t just have one frame, crucial!&#xA;&#xA;      // The image file muxer writes video frames to image files, http://underpop.online.fr/f/ffmpeg/help/image2-1.htm.gz&#xA;      &#x27;-f&#x27;, &#x27;image2&#x27;,&#xA;&#xA;      // The -loop option is specific to the image file demuxer and gif muxer, so it can&#x27;t be used for typical video files, but it can be used to infinitely loop a series of input images.&#xA;      &#x27;-loop&#x27;, &#x27;1&#x27;,&#xA;&#xA;      // pattern_type is used to determine the format of the images contained in the files.&#xA;      // Read images matching the "*.png" glob pattern, that is files terminating with the ".png" suffix&#xA;      &#x27;-pattern_type&#x27;, &#x27;glob&#x27;,&#xA;&#xA;      // &#x27;-i&#x27;, `images/${streamConfigData.youtube_key}/destination/image-*.png`,&#xA;&#xA;      &#x27;-i&#x27;, `images/${streamConfigData.youtube_key}/overlay/abc.png`,&#xA;&#xA;      // &#x27;-vf&#x27;, &#x27;scale=1920x1080:flags=lanczos&#x27;,&#xA;      &#xA;      // -shortest ends encoding when the shortest input ends, which is necessary as looping the background means that that input will never end.&#xA;      // &#x27;overlay=shortest=1&#x27;,&#xA;&#xA;      "-filter_complex", "[1:v]format=rgba,colorchannelmixer=aa=1[fg];[0][fg]overlay",&#xA;      &#xA;      // Because we&#x27;re using a generated audio source which never ends,&#xA;      // specify that we&#x27;ll stop at end of other input.  Remove this line if you&#xA;      // send audio from the browser.&#xA;      // &#x27;-shortest&#x27;,&#xA;      &#xA;      // If we&#x27;re encoding H.264 in-browser, we can set the video codec to &#x27;copy&#x27;&#xA;      // so that we don&#x27;t waste any CPU and quality with unnecessary transcoding.&#xA;      // If the browser doesn&#x27;t support H.264, set the video codec to &#x27;libx264&#x27;&#xA;      // or similar to transcode it to H.264 here on the server.&#xA;      // &#x27;-vcodec&#x27;, &#x27;libx264&#x27;,&#xA;      // it is not possible to filter and stream copy the same stream at the same time. https://stackoverflow.com/a/53526514/4057143&#xA;      &#x27;-vcodec&#x27;, &#x27;copy&#x27;,&#xA;      &#xA;      // if browser not supports encoding AAC, we must transcode the audio to AAC here on the server.&#xA;      // &#x27;-acodec&#x27;, &#x27;aac&#x27;,&#xA;&#xA;      // Use this rate control mode if you want to keep the best quality and care less about the file size. CRF scale is 0–51, where 0 is lossless, 23 is the default, and 51 is worst quality possible. A lower value generally leads to higher quality, https://trac.ffmpeg.org/wiki/Encode/H.264&#xA;      &#x27;-crf&#x27;, &#x27;23&#x27;,&#xA;&#xA;      // preset provide a certain encoding speed to compression ratio. A slower preset will provide better compression. medium – default preset, https://trac.ffmpeg.org/wiki/Encode/H.264&#xA;      &#x27;-preset&#x27;, &#x27;ultrafast&#x27;,&#xA;&#xA;      // -r set the frame rate. Generally, -r. Use the filter when you need to change framerate before applying further filters.&#xA;      // &#x27;-r&#x27;, &#x27;30&#x27;,&#xA;      // &#x27;-framerate&#x27;, &#x27;30&#x27;,&#xA;&#xA;      //debug level logs&#xA;      &#x27;-loglevel&#x27;, &#x27;debug&#x27;,&#xA;      &#x27;-v&#x27;, &#x27;verbose&#x27;,&#xA;&#xA;      // -g GOP_LEN_IN_FRAMES, -g sets the keyframe interval. https://superuser.com/a/908325&#xA;      &#x27;-g&#x27;, &#x27;60&#x27;,&#xA;&#xA;      // video timescale, not sure what it is!&#xA;      &#x27;-video_track_timescale&#x27;, &#x27;1000&#x27;,&#xA;&#xA;      // a live stream with more/less constant bit rate, to be able to control the bandwidth used.&#xA;      // a live stream with limited bit rate&#xA;      &#x27;-b:v&#x27;, &#x27;15000k&#x27;,&#xA;      // &#x27;-maxrate&#x27;, &#x27;4000k&#x27;,&#xA;      // &#x27;-bufsize&#x27;, &#x27;8000k&#x27;,&#xA;&#xA;      // FLV is the container format used in conjunction with RTMP&#xA;      &#x27;-f&#x27;, &#x27;flv&#x27;,&#xA;      &#xA;      // The output RTMP URL.&#xA;      // For debugging, you could set this to a filename like &#x27;test.flv&#x27;, and play&#xA;      // the resulting file with VLC.&#xA;      rtmpUrl &#xA;    ], {&#xA;      env: {&#xA;          NODE_ENV: &#x27;production&#x27;,&#xA;          PATH: process.env.PATH&#xA;      }&#xA;    });

    &#xD;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;&#xA;

  • Undefined references while using ffmpeg 2.1.1 for Android

    2 décembre 2013, par Kernald

    I'm building ffmpeg following the same pattern as halfninja's build : make the ffmpeg's main available from Java through JNI. I built ffmpeg with the following options :

    ./configure \
       --prefix=$PREFIX \
       --disable-shared \
       --enable-static \
       --disable-doc \
       --disable-ffmpeg \
       --disable-ffplay \
       --disable-ffprobe \
       --disable-ffserver \
       --disable-doc \
       --disable-symver \
       --cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
       --target-os=linux \
       --arch=arm \
       --enable-cross-compile \
       --sysroot=$SYSROOT \
       --extra-cflags="-Os -fpic $ADDI_CFLAGS" \
       --extra-ldflags="$ADDI_LDFLAGS"

    $PREFIX, TOOLCHAIN, … being set to corresponding folders from my NDK. $ADDI_CFLAGS is set to -marm and $ADDI_LDFLAGS and $ADDITIONAL_CONFIGURE_FLAGS are both unset. The resulting static libraries are created :

    $ ls -1 android/arm/lib/
    libavcodec.a
    libavdevice.a
    libavfilter.a
    libavformat.a
    libavutil.a
    libswresample.a
    libswscale.a
    pkgconfig

    I expose them with an Android.mk file, ffmpeg being built in $NDK/sources/ffmpeg-2.1.1 :

    LOCAL_PATH:= $(call my-dir)

    include $(CLEAR_VARS)
    LOCAL_MODULE:= libavdevice
    LOCAL_SRC_FILES:= lib/libavdevice.a
    LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include
    include $(PREBUILT_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE:= libavcodec
    LOCAL_SRC_FILES:= lib/libavcodec.a
    LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include
    include $(PREBUILT_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE:= libavformat
    LOCAL_SRC_FILES:= lib/libavformat.a
    LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include
    include $(PREBUILT_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE:= libswscale
    LOCAL_SRC_FILES:= lib/libswscale.a
    LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include
    include $(PREBUILT_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE:= libavutil
    LOCAL_SRC_FILES:= lib/libavutil.a
    LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include
    include $(PREBUILT_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE:= libavfilter
    LOCAL_SRC_FILES:= lib/libavfilter.a
    LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include
    include $(PREBUILT_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE:= libwsresample
    LOCAL_SRC_FILES:= lib/libswresample.a
    LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/include
    include $(PREBUILT_STATIC_LIBRARY)

    So, up to there, everything looks good. Now, I try to build the ffmpeg binary equivalent as a static library. I copied (and didn't change a single character for now) ffmpeg.c, ffmpeg.h, cmdutils.c, cmdutils.h, ffmpeg_opt.cand ffmpeg_filter.c in my jni folder. I also have two directly JNI-related files in this folder (copied from halfninja's build, I just changed the package name). Here's the relevant Android.mk :

    LOCAL_PATH := $(call my-dir)

    include $(CLEAR_VARS)
    LOCAL_C_INCLUDES := /Applications/android-ndk-r9b/sources/ffmpeg-2.1.1
    LOCAL_CFLAGS := -Wdeprecated-declarations
    LOCAL_MODULE := videokit
    ANDROID_LIB := -landroid
    LOCAL_LDLIBS += -llog -ljnigraphics -lz
    LOCAL_SRC_FILES := videokit/com_rfc_video_ffmpeg_Videokit.c videokit/ffmpeg.c videokit/cmdutils.c videokit/ffmpeg_opt.c videokit/ffmpeg_filter.c
    LOCAL_SHARED_LIBRARIES := libavdevice libavformat libavfilter libavcodec libwscale libavutil libswresample libswscale libpostproc
    include $(BUILD_SHARED_LIBRARY)
    $(call import-module,ffmpeg-2.1.1/android/arm)

    Everything compiles fine, but doesn't link. Here are the first errors :

    [armeabi] SharedLibrary : libvideokit.so
    /Applications/android-ndk-r9b/toolchains/arm-linux-androideabi-4.6/prebuilt/darwin-x86_64/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld : ./obj/local/armeabi/objs/videokit/videokit/cmdutils.o : in function print_all_libs_info.constprop.5:jni/videokit/cmdutils.c:1063 : error : undefined reference to 'swresample_version'
    /Applications/android-ndk-r9b/toolchains/arm-linux-androideabi-4.6/prebuilt/darwin-x86_64/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld : ./obj/local/armeabi/objs/videokit/videokit/cmdutils.o : in function print_all_libs_info.constprop.5:jni/videokit/cmdutils.c:1063 : error : undefined reference to 'swresample_configuration'
    /Applications/android-ndk-r9b/toolchains/arm-linux-androideabi-4.6/prebuilt/darwin-x86_64/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld : ./obj/local/armeabi/objs/videokit/videokit/cmdutils.o : in function opt_default:jni/videokit/cmdutils.c:558 : error : undefined reference to 'swr_get_class'
    /Applications/android-ndk-r9b/toolchains/arm-linux-androideabi-4.6/prebuilt/darwin-x86_64/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld : ./obj/local/armeabi/objs/videokit/videokit/cmdutils.o : in function opt_default:jni/videokit/cmdutils.c:561 : error : undefined reference to 'swr_alloc'
    /Applications/android-ndk-r9b/toolchains/arm-linux-androideabi-4.6/prebuilt/darwin-x86_64/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld : ./obj/local/armeabi/objs/videokit/videokit/cmdutils.o : in function opt_default:jni/videokit/cmdutils.c:563 : error : undefined reference to 'swr_free'
    /Applications/android-ndk-r9b/toolchains/arm-linux-androideabi-4.6/prebuilt/darwin-x86_64/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld : ./obj/local/armeabi/objs/videokit/videokit/ffmpeg_opt.o : in function show_help_default:jni/videokit/ffmpeg_opt.c:2464 : error : undefined reference to 'swr_get_class'

    What I don't understand is that these functions are defined and available in libswresample.a, which I'm linking to :

    arm-linux-androideabi-nm /Applications/android-ndk-r9b/sources/ffmpeg-2.1.1/android/arm/lib/libswresample.a  | grep -i -e swr_get_class -e swresample_version
    000001d4 T swr_get_class
    00000178 T swresample_version

    What am I doing wrong ? Is there another, maybe simpler way to expose ffmpeg as a static library available via JNI ? (I don't use halfninja's because I need at least ffmpeg 1.1, while his is in 0.9).