Recherche avancée

Médias (1)

Mot : - Tags -/swfupload

Autres articles (103)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Use, discuss, criticize

    13 avril 2011, par

    Talk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
    The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
    A discussion list is available for all exchanges between users.

  • Librairies et binaires spécifiques au traitement vidéo et sonore

    31 janvier 2010, par

    Les logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
    Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
    Binaires complémentaires et facultatifs flvtool2 : (...)

Sur d’autres sites (7464)

  • Merging input Streams with nodejs/ffmpeg

    14 septembre 2020, par jAndy

    I'm creating a very basic and rudimentary Video-Web-Chat. On the client side, I'm going to use a simple getUserMedia API call to capture the webcam data and send video-data as data-blob to my server.

    


    From There, I'm planning to either use the fluent-ffmpeg library or just spawn ffmpeg myself and pipe that raw data to ffmpeg, which in turn, does some magic and pushes that out as HLS stream to an Amazon AWS Service (for instance), which then gets actually displayed on a Web Browser for all participating people in the video chat.

    


    So far, I think all of this should be fairly easy to implement, but I keep my head spinning around the question, how I can create a "combined" or "merged" frame and stream, so the output HLS data from my server to the distributing cloud service has only to be one combined data stream to receive.

    


    If there are 3 people in that video chat, my server receives 3 data streams from those clients and combines these data streams (from the individual web-cam data sources) into one output stream.

    


    How could that be accomplished ?
Can I "create" a new frame with ffmpeg, so to speak ? I would be very thankful if anybody could give me a heads up here, maybe I'm thinking in a complete wrong direction.

    


    Another question which arises to me is, if I really can just "dump" any data, which I'm receiving from a binary blob created from getUserMedia or MultiStreamRecorder to ffmpeg or if I have to specify somewhere and somehow the exact codecs being used etc.?

    


  • output of ffmpeg comes out like yamborghini high music video

    19 janvier, par chip

    I do this procedure when I edit a long video

    


      

    • segment to 3 second videos, so I come up with a lot of short videos
    • 


    • I randomly pick videos and put them in a list
    • 


    • then I join these short videos together using concat
    • 


    • now I get a long video again. next thing I do is segment the video 4 minute videos
    • 


    


    After processing, the videos look messed up. I don't know how to describe it but it looks like the music video yamborghini high

    


    For some reason, this only happens to videos I capture at night. I do the same process for day time footage, no problem.

    


    is there a problem with slicing, merging and then slicing again ?

    


    or is it an issue that I run multiple ffmpeg scripts at the same time ?

    


    here's the script

    


    for FILE in *.mp4; do ffmpeg -i ${FILE} -vcodec copy -f segment -segment_time 00:10 -reset_timestamps 1 "part_$( date '+%F%H%M%S' )_%02d.mp4"; rm -rf $FILE; done; echo 'slicing completed.' && \ 
for f in part_*[13579].mp4; do echo "file '$f'" >> mylist.txt; done
ffmpeg -f concat -safe 0 -i mylist.txt -c copy output.mp4 && echo 'done merging.' && \ 
ffmpeg -i output.mp4 -threads 7 -vcodec copy -f segment -segment_time 04:00 -reset_timestamps 1 "Video_Title_$( date '+%F%H%M%S' ).mp4" && echo 'individual videos created'




    


  • Building a shared library from static libraries for ffmpeg 2.5.2

    7 janvier 2015, par abijinx

    I am currently building a shared library for ffmpeg as myffmpeg.so using ffmpeg 0.8.6. I am achieving by combining static libs of individual modules of ffmpeg, along with a few additional static libraries. The main makefile used is as follows :

    LOCAL_PATH := $(call my-dir)

    include $(LOCAL_PATH)/Android_Files.mk

    include $(CLEAR_VARS)
    LOCAL_MODULE:= libavcodec
    LOCAL_SRC_FILES:= $(MY_AVCODEC_FILES)
    include $(LOCAL_PATH)/Android_Common.mk
    include $(BUILD_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE:= libavfilter
    LOCAL_SRC_FILES:= $(MY_AVFILTER_FILES)
    include $(LOCAL_PATH)/Android_Common.mk
    include $(BUILD_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE:= libavformat
    LOCAL_SRC_FILES:= $(MY_AVFORMAT_FILES)
    include $(LOCAL_PATH)/Android_Common.mk
    include $(BUILD_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE:= libavutil
    LOCAL_SRC_FILES:= $(MY_AVUTIL_FILES)
    include $(LOCAL_PATH)/Android_Common.mk
    include $(BUILD_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE:= libswscale
    LOCAL_SRC_FILES:= $(MY_SWSCALE_FILES)
    include $(LOCAL_PATH)/Android_Common.mk
    include $(BUILD_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE:= ffmpeg_myc
    LOCAL_SRC_FILES:= ffmpeg.c cmdutils.c
    include $(LOCAL_PATH)/Android_Common.mk
    include $(BUILD_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := myffmpeg
    LOCAL_WHOLE_STATIC_LIBRARIES := libavcodec libavfilter libavformat libavutil libswscale ffmpeg_c
    LOCAL_LDFLAGS += -lz -lm -llog
    ifeq ($(FF_ENABLE_AMR),yes)
    LOCAL_WHOLE_STATIC_LIBRARIES += opencore-amrnb opencore-amrwb
    endif
    ifeq ($(FF_ENABLE_AAC),yes)
    LOCAL_WHOLE_STATIC_LIBRARIES += vo-aacenc
    endif

    include $(BUILD_SHARED_LIBRARY)

    The above makefile uses only few selected source files to get the individual static libs as mentioned in the FF_SOURCE_FILES parameter.Then combines all of those with the additional libs to create a shared library myffmpeg.so.

    Now I am trying to get a similar output with ffmpeg 2.5.2 for 64-bit arm architecture. This time I have built the latest ffmpeg using the following configure script and generated the static libraries of different ffmpeg modules like libavcodec.a, libavfilter.a, etc

    #!/bin/bash
    ABI=aarch64-linux-android
    NDK=
    SYSROOT=$NDK/platforms/android-21/arch-arm64/
    TOOLCHAIN=$NDK/toolchains/aarch64-linux-android-4.9/prebuilt/linux-x86_64
    CPU=arm64
    PREFIX=$(pwd)/android/$CPU
    ./configure \
    --prefix=$PREFIX \
    --cc=$TOOLCHAIN/bin/$ABI-gcc \
    --enable-static \
    --disable-doc \
    --disable-ffmpeg \
    --disable-ffplay \
    --disable-ffprobe \
    --disable-network \
    --disable-ffserver \
    --disable-devices \
    --disable-avdevice \
    --disable-swscale-alpha \
    --disable-doc \
    --disable-symver \
    --disable-neon \
    --enable-optimizations \
    --cross-prefix=$TOOLCHAIN/bin/aarch64-linux-android- \
    --target-os=linux \
    --arch=arm64 \
    --enable-cross-compile \
    --sysroot=$SYSROOT

       --enable-libopencore-amrnb \
       --enable-libopencore-amrwb \
       --enable-libvo-aacenc \
       $ADDITIONAL_CONFIGURE_FLAG
    make clean
    make
    make install

    Now I am trying to combine the generated static libraries i.e

    • libavcodec.a
      libavfilter.a
      libavformat.a
      libavutil.a
      libswresample.a
      libswscale.a
      with

    • ffmpeg_myc opencore-amrnb opencore-amrwb

    and generate a myffmpeg.so shared library by giving ndk-build.

    I tried to paste the generated ffmpeg static libraries from ffmpeg/android folder to obj/local/arm64-v8a (path where are all the static libs were previously created in 0.8.6 according the makefile). I also made changes to the makefile thinking that it will use the already generated ffmpeg libraries. But I found that the size of the generated shared library was too low and that it has not included the generated ffmpeg libs.

    LOCAL_PATH := $(call my-dir)
    include $(CLEAR_VARS)
    LOCAL_MODULE := myffmpeg
    LOCAL_WHOLE_STATIC_LIBRARIES := libavcodec libavfilter libavformat libavutil libswscale ffmpeg_c
    LOCAL_LDFLAGS += -lz -lm -llog
    ifeq ($(FF_ENABLE_AMR),yes)
    LOCAL_WHOLE_STATIC_LIBRARIES += opencore-amrnb opencore-amrwb
    endif
    ifeq ($(FF_ENABLE_AAC),yes)
    LOCAL_WHOLE_STATIC_LIBRARIES += vo-aacenc
    endif

    include $(BUILD_SHARED_LIBRARY)

    I would like to know whether I am going in a right way and have to do some minor changes. Or if I should start with an entirely different approach to achieve this. Any suggestions would be helpful.

    Thanks in advance