Recherche avancée

Médias (0)

Mot : - Tags -/upload

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (28)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

Sur d’autres sites (4126)

  • Merging input Streams with nodejs/ffmpeg

    14 septembre 2020, par jAndy

    I'm creating a very basic and rudimentary Video-Web-Chat. On the client side, I'm going to use a simple getUserMedia API call to capture the webcam data and send video-data as data-blob to my server.

    


    From There, I'm planning to either use the fluent-ffmpeg library or just spawn ffmpeg myself and pipe that raw data to ffmpeg, which in turn, does some magic and pushes that out as HLS stream to an Amazon AWS Service (for instance), which then gets actually displayed on a Web Browser for all participating people in the video chat.

    


    So far, I think all of this should be fairly easy to implement, but I keep my head spinning around the question, how I can create a "combined" or "merged" frame and stream, so the output HLS data from my server to the distributing cloud service has only to be one combined data stream to receive.

    


    If there are 3 people in that video chat, my server receives 3 data streams from those clients and combines these data streams (from the individual web-cam data sources) into one output stream.

    


    How could that be accomplished ?
Can I "create" a new frame with ffmpeg, so to speak ? I would be very thankful if anybody could give me a heads up here, maybe I'm thinking in a complete wrong direction.

    


    Another question which arises to me is, if I really can just "dump" any data, which I'm receiving from a binary blob created from getUserMedia or MultiStreamRecorder to ffmpeg or if I have to specify somewhere and somehow the exact codecs being used etc.?

    


  • output of ffmpeg comes out like yamborghini high music video

    19 janvier, par chip

    I do this procedure when I edit a long video

    


      

    • segment to 3 second videos, so I come up with a lot of short videos
    • 


    • I randomly pick videos and put them in a list
    • 


    • then I join these short videos together using concat
    • 


    • now I get a long video again. next thing I do is segment the video 4 minute videos
    • 


    


    After processing, the videos look messed up. I don't know how to describe it but it looks like the music video yamborghini high

    


    For some reason, this only happens to videos I capture at night. I do the same process for day time footage, no problem.

    


    is there a problem with slicing, merging and then slicing again ?

    


    or is it an issue that I run multiple ffmpeg scripts at the same time ?

    


    here's the script

    


    for FILE in *.mp4; do ffmpeg -i ${FILE} -vcodec copy -f segment -segment_time 00:10 -reset_timestamps 1 "part_$( date '+%F%H%M%S' )_%02d.mp4"; rm -rf $FILE; done; echo 'slicing completed.' && \ 
for f in part_*[13579].mp4; do echo "file '$f'" >> mylist.txt; done
ffmpeg -f concat -safe 0 -i mylist.txt -c copy output.mp4 && echo 'done merging.' && \ 
ffmpeg -i output.mp4 -threads 7 -vcodec copy -f segment -segment_time 04:00 -reset_timestamps 1 "Video_Title_$( date '+%F%H%M%S' ).mp4" && echo 'individual videos created'




    


  • Building a shared library from static libraries for ffmpeg 2.5.2

    7 janvier 2015, par abijinx

    I am currently building a shared library for ffmpeg as myffmpeg.so using ffmpeg 0.8.6. I am achieving by combining static libs of individual modules of ffmpeg, along with a few additional static libraries. The main makefile used is as follows :

    LOCAL_PATH := $(call my-dir)

    include $(LOCAL_PATH)/Android_Files.mk

    include $(CLEAR_VARS)
    LOCAL_MODULE:= libavcodec
    LOCAL_SRC_FILES:= $(MY_AVCODEC_FILES)
    include $(LOCAL_PATH)/Android_Common.mk
    include $(BUILD_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE:= libavfilter
    LOCAL_SRC_FILES:= $(MY_AVFILTER_FILES)
    include $(LOCAL_PATH)/Android_Common.mk
    include $(BUILD_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE:= libavformat
    LOCAL_SRC_FILES:= $(MY_AVFORMAT_FILES)
    include $(LOCAL_PATH)/Android_Common.mk
    include $(BUILD_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE:= libavutil
    LOCAL_SRC_FILES:= $(MY_AVUTIL_FILES)
    include $(LOCAL_PATH)/Android_Common.mk
    include $(BUILD_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE:= libswscale
    LOCAL_SRC_FILES:= $(MY_SWSCALE_FILES)
    include $(LOCAL_PATH)/Android_Common.mk
    include $(BUILD_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE:= ffmpeg_myc
    LOCAL_SRC_FILES:= ffmpeg.c cmdutils.c
    include $(LOCAL_PATH)/Android_Common.mk
    include $(BUILD_STATIC_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := myffmpeg
    LOCAL_WHOLE_STATIC_LIBRARIES := libavcodec libavfilter libavformat libavutil libswscale ffmpeg_c
    LOCAL_LDFLAGS += -lz -lm -llog
    ifeq ($(FF_ENABLE_AMR),yes)
    LOCAL_WHOLE_STATIC_LIBRARIES += opencore-amrnb opencore-amrwb
    endif
    ifeq ($(FF_ENABLE_AAC),yes)
    LOCAL_WHOLE_STATIC_LIBRARIES += vo-aacenc
    endif

    include $(BUILD_SHARED_LIBRARY)

    The above makefile uses only few selected source files to get the individual static libs as mentioned in the FF_SOURCE_FILES parameter.Then combines all of those with the additional libs to create a shared library myffmpeg.so.

    Now I am trying to get a similar output with ffmpeg 2.5.2 for 64-bit arm architecture. This time I have built the latest ffmpeg using the following configure script and generated the static libraries of different ffmpeg modules like libavcodec.a, libavfilter.a, etc

    #!/bin/bash
    ABI=aarch64-linux-android
    NDK=
    SYSROOT=$NDK/platforms/android-21/arch-arm64/
    TOOLCHAIN=$NDK/toolchains/aarch64-linux-android-4.9/prebuilt/linux-x86_64
    CPU=arm64
    PREFIX=$(pwd)/android/$CPU
    ./configure \
    --prefix=$PREFIX \
    --cc=$TOOLCHAIN/bin/$ABI-gcc \
    --enable-static \
    --disable-doc \
    --disable-ffmpeg \
    --disable-ffplay \
    --disable-ffprobe \
    --disable-network \
    --disable-ffserver \
    --disable-devices \
    --disable-avdevice \
    --disable-swscale-alpha \
    --disable-doc \
    --disable-symver \
    --disable-neon \
    --enable-optimizations \
    --cross-prefix=$TOOLCHAIN/bin/aarch64-linux-android- \
    --target-os=linux \
    --arch=arm64 \
    --enable-cross-compile \
    --sysroot=$SYSROOT

       --enable-libopencore-amrnb \
       --enable-libopencore-amrwb \
       --enable-libvo-aacenc \
       $ADDITIONAL_CONFIGURE_FLAG
    make clean
    make
    make install

    Now I am trying to combine the generated static libraries i.e

    • libavcodec.a
      libavfilter.a
      libavformat.a
      libavutil.a
      libswresample.a
      libswscale.a
      with

    • ffmpeg_myc opencore-amrnb opencore-amrwb

    and generate a myffmpeg.so shared library by giving ndk-build.

    I tried to paste the generated ffmpeg static libraries from ffmpeg/android folder to obj/local/arm64-v8a (path where are all the static libs were previously created in 0.8.6 according the makefile). I also made changes to the makefile thinking that it will use the already generated ffmpeg libraries. But I found that the size of the generated shared library was too low and that it has not included the generated ffmpeg libs.

    LOCAL_PATH := $(call my-dir)
    include $(CLEAR_VARS)
    LOCAL_MODULE := myffmpeg
    LOCAL_WHOLE_STATIC_LIBRARIES := libavcodec libavfilter libavformat libavutil libswscale ffmpeg_c
    LOCAL_LDFLAGS += -lz -lm -llog
    ifeq ($(FF_ENABLE_AMR),yes)
    LOCAL_WHOLE_STATIC_LIBRARIES += opencore-amrnb opencore-amrwb
    endif
    ifeq ($(FF_ENABLE_AAC),yes)
    LOCAL_WHOLE_STATIC_LIBRARIES += vo-aacenc
    endif

    include $(BUILD_SHARED_LIBRARY)

    I would like to know whether I am going in a right way and have to do some minor changes. Or if I should start with an entirely different approach to achieve this. Any suggestions would be helpful.

    Thanks in advance