Recherche avancée

Médias (91)

Autres articles (48)

  • MediaSPIP : Modification des droits de création d’objets et de publication définitive

    11 novembre 2010, par

    Par défaut, MediaSPIP permet de créer 5 types d’objets.
    Toujours par défaut les droits de création et de publication définitive de ces objets sont réservés aux administrateurs, mais ils sont bien entendu configurables par les webmestres.
    Ces droits sont ainsi bloqués pour plusieurs raisons : parce que le fait d’autoriser à publier doit être la volonté du webmestre pas de l’ensemble de la plateforme et donc ne pas être un choix par défaut ; parce qu’avoir un compte peut servir à autre choses également, (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

Sur d’autres sites (8991)

  • ffmpeg - Audio not playing on iOS with transcoded video

    23 juin 2015, par Dev

    I am using Laravel 4.2.

    I am working on a project that provides video services. To transcode video (to play on all devices) I’ve used git-hub package php ffmpeg.

    By transcoding video, I found it is NOT working with iPhone 5s. I searched for solution and found some changes in parameters passed to the library.

    Finally, I come up with the command below that the library uses to transcode the video.

    ffmpeg -y -i video.mp4 -async 1 -metadata:s:v:0 start_time=0 -s 560x320
       -vcodec libx264 -strict experimental -pix_fmt yuv420p -profile:v baseline
       -acodec aac -profile:a aac_low -b:v 466k -refs 6 -coder 1 -sc_threshold 40
       -flags +loop -me_range 16 -subq 7 -i_qfactor 0.71 -qcomp 0.6 -qdiff 4
       -trellis 1 -b:a 256k -ac 2 transcode_video.mp4

    The video is playing well but the audio is NOT working on iOS. Hence the video is playing without audio. I am using HTML 5 player to play the video and auto play option is off.

    Any suggestions ?

  • gstreamer android error : engine.GetInterface(IODeviceCapabilities) failed(0x0000000c)

    19 juin 2015, par omerjerk

    I’m trying to stream mpegts file on an Android device. This is the file.

    I’ve setup everything as done in the tutorial 5 of gstreamer android.
    But I’m getting the following error -

    06-18 23:56:49.023  27051-27132/kuchbhilabs.chestream D/GStreamer+tutorial-4﹕ 0:00:03.720735780 0xac093980 src/main/jni/main.c:104:set_ui_message Setting message to: Buffering 66%
    06-18 23:56:49.023  27051-27132/kuchbhilabs.chestream D/GStreamerPlayer﹕ Buffering 66%
    06-18 23:56:49.551  27051-27350/kuchbhilabs.chestream W/GStreamer+mpegtsbase﹕ 0:00:04.248340259 0xaec6d6f0 mpegtsbase.c:1379:query_upstream_latency:<tsdemux0> Failed to query upstream latency
    06-18 23:56:49.555  27051-27350/kuchbhilabs.chestream W/GStreamer+mpegtsbase﹕ 0:00:04.252435571 0xaec6d6f0 mpegtsbase.c:531:mpegts_base_program_add_stream Stream already present !
    06-18 23:56:49.762  27051-27351/kuchbhilabs.chestream W/libOpenSLES﹕ Leaving Object::GetInterface (SL_RESULT_FEATURE_UNSUPPORTED)
    06-18 23:56:49.762  27051-27351/kuchbhilabs.chestream E/GStreamer+opensles_sink﹕ 0:00:04.459898644 0xaec6e400 openslessink.c:152:_opensles_query_capabilities:&lt;gstopenslessink@0xac3d8408&gt; engine.GetInterface(IODeviceCapabilities) failed(0x0000000c)
    06-18 23:56:49.763  27051-27351/kuchbhilabs.chestream W/libOpenSLES﹕ Leaving OutputMix::GetDestinationOutputDeviceIDs (SL_RESULT_BUFFER_INSUFFICIENT)
    06-18 23:56:49.810  27051-27351/kuchbhilabs.chestream E/GStreamer+ffmpeg﹕ 0:00:04.507782029 0xaec6e400 :0: channel element 0.0 is not allocated
    06-18 23:56:49.810  27051-27351/kuchbhilabs.chestream W/GStreamer+ffmpeg﹕ 0:00:04.507919477 0xaec6e400 gstffmpegdec.c:2231:gst_ffmpegdec_audio_frame: error: Decoding of AAC stream by FFMPEG failed.
    06-18 23:56:49.811  27051-27351/kuchbhilabs.chestream W/GStreamer+ffmpeg﹕ 0:00:04.508238122 0xaec6e400 gstffmpegdec.c:2315:gst_ffmpegdec_frame: ffdec_aac: decoding error (len: -1, have_data: 0)
    06-18 23:56:49.811  27051-27132/kuchbhilabs.chestream D/GStreamer+tutorial-4﹕ 0:00:04.508278070 0xac093980 src/main/jni/main.c:104:set_ui_message Setting message to: Error received from element ffdec_aac0: Could not decode stream.
    06-18 23:56:49.811  27051-27132/kuchbhilabs.chestream D/GStreamerPlayer﹕ Error received from element ffdec_aac0: Could not decode stream.
    06-18 23:56:49.878  27051-27352/kuchbhilabs.chestream W/GStreamer+ffmpeg﹕ 0:00:04.575771665 0xaec6e460 gstffmpegdec.c:2315:gst_ffmpegdec_frame: ffdec_h264: decoding error (len: -1, have_data: 0)
    </tsdemux0>

    This is my Android.mk file :

    LOCAL_PATH := $(call my-dir)

    include $(CLEAR_VARS)

    LOCAL_MODULE    := chestream
    LOCAL_SRC_FILES := main.c
    LOCAL_SHARED_LIBRARIES := gstreamer_android
    LOCAL_LDLIBS := -llog -landroid
    include $(BUILD_SHARED_LIBRARY)

    ifndef GSTREAMER_SDK_ROOT
    ifndef GSTREAMER_SDK_ROOT_ANDROID
    $(error GSTREAMER_SDK_ROOT_ANDROID is not defined!)
    endif
    GSTREAMER_SDK_ROOT        := $(GSTREAMER_SDK_ROOT_ANDROID)
    endif
    GSTREAMER_NDK_BUILD_PATH  := $(GSTREAMER_SDK_ROOT)/share/gst-android/ndk-build/
    include $(GSTREAMER_NDK_BUILD_PATH)/plugins.mk
    GSTREAMER_PLUGINS         := $(GSTREAMER_PLUGINS_CORE) $(GSTREAMER_PLUGINS_PLAYBACK) $(GSTREAMER_PLUGINS_CODECS) $(GSTREAMER_PLUGINS_NET) $(GSTREAMER_PLUGINS_SYS)\
    mpeg2dec mpegdemux2 mpegtsdemux mpegtsmux ffmpeg
    GSTREAMER_EXTRA_DEPS      := gstreamer-interfaces-0.10 gstreamer-video-0.10
    include $(GSTREAMER_NDK_BUILD_PATH)/gstreamer.mk

    The error is probably that gstreamer is not able to decode AAC file.
    Any help would be appreciated.

  • Mobile Camera live audio/video streaming and encoding

    7 juin 2015, par Strikecounter2

    I know this question has been asked a couple of times, but I still haven’t found the right answer for my question.

    I would like to code an app that is able to live-stream audio and video while the content is being recorded and then uploaded to a server. I’d prefer to have my own back-end using Parse, because I want a high scalability. I know that the video has to be encoded to a h.264 codec and the audio to an AAC codec, but I don’t know how to achieve this. I have heard of the FFmpeg framework, but I am not sure if I would violate their license if I distribute my app or even sell it to somebody else.
    I would then like to receive the stream from the server to open it on the iPhone/android phone.

    They key requirements would be :

    • Low Latency
    • About 24 fps
    • Audio/Video in sync
    • No buffering while watching

    I would like to use Swift as a programming language, but if there is no way to use a swift-wrapper for any frameworks I would focus on Objective-C too.

    I am willing to learn everything that is needed, but I don’t know where to start.