Recherche avancée

Médias (1)

Mot : - Tags -/pirate bay

Autres articles (90)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (10283)

  • libavformat/hls : add support for decryption of HLS media segments encrypted using...

    21 septembre 2021, par Nachiket Tarate
    libavformat/hls : add support for decryption of HLS media segments encrypted using SAMPLE-AES encryption method
    

    Apple HTTP Live Streaming Sample Encryption :

    https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/HLS_Sample_Encryption

    Signed-off-by : Nachiket Tarate <nachiket.programmer@gmail.com>
    Signed-off-by : Steven Liu <lq@chinaffmpeg.org>

    • [DH] libavformat/Makefile
    • [DH] libavformat/hls.c
    • [DH] libavformat/hls_sample_encryption.c
    • [DH] libavformat/hls_sample_encryption.h
    • [DH] libavformat/mpegts.c
  • Flash Media Server Recording Delay

    14 novembre 2011, par Corey

    I have an application where a user can record themselves singing along to a song. Once I receive the NetStream status event 'Record.Start' I start playing an audio file. Once the audio completes, I stop recording. Next, I have a script that runs FFMPEG to combine the recorded video/audio with the same music file. The problem I'm finding is that there is a noticeable delay between the recorded audio and the music. It seems also that this delay depends on network speed as it varies depending on the network. Can I determine this delay through the FMS dynamically ?

  • How to convert a video and audio file to be smoothly played via Media Source Extension API ?

    4 octobre 2018, par Aman

    I have built a web video player using the Media Source Extension API. I have been testing my video player using local video and audio files on my PC. Everything works perfectly expect the video keeps buffering. I’m playing a 4k 60fps video, which I downloaded from YouTube. My PC is not 4k resolution, but the video smoothly plays through YouTube and VLC Media Player. I’m just surprised to why my Media Source Extension Video Player buffers even through the video and audio file are not being retrieved via network. I’m assuming that the video and audio files I’m using are causing this problem. So I will explain how I created my video and audio files first :

    1. I downloaded the video from https://www.youtube.com/watch?v=KaCQ8SQ6ZHQ&t=3s using the 4K Video Downloader https://www.4kdownload.com/products/product-videodownloader.

    2. Convert the mkv (the 4K Video Downloader only allows the 4k 60fps video to be downloaded in mkv format, for me) file to mp4 using ffmpeg in CMD : ffmpeg -i test.mkv -codec copy test.mp4.

    3. Converting the test.mp4 file to my preferred 4K resolution 3840x2160 from 3840x1632 using ffmpeg in CMD : ffmpeg -i test.mp4 -s 3840x2160 -c:a copy test_changed.mp4. (NOT SO IMPORTANT)

    4. Separating the video and audio of test_changed.mp4 to video.mp4 for video and audio.mp4 for audio using MP4Box in CMD : Video - MP4Box -single 1 test_changed.mp4 -out video.mp4 and Audio - MP4Box -single 2 test_changed.mp4 -out audio.mp4.

    5. Splitting both video.mp4 and audio.mp4 into 30 split parts each containing 5 seconds of the video and audio file. So I end up having (video_1.mp4,audio_1.mp4), (video_2.mp4,audio_2.mp4), (video_3.mp4,audio_3.mp4), ..... (video_29.mp4,audio_29.mp4), (video_30.mp4,audio_30.mp4). Using ffmpeg and one by one specifying the time range for each part in CMD :

      [For Part 1 : ffmpeg -ss 00:00:00 -to 00:00:05 -i video.mp4 video_1.mp4, ffmpeg -ss 00:00:00 -to 00:00:05 -i audio.mp4 audio_1.mp4],

      [For Part 2 : ffmpeg -ss 00:00:05 -to 00:00:10 -i video.mp4 video_2.mp4, ffmpeg -ss 00:00:05 -to 00:00:10 -i audio.mp4 audio_2.mp4],

      [For Part 3 : ffmpeg -ss 00:00:10 -to 00:00:15 -i video.mp4 video_3.mp4, ffmpeg -ss 00:00:10 -to 00:00:15 -i audio.mp4 audio_3.mp4],

      .....

      [For Part 29 : ffmpeg -ss 00:02:20 -to 00:02:25 -i video.mp4 video_29.mp4, ffmpeg -ss 00:02:20 -to 00:02:25 -i audio.mp4 audio_29.mp4],

      [For Part 30 : ffmpeg -ss 00:02:25 -to 00:02:30 -i video.mp4 video_30.mp4, ffmpeg -ss 00:02:25 -to 00:02:30 -i audio.mp4 audio_30.mp4].

    6. Fragmenting each of the video and audio parts using MP4Box in CMD(As far as I know, fragmented mp4 files are only files played via Media Source Extension API) :

      [For Part 1 : MP4Box -dash 1000 -rap -frag-rap video_1.mp4, MP4Box -dash 1000 -rap -frag-rap audio_1.mp4],

      [For Part 2 : MP4Box -dash 1000 -rap -frag-rap video_2.mp4, MP4Box -dash 1000 -rap -frag-rap audio_2.mp4],

      [For Part 3 : MP4Box -dash 1000 -rap -frag-rap video_3.mp4, MP4Box -dash 1000 -rap -frag-rap audio_3.mp4],

      .....

      [For Part 29 : MP4Box -dash 1000 -rap -frag-rap video_29.mp4, MP4Box -dash 1000 -rap -frag-rap audio_29.mp4],

      [For Part 30 : MP4Box -dash 1000 -rap -frag-rap video_30.mp4, MP4Box -dash 1000 -rap -frag-rap audio_30.mp4].

    7. I receive a fragmented file for each file with "_dashinit" being in it e.g. For Part 1 : video_1_dashinit.mp4 and audio_1_dashinit.mp4. These are the files I’m playing through Media Source Extension API.

    So I’m appending these files into my sourceBuffers and playing it with the video. I have given the test.zip file here (https://drive.google.com/file/d/1tyPBTxgpS601Xs5VEWznYhWw9PwhMHsB/view?usp=sharing) containing the test sample.

    I’m using this command in CMD to run Chrome and test my file : chrome.exe --allow-file-access-from-files

    Anyone can use this test sample and see if the video is buffering for them too. Please comment about anything I’m doing wrong, or help me construct a better 5 seconds video and audio files for MSE playable. Thanks