Recherche avancée

Médias (91)

Autres articles (87)

  • MediaSPIP Player : problèmes potentiels

    22 février 2011, par

    Le lecteur ne fonctionne pas sur Internet Explorer
    Sur Internet Explorer (8 et 7 au moins), le plugin utilise le lecteur Flash flowplayer pour lire vidéos et son. Si le lecteur ne semble pas fonctionner, cela peut venir de la configuration du mod_deflate d’Apache.
    Si dans la configuration de ce module Apache vous avez une ligne qui ressemble à la suivante, essayez de la supprimer ou de la commenter pour voir si le lecteur fonctionne correctement : /** * GeSHi (C) 2004 - 2007 Nigel McNie, (...)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Création définitive du canal

    12 mars 2010, par

    Lorsque votre demande est validée, vous pouvez alors procéder à la création proprement dite du canal. Chaque canal est un site à part entière placé sous votre responsabilité. Les administrateurs de la plateforme n’y ont aucun accès.
    A la validation, vous recevez un email vous invitant donc à créer votre canal.
    Pour ce faire il vous suffit de vous rendre à son adresse, dans notre exemple "http://votre_sous_domaine.mediaspip.net".
    A ce moment là un mot de passe vous est demandé, il vous suffit d’y (...)

Sur d’autres sites (11254)

  • Flutter : FFmpeg Not Executing Filter

    5 novembre 2022, par Dennis Ashford

    I am downloading a video from Firebase and then trying to apply a watermark to that video that will then be saved in a temporary directory in the cache. I am using the ffmpeg_kit_flutter package to do this. There is very little online about how this should work in Flutter.

    


    The video and image are loaded and stored in the cache properly. However, the FFmpegKit.execute call is not working and does not create a new output file in the cache. Any suggestions on how to make that function execute ? Are the FFmpeg commands correct ?

    


    Future<string> waterMarkVideo(String videoPath, String watermarkPath) async {&#xA;    //these calls are to load the video into temporary directory&#xA;    final response = await http.get(Uri.parse(videoPath));&#xA;    final originalVideo = File (&#x27;${(await getTemporaryDirectory()).path}/video.mp4&#x27;);&#xA;    await originalVideo.create(recursive: true);&#xA;    await originalVideo.writeAsBytes(response.bodyBytes);&#xA;    print(&#x27;video path&#x27; &#x2B; originalVideo.path);&#xA;&#xA;    //this grabs the watermark image from assets and decodes it&#xA;    final byteData = await rootBundle.load(watermarkPath);&#xA;    final watermark = File(&#x27;${(await getTemporaryDirectory()).path}/image.png&#x27;);&#xA;    await watermark.create(recursive: true);&#xA;    await watermark.writeAsBytes(byteData.buffer.asUint8List(byteData.offsetInBytes, byteData.lengthInBytes));&#xA;    print(&#x27;watermark path&#x27; &#x2B; watermark.path);&#xA;&#xA;    //this creates temporary directory for new watermarked video&#xA;    var tempDir = await getTemporaryDirectory();&#xA;    final newVideoPath = &#x27;${tempDir.path}/${DateTime.now().microsecondsSinceEpoch}result.mp4&#x27;;&#xA;&#xA;    //and now attempting to work with ffmpeg package to overlay watermark on video&#xA;    await FFmpegKit.execute("-i $originalVideo -i $watermark -filter_complex &#x27;overlay[out]&#x27; -map &#x27;[out]&#x27; $newVideoPath")&#xA;    .then((rc) => print(&#x27;FFmpeg process exited with rc $rc&#x27;));&#xA;    print(&#x27;new video path&#x27; &#x2B; newVideoPath);&#xA;&#xA;return newVideoPath;&#xA;  }&#xA;</string>

    &#xA;

    The logs give the file paths and also give this

    &#xA;

    FFmpegKitFlutterPlugin 0x600000000fe0 started listening to events on 0x600001a4b280.&#xA;flutter: Loaded ffmpeg-kit-flutter-ios-https-x86_64-4.5.1.&#xA;flutter: FFmpeg process exited with rc Instance of &#x27;FFmpegSession&#x27;&#xA;

    &#xA;

  • JW Player fails with error with wma files : Task Queue failed at step 5

    17 mars 2018, par Sabeena

    I have a JW Player which plays MP3 files but with WMA files it gives the error :

    Task Queue failed at step 5: Playlist could not be loaded: Playlist file did not contain a valid playlist

    I thought of two reasons :

    1. There is no support for WMA but please confirm me this.
    2. Somewhere I need to setup the type of file I am using in this player.

    If WMA not supported in JW Player how can I play WMA and MP3 files in my website ?

    Is ffmpeg needed to convert WMA to MP3 while uploading ?

  • How can I correctly provide a mock webcam video to Chrome ?

    15 décembre 2022, par doppelgreener

    I'm trying to run end-to-end testing in Chrome for a product that requires a webcam feed halfway through to operate. From what I understand this means providing a fake webcam video to Chrome using the --use-file-for-fake-video-capture="/path/to/video.y4m" command line argument. It will then use that as a webcam video.

    &#xA;&#xA;

    However, no matter what y4m file I provide, I get the following error from Chrome running under these conditions :

    &#xA;&#xA;

    DOMException: Could not start video source&#xA;{&#xA;  code: 0,&#xA;  message: "Could not start video source",&#xA;  name: "NotReadableError"&#xA;}&#xA;

    &#xA;&#xA;

    Notably I can provide an audio file just fine using --use-file-for-fake-audio-capture and Chrome will work with it well. The video has been my sticking point.

    &#xA;&#xA;

    This error comes out of the following straightforward mediaDevices request :

    &#xA;&#xA;

    navigator.mediaDevices.getUserMedia({ video: true, audio: true })&#xA;  .then(data => {&#xA;    // do stuff&#xA;  })&#xA;  .catch(err => {&#xA;    // oh no!&#xA;  });&#xA;

    &#xA;&#xA;

    (This always hits the “oh no !” branch when a video file is provided.)

    &#xA;&#xA;

    What I've tried so far

    &#xA;&#xA;

    I've been running Chrome with the following command line arguments (newlines added for readability), and I'm using a Mac hence the open command :

    &#xA;&#xA;&#xA;&#xA;

    open -a "Google Chrome" --args&#xA;  --disable-gpu&#xA;  --use-fake-device-for-media-stream&#xA;  --use-file-for-fake-video-capture="~/Documents/mock/webcam.y4m"&#xA;  --use-file-for-fake-audio-capture="~/Documents/mock/microphone.wav"&#xA;

    &#xA;&#xA;

    webcam.y4m and microphone.wav were generated from a video file I recorded.

    &#xA;&#xA;

    I first recorded a twenty-second mp4 video using my browser's MediaRecorder, downloaded the result, and converted it using the following command line commands :

    &#xA;&#xA;

    ffmpeg -y -i original.mp4 -f wav -vn microphone.wav&#xA;ffmpeg -y -i original.mp4 webcam.y4m&#xA;

    &#xA;&#xA;

    When this didn't work, I tried the same using a twenty-second movie file I recorded in Quicktime :

    &#xA;&#xA;

    ffmpeg -y -i original.mov -f wav -vn microphone.wav&#xA;ffmpeg -y -i original.mov webcam.y4m&#xA;

    &#xA;&#xA;

    When that also failed, I went straight to the Chromium file that explains fake video capture, went to the example y4m file list it provided, and downloaded the grandma file and provided that as a command line argument to Chrome instead :

    &#xA;&#xA;

    open -a "Google Chrome" --args&#xA;  --disable-gpu&#xA;  --use-fake-device-for-media-stream&#xA;  --use-file-for-fake-video-capture="~/Documents/mock/grandma_qcif.y4m"&#xA;  --use-file-for-fake-audio-capture="~/Documents/mock/microphone.wav"&#xA;

    &#xA;&#xA;

    Chrome provides me with the exact same error in all of these situations.

    &#xA;&#xA;

    The only time Chrome doesn't error out with that mediaDevices request is when I omit the video completely :

    &#xA;&#xA;

    open -a "Google Chrome" --args&#xA;  --disable-gpu&#xA;  --use-fake-device-for-media-stream&#xA;  --use-file-for-fake-audio-capture="~/Documents/mock/microphone.wav"&#xA;

    &#xA;&#xA;

    Accounting for C420mpeg2

    &#xA;&#xA;

    TestRTC suggests Chrome will “crash” if I give it a C420mpeg2 file, and recommends that simply replacing the metadata fixes the issue. Indeed the video file I generate from ffmpeg gives me the following header :

    &#xA;&#xA;

    YUV4MPEG2 W1280 H720 F30:1 Ip A1:1 C420mpeg2 XYSCSS=420MPEG2&#xA;

    &#xA;&#xA;

    Chrome doesn't actually crash when run with this file, I just get the error above. If I edit the video file to the following header though per TestRTC's recommendations I get the same situation :

    &#xA;&#xA;

    YUV4MPEG2 W1280 H720 F30:1 Ip A1:1 C420 XYSCSS=420MPEG2&#xA;

    &#xA;&#xA;

    The video file still gives me the above error in these conditions.

    &#xA;&#xA;

    What can/should I do ?

    &#xA;&#xA;

    How should I be providing a video file to Chrome for this command line argument ?

    &#xA;&#xA;

    How should I be recording or creating the video file ?

    &#xA;&#xA;

    How should I convert it to y4m ?

    &#xA;