Recherche avancée

Médias (1)

Mot : - Tags -/école

Autres articles (71)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (15026)

  • Decoding RIMM streaming file format

    10 septembre 2011, par Thomas

    I want to decode the video (visual) frames within a Blackberry RIMM file. So far I have a parser, and some corresponding container documentation from RIM. 

    The video codec is H264 and is explicitly set on the device using one of the video.encodings properties. However, FFMPEG is not able to decode the frames and this is driving me nuts.

    Edit 1 : The issues seems to be lack of SPS and PPS in the frames, and artificially inserting them have proven unsuccessful so far (all grey image). Blackberry 9700 sends

    0x00 0x00 0x ?? 0x ?? 0xType

    where Type is according to table 7-1 in the H264 spec (I and P frames). We believe the 0x ?? 0x ?? represent the size of the frame, however the size does not always correspond to the size found by the parser (the parser seems to be working correctly).

    I have a windows decoder codec from blackberry, called mc_demux_mp2_ds.ax, and can play some MPEG-4 files captured the same way, but it is a binary for windows. And the H264 files will not play either way. I am aware of previous attempts. The capture url for javax.microedition.media.Manager is

    encoding=video-3gpp_width=176_height=144_video_codec=H264_audio_codec=AAC

    and I am writing to an output stream. Some example files here.

    Edit 2 :Turns out that about 3-4 of the 12-15 available video capture modes are flat out failing and refusing to output data, even in the simplest of test applications. So any working solution should implement MPEG-4, H264 and H263 in both AMR and AAC, in so getting fallback alternatives when one sound codec and/or resolution fails. Reboots, hangs and what not litters the Blackberry video implementation and vary from firmware to firmware ; total suckage.

  • Nothing happens when I am clicking on the create video button nothing happens and it displays the error failed to load ffmpeg

    3 août 2024, par shashwat bajpai

    So here is my code in react and ffmpeg and wasm which I used to create a simple video editor to just apply audio on top of the image and convert it into mp3

    


    This is my code :

    


    import React, { useState, useEffect } from &#x27;react&#x27;;&#xA;import { FFmpeg } from &#x27;@ffmpeg/ffmpeg&#x27;;&#xA;import { fetchFile, toBlobURL } from &#x27;@ffmpeg/util&#x27;;&#xA;&#xA;function App() {&#xA;  const [videosrc, setvideosrc] = useState(&#x27;&#x27;);&#xA;  const [imgfile, setimgfile] = useState(null);&#xA;  const [audiofile, setaudiofile] = useState(null);&#xA;  const [ffmpeg, setFfmpeg] = useState(null);&#xA;  const [loading, setLoading] = useState(false);&#xA;  const [error, setError] = useState(null);&#xA;&#xA;  useEffect(() => {&#xA;    const load = async () => {&#xA;      try {&#xA;        const ffmpegInstance = new FFmpeg();&#xA;        ffmpegInstance.on(&#x27;log&#x27;, ({ message }) => console.log(message));&#xA;        await ffmpegInstance.load({&#xA;          coreURL: await toBlobURL(`http://localhost:5173/node_modules/@ffmpeg/core/dist/ffmpeg-core.js`, &#x27;text/javascript&#x27;),&#xA;          wasmURL: await toBlobURL(`http://localhost:5173/node_modules/@ffmpeg/core/dist/ffmpeg-core.wasm`, &#x27;application/wasm&#x27;),&#xA;        });&#xA;        setFfmpeg(ffmpegInstance);&#xA;        console.log(&#x27;FFmpeg loaded successfully&#x27;);&#xA;      } catch (err) {&#xA;        console.error(&#x27;Failed to load FFmpeg:&#x27;, err);&#xA;        setError(&#x27;Failed to load FFmpeg&#x27;);&#xA;      }&#xA;    };&#xA;    load();&#xA;  }, []);&#xA;&#xA;  const createVideo = async () => {&#xA;    console.log(&#x27;Create Video button clicked&#x27;);&#xA;    setLoading(true);&#xA;    setError(null);&#xA;&#xA;    if (!ffmpeg) {&#xA;      console.error(&#x27;FFmpeg not loaded&#x27;);&#xA;      setError(&#x27;FFmpeg not loaded&#x27;);&#xA;      setLoading(false);&#xA;      return;&#xA;    }&#xA;&#xA;    if (!imgfile || !audiofile) {&#xA;      console.error(&#x27;Image or audio file not selected&#x27;);&#xA;      setError(&#x27;Please select both image and audio files&#x27;);&#xA;      setLoading(false);&#xA;      return;&#xA;    }&#xA;&#xA;    try {&#xA;      console.log(&#x27;Writing files to MEMFS&#x27;);&#xA;      await ffmpeg.writeFile(&#x27;image.png&#x27;, await fetchFile(imgfile));&#xA;      await ffmpeg.writeFile(&#x27;sound.mp3&#x27;, await fetchFile(audiofile));&#xA;&#xA;      console.log(&#x27;Running FFmpeg command&#x27;);&#xA;      await ffmpeg.exec([&#xA;        "-framerate", "1/10",&#xA;        "-i", "image.png",&#xA;        "-i", "sound.mp3",&#xA;        "-c:v", "libx264",&#xA;        "-t", "10",&#xA;        "-pix_fmt", "yuv420p",&#xA;        "-vf", "scale=1920:1080",&#xA;        "test.mp4"&#xA;      ]);&#xA;&#xA;      console.log(&#x27;Reading output file&#x27;);&#xA;      const data = await ffmpeg.readFile(&#x27;test.mp4&#x27;);&#xA;&#xA;      console.log(&#x27;Creating URL for video&#x27;);&#xA;      const url = URL.createObjectURL(new Blob([data.buffer], { type: &#x27;video/mp4&#x27; }));&#xA;      setvideosrc(url);&#xA;      console.log(&#x27;Video created successfully&#x27;);&#xA;    } catch (err) {&#xA;      console.error(&#x27;Error creating video:&#x27;, err);&#xA;      setError(`Error creating video: ${err.message}`);&#xA;    } finally {&#xA;      setLoading(false);&#xA;    }&#xA;  };&#xA;&#xA;  const handlechangeimage = (e) => {&#xA;    setimgfile(e.target.files[0]);&#xA;  };&#xA;&#xA;  const handlechangesound = (e) => {&#xA;    setaudiofile(e.target.files[0]);&#xA;  };&#xA;&#xA;  return (&#xA;    <div>&#xA;      <h1>Image</h1>&#xA;      <input type="file" accept="&#x27;image/*&#x27;" />&#xA;      <h1>Audio</h1>&#xA;      <input type="file" accept="&#x27;audio/*&#x27;" />&#xA;      <button disabled="{loading}">&#xA;        {loading ? &#x27;Creating Video...&#x27; : &#x27;Create Video&#x27;}&#xA;      </button>&#xA;      {error &amp;&amp; <p style="{{color:">{error}</p>}&#xA;      {videosrc &amp;&amp; (&#xA;        <div>&#xA;          <h2>Generated Video:</h2>&#xA;          <video controls="controls" src="{videosrc}"></video>&#xA;        </div>&#xA;      )}&#xA;    </div>&#xA;  );&#xA;}&#xA;&#xA;export default App;&#xA;

    &#xA;

    Version which I am using is : "@ffmpeg/ffmpeg" : "^0.12.7",&#xA;"@ffmpeg/util" : "^0.12.1",&#xA;"@ffmpeg/core" : "^0.12.3",

    &#xA;

    Is it issue with the version / Blob URL ? How can I run ffmpeg in my system ?

    &#xA;

  • arm : Load mb_y properly in mbtree_propagate_list_internal_neon

    26 décembre 2016, par Martin Storsjö
    arm : Load mb_y properly in mbtree_propagate_list_internal_neon
    

    The previous version, attempting to load two stack parameters at once,
    only would have worked if they were interpreted and loaded as 32 bit
    elements, not when loading them as 16 bit elements.

    • [DH] common/arm/mc-a.S