Recherche avancée

Médias (91)

Autres articles (69)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Possibilité de déploiement en ferme

    12 avril 2011, par

    MediaSPIP peut être installé comme une ferme, avec un seul "noyau" hébergé sur un serveur dédié et utilisé par une multitude de sites différents.
    Cela permet, par exemple : de pouvoir partager les frais de mise en œuvre entre plusieurs projets / individus ; de pouvoir déployer rapidement une multitude de sites uniques ; d’éviter d’avoir à mettre l’ensemble des créations dans un fourre-tout numérique comme c’est le cas pour les grandes plate-formes tout public disséminées sur le (...)

Sur d’autres sites (13625)

  • Audio to Video Conversion using FFmpeg in React

    6 octobre 2024, par Yuri

    Upon running my code, the audio successfully gets converted into a video and saved in the bucket, however, the video size is just 9 bytes.

    


    Here's my utility.

    


    import { FFmpeg } from '@ffmpeg/ffmpeg';
import { fetchFile } from '@ffmpeg/util';
import { ref, uploadBytes, getDownloadURL } from 'firebase/storage';
import { storage } from '../firebase'; 

export async function generateAndStoreVideo(audioPath, imagePath, userId) {
  try {
    const ffmpeg = new FFmpeg({ log: true });
    await ffmpeg.load();

    console.log('Fetching audio and image...');
    ffmpeg.writeFile( 'input.mp3', await fetchFile(audioPath));
    ffmpeg.writeFile('input.jpg', await fetchFile(imagePath));
    
    console.log('Running FFmpeg...');
    await ffmpeg.exec(['-i', 'sound.mp3', '-i', 'image.png', 'output.mp4']
    );

    console.log('FFmpeg completed!');
    
    const data = ffmpeg.readFile( 'output.mp4');
    const videoBlob = new Blob([data.buffer], { type: 'video/mp4' });

    console.log('Uploading video to Firebase...');
    const videoRef = ref(storage, `videos/${userId}/${Date.now()}_generated_video.mp4`);
    const uploadSnapshot = await uploadBytes(videoRef, videoBlob);

    const videoURL = await getDownloadURL(uploadSnapshot.ref);

    console.log('Video uploaded successfully! URL:', videoURL);

    return videoURL;
  } catch (error) {
    console.error('Error generating or uploading video:', error);
    throw new Error('Video generation or upload failed.');
  }
}



    


    After uploading, I also got this error :

    


    ErrnoError: FS error
    at handleError (http://localhost:3000/static/js/bundle.js:98427:58)
    at http://localhost:3000/static/js/bundle.js:98450:7


    


    I've made sure I'm using the new FFMPEG version and syntax, but the issues persist.

    


  • How to make media recorder api individual chunks playable by it self

    10 août 2024, par tedGuy

    I'm trying to send individual chunks to the server instead of sending the whole chunks at once. This way, I have Ffmpeg on my Rails server to convert these chunks to HLS and upload them to S3 to stream the video instantly. However, I've encountered an issue the Media Recorder only provides playable chunks for the first segment after that, they are not playable and need to be concatenated to play.

    


    To avoid this, I've taken a different approach where I start a new Media Recorder every 3 seconds so that I get a playable chunk every time. However, this approach has its issues the video glitches a bit due to the delay when I stop and start a new Media Recorder. Is there a way to achieve this with ease ? This is my current status. please help !

    


    const startVideoRecording = async (
    screenStream: MediaStream,
    audioStream: MediaStream
  ) => {
    setStartingRecording(true);

    try {
      const res = await getVideoId();
      videoId.current = res;
    } catch (error) {
      console.log(error);
      return;
    }

    const outputStream = new MediaStream();
    outputStream.addTrack(screenStream.getVideoTracks()[0]);
    outputStream.addTrack(audioStream.getAudioTracks()[0]); // Add audio track

    const mimeTypes = [
      "video/webm;codecs=h264",
      "video/webm;codecs=vp9",
      "video/webm;codecs=vp8",
      "video/webm",
      "video/mp4",
    ];

    let selectedMimeType = "";
    for (const mimeType of mimeTypes) {
      if (MediaRecorder.isTypeSupported(mimeType)) {
        selectedMimeType = mimeType;
        break;
      }
    }

    if (!selectedMimeType) {
      console.error("No supported mime type found");
      return;
    }

    const videoRecorderOptions = {
      mimeType: selectedMimeType,
    };

    let chunkIndex = 0;

    const startNewRecording = () => {
      // Stop the current recorder if it's running
      if (
        videoRecorderRef.current &&
        videoRecorderRef.current.state === "recording"
      ) {
        videoRecorderRef.current.stop();
      }

      // Create a new MediaRecorder instance
      const newVideoRecorder = new MediaRecorder(
        outputStream,
        videoRecorderOptions
      );
      videoRecorderRef.current = newVideoRecorder;

      newVideoRecorder.ondataavailable = async (event) => {
        if (event.data.size > 0) {
          chunkIndex++;
          totalSegments.current++;
          const segmentIndex = totalSegments.current;
          console.log(event.data);
          handleUpload({ segmentIndex, chunk: event.data });
        }
      };

      // Start recording with a 3-second interval
      newVideoRecorder.start();
    };

    // Start the first recording
    startNewRecording();

    // Set up an interval to restart the recording every 3 seconds
    recordingIntervalIdRef.current = setInterval(() => {
      startNewRecording();
    }, 3000);

    setIsRecording(true);
    setStartingRecording(false);
  };


    


  • Live stream doesn't seem to be passed to AWS correctly

    12 août 2024, par NoobAmI

    I'm trying to stream my live video into Amazon IVS and I don't see it on the live channels.

    


    Is it possible I have a mistake in my FFMPEG configuration ?
I'm expecting to see this in my playback url or on the console screen for playback but I see nothing at the moment.

    


    As I understand it, shouldn't I see some kind of playback in the live channels if a stream is being sent that channel ?

    


    `  async sendDataToIvs(channelArn: string, payload: any): Promise<void> {&#xA;    const injestServer = &#x27;***.global-contribute.live-video.net&#x27;;&#xA;    const streamKey = &#x27;sk_us-east-1_*****&#x27;;&#xA;  &#xA;    const ffmpeg = spawn(&#x27;ffmpeg&#x27;, [&#xA;      &#x27;-re&#x27;, // Read input at native frame rate&#xA;      &#x27;-i&#x27;, &#x27;-&#x27;, // Input from stdin (live stream data)&#xA;      &#x27;-r&#x27;, &#x27;30&#x27;, // Frame rate&#xA;      &#x27;-c:v&#x27;, &#x27;libx264&#x27;, // Video codec - H.264&#xA;      &#x27;-pix_fmt&#x27;, &#x27;yuv420p&#x27;, // Pixel format&#xA;      &#x27;-profile:v&#x27;, &#x27;main&#x27;, // H.264 profile&#xA;      &#x27;-preset&#x27;, &#x27;veryfast&#x27;, // Encoder quality setting&#xA;      &#x27;-x264opts&#x27;, &#x27;nal-hrd=cbr:no-scenecut&#x27;, // Additional x264 options&#xA;      &#x27;-minrate&#x27;, &#x27;3000&#x27;, // Minimum bitrate&#xA;      &#x27;-maxrate&#x27;, &#x27;3000&#x27;, // Maximum bitrate&#xA;      &#x27;-g&#x27;, &#x27;60&#x27;, // GOP size&#xA;      &#x27;-c:a&#x27;, &#x27;aac&#x27;, // Audio codec&#xA;      &#x27;-b:a&#x27;, &#x27;160k&#x27;, // Audio bitrate&#xA;      &#x27;-ac&#x27;, &#x27;2&#x27;, // Audio channels&#xA;      &#x27;-ar&#x27;, &#x27;44100&#x27;, // Audio sample rate&#xA;      &#x27;-f&#x27;, &#x27;flv&#x27;, // Output format&#xA;      `rtmps://${injestServer}:443/app/${streamKey}` // Output destination&#xA;    ]);&#xA;  &#xA;    ffmpeg.stdin.write(payload, (err) => {&#xA;      console.log(payload)&#xA;      if (err) console.error(&#x27;Error writing payload to FFmpeg stdin:&#x27;, err);&#xA;    });&#xA;  &#xA;    ffmpeg.on(&#x27;close&#x27;, (code) => {&#xA;      console.log(`FFmpeg process exited with code ${code}`);&#xA;    });&#xA;  &#xA;    ffmpeg.stdin.on(&#x27;error&#x27;, (err) => {&#xA;      console.error(&#x27;Error writing to FFmpeg stdin:&#x27;, err);&#xA;    });&#xA;  &#xA;    ffmpeg.stderr.on(&#x27;data&#x27;, (data) => {&#xA;      console.error(`FFmpeg error: ${data}`);&#xA;    });&#xA;  } `&#xA;</void>

    &#xA;