Recherche avancée

Médias (1)

Mot : - Tags -/iphone

Autres articles (80)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (4544)

  • Socket.io client in js and server in Socket.io go doesn't send connected messege and data

    24 mars 2023, par OmriHalifa

    I am using ffmpeg and socket.io and I have some issues. I'm trying to send a connection request to a server written in Go through React, but I'm unable to connect to it. I tried adding the events in useEffect and it's still not working, what should I do ? i attaching my code in js and in go :
main.go

    


    package main

import (
    "log"

    "github.com/gin-gonic/gin"

    socketio "github.com/googollee/go-socket.io"
)

func main() {
    router := gin.New()

    server := socketio.NewServer(nil)

    server.OnConnect("/", func(s socketio.Conn) error {
        s.SetContext("")
        log.Println("connected:", s.ID())
        return nil
    })

    server.OnEvent("/", "notice", func(s socketio.Conn, msg string) {
        log.Println("notice:", msg)
        s.Emit("reply", "have "+msg)
    })

    server.OnEvent("/", "transcoded-video", func(s socketio.Conn, data string) {
        log.Println("transcoded-video:", data)
    })

    server.OnEvent("/", "bye", func(s socketio.Conn) string {
        last := s.Context().(string)
        s.Emit("bye", last)
        s.Close()
        return last
    })

    server.OnError("/", func(s socketio.Conn, e error) {
        log.Println("meet error:", e)
    })

    server.OnDisconnect("/", func(s socketio.Conn, reason string) {
        log.Println("closed", reason)
    })

    go func() {
        if err := server.Serve(); err != nil {
            log.Fatalf("socketio listen error: %s\n", err)
        }
    }()
    defer server.Close()

    if err := router.Run(":8000"); err != nil {
        log.Fatal("failed run app: ", err)
    }
}



    


    App.js

    


    import &#x27;./App.css&#x27;;&#xA;import { useEffect } from &#x27;react&#x27;;&#xA;import { createFFmpeg, fetchFile } from &#x27;@ffmpeg/ffmpeg&#x27;;&#xA;import { io } from &#x27;socket.io-client&#x27;; &#xA;&#xA;function App() {&#xA;  const socket = io("http://localhost:8000",function() {&#xA;    // Send a message to the server when the client is connected&#xA;    socket.emit(&#x27;clientConnected&#x27;, &#x27;Client has connected to the server!&#x27;);&#xA;  })&#xA;&#xA;  const ffmpegWorker = createFFmpeg({&#xA;    log: true&#xA;  })&#xA;&#xA;  // Initialize FFmpeg when the component is mounted&#xA;  async function initFFmpeg() {&#xA;    await ffmpegWorker.load();&#xA;  }&#xA;&#xA;  async function transcode(webcamData) {&#xA;    const name = &#x27;record.webm&#x27;;&#xA;    await ffmpegWorker.FS(&#x27;writeFile&#x27;, name, await fetchFile(webcamData));&#xA;    await ffmpegWorker.run(&#x27;-i&#x27;, name, &#x27;-preset&#x27;, &#x27;ultrafast&#x27;, &#x27;-threads&#x27;, &#x27;4&#x27;, &#x27;output.mp4&#x27;);&#xA;    const data = ffmpegWorker.FS(&#x27;readFile&#x27;, &#x27;output.mp4&#x27;);&#xA;    &#xA;    // Set the source of the output video element to the transcoded video data&#xA;    const video = document.getElementById(&#x27;output-video&#x27;);&#xA;    video.src = URL.createObjectURL(new Blob([data.buffer], { type: &#x27;video/mp4&#x27; }));&#xA;    &#xA;    // Remove the output.mp4 file from the FFmpeg virtual file system&#xA;    ffmpegWorker.FS(&#x27;unlink&#x27;, &#x27;output.mp4&#x27;);&#xA;    &#xA;    // Emit a "transcoded-video" event to the server with the transcoded video data&#xA;    socket.emit("transcoded-video", data.buffer)&#xA;  }&#xA;  &#xA;  &#xA;&#xA;  let mediaRecorder;&#xA;  let chunks = [];&#xA;  &#xA;  // Request access to the user&#x27;s camera and microphone and start recording&#xA;  function requestMedia() {&#xA;    const webcam = document.getElementById(&#x27;webcam&#x27;);&#xA;    navigator.mediaDevices.getUserMedia({ video: true, audio: true })&#xA;    .then(async (stream) => {&#xA;      webcam.srcObject = stream;&#xA;      await webcam.play();&#xA;&#xA;      // Set up a MediaRecorder instance to record the video and audio&#xA;      mediaRecorder = new MediaRecorder(stream);&#xA;&#xA;      // Add the recorded data to the chunks array&#xA;      mediaRecorder.ondataavailable = async (e) => {&#xA;        chunks.push(e.data);&#xA;      }&#xA;&#xA;      // Transcode the recorded video data after the MediaRecorder stops&#xA;      mediaRecorder.onstop = async () => {&#xA;        await transcode(new Uint8Array(await (new Blob(chunks)).arrayBuffer()));&#xA;&#xA;        // Clear the chunks array after transcoding&#xA;        chunks = [];&#xA;&#xA;        // Start the MediaRecorder again after a 0 millisecond delay&#xA;        setTimeout(() => {&#xA;          mediaRecorder.start();&#xA;          &#xA;          // Stop the MediaRecorder after 3 seconds&#xA;          setTimeout(() => {&#xA;            mediaRecorder.stop();&#xA;          }, 500);&#xA;        }, 0);&#xA;      }&#xA;&#xA;      // Start the MediaRecorder&#xA;      mediaRecorder.start();&#xA;&#xA;      // Stop the MediaRecorder after 3 seconds&#xA;      setTimeout(() => {&#xA;        mediaRecorder.stop();&#xA;      }, 700);&#xA;    })&#xA;  }&#xA;  &#xA;  useEffect(() => {&#xA;    // Set up event listeners for the socket connection&#xA;    socket.on(&#x27;/&#x27;, function(){&#xA;      // Log a message when the client is connected to the server&#xA;      console.log("Connected to server!"); &#xA;    });&#xA;&#xA;    socket.on(&#x27;transcoded-video&#x27;, function(data){&#xA;      // Log the received data for debugging purposes&#xA;      console.log("Received transcoded video data:", data); &#xA;    });&#xA;&#xA;    socket.on(&#x27;notice&#x27;, function(data){&#xA;      // Emit a "notice" event back to the server to acknowledge the received data&#xA;      socket.emit("notice", "ping server!");&#xA;    });&#xA;&#xA;    socket.on(&#x27;bye&#x27;, function(data){&#xA;      // Log the received data and disconnect from the server&#xA;      console.log("Server sent:", data); &#xA;      socket.disconnect();&#xA;    });&#xA;&#xA;    socket.on(&#x27;disconnect&#x27;, function(){&#xA;      // Log a message when the client is disconnected from the server&#xA;      console.log("Disconnected from server!"); &#xA;    });&#xA;  }, [])&#xA;&#xA;  return (&#xA;    <div classname="App">&#xA;      <div>&#xA;          <video muted="{true}"></video>&#xA;          <video autoplay="autoplay"></video>&#xA;      </div>&#xA;      <button>start streaming</button>&#xA;    </div>&#xA;  );&#xA;}&#xA;&#xA;export default App;&#xA;

    &#xA;

    What can i do to fix it ? thank you !!

    &#xA;

  • Is there a way to send ffmpeg data directly to a TCP Client ?

    3 février 2021, par KosmosisDire

    I am using ffmpeg to send screen capture directly to a C# TCP server. However, due to some bug or technical limitation, the Quest 2 (my build device) cannot bind ports. (There is very little info regarding this bug online, but as far as I can tell I can't get around it). So I cannot have a C# server on the Quest, I must have a client on the quest to receive data. However, ffmpeg sends data as a TCP client. So I need ffmpeg to act as a server.

    &#xA;

    I tried sending the data through a server on my computer that then sends the data to the C# client. But I get less than 1 fps with huge lag. Normally, I get a good 30 fps with low latency when sending directly to a server.

    &#xA;

    So my question :

    &#xA;

    Is there a way to get ffmpeg data directly to a client, or indirectly without increasing the latency ?

    &#xA;

    Here is the ffmpeg command I am using :

    &#xA;

    ffmpeg -f gdigrab -i desktop -pixel_format rgb8 -video_size 896x504 -vf scale=896:504 -framerate 5 -r 30 -f rawvideo tcp://127.0.0.1:846&#xA;

    &#xA;

    I can include code for my C# client and server attempts as well as the middleman server, if I need.

    &#xA;

    Thanks for any help !

    &#xA;

  • Do i need a client to stream data ?

    28 novembre 2017, par Adalcar

    I am writing a web interface for a set of cameras using ffmpeg.

    The issue comes when trying to stream this data :
    after shamelessly ripping this code, replacing the ip with localhost for testing purposes, I tried to read the stream with VLC, to no avail : the server side runs without problem but the packets seem to disappear in thin air.

    That’s when I open the server with

    avio_open2(&amp;server, "udp://localhost:554", AVIO_FLAG_WRITE, NULL, &amp;options);

    If I use rtsp instead, it has the decency of failing with "unknown protocol" (?)

    and for http, it pops an unknown error.

    Do I need a client request to start streaming ?