Recherche avancée

Médias (1)

Mot : - Tags -/école

Autres articles (56)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

Sur d’autres sites (9289)

  • Real Time Audio and Video Streaming in C#

    16 novembre 2014, par Nuwan

    I am developing an application which can be used to stream audio and video in real time.
    I can stream in two different ways. I used a capture card to capture live HD stream and
    re send it. And also I need to stream local video file in real time.

    Now I capture video using OpenCV and store frames as bitmaps in blokingCollection bitmap queue.
    After that I encode video frames using ffmpeg (used c# library Nreco) and stored in a queue. Then I send that encoded data through UDP (did not used RTP/RTSP) to omxplayer in raspberry pi and it works very fine.

    Then I captured audio data using ffmpeg
    I used this command to capture and encode audio data.

                    data = ms.ToArray();
                    ffMpegTask = ffmpegConverter.ConvertLiveMedia(
                           fileName,
                           null,
                           ms,
                           Format.avi,
                           new ConvertSettings()
                           {                                
                               CustomOutputArgs = " -tune zerolatency -ss " + second + " -t " + endTime + " -strict experimental -acodec aac -ab 160k -ac 2 -ar 44100 -vn ",                            
                           });
                   ffMpegTask.Start();
                   ffMpegTask.Stop();
                   byte[] data = ms.ToArray();

    After that I saved every audio data packet to queue.

    And I tried to stream these separate audio and video data to omxplayer by using two different
    ports. and received streams by using two omxplayers. And it works fine.

    But what I need to do is multiplex this audio and video stream and send as one stream.
    what I do is first stream two streams as UDP://224.1.1.1:1250(video) and UDP://224.1.1.1:1260(audio)
    then I used nreco invoke method. We can use it to execute ffmpeg commands.

    " -re -i udp://224.1.1.1:1250 -i udp://224.1.1.1:1260 -c copy -f avi udp://224.1.1.1:1270"

    and this works for both audio and video stream but completely out of sync.

    Next thing what I do is creating another ffmpeg ConvertLiveMedia task and write audio and video data
    to that task using write method. And I stream that mux data and received using ffplay. And it plays the stream
    and the sync problem is solved. But sometimes audio and video frames are dropping and then it begins to
    play out of sync.

                   combine = new MemoryStream();
                   ffMpegTaskcom = ffmpegConvertercom.ConvertLiveMedia(
                           Format.mpeg,
                           combine,
                           Format.avi,
                           new ConvertSettings()
                           {
                               CustomInputArgs = " ", // windows bitmap pixel format
                               CustomOutputArgs = " -threads 7 -c:v libx264 -preset ultrafast -tune zerolatency -strict experimental -profile:v baseline -movflags +faststart -tune film -level 3.0 -tune zerolatency -tune film -pix_fmt yuv420p -g 250 -crf 22 -b:v 4000k -minrate 3000k -maxrate 5000k -acodec aac -ab 160k -ac 2 -ar 44100",

                           });
                   ffMpegTaskcom.Start();
                   byte[] streamBytesvi = null;
                   byte[] streamBytesau = null;
                   encodeQueqe.TryDequeue(out streamBytesvi);
                   encodeQueqeau.TryDequeue(out streamBytesau);
                   ffMpegTaskcom.Write(streamBytesvi, 0, streamBytesvi.Length);
                   ffMpegTaskcom.Write(streamBytesau, 0, streamBytesau.Length);

                   //ffMpegTaskcom.Wait();
                   ffMpegTaskcom.Stop();

    Now I need to know a good method to deliver audio and video data with synchronization.
    Please tell me what is the wrong I have done or suggest a better way to do this.

    Thank You !

  • How can i specify range of frames in ffmpeg shuffleframes filter [closed]

    26 juillet 2022, par liphone

    I need to duplicate a bunch of seconds in video and in official docs you can only input frames into shuffleframes filter like so

    


    ffmpeg -i  INPUT -vf "shuffleframes=0 1 2 3 4 5 6 7 ... 592 593 594 595 596 597 598 599 600" OUTPUT


    


    Is there any way to put it like in array ? Or maybe other method of shuffling frames in ffmpeg

    


  • How to load a custom java module into Wowza Streaming Engine ?

    27 octobre 2018, par kw3rti

    I’ve followed the tutorial below step by step, however, the module I’ve created does not appear to load or execute, as I’m not seeing any log entries relating to the getLogger calls in the Wowza Streaming Engine. More specifically, I have created a new Wowza project containing a new module (see code below). Eclipse has then created a jar file in the lib folder of the install directory. I have added the module to a live application on the streaming server. I have also edited the Application.xml file to include the new module.

    To hopefully run the module I’ve written, I am streaming an mp4 file using ffmpeg (according to documentation here) to the streaming engine (via the live application), which I can see in the test players. My understanding was that this would trigger at least one of the event listeners in the module. However, nothing appears to come up in the logs. The only entries related to the stream that I can see are shown below.

    I’ve been trying to debug what’s going wrong for quite a while now, so I’d appreciate any suggestions of what might fix the issue.

    https://www.wowza.com/docs/How-to-extend-Wowza-Streaming-Engine-using-Java

    public class GCStreamModule extends ModuleBase {

       public void onAppStart(IApplicationInstance appInstance) {
           String fullname = appInstance.getApplication().getName() + "/" + appInstance.getName();
           getLogger().info("onAppStart: " + fullname);
       }

       public void onAppStop(IApplicationInstance appInstance) {
           String fullname = appInstance.getApplication().getName() + "/" + appInstance.getName();
           getLogger().info("onAppStop: " + fullname);
       }

       public void onConnect(IClient client, RequestFunction function, AMFDataList params) {
           getLogger().info("onConnect: " + client.getClientId());
       }

       public void onConnectAccept(IClient client) {
           getLogger().info("onConnectAccept: " + client.getClientId());
       }

       public void onConnectReject(IClient client) {
           getLogger().info("onConnectReject: " + client.getClientId());
       }

       public void onDisconnect(IClient client) {
          getLogger().info("onDisconnect: " + client.getClientId());
       }

       public void onStreamCreate(IMediaStream stream) {
           getLogger().info("onStreamConnect");
       }

       public void onMediaStreamCreate(IMediaStream stream){
           getLogger().info("onMediaStreamCreate: " + stream.getSrc());
       }

    }

    Screenshot1
    Screenshot2