Recherche avancée

Médias (91)

Autres articles (24)

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

  • Les vidéos

    21 avril 2011, par

    Comme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
    Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
    Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...)

Sur d’autres sites (4979)

  • How to get a file converted by ffmpeg in Android : permission denied

    2 juin 2021, par Omid.N

    I am trying to use FFmpeg in my android app. So I want to test it if it works before moving on. I use an external library : github link
    
The code looks like this :

    


    package net.omidn.aslanmediaconverter;

import android.net.Uri;
import android.os.Bundle;
import android.util.Log;
import android.widget.TextView;

import androidx.appcompat.app.AppCompatActivity;

import com.arthenica.ffmpegkit.ExecuteCallback;
import com.arthenica.ffmpegkit.FFmpegKit;
import com.arthenica.ffmpegkit.FFmpegSession;
import com.arthenica.ffmpegkit.Session;

import net.bramp.ffmpeg.job.FFmpegJob;

import java.io.BufferedInputStream;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;


public class MainActivity extends AppCompatActivity {
    
    private static final String TAG = "MainActivity";
    FFmpegJob myjob;

    @Override
    protected void onCreate(Bundle savedInstanceState) {


        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        TextView textView = (TextView) findViewById(R.id.text_view);


        FFmpegJob job = null;

        File inFile = new File("/storage/emulated/0/video_2021-05-29_17-50-20.mp4");
        String inputName = Uri.fromFile(inFile).toString();
        Log.d(TAG, inputName);
        Log.d(TAG,"file exists : " + String.valueOf(inFile.exists()));
        Log.d(TAG,"file canRead : " + String.valueOf(inFile.canRead()));

        FFmpegSession fFmpegSession = FFmpegKit.executeAsync("-i file:///storage/emulated/0/video_2021-05-29_17-50-20.mp4 -c:v mpeg4 file:///storage/emulated/0/out.mp4",
                new ExecuteCallback() {
                    @Override
                    public void apply(Session session) {

                    }
                });
        try {
            Thread.sleep(5000);
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
        textView.setText("" + fFmpegSession.getState().name() + "    " + fFmpegSession.getOutput());
    }

}



    


    As you can see I give the files with file:/// protocol. If I don't use that the resault is the same. The three lines of Log.d(...) will print :

    


    2021-06-03 00:58:08.869 8376-8376/net.omidn.aslanmediaconverter D/MainActivity: file:///storage/emulated/0/video_2021-05-29_17-50-20.mp4
2021-06-03 00:58:08.869 8376-8376/net.omidn.aslanmediaconverter D/MainActivity: file exists : true
2021-06-03 00:58:08.869 8376-8376/net.omidn.aslanmediaconverter D/MainActivity: file canRead : false


    


    The video file has read access on the storage :
The video file has read access

    


  • Live555 Multicast RTSP Server - Display in FFMPEG

    31 mai 2021, par BanB

    I made a multicast RTSP server using the Live555 library.
    
It is displayed by VLC but not from VMS (NX Witness) or FFMPEG.
    
If I use the Unicast, I can display it, but I do not know how to change the file that is streaming in the Unicast.
    
Is there a way to change the streaming file in Unicast or make it viewable in FFMPEG with Multicast ?

    


    This is my code.

    


    


    Header

    


    


    #include <iostream>&#xA;#include <string>&#xA;#include <filesystem>&#xA;#include <fstream>&#xA;#include <ctime>&#xA;#include &#xA;#include &#xA;#include  &#xA;#include <opencv2></opencv2>videoio.hpp>&#xA;#include <opencv2></opencv2>imgcodecs.hpp>&#xA;&#xA;void Start(const char* streamPath, int port);&#xA;void AddDataL(BYTE* data, int length);&#xA;AddDataHW(BYTE* data, int height, int width);&#xA;&#xA;void Play();&#xA;void AfterPlaying(void*);&#xA;&#xA;void EncodeToH264(cv::Mat image);&#xA;long long GetTimeMs();&#xA;int GetFileCount();&#xA;const char* GetFirstFilePath();&#xA;&#xA;const int fourcc = cv::VideoWriter::fourcc(&#x27;x&#x27;, &#x27;2&#x27;, &#x27;6&#x27;, &#x27;4&#x27;);&#xA;&#xA;UsageEnvironment* env;&#xA;H264ServerMediaSubsession* h264;&#xA;H264VideoStreamFramer* videoSource;&#xA;RTPSink* videoSink;&#xA;&#xA;const char* dataPath = "data/";&#xA;const char* extension = ".h264";&#xA;</ctime></fstream></filesystem></string></iostream>

    &#xA;

    &#xA;

    CPP

    &#xA;

    &#xA;

    void Start()&#xA;{&#xA;    askScheduler *scheduler = BasicTaskScheduler::createNew();&#xA;    env = BasicUsageEnvironment::createNew(*scheduler);&#xA;&#xA;    RTSPServer *rtspServer = RTSPServer::createNew(*env, 8554);&#xA;&#xA;    if (rtspServer == NULL)&#xA;        throw;&#xA;&#xA;    struct sockaddr_storage destinationAddress;&#xA;    destinationAddress.ss_family = AF_INET;&#xA;    ((struct sockaddr_in &amp;)destinationAddress).sin_addr.s_addr = chooseRandomIPv4SSMAddress(*env);&#xA;&#xA;    const unsigned short rtpPortNum = 18888;&#xA;    const unsigned short rtcpPortNum = rtpPortNum &#x2B; 1;&#xA;    const unsigned char ttl = 255;&#xA;&#xA;    const Port rtpPort(rtpPortNum);&#xA;    const Port rtcpPort(rtcpPortNum);&#xA;&#xA;    Groupsock rtpGroupsock(*env, destinationAddress, rtpPort, ttl);&#xA;    rtpGroupsock.multicastSendOnly();&#xA;    Groupsock rtcpGroupsock(*env, destinationAddress, rtcpPort, ttl);&#xA;    rtcpGroupsock.multicastSendOnly();&#xA;&#xA;    OutPacketBuffer::maxSize = 300000;&#xA;    videoSink = H264VideoRTPSink::createNew(*env, &amp;rtpGroupsock, 96);&#xA;&#xA;    const unsigned estimatedSessionBandwidth = 5000;&#xA;    const unsigned maxCNAMElen = 100;&#xA;    unsigned char CNAME[maxCNAMElen &#x2B; 1];&#xA;    gethostname((char *)CNAME, maxCNAMElen);&#xA;    CNAME[maxCNAMElen] = &#x27;\0&#x27;;&#xA;&#xA;    RTCPInstance *rtcp = RTCPInstance::createNew(*env, &amp;rtcpGroupsock, estimatedSessionBandwidth, CNAME, videoSink, NULL, True);&#xA;&#xA;    ServerMediaSession *sms = ServerMediaSession::createNew(*env, "live", "live", "Session streamed by \"Test Server\"", True);&#xA;    sms->addSubsession(PassiveServerMediaSubsession::createNew(*videoSink, rtcp));&#xA;    rtspServer->addServerMediaSession(sms);&#xA;&#xA;    *env &lt;&lt; "Start > " &lt;&lt; rtspServer->rtspURL(sms) &lt;&lt; "\n";&#xA;&#xA;    Play();&#xA;&#xA;    env->taskScheduler().doEventLoop();&#xA;}&#xA;&#xA;void AddData(BYTE* data, int length)&#xA;{&#xA;    EncodeToH264(cv::imdecode(cv::Mat(1, length, CV_8UC3, data), 1));&#xA;}&#xA;&#xA;void EncodeToH264(cv::Mat image)&#xA;{&#xA;    std::string fileName = dataPath &#x2B; std::to_string(GetTimeMs()) &#x2B; extension;&#xA;    cv::VideoWriter writer(fileName.c_str(), fourcc, 1, image.size());&#xA;    writer.write(image);&#xA;    writer.write(image);&#xA;    writer.release();&#xA;}&#xA;&#xA;void Play()&#xA;{&#xA;    while (GetFileCount() == 0);&#xA;&#xA;    const char* fileName = GetFirstFilePath();&#xA;&#xA;    ByteStreamFileSource* fileSource;&#xA;    while ((fileSource = ByteStreamFileSource::createNew(*env, fileName)) == NULL);&#xA;&#xA;    FramedSource* videoES = fileSource;&#xA;&#xA;    videoSource = H264VideoStreamFramer::createNew(*env, videoES);&#xA;&#xA;    videoSink->startPlaying(*videoSource, AfterPlaying, videoSink);&#xA;}&#xA;&#xA;void AfterPlaying(void*)&#xA;{&#xA;    videoSink->stopPlaying();&#xA;    Medium::close(videoSource);&#xA;&#xA;    if (GetFileCount() > 1)&#xA;        std::filesystem::remove(GetFirstFilePath());&#xA;&#xA;    Play();&#xA;}&#xA;&#xA;long long GetTimeMs()&#xA;{&#xA;    return std::chrono::duration_cast(std::chrono::system_clock::now().time_since_epoch()).count();&#xA;}&#xA;&#xA;int GetFileCount() &#xA;{&#xA;    return std::distance(std::filesystem::directory_iterator(dataPath), std::filesystem::directory_iterator());&#xA;}&#xA;&#xA;const char* GetFirstFilePath()&#xA;{&#xA;    for (const auto&amp; entry : std::filesystem::directory_iterator(dataPath))&#xA;        return entry.path().string().c_str();&#xA;}&#xA;

    &#xA;

  • How to use ffmpeg to record video from SDP ?

    17 août 2024, par Thomas Carlton

    I'm running Ubuntu 18.04 and I have the following SDP Offer from Kurento Media Server. The offer is saved to the file a.sdp :

    &#xA;

        v=0&#xA;    o=- 3831476180 3831476180 IN IP4 172.31.46.122&#xA;    s=Kurento Media Server&#xA;    c=IN IP4 172.31.46.122&#xA;    t=0 0&#xA;    m=audio 28460 RTP/AVPF 96 0 97&#xA;    a=setup:actpass&#xA;    a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time&#xA;    a=rtpmap:96 opus/48000/2&#xA;    a=rtpmap:97 AMR/8000&#xA;    a=rtcp:28461&#xA;    a=sendrecv&#xA;    a=mid:audio0&#xA;    a=ssrc:1797155263 cname:user1913428254@host-e7ab0454&#xA;    m=video 18122 RTP/AVPF 102 103&#xA;    a=setup:actpass&#xA;    a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time&#xA;    a=rtpmap:102 VP8/90000&#xA;    a=rtpmap:103 H264/90000&#xA;    a=fmtp:103 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=42e01f&#xA;    a=rtcp:18123&#xA;    a=sendrecv&#xA;    a=mid:video0&#xA;    a=rtcp-fb:102 nack&#xA;    a=rtcp-fb:102 nack pli&#xA;    a=rtcp-fb:102 goog-remb&#xA;    a=rtcp-fb:102 ccm fir&#xA;    a=rtcp-fb:103 nack&#xA;    a=rtcp-fb:103 nack pli&#xA;    a=rtcp-fb:103 ccm fir&#xA;    a=ssrc:2822275190 cname:user1913428254@host-e7ab0454&#xA;

    &#xA;

    Then, I'm calling ffmpeg to record this flow to a file :

    &#xA;

    ffmpeg -max_delay 5000 -reorder_queue_size 16384 -protocol_whitelist file,crypto,udp,rtp -re -i a.sdp -vcodec copy -acodec aac -y output.mp4&#xA;

    &#xA;

    172.31.46.122 is the local IP adress and I'm running ffmpeg from the same machine as SDP offer. So ffmpeg has access to this IP address.

    &#xA;

    I'm getting the error :

    &#xA;

    &#xA;

    bind failed : Address already in use.

    &#xA;

    a.sdp : Invalid data found when processing input

    &#xA;

    &#xA;

    How can I solve that ?

    &#xA;