
Recherche avancée
Médias (1)
-
Bug de détection d’ogg
22 mars 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Video
Autres articles (101)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)
Sur d’autres sites (11460)
-
rename files for language_REGION according to the standard ISO_3166-1 (http://en.wikipedia.org/wiki/ISO_3166-1), for Taiwan tha language is Chinese (zh) and the region is Taiwan (TW)
6 septembre 2012, par Cotom localization/messages_pt_BR.js m localization/messages_pt_PT.js m localization/messages_zh_TW.js rename files for language_REGION according to the standard ISO_3166-1 (http://en.wikipedia.org/wiki/ISO_3166-1), for Taiwan tha language is Chinese (zh) and the region is Taiwan (...)
-
Anomalie #2761 (Fermé) : Activation des plugins
15 juin 2012, par xdjuj -Sur SPIP3 (et SPIP 2 => xdjuj ==> même problème sur SPIP 2.1.13 d’ailleurs ^^’) Si je coche plusieurs plugins dans une liste afin de les activer une partie de ces plugins a des dépendances non satisfaites. Ca me renvoit bien l’erreur MAIS ça n’active pas les autres. Je dois refaire ma sélection (...)
-
Cannot watch RTSP stream from Live555 using FFMPEG
30 octobre 2023, par bobku123I am trying to create an RTSP server using Live555, with the stream source being raw h264 video of my USB webcam encoded with FFMPEG, sent over UDP.


I have used BasicUDPSource from Live555 sources as my FramedSource class. I created my own MediaSubsession class as per the Live555 FAQ. Here is the source code I created so far :


#include "liveMedia.hh"

#include "BasicUsageEnvironment.hh"
#include "announceURL.hh"
#include "FFMPEGH264StreamMediaSubsession.hh"
#include "BasicUDPSource.hh"
#include "H264VideoStreamFramer.hh"
#include "H265VideoRTPSink.hh"

UsageEnvironment* env;

// To make the second and subsequent client for each stream reuse the same
// input stream as the first client (rather than playing the file from the
// start for each client), change the following "False" to "True":
Boolean reuseFirstSource = False;

// To stream *only* MPEG-1 or 2 video "I" frames
// (e.g., to reduce network bandwidth),
// change the following "False" to "True":
Boolean iFramesOnly = False;

static void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms,
 char const* streamName, char const* inputFileName); // forward


int main(int argc, char** argv) {
 // Begin by setting up our usage environment:
 TaskScheduler* scheduler = BasicTaskScheduler::createNew();
 env = BasicUsageEnvironment::createNew(*scheduler);

 UserAuthenticationDatabase* authDB = NULL;

 // Serve regular RTSP (over a TCP connection):
 RTSPServer* rtspServer = RTSPServer::createNew(*env, 8554, authDB);

 if (rtspServer == NULL) {
 *env << "Failed to create RTSP server: " << env->getResultMsg() << "\n";
 exit(1);
 }

 char const* descriptionString = "Session streamed by \"testFFMPEGRTSPServer\"";

 {
 char const* streamName = "FFMPEGRTSPStream";
 ServerMediaSession* sms
 = ServerMediaSession::createNew(*env, streamName, streamName,
 descriptionString);
 sms->addSubsession(FFMPEGH264StreamMediaSubsession
 ::createNew(*env, reuseFirstSource));
 rtspServer->addServerMediaSession(sms);

 announceStream(rtspServer, sms, streamName, "ffmpeg");
 }

 env->taskScheduler().doEventLoop(); // does not return
}

static void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms,
 char const* streamName, char const* inputFileName) {
 UsageEnvironment& env = rtspServer->envir();

 env << "\n\"" << streamName << "\" stream, from the file \""
 << inputFileName << "\"\n";
 announceURL(rtspServer, sms);
}

FFMPEGH264StreamMediaSubsession*
 FFMPEGH264StreamMediaSubsession::createNew(UsageEnvironment& env, Boolean reuseFirstSource)
{
 return new FFMPEGH264StreamMediaSubsession(env, reuseFirstSource);
}

FFMPEGH264StreamMediaSubsession::FFMPEGH264StreamMediaSubsession(UsageEnvironment& env, Boolean reuseFirstSource) 
 : OnDemandServerMediaSubsession(env, reuseFirstSource)
{

}

FFMPEGH264StreamMediaSubsession::~FFMPEGH264StreamMediaSubsession()
{

}

FramedSource* FFMPEGH264StreamMediaSubsession::createNewStreamSource(unsigned clientSessionId,
 unsigned& estBitrate)
{
 estBitrate = 500; // kbps, estimate

 // Create the video source:
 // Create a 'groupsock' for the input multicast group,port:
 char const* inputAddressStr = "192.168.1.100";

 NetAddressList inputAddresses(inputAddressStr);
 struct sockaddr_storage inputAddress;
 copyAddress(inputAddress, inputAddresses.firstAddress());

 Port const inputPort(8888);
 unsigned char const inputTTL = 0; // we're only reading from this mcast group

 Groupsock inputGroupsock(envir(), inputAddress, inputPort, inputTTL);

 // Then create a liveMedia 'source' object, encapsulating this groupsock:
 FramedSource* source = BasicUDPSource::createNew(envir(), &inputGroupsock);

 // Create a framer for the Video Elementary Stream:
 return H264VideoStreamFramer::createNew(envir(), source);
} 

RTPSink* FFMPEGH264StreamMediaSubsession::createNewRTPSink(Groupsock* rtpGroupsock,
 unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource)
{
 return H265VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
}



I converted my USB webcam to H264 stream using the following FFMPEG command :


ffmpeg -video_size 1280x720 -framerate 30 -input_format mjpeg -f v4l2 -i /dev/video0 -c:v h264_nvenc -bf 0 -g 30 -bsf:v 'filter_units=remove_types=35|38-40' -f h264 udp://192.168.1.100:8888



However when I try to play it with FFPLAY using the following, I do not get any output :


ffplay rtsp://192.168.1.100:8554/FFMPEGRTSPStream



What am I doing wrong ?