
Recherche avancée
Médias (2)
-
Valkaama DVD Label
4 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Image
-
Podcasting Legal guide
16 mai 2011, par
Mis à jour : Mai 2011
Langue : English
Type : Texte
Autres articles (32)
-
Modifier la date de publication
21 juin 2013, parComment changer la date de publication d’un média ?
Il faut au préalable rajouter un champ "Date de publication" dans le masque de formulaire adéquat :
Administrer > Configuration des masques de formulaires > Sélectionner "Un média"
Dans la rubrique "Champs à ajouter, cocher "Date de publication "
Cliquer en bas de la page sur Enregistrer -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Automated installation script of MediaSPIP
25 avril 2011, parTo overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
The documentation of the use of this installation script is available here.
The code of this (...)
Sur d’autres sites (7157)
-
Cannot watch RTSP stream from Live555 using FFMPEG
30 octobre 2023, par bobku123I am trying to create an RTSP server using Live555, with the stream source being raw h264 video of my USB webcam encoded with FFMPEG, sent over UDP.


I have used BasicUDPSource from Live555 sources as my FramedSource class. I created my own MediaSubsession class as per the Live555 FAQ. Here is the source code I created so far :


#include "liveMedia.hh"

#include "BasicUsageEnvironment.hh"
#include "announceURL.hh"
#include "FFMPEGH264StreamMediaSubsession.hh"
#include "BasicUDPSource.hh"
#include "H264VideoStreamFramer.hh"
#include "H265VideoRTPSink.hh"

UsageEnvironment* env;

// To make the second and subsequent client for each stream reuse the same
// input stream as the first client (rather than playing the file from the
// start for each client), change the following "False" to "True":
Boolean reuseFirstSource = False;

// To stream *only* MPEG-1 or 2 video "I" frames
// (e.g., to reduce network bandwidth),
// change the following "False" to "True":
Boolean iFramesOnly = False;

static void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms,
 char const* streamName, char const* inputFileName); // forward


int main(int argc, char** argv) {
 // Begin by setting up our usage environment:
 TaskScheduler* scheduler = BasicTaskScheduler::createNew();
 env = BasicUsageEnvironment::createNew(*scheduler);

 UserAuthenticationDatabase* authDB = NULL;

 // Serve regular RTSP (over a TCP connection):
 RTSPServer* rtspServer = RTSPServer::createNew(*env, 8554, authDB);

 if (rtspServer == NULL) {
 *env << "Failed to create RTSP server: " << env->getResultMsg() << "\n";
 exit(1);
 }

 char const* descriptionString = "Session streamed by \"testFFMPEGRTSPServer\"";

 {
 char const* streamName = "FFMPEGRTSPStream";
 ServerMediaSession* sms
 = ServerMediaSession::createNew(*env, streamName, streamName,
 descriptionString);
 sms->addSubsession(FFMPEGH264StreamMediaSubsession
 ::createNew(*env, reuseFirstSource));
 rtspServer->addServerMediaSession(sms);

 announceStream(rtspServer, sms, streamName, "ffmpeg");
 }

 env->taskScheduler().doEventLoop(); // does not return
}

static void announceStream(RTSPServer* rtspServer, ServerMediaSession* sms,
 char const* streamName, char const* inputFileName) {
 UsageEnvironment& env = rtspServer->envir();

 env << "\n\"" << streamName << "\" stream, from the file \""
 << inputFileName << "\"\n";
 announceURL(rtspServer, sms);
}

FFMPEGH264StreamMediaSubsession*
 FFMPEGH264StreamMediaSubsession::createNew(UsageEnvironment& env, Boolean reuseFirstSource)
{
 return new FFMPEGH264StreamMediaSubsession(env, reuseFirstSource);
}

FFMPEGH264StreamMediaSubsession::FFMPEGH264StreamMediaSubsession(UsageEnvironment& env, Boolean reuseFirstSource) 
 : OnDemandServerMediaSubsession(env, reuseFirstSource)
{

}

FFMPEGH264StreamMediaSubsession::~FFMPEGH264StreamMediaSubsession()
{

}

FramedSource* FFMPEGH264StreamMediaSubsession::createNewStreamSource(unsigned clientSessionId,
 unsigned& estBitrate)
{
 estBitrate = 500; // kbps, estimate

 // Create the video source:
 // Create a 'groupsock' for the input multicast group,port:
 char const* inputAddressStr = "192.168.1.100";

 NetAddressList inputAddresses(inputAddressStr);
 struct sockaddr_storage inputAddress;
 copyAddress(inputAddress, inputAddresses.firstAddress());

 Port const inputPort(8888);
 unsigned char const inputTTL = 0; // we're only reading from this mcast group

 Groupsock inputGroupsock(envir(), inputAddress, inputPort, inputTTL);

 // Then create a liveMedia 'source' object, encapsulating this groupsock:
 FramedSource* source = BasicUDPSource::createNew(envir(), &inputGroupsock);

 // Create a framer for the Video Elementary Stream:
 return H264VideoStreamFramer::createNew(envir(), source);
} 

RTPSink* FFMPEGH264StreamMediaSubsession::createNewRTPSink(Groupsock* rtpGroupsock,
 unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource)
{
 return H265VideoRTPSink::createNew(envir(), rtpGroupsock, rtpPayloadTypeIfDynamic);
}



I converted my USB webcam to H264 stream using the following FFMPEG command :


ffmpeg -video_size 1280x720 -framerate 30 -input_format mjpeg -f v4l2 -i /dev/video0 -c:v h264_nvenc -bf 0 -g 30 -bsf:v 'filter_units=remove_types=35|38-40' -f h264 udp://192.168.1.100:8888



However when I try to play it with FFPLAY using the following, I do not get any output :


ffplay rtsp://192.168.1.100:8554/FFMPEGRTSPStream



What am I doing wrong ?


-
doc/filters : clarify requirements for subtitles wrap_unicode option
25 mai 2023, par Gyan Doshi -
avcodec/nvenc : make aware of SDK 11.1 driver requirements
8 août 2021, par Timo Rothenpieler