
Recherche avancée
Médias (91)
-
999,999
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Slip - Artworks
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
-
Demon seed (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
The four of us are dying (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Corona radiata (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
-
Lights in the sky (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
Autres articles (63)
-
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
MediaSPIP Init et Diogène : types de publications de MediaSPIP
11 novembre 2010, parÀ l’installation d’un site MediaSPIP, le plugin MediaSPIP Init réalise certaines opérations dont la principale consiste à créer quatre rubriques principales dans le site et de créer cinq templates de formulaire pour Diogène.
Ces quatre rubriques principales (aussi appelées secteurs) sont : Medias ; Sites ; Editos ; Actualités ;
Pour chacune de ces rubriques est créé un template de formulaire spécifique éponyme. Pour la rubrique "Medias" un second template "catégorie" est créé permettant d’ajouter (...)
Sur d’autres sites (6058)
-
Receive RTMP stream with OpenCV (python)
12 février 2024, par OvernoutI'm trying to process an RTMP stream in Python, using OpenCV2 but I'm not able to get OpenCV to capture it.


I can run FFmpeg/FFplay from the command line and receive the stream successfully.
What could cause OpenCV to fail opening the stream in listening mode ?


Here is my code :


import cv2

cap = cv2.VideoCapture("rtmp://0.0.0.0/live/stream", cv2.CAP_FFMPEG)

if not cap.isOpened():
 print("Cannot open video source")
 exit()



And the output :


[tcp @ 00000192c490d640] Connection to tcp://0.0.0.0:1935 failed: Error number -138 occurred
[rtmp @ 00000192c490d580] Cannot open connection tcp://0.0.0.0:1935 Cannot open video source



Here is the output of cv2.getBuildInformation()


General configuration for OpenCV 4.9.0 =====================================
 Version control: 4.9.0

 Platform:
 Timestamp: 2023-12-31T11:21:12Z
 Host: Windows 10.0.17763 AMD64
 CMake: 3.24.2
 CMake generator: Visual Studio 14 2015
 CMake build tool: MSBuild.exe
 MSVC: 1900
 Configuration: Debug Release

 CPU/HW features:
 Baseline: SSE SSE2 SSE3
 requested: SSE3
 Dispatched code generation: SSE4_1 SSE4_2 FP16 AVX AVX2
 requested: SSE4_1 SSE4_2 AVX FP16 AVX2 AVX512_SKX
 SSE4_1 (16 files): + SSSE3 SSE4_1
 SSE4_2 (1 files): + SSSE3 SSE4_1 POPCNT SSE4_2
 FP16 (0 files): + SSSE3 SSE4_1 POPCNT SSE4_2 FP16 AVX
 AVX (8 files): + SSSE3 SSE4_1 POPCNT SSE4_2 AVX
 AVX2 (36 files): + SSSE3 SSE4_1 POPCNT SSE4_2 FP16 FMA3 AVX AVX2

 C/C++:
 Built as dynamic libs?: NO
 C++ standard: 11
 C++ Compiler: C:/Program Files (x86)/Microsoft Visual Studio 14.0/VC/bin/x86_amd64/cl.exe (ver 19.0.24247.2)
 C++ flags (Release): /DWIN32 /D_WINDOWS /W4 /GR /D _CRT_SECURE_NO_DEPRECATE /D _CRT_NONSTDC_NO_DEPRECATE /D _SCL_SECURE_NO_WARNINGS /Gy /bigobj /Oi /fp:precise /EHa /wd4127 /wd4251 /wd4324 /wd4275 /wd4512 /wd4589 /wd4819 /MP /O2 /Ob2 /DNDEBUG 
 C++ flags (Debug): /DWIN32 /D_WINDOWS /W4 /GR /D _CRT_SECURE_NO_DEPRECATE /D _CRT_NONSTDC_NO_DEPRECATE /D _SCL_SECURE_NO_WARNINGS /Gy /bigobj /Oi /fp:precise /EHa /wd4127 /wd4251 /wd4324 /wd4275 /wd4512 /wd4589 /wd4819 /MP /Zi /Ob0 /Od /RTC1 
 C Compiler: C:/Program Files (x86)/Microsoft Visual Studio 14.0/VC/bin/x86_amd64/cl.exe
 C flags (Release): /DWIN32 /D_WINDOWS /W3 /D _CRT_SECURE_NO_DEPRECATE /D _CRT_NONSTDC_NO_DEPRECATE /D _SCL_SECURE_NO_WARNINGS /Gy /bigobj /Oi /fp:precise /MP /O2 /Ob2 /DNDEBUG 
 C flags (Debug): /DWIN32 /D_WINDOWS /W3 /D _CRT_SECURE_NO_DEPRECATE /D _CRT_NONSTDC_NO_DEPRECATE /D _SCL_SECURE_NO_WARNINGS /Gy /bigobj /Oi /fp:precise /MP /Zi /Ob0 /Od /RTC1 
 Linker flags (Release): /machine:x64 /NODEFAULTLIB:atlthunk.lib /INCREMENTAL:NO /NODEFAULTLIB:libcmtd.lib /NODEFAULTLIB:libcpmtd.lib /NODEFAULTLIB:msvcrtd.lib
 Linker flags (Debug): /machine:x64 /NODEFAULTLIB:atlthunk.lib /debug /INCREMENTAL /NODEFAULTLIB:libcmt.lib /NODEFAULTLIB:libcpmt.lib /NODEFAULTLIB:msvcrt.lib
 ccache: NO
 Precompiled headers: YES
 Extra dependencies: wsock32 comctl32 gdi32 ole32 setupapi ws2_32
 3rdparty dependencies: libprotobuf ade ittnotify libjpeg-turbo libwebp libpng libtiff libopenjp2 IlmImf zlib ippiw ippicv

 OpenCV modules:
 To be built: calib3d core dnn features2d flann gapi highgui imgcodecs imgproc ml objdetect photo python3 stitching video videoio
 Disabled: java world
 Disabled by dependency: -
 Unavailable: python2 ts
 Applications: -
 Documentation: NO
 Non-free algorithms: NO

 Windows RT support: NO

 GUI: WIN32UI
 Win32 UI: YES
 VTK support: NO

 Media I/O: 
 ZLib: build (ver 1.3)
 JPEG: build-libjpeg-turbo (ver 2.1.3-62)
 SIMD Support Request: YES
 SIMD Support: NO
 WEBP: build (ver encoder: 0x020f)
 PNG: build (ver 1.6.37)
 TIFF: build (ver 42 - 4.2.0)
 JPEG 2000: build (ver 2.5.0)
 OpenEXR: build (ver 2.3.0)
 HDR: YES
 SUNRASTER: YES
 PXM: YES
 PFM: YES

 Video I/O:
 DC1394: NO
 FFMPEG: YES (prebuilt binaries)
 avcodec: YES (58.134.100)
 avformat: YES (58.76.100)
 avutil: YES (56.70.100)
 swscale: YES (5.9.100)
 avresample: YES (4.0.0)
 GStreamer: NO
 DirectShow: YES
 Media Foundation: YES
 DXVA: YES

 Parallel framework: Concurrency

 Trace: YES (with Intel ITT)

 Other third-party libraries:
 Intel IPP: 2021.11.0 [2021.11.0]
 at: D:/a/opencv-python/opencv-python/_skbuild/win-amd64-3.7/cmake-build/3rdparty/ippicv/ippicv_win/icv
 Intel IPP IW: sources (2021.11.0)
 at: D:/a/opencv-python/opencv-python/_skbuild/win-amd64-3.7/cmake-build/3rdparty/ippicv/ippicv_win/iw
 Lapack: NO
 Eigen: NO
 Custom HAL: NO
 Protobuf: build (3.19.1)
 Flatbuffers: builtin/3rdparty (23.5.9)

 OpenCL: YES (NVD3D11)
 Include path: D:/a/opencv-python/opencv-python/opencv/3rdparty/include/opencl/1.2
 Link libraries: Dynamic load

 Python 3:
 Interpreter: C:/hostedtoolcache/windows/Python/3.7.9/x64/python.exe (ver 3.7.9)
 Libraries: C:/hostedtoolcache/windows/Python/3.7.9/x64/libs/python37.lib (ver 3.7.9)
 numpy: C:/hostedtoolcache/windows/Python/3.7.9/x64/lib/site-packages/numpy/core/include (ver 1.17.0)
 install path: python/cv2/python-3

 Python (for build): C:\hostedtoolcache\windows\Python\3.7.9\x64\python.exe

 Java: 
 ant: NO
 Java: YES (ver 1.8.0.392)
 JNI: C:/hostedtoolcache/windows/Java_Temurin-Hotspot_jdk/8.0.392-8/x64/include C:/hostedtoolcache/windows/Java_Temurin-Hotspot_jdk/8.0.392-8/x64/include/win32 C:/hostedtoolcache/windows/Java_Temurin-Hotspot_jdk/8.0.392-8/x64/include
 Java wrappers: NO
 Java tests: NO

 Install to: D:/a/opencv-python/opencv-python/_skbuild/win-amd64-3.7/cmake-install
-----------------------------------------------------------------



-
Cutting a live stream into separate mp4 files
9 juin 2017, par FearhunterI am doing a research for cutting a live stream in piece and save it as mp4 files. I am using this source for the proof of concept :
And this is the example code I use :
using System;
using System.Collections.Generic;
using System.Configuration;
using System.IO;
using System.Linq;
using System.Net;
using System.Security.Cryptography;
using System.Text;
using System.Threading.Tasks;
using Microsoft.WindowsAzure.MediaServices.Client;
using Newtonsoft.Json.Linq;
namespace AMSLiveTest
{
class Program
{
private const string StreamingEndpointName = "streamingendpoint001";
private const string ChannelName = "channel001";
private const string AssetlName = "asset001";
private const string ProgramlName = "program001";
// Read values from the App.config file.
private static readonly string _mediaServicesAccountName =
ConfigurationManager.AppSettings["MediaServicesAccountName"];
private static readonly string _mediaServicesAccountKey =
ConfigurationManager.AppSettings["MediaServicesAccountKey"];
// Field for service context.
private static CloudMediaContext _context = null;
private static MediaServicesCredentials _cachedCredentials = null;
static void Main(string[] args)
{
// Create and cache the Media Services credentials in a static class variable.
_cachedCredentials = new MediaServicesCredentials(
_mediaServicesAccountName,
_mediaServicesAccountKey);
// Used the cached credentials to create CloudMediaContext.
_context = new CloudMediaContext(_cachedCredentials);
IChannel channel = CreateAndStartChannel();
// Set the Live Encoder to point to the channel's input endpoint:
string ingestUrl = channel.Input.Endpoints.FirstOrDefault().Url.ToString();
// Use the previewEndpoint to preview and verify
// that the input from the encoder is actually reaching the Channel.
string previewEndpoint = channel.Preview.Endpoints.FirstOrDefault().Url.ToString();
IProgram program = CreateAndStartProgram(channel);
ILocator locator = CreateLocatorForAsset(program.Asset, program.ArchiveWindowLength);
IStreamingEndpoint streamingEndpoint = CreateAndStartStreamingEndpoint();
GetLocatorsInAllStreamingEndpoints(program.Asset);
// Once you are done streaming, clean up your resources.
Cleanup(streamingEndpoint, channel);
}
public static IChannel CreateAndStartChannel()
{
//If you want to change the Smooth fragments to HLS segment ratio, you would set the ChannelCreationOptions’s Output property.
IChannel channel = _context.Channels.Create(
new ChannelCreationOptions
{
Name = ChannelName,
Input = CreateChannelInput(),
Preview = CreateChannelPreview()
});
//Starting and stopping Channels can take some time to execute. To determine the state of operations after calling Start or Stop, query the IChannel.State .
channel.Start();
return channel;
}
private static ChannelInput CreateChannelInput()
{
return new ChannelInput
{
StreamingProtocol = StreamingProtocol.RTMP,
AccessControl = new ChannelAccessControl
{
IPAllowList = new List<iprange>
{
new IPRange
{
Name = "TestChannelInput001",
// Setting 0.0.0.0 for Address and 0 for SubnetPrefixLength
// will allow access to IP addresses.
Address = IPAddress.Parse("0.0.0.0"),
SubnetPrefixLength = 0
}
}
}
};
}
private static ChannelPreview CreateChannelPreview()
{
return new ChannelPreview
{
AccessControl = new ChannelAccessControl
{
IPAllowList = new List<iprange>
{
new IPRange
{
Name = "TestChannelPreview001",
// Setting 0.0.0.0 for Address and 0 for SubnetPrefixLength
// will allow access to IP addresses.
Address = IPAddress.Parse("0.0.0.0"),
SubnetPrefixLength = 0
}
}
}
};
}
public static void UpdateCrossSiteAccessPoliciesForChannel(IChannel channel)
{
var clientPolicy =
@"<?xml version=""1.0"" encoding=""utf-8""?>
<policy>
<domain uri=""></domain>
<resource path=""></resource>"" include-subpaths=""true""/>
</policy>
";
var xdomainPolicy =
@"<?xml version=""1.0"" ?>
";
channel.CrossSiteAccessPolicies.ClientAccessPolicy = clientPolicy;
channel.CrossSiteAccessPolicies.CrossDomainPolicy = xdomainPolicy;
channel.Update();
}
public static IProgram CreateAndStartProgram(IChannel channel)
{
IAsset asset = _context.Assets.Create(AssetlName, AssetCreationOptions.None);
// Create a Program on the Channel. You can have multiple Programs that overlap or are sequential;
// however each Program must have a unique name within your Media Services account.
IProgram program = channel.Programs.Create(ProgramlName, TimeSpan.FromHours(3), asset.Id);
program.Start();
return program;
}
public static ILocator CreateLocatorForAsset(IAsset asset, TimeSpan ArchiveWindowLength)
{
// You cannot create a streaming locator using an AccessPolicy that includes write or delete permissions.
var locator = _context.Locators.CreateLocator
(
LocatorType.OnDemandOrigin,
asset,
_context.AccessPolicies.Create
(
"Live Stream Policy",
ArchiveWindowLength,
AccessPermissions.Read
)
);
return locator;
}
public static IStreamingEndpoint CreateAndStartStreamingEndpoint()
{
var options = new StreamingEndpointCreationOptions
{
Name = StreamingEndpointName,
ScaleUnits = 1,
AccessControl = GetAccessControl(),
CacheControl = GetCacheControl()
};
IStreamingEndpoint streamingEndpoint = _context.StreamingEndpoints.Create(options);
streamingEndpoint.Start();
return streamingEndpoint;
}
private static StreamingEndpointAccessControl GetAccessControl()
{
return new StreamingEndpointAccessControl
{
IPAllowList = new List<iprange>
{
new IPRange
{
Name = "Allow all",
Address = IPAddress.Parse("0.0.0.0"),
SubnetPrefixLength = 0
}
},
AkamaiSignatureHeaderAuthenticationKeyList = new List<akamaisignatureheaderauthenticationkey>
{
new AkamaiSignatureHeaderAuthenticationKey
{
Identifier = "My key",
Expiration = DateTime.UtcNow + TimeSpan.FromDays(365),
Base64Key = Convert.ToBase64String(GenerateRandomBytes(16))
}
}
};
}
private static byte[] GenerateRandomBytes(int length)
{
var bytes = new byte[length];
using (var rng = new RNGCryptoServiceProvider())
{
rng.GetBytes(bytes);
}
return bytes;
}
private static StreamingEndpointCacheControl GetCacheControl()
{
return new StreamingEndpointCacheControl
{
MaxAge = TimeSpan.FromSeconds(1000)
};
}
public static void UpdateCrossSiteAccessPoliciesForStreamingEndpoint(IStreamingEndpoint streamingEndpoint)
{
var clientPolicy =
@"<?xml version=""1.0"" encoding=""utf-8""?>
<policy>
<domain uri=""></domain>
<resource path=""></resource>"" include-subpaths=""true""/>
</policy>
";
var xdomainPolicy =
@"<?xml version=""1.0"" ?>
";
streamingEndpoint.CrossSiteAccessPolicies.ClientAccessPolicy = clientPolicy;
streamingEndpoint.CrossSiteAccessPolicies.CrossDomainPolicy = xdomainPolicy;
streamingEndpoint.Update();
}
public static void GetLocatorsInAllStreamingEndpoints(IAsset asset)
{
var locators = asset.Locators.Where(l => l.Type == LocatorType.OnDemandOrigin);
var ismFile = asset.AssetFiles.AsEnumerable().FirstOrDefault(a => a.Name.EndsWith(".ism"));
var template = new UriTemplate("{contentAccessComponent}/{ismFileName}/manifest");
var urls = locators.SelectMany(l =>
_context
.StreamingEndpoints
.AsEnumerable()
.Where(se => se.State == StreamingEndpointState.Running)
.Select(
se =>
template.BindByPosition(new Uri("http://" + se.HostName),
l.ContentAccessComponent,
ismFile.Name)))
.ToArray();
}
public static void Cleanup(IStreamingEndpoint streamingEndpoint,
IChannel channel)
{
if (streamingEndpoint != null)
{
streamingEndpoint.Stop();
streamingEndpoint.Delete();
}
IAsset asset;
if (channel != null)
{
foreach (var program in channel.Programs)
{
asset = _context.Assets.Where(se => se.Id == program.AssetId)
.FirstOrDefault();
program.Stop();
program.Delete();
if (asset != null)
{
foreach (var l in asset.Locators)
l.Delete();
asset.Delete();
}
}
channel.Stop();
channel.Delete();
}
}
}
}
</akamaisignatureheaderauthenticationkey></iprange></iprange></iprange>Now I want to make a method to cut a live stream for example every 15 minutes and save it as mp4 but don’t know where to start.
Can someone point me in the right direction ?
Kind regards
UPDATE :
I want to save the mp4 files on my hard disk.
-
Need help on handling MPEG4V1 data
31 janvier 2021, par GediminasI'm in situation where I need to get a chunk of MPEG4V1 (Microsoft MPEG-4 VKI
Codec V1) data located in the beginning of a packet (that was sent by some DVR unit).



Packet structure looks something like this :



- 

- Compressed MPEG4 data.
- Long integer - Number of events and tripwires.
- Long integer - Number of events.
- Event - Event's sequence.
- Long integer - Number of tripwires.
- Tripwire - Tripwires sequence.
- Long integer - Cyclical redundant code (CRC).

















So there is no indication of how to know where does the MPEG4 data ends (Or is there ?),
and from where should I start reading this additional data like "Number of events and tripwires" and etc...



I uploaded two packet's so you could see how the actual data looks like :
recvData1.txt,
recvData2.txt.



I've tried to decode those packets using FFmpeg library with avcodec_decode_video function and by removing byte by byte from the end of my recvData buffer in a hope for any results,

but FFmpeg just allways returned with an error messages like this :




"[msmpeg4v1 @ 038865a0] invalid startcode",
 "[msmpeg4v1 @ 038865a0] header damaged".





I'm not that good specialist on knowing of how does the MPEG4 works from the inside,
but judging by the error messages it's clearly seen that I'm missing some data for decoding at the start of the buffer.



So I'm not sure of what part / kind of MPEG data I'm getting here..

Maybe it's some kind of MPEG's "frame" data with it's "end" indication or something ?


I've even compared the start of my recvData buffer to some of MPEG4V1 encoded video files I found on the net "http://www.trekmania.net/clips/video_clips4.htm" to check if the start of my buffer really contains the MPEG data ..and not some kind of DVR vendor specific stuff..



And I noticed that there are about 20bytes of data 
(at the start of my packet data, and in .avi files right after about 180bytes..) 
that looks like some kind of header or something..



Please check this image : "http://ggodis.gamedev.lt/stackOverflow/recvData.jpg"



Maybe someone knows what this part of MPEG4V1 data represents ?



P.S. ..I've checked the CRC values for my received packets and they were correct..