
Recherche avancée
Médias (1)
-
The pirate bay depuis la Belgique
1er avril 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Image
Autres articles (101)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)
Sur d’autres sites (9606)
-
What role does bit rate play in the accuracy of Google Speech To Text transcription ?
11 novembre 2020, par Jash ShahI am helping a client convert a video file using
ffmpeg
and they originally used-b:a 64k
while transcoding their video to audio at a sampling rate (-ar 44100
argument inffmpeg
) of 44100. Their objective is that they want to generate the most accurate transcriptions using the Google Cloud Speech To Text API.

While combing through their documentation I did not find anything on how bit rate impacts the accuracy of the transcription. So my question is thus - would using a higher bit rate such as
128k
help me in getting better transcriptions or does it not matter ?

-
H264 to MP4 with Bframe play back and forth on google chrome[ffmpeg]
2 mai 2017, par Ravi AgolaI want to generate a mp4 container with h264 encoded file.
H264 file contains [I P B B][P B B][P B B][P B B] frames.
when I generate a mp4 file with FFmpeg, It works well with FFplay as well as VLC but on google-chrome mp4 file plays frame back and forth.
ffmpeg -i input.h264 -vcodec copy output.mp4
when i use internal codec library(libx264) it works well on (ffplay,vlc and google-chrome)
ffmpeg -i input.h264 -vcodec h264 output.mp4
as above command transcode h264(native) to h264(x264), I don’t want to transcode file as I will be using it with ffmpeg library.
when I use h264 without B frame it works well in both cases.
I have tried some experiment with sample test file available here.
direct conversion(MKV to MP4) works well with chrome as below
ffmpeg -i jellyfish-3-mbps-hd-h264.mkv -vcodec copy output.mp4
(MKV to H264 and H264 to MP4) plays back and forth on chrome
ffmpeg -i jellyfish-3-mbps-hd-h264.mkv -vcodec copy output.h264
ffmpeg -i output.h264 -vcodec copy output.mp4I get message in this case
[mp4 @ 0xb6f8b20] Timestamps are unset in a packet for stream 0. This is deprecated and will stop working in the future. Fix your code to set the timestamps properly
[mp4 @ 0xb6f8b20] pts has no value"what can be a reason behind this behavior ?
Thanks.
-
Play video using mse (media source extension) in google chrome
23 août 2019, par liyuqihxcI’m working on a project that convert rtsp stream (ffmpeg) and play it on the web page (signalr + mse).
So far it works pretty much as I expected on the latest version of edge and firefox, but not chrome.
here’s the code
public class WebmMediaStreamContext
{
private Process _ffProcess;
private readonly string _cmd;
private byte[] _initSegment;
private Task _readMediaStreamTask;
private CancellationTokenSource _cancellationTokenSource;
private const string _CmdTemplate = "-i {0} -c:v libvpx -tile-columns 4 -frame-parallel 1 -keyint_min 90 -g 90 -f webm -dash 1 pipe:";
public static readonly byte[] ClusterStart = { 0x1F, 0x43, 0xB6, 0x75, 0x01, 0x00, 0x00, 0x00 };
public event EventHandler<clusterreadyeventargs> ClusterReadyEvent;
public WebmMediaStreamContext(string rtspFeed)
{
_cmd = string.Format(_CmdTemplate, rtspFeed);
}
public async Task StartConverting()
{
if (_ffProcess != null)
throw new InvalidOperationException();
_ffProcess = new Process();
_ffProcess.StartInfo = new ProcessStartInfo
{
FileName = "ffmpeg/ffmpeg.exe",
Arguments = _cmd,
UseShellExecute = false,
CreateNoWindow = true,
RedirectStandardOutput = true
};
_ffProcess.Start();
_initSegment = await ParseInitSegmentAndStartReadMediaStream();
}
public byte[] GetInitSegment()
{
return _initSegment;
}
// Find the first cluster, and everything before it is the InitSegment
private async Task ParseInitSegmentAndStartReadMediaStream()
{
Memory<byte> buffer = new byte[10 * 1024];
int length = 0;
while (length != buffer.Length)
{
length += await _ffProcess.StandardOutput.BaseStream.ReadAsync(buffer.Slice(length));
int cluster = buffer.Span.IndexOf(ClusterStart);
if (cluster >= 0)
{
_cancellationTokenSource = new CancellationTokenSource();
_readMediaStreamTask = new Task(() => ReadMediaStreamProc(buffer.Slice(cluster, length - cluster).ToArray(), _cancellationTokenSource.Token), _cancellationTokenSource.Token, TaskCreationOptions.LongRunning);
_readMediaStreamTask.Start();
return buffer.Slice(0, cluster).ToArray();
}
}
throw new InvalidOperationException();
}
private void ReadMoreBytes(Span<byte> buffer)
{
int size = buffer.Length;
while (size > 0)
{
int len = _ffProcess.StandardOutput.BaseStream.Read(buffer.Slice(buffer.Length - size));
size -= len;
}
}
// Parse every single cluster and fire ClusterReadyEvent
private void ReadMediaStreamProc(byte[] bytesRead, CancellationToken cancel)
{
Span<byte> buffer = new byte[5 * 1024 * 1024];
bytesRead.CopyTo(buffer);
int bufferEmptyIndex = bytesRead.Length;
do
{
if (bufferEmptyIndex < ClusterStart.Length + 4)
{
ReadMoreBytes(buffer.Slice(bufferEmptyIndex, 1024));
bufferEmptyIndex += 1024;
}
int clusterDataSize = BitConverter.ToInt32(
buffer.Slice(ClusterStart.Length, 4)
.ToArray()
.Reverse()
.ToArray()
);
int clusterSize = ClusterStart.Length + 4 + clusterDataSize;
if (clusterSize > buffer.Length)
{
byte[] newBuffer = new byte[clusterSize];
buffer.Slice(0, bufferEmptyIndex).CopyTo(newBuffer);
buffer = newBuffer;
}
if (bufferEmptyIndex < clusterSize)
{
ReadMoreBytes(buffer.Slice(bufferEmptyIndex, clusterSize - bufferEmptyIndex));
bufferEmptyIndex = clusterSize;
}
ClusterReadyEvent?.Invoke(this, new ClusterReadyEventArgs(buffer.Slice(0, bufferEmptyIndex).ToArray()));
bufferEmptyIndex = 0;
} while (!cancel.IsCancellationRequested);
}
}
</byte></byte></byte></clusterreadyeventargs>I use ffmpeg to convert the rtsp stream to vp8 WEBM byte stream and parse it to "Init Segment" (ebml head、info、tracks...) and "Media Segment" (cluster), then send it to browser via signalR
$(function () {
var mediaSource = new MediaSource();
var mimeCodec = 'video/webm; codecs="vp8"';
var video = document.getElementById('video');
mediaSource.addEventListener('sourceopen', callback, false);
function callback(e) {
var sourceBuffer = mediaSource.addSourceBuffer(mimeCodec);
var queue = [];
sourceBuffer.addEventListener('updateend', function () {
if (queue.length === 0) {
return;
}
var base64 = queue[0];
if (base64.length === 0) {
mediaSource.endOfStream();
queue.shift();
return;
} else {
var buffer = new Uint8Array(atob(base64).split("").map(function (c) {
return c.charCodeAt(0);
}));
sourceBuffer.appendBuffer(buffer);
queue.shift();
}
}, false);
var connection = new signalR.HubConnectionBuilder()
.withUrl("/signalr-video")
.configureLogging(signalR.LogLevel.Information)
.build();
connection.start().then(function () {
connection.stream("InitVideoReceive")
.subscribe({
next: function(item) {
if (queue.length === 0 && !!!sourceBuffer.updating) {
var buffer = new Uint8Array(atob(item).split("").map(function (c) {
return c.charCodeAt(0);
}));
sourceBuffer.appendBuffer(buffer);
console.log(blockindex++ + " : " + buffer.byteLength);
} else {
queue.push(item);
}
},
complete: function () {
queue.push('');
},
error: function (err) {
console.error(err);
}
});
});
}
video.src = window.URL.createObjectURL(mediaSource);
})chrome just play the video for 3 5 seconds and then stop for buffering, even though there are plenty of cluster transfered and inserted into SourceBuffer.
here’s the information in chrome ://media-internals/
Player Properties :
render_id: 217
player_id: 1
origin_url: http://localhost:52531/
frame_url: http://localhost:52531/
frame_title: Home Page
url: blob:http://localhost:52531/dcb25d89-9830-40a5-ba88-33c13b5c03eb
info: Selected FFmpegVideoDecoder for video decoding, config: codec: vp8 format: 1 profile: vp8 coded size: [1280,720] visible rect: [0,0,1280,720] natural size: [1280,720] has extra data? false encryption scheme: Unencrypted rotation: 0°
pipeline_state: kSuspended
found_video_stream: true
video_codec_name: vp8
video_dds: false
video_decoder: FFmpegVideoDecoder
duration: unknown
height: 720
width: 1280
video_buffering_state: BUFFERING_HAVE_NOTHING
for_suspended_start: false
pipeline_buffering_state: BUFFERING_HAVE_NOTHING
event: PAUSELog
Timestamp Property Value
00:00:00 00 origin_url http://localhost:52531/
00:00:00 00 frame_url http://localhost:52531/
00:00:00 00 frame_title Home Page
00:00:00 00 url blob:http://localhost:52531/dcb25d89-9830-40a5-ba88-33c13b5c03eb
00:00:00 00 info ChunkDemuxer: buffering by DTS
00:00:00 35 pipeline_state kStarting
00:00:15 213 found_video_stream true
00:00:15 213 video_codec_name vp8
00:00:15 216 video_dds false
00:00:15 216 video_decoder FFmpegVideoDecoder
00:00:15 216 info Selected FFmpegVideoDecoder for video decoding, config: codec: vp8 format: 1 profile: vp8 coded size: [1280,720] visible rect: [0,0,1280,720] natural size: [1280,720] has extra data? false encryption scheme: Unencrypted rotation: 0°
00:00:15 216 pipeline_state kPlaying
00:00:15 213 duration unknown
00:00:16 661 height 720
00:00:16 661 width 1280
00:00:16 665 video_buffering_state BUFFERING_HAVE_ENOUGH
00:00:16 665 for_suspended_start false
00:00:16 665 pipeline_buffering_state BUFFERING_HAVE_ENOUGH
00:00:16 667 pipeline_state kSuspending
00:00:16 670 pipeline_state kSuspended
00:00:52 759 info Effective playback rate changed from 0 to 1
00:00:52 759 event PLAY
00:00:52 759 pipeline_state kResuming
00:00:52 760 video_dds false
00:00:52 760 video_decoder FFmpegVideoDecoder
00:00:52 760 info Selected FFmpegVideoDecoder for video decoding, config: codec: vp8 format: 1 profile: vp8 coded size: [1280,720] visible rect: [0,0,1280,720] natural size: [1280,720] has extra data? false encryption scheme: Unencrypted rotation: 0°
00:00:52 760 pipeline_state kPlaying
00:00:52 793 height 720
00:00:52 793 width 1280
00:00:52 798 video_buffering_state BUFFERING_HAVE_ENOUGH
00:00:52 798 for_suspended_start false
00:00:52 798 pipeline_buffering_state BUFFERING_HAVE_ENOUGH
00:00:56 278 video_buffering_state BUFFERING_HAVE_NOTHING
00:00:56 295 for_suspended_start false
00:00:56 295 pipeline_buffering_state BUFFERING_HAVE_NOTHING
00:01:20 717 event PAUSE
00:01:33 538 event PLAY
00:01:35 94 event PAUSE
00:01:55 561 pipeline_state kSuspending
00:01:55 563 pipeline_state kSuspendedCan someone tell me what’s wrong with my code, or dose chrome require some magic configuration to work ?
Thanks
Please excuse my english :)