
Recherche avancée
Médias (1)
-
La conservation du net art au musée. Les stratégies à l’œuvre
26 mai 2011
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (65)
-
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)
Sur d’autres sites (8499)
-
Fragmented MP4 - problem playing in browser
12 juin 2019, par PookyFanI try to create fragmented MP4 from raw H264 video data so I could play it in internet browser’s player. My goal is to create live streaming system, where media server would send fragmented MP4 pieces to browser. The server would buffer input data from RaspberryPi camera, which sends video as H264 frames. It would then mux that video data and make it available for client. The browser would play media data (that were muxed by server and sent i.e. through websocket) by using Media Source Extensions.
For test purpose I wrote the following pieces of code (using many examples I found in the intenet) :
C++ application using avcodec which muxes raw H264 video to fragmented MP4 and saves it to a file :
#define READBUFSIZE 4096
#define IOBUFSIZE 4096
#define ERRMSGSIZE 128
#include <cstdint>
#include <iostream>
#include <fstream>
#include <string>
#include <vector>
extern "C"
{
#include <libavformat></libavformat>avformat.h>
#include <libavutil></libavutil>error.h>
#include <libavutil></libavutil>opt.h>
}
enum NalType : uint8_t
{
//NALs containing stream metadata
SEQ_PARAM_SET = 0x7,
PIC_PARAM_SET = 0x8
};
std::vector outputData;
int mediaMuxCallback(void *opaque, uint8_t *buf, int bufSize)
{
outputData.insert(outputData.end(), buf, buf + bufSize);
return bufSize;
}
std::string getAvErrorString(int errNr)
{
char errMsg[ERRMSGSIZE];
av_strerror(errNr, errMsg, ERRMSGSIZE);
return std::string(errMsg);
}
int main(int argc, char **argv)
{
if(argc < 2)
{
std::cout << "Missing file name" << std::endl;
return 1;
}
std::fstream file(argv[1], std::ios::in | std::ios::binary);
if(!file.is_open())
{
std::cout << "Couldn't open file " << argv[1] << std::endl;
return 2;
}
std::vector inputMediaData;
do
{
char buf[READBUFSIZE];
file.read(buf, READBUFSIZE);
int size = file.gcount();
if(size > 0)
inputMediaData.insert(inputMediaData.end(), buf, buf + size);
} while(!file.eof());
file.close();
//Initialize avcodec
av_register_all();
uint8_t *ioBuffer;
AVCodec *codec = avcodec_find_decoder(AV_CODEC_ID_H264);
AVCodecContext *codecCtxt = avcodec_alloc_context3(codec);
AVCodecParserContext *parserCtxt = av_parser_init(AV_CODEC_ID_H264);
AVOutputFormat *outputFormat = av_guess_format("mp4", nullptr, nullptr);
AVFormatContext *formatCtxt;
AVIOContext *ioCtxt;
AVStream *videoStream;
int res = avformat_alloc_output_context2(&formatCtxt, outputFormat, nullptr, nullptr);
if(res < 0)
{
std::cout << "Couldn't initialize format context; the error was: " << getAvErrorString(res) << std::endl;
return 3;
}
if((videoStream = avformat_new_stream( formatCtxt, avcodec_find_encoder(formatCtxt->oformat->video_codec) )) == nullptr)
{
std::cout << "Couldn't initialize video stream" << std::endl;
return 4;
}
else if(!codec)
{
std::cout << "Couldn't initialize codec" << std::endl;
return 5;
}
else if(codecCtxt == nullptr)
{
std::cout << "Couldn't initialize codec context" << std::endl;
return 6;
}
else if(parserCtxt == nullptr)
{
std::cout << "Couldn't initialize parser context" << std::endl;
return 7;
}
else if((ioBuffer = (uint8_t*)av_malloc(IOBUFSIZE)) == nullptr)
{
std::cout << "Couldn't allocate I/O buffer" << std::endl;
return 8;
}
else if((ioCtxt = avio_alloc_context(ioBuffer, IOBUFSIZE, 1, nullptr, nullptr, mediaMuxCallback, nullptr)) == nullptr)
{
std::cout << "Couldn't initialize I/O context" << std::endl;
return 9;
}
//Set video stream data
videoStream->id = formatCtxt->nb_streams - 1;
videoStream->codec->width = 1280;
videoStream->codec->height = 720;
videoStream->time_base.den = 60; //FPS
videoStream->time_base.num = 1;
videoStream->codec->flags |= CODEC_FLAG_GLOBAL_HEADER;
formatCtxt->pb = ioCtxt;
//Retrieve SPS and PPS for codec extdata
const uint32_t synchMarker = 0x01000000;
unsigned int i = 0;
int spsStart = -1, ppsStart = -1;
uint16_t spsSize = 0, ppsSize = 0;
while(spsSize == 0 || ppsSize == 0)
{
uint32_t *curr = (uint32_t*)(inputMediaData.data() + i);
if(*curr == synchMarker)
{
unsigned int currentNalStart = i;
i += sizeof(uint32_t);
uint8_t nalType = inputMediaData.data()[i] & 0x1F;
if(nalType == SEQ_PARAM_SET)
spsStart = currentNalStart;
else if(nalType == PIC_PARAM_SET)
ppsStart = currentNalStart;
if(spsStart >= 0 && spsSize == 0 && spsStart != i)
spsSize = currentNalStart - spsStart;
else if(ppsStart >= 0 && ppsSize == 0 && ppsStart != i)
ppsSize = currentNalStart - ppsStart;
}
++i;
}
videoStream->codec->extradata = inputMediaData.data() + spsStart;
videoStream->codec->extradata_size = ppsStart + ppsSize;
//Write main header
AVDictionary *options = nullptr;
av_dict_set(&options, "movflags", "frag_custom+empty_moov", 0);
res = avformat_write_header(formatCtxt, &options);
if(res < 0)
{
std::cout << "Couldn't write container main header; the error was: " << getAvErrorString(res) << std::endl;
return 10;
}
//Retrieve frames from input video and wrap them in container
int currentInputIndex = 0;
int framesInSecond = 0;
while(currentInputIndex < inputMediaData.size())
{
uint8_t *frameBuffer;
int frameSize;
res = av_parser_parse2(parserCtxt, codecCtxt, &frameBuffer, &frameSize, inputMediaData.data() + currentInputIndex,
inputMediaData.size() - currentInputIndex, AV_NOPTS_VALUE, AV_NOPTS_VALUE, 0);
if(frameSize == 0) //No more frames while some data still remains (is that even possible?)
{
std::cout << "Some data left unparsed: " << std::to_string(inputMediaData.size() - currentInputIndex) << std::endl;
break;
}
//Prepare packet with video frame to be dumped into container
AVPacket packet;
av_init_packet(&packet);
packet.data = frameBuffer;
packet.size = frameSize;
packet.stream_index = videoStream->index;
currentInputIndex += frameSize;
//Write packet to the video stream
res = av_write_frame(formatCtxt, &packet);
if(res < 0)
{
std::cout << "Couldn't write packet with video frame; the error was: " << getAvErrorString(res) << std::endl;
return 11;
}
if(++framesInSecond == 60) //We want 1 segment per second
{
framesInSecond = 0;
res = av_write_frame(formatCtxt, nullptr); //Flush segment
}
}
res = av_write_frame(formatCtxt, nullptr); //Flush if something has been left
//Write media data in container to file
file.open("my_mp4.mp4", std::ios::out | std::ios::binary);
if(!file.is_open())
{
std::cout << "Couldn't open output file " << std::endl;
return 12;
}
file.write((char*)outputData.data(), outputData.size());
if(file.fail())
{
std::cout << "Couldn't write to file" << std::endl;
return 13;
}
std::cout << "Media file muxed successfully" << std::endl;
return 0;
}
</vector></string></fstream></iostream></cstdint>(I hardcoded a few values, such as video dimensions or framerate, but as I said this is just a test code.)
Simple HTML webpage using MSE to play my fragmented MP4
<video width="1280" height="720" controls="controls">
</video>
<code class="echappe-js"><script><br />
var vidElement = document.querySelector('video');<br />
<br />
if (window.MediaSource) {<br />
var mediaSource = new MediaSource();<br />
vidElement.src = URL.createObjectURL(mediaSource);<br />
mediaSource.addEventListener('sourceopen', sourceOpen);<br />
} else {<br />
console.log("The Media Source Extensions API is not supported.")<br />
}<br />
<br />
function sourceOpen(e) {<br />
URL.revokeObjectURL(vidElement.src);<br />
var mime = 'video/mp4; codecs="avc1.640028"';<br />
var mediaSource = e.target;<br />
var sourceBuffer = mediaSource.addSourceBuffer(mime);<br />
var videoUrl = 'my_mp4.mp4';<br />
fetch(videoUrl)<br />
.then(function(response) {<br />
return response.arrayBuffer();<br />
})<br />
.then(function(arrayBuffer) {<br />
sourceBuffer.addEventListener('updateend', function(e) {<br />
if (!sourceBuffer.updating &amp;&amp; mediaSource.readyState === 'open') {<br />
mediaSource.endOfStream();<br />
}<br />
});<br />
sourceBuffer.appendBuffer(arrayBuffer);<br />
});<br />
}<br />
</script>
Output MP4 file generated by my C++ application can be played i.e. in MPC, but it doesn’t play in any web browser I tested it with. It also doesn’t have any duration (MPC keeps showing 00:00).
To compare output MP4 file I got from my C++ application described above, I also used FFMPEG to create fragmented MP4 file from the same source file with raw H264 stream. I used the following command :
ffmpeg -r 60 -i input.h264 -c:v copy -f mp4 -movflags empty_moov+default_base_moof+frag_keyframe test.mp4
This file generated by FFMPEG is played correctly by every web browser I used for tests. It also has correct duration (but also it has trailing atom, which wouldn’t be present in my live stream anyway, and as I need a live stream, it won’t have any fixed duration in the first place).
MP4 atoms for both files look very similiar (they have identical avcc section for sure). What’s interesting (but not sure if it’s of any importance), both files have different NALs format than input file (RPI camera produces video stream in Annex-B format, while output MP4 files contain NALs in AVCC format... or at least it looks like it’s the case when I compare mdat atoms with input H264 data).
I assume there is some field (or a few fields) I need to set for avcodec to make it produce video stream that would be properly decoded and played by browsers players. But what field(s) do I need to set ? Or maybe problem lies somewhere else ? I ran out of ideas.
EDIT 1 :
As suggested, I investigated binary content of both MP4 files (generated by my app and FFMPEG tool) with hex editor. What I can confirm :- both files have identical avcc section (they match perfectly and are in AVCC format, I analyzed it byte after byte and there’s no mistake about it)
- both files have NALs in AVCC format (I looked closely at mdat atoms and they don’t differ between both MP4 files)
So I guess there’s nothing wrong with the extradata creation in my code - avcodec takes care of it properly, even if I just feed it with SPS and PPS NALs. It converts them by itself, so no need for me to do it by hand. Still, my original problem remains.
EDIT 2 : I achieved partial success - MP4 generated by my app now plays in Firefox. I added this line to the code (along with rest of stream initialization) :
videoStream->codec->time_base = videoStream->time_base;
So now this section of my code looks like this :
//Set video stream data
videoStream->id = formatCtxt->nb_streams - 1;
videoStream->codec->width = 1280;
videoStream->codec->height = 720;
videoStream->time_base.den = 60; //FPS
videoStream->time_base.num = 1;
videoStream->codec->time_base = videoStream->time_base;
videoStream->codec->flags |= CODEC_FLAG_GLOBAL_HEADER;
formatCtxt->pb = ioCtxt; -
ffmpeg not returning duration, cant play video until complete. Stream images 2 video via PHP
17 février 2014, par John JI am real struggling with ffmpeg. I am trying to convert images to video, I have an ip camera which I am recording from. The recordings are mjpegs 1 frame per image.
I am trying to create a script in php so I can recreate a video from date to date, this requires inputting the images via image2pipe and then creating the video.
The trouble is, ffmpeg does return the duration and start stats, so I have no way of working out when the video is done or what percentage is done. The video won't play until its finished, and its not a very good UE.
Any ideas of how I can resolve this, the video format can be anything I am open to suggestions.
PHP :
//Shell command
exec('cat /image/dir/*.jpg | ffmpeg -y -c:v mjpeg -f image2pipe -r 10 -i - -c:v libx264 -pix_fmt yuv420p -movflags +faststart myvids/vidname.mp4 1>vidname.txt 2>&1')
//This is loaded via javascript when the video is loaded (which is failing due to stats being wrong
$video_play = "<video width="\"320\"" height="\"240\"" src="\"myvids/vidname.mp4\"" type="\"video/mp4\"\" controls="\"controls\"" preload="\"none\""></video>";Javascript :
//Javascript to create the loop until video is loaded
<code class="echappe-js"><script><br />
$(document).ready(function() {<br />
var loader = $("#clip_load").percentageLoader();<br />
$.ajaxSetup({ cache: false }); // This part addresses an IE bug. without it, IE will only load the first number and will never refresh<br />
var interval = setInterval(updateProgress,1000);<br />
function updateProgress(){ $.get( "&#39;.base_url().&#39;video/getVideoCompile_Process?l=&#39;.$vid_name.&#39;-output.txt&amp;t=per", function( data ) { if(data=>\&#39;100\&#39;){ $("#clip_load").html(\&#39;&#39;.$video_play.&#39;\&#39;); clearInterval(interval); }else{loader.setProgress(data); } }); }<br />
});<br />
</script>PHP (page is called via javascript :
//This is the script which returns the current percentage
$logloc = $this->input->get('l');
$content = @file_get_contents($logloc);
if($content){
//get duration of source
preg_match("/Duration: (.*?), start:/", $content, $matches);
$rawDuration = $matches[1];
//rawDuration is in 00:00:00.00 format. This converts it to seconds.
$ar = array_reverse(explode(":", $rawDuration));
$duration = floatval($ar[0]);
if (!empty($ar[1])) $duration += intval($ar[1]) * 60;
if (!empty($ar[2])) $duration += intval($ar[2]) * 60 * 60;
//get the time in the file that is already encoded
preg_match_all("/time=(.*?) bitrate/", $content, $matches);
$rawTime = array_pop($matches);
//this is needed if there is more than one match
if (is_array($rawTime)){$rawTime = array_pop($rawTime);}
//rawTime is in 00:00:00.00 format. This converts it to seconds.
$ar = array_reverse(explode(":", $rawTime));
$time = floatval($ar[0]);
if (!empty($ar[1])) $time += intval($ar[1]) * 60;
if (!empty($ar[2])) $time += intval($ar[2]) * 60 * 60;
//calculate the progress
$progress = round(($time/$duration) * 100);
if ($this->input->get('t')=='per'){
echo $progress;
}else{
echo "Duration: " . $duration . "<br />";
echo "Current Time: " . $time . "<br />";
echo "Progress: " . $progress . "%";}
}else{ echo "cannot locate";}Thanks
-
Android Third Party Media Player SDK to replace Default MediaPlayer under Apache Licence [on hold]
22 avril 2015, par Amanda FernandezI googled a lot and found that there is no way we can alter the buffer size of default media player. So, need to use custom player. I found 2 libraries :
1. ffmeg
2. gStreamerBut both are under GNU licence.
I have confusion on its licence.If I’m using these 2 libraries, then do I have to make my app source code open source.?
Also, came across brightcove library. It’s native videoplayer NDK. Can It be sued for streaming audio.Is there any other 3 party libraries available for media player ?