
Recherche avancée
Médias (3)
-
GetID3 - Bloc informations de fichiers
9 avril 2013, par
Mis à jour : Mai 2013
Langue : français
Type : Image
-
GetID3 - Boutons supplémentaires
9 avril 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Image
-
Collections - Formulaire de création rapide
19 février 2013, par
Mis à jour : Février 2013
Langue : français
Type : Image
Autres articles (54)
-
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 est la première version de MediaSPIP stable.
Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Mise à disposition des fichiers
14 avril 2011, parPar défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...) -
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)
Sur d’autres sites (10428)
-
Output RTSP stream with ffmpeg
28 juillet 2015, par sipwizI’m attempting to use the ffmpeg libraries to send a video stream from my application to a media server (in this case wowza). I have been able to do the reverse and consume an RTSP stream but I’m having a few issues writing an RTSP stream.
I have found a few examples and attempted to utilise the relevant bits. The code is below. I have simplified it as much as I can. I do only want to send a single H264 bit stream to the wowza server and which it can handle.
I get an "Integer division by zero" exception whenever in the av_interleaved_write_frame function when I try and send a packet. The exception looks like it’s related to the packet timestamps not being set correctly. I’ve tried different values and can get past the exception by setting some contrived values but then the write call fails.
#include <iostream>
#include <fstream>
#include <sstream>
#include <cstring>
#include "stdafx.h"
#include "windows.h"
extern "C"
{
#include
#include
#include
#include
}
using namespace std;
static int video_is_eof;
#define STREAM_DURATION 50.0
#define STREAM_FRAME_RATE 25 /* 25 images/s */
#define STREAM_PIX_FMT AV_PIX_FMT_YUV420P /* default pix_fmt */
#define VIDEO_CODEC_ID CODEC_ID_H264
static int sws_flags = SWS_BICUBIC;
/* video output */
static AVFrame *frame;
static AVPicture src_picture, dst_picture;
static int frame_count;
static int write_frame(AVFormatContext *fmt_ctx, const AVRational *time_base, AVStream *st, AVPacket *pkt)
{
/* rescale output packet timestamp values from codec to stream timebase */
pkt->pts = av_rescale_q_rnd(pkt->pts, *time_base, st->time_base, AVRounding(AV_ROUND_NEAR_INF | AV_ROUND_PASS_MINMAX));
pkt->dts = av_rescale_q_rnd(pkt->dts, *time_base, st->time_base, AVRounding(AV_ROUND_NEAR_INF | AV_ROUND_PASS_MINMAX));
pkt->duration = av_rescale_q(pkt->duration, *time_base, st->time_base);
pkt->stream_index = st->index;
// Exception occurs here.
return av_interleaved_write_frame(fmt_ctx, pkt);
}
/* Add an output stream. */
static AVStream *add_stream(AVFormatContext *oc, AVCodec **codec, enum AVCodecID codec_id)
{
AVCodecContext *c;
AVStream *st;
/* find the encoder */
*codec = avcodec_find_encoder(codec_id);
if (!(*codec)) {
fprintf(stderr, "Could not find encoder for '%s'\n", avcodec_get_name(codec_id));
exit(1);
}
st = avformat_new_stream(oc, *codec);
if (!st) {
fprintf(stderr, "Could not allocate stream\n");
exit(1);
}
st->id = oc->nb_streams - 1;
c = st->codec;
c->codec_id = codec_id;
c->bit_rate = 400000;
c->width = 352;
c->height = 288;
c->time_base.den = STREAM_FRAME_RATE;
c->time_base.num = 1;
c->gop_size = 12; /* emit one intra frame every twelve frames at most */
c->pix_fmt = STREAM_PIX_FMT;
return st;
}
static void open_video(AVFormatContext *oc, AVCodec *codec, AVStream *st)
{
int ret;
AVCodecContext *c = st->codec;
/* open the codec */
ret = avcodec_open2(c, codec, NULL);
if (ret < 0) {
fprintf(stderr, "Could not open video codec: ");
exit(1);
}
/* allocate and init a re-usable frame */
frame = av_frame_alloc();
if (!frame) {
fprintf(stderr, "Could not allocate video frame\n");
exit(1);
}
frame->format = c->pix_fmt;
frame->width = c->width;
frame->height = c->height;
/* Allocate the encoded raw picture. */
ret = avpicture_alloc(&dst_picture, c->pix_fmt, c->width, c->height);
if (ret < 0) {
fprintf(stderr, "Could not allocate picture: ");
exit(1);
}
/* copy data and linesize picture pointers to frame */
*((AVPicture *)frame) = dst_picture;
}
/* Prepare a dummy image. */
static void fill_yuv_image(AVPicture *pict, int frame_index, int width, int height)
{
int x, y, i;
i = frame_index;
/* Y */
for (y = 0; y < height; y++)
for (x = 0; x < width; x++)
pict->data[0][y * pict->linesize[0] + x] = x + y + i * 3;
/* Cb and Cr */
for (y = 0; y < height / 2; y++) {
for (x = 0; x < width / 2; x++) {
pict->data[1][y * pict->linesize[1] + x] = 128 + y + i * 2;
pict->data[2][y * pict->linesize[2] + x] = 64 + x + i * 5;
}
}
}
static void write_video_frame(AVFormatContext *oc, AVStream *st, int flush)
{
int ret;
AVCodecContext *c = st->codec;
if (!flush) {
fill_yuv_image(&dst_picture, frame_count, c->width, c->height);
}
AVPacket pkt = { 0 };
int got_packet;
av_init_packet(&pkt);
/* encode the image */
frame->pts = frame_count;
ret = avcodec_encode_video2(c, &pkt, flush ? NULL : frame, &got_packet);
if (ret < 0) {
fprintf(stderr, "Error encoding video frame:");
exit(1);
}
/* If size is zero, it means the image was buffered. */
if (got_packet) {
ret = write_frame(oc, &c->time_base, st, &pkt);
}
else {
if (flush) {
video_is_eof = 1;
}
ret = 0;
}
if (ret < 0) {
fprintf(stderr, "Error while writing video frame: ");
exit(1);
}
frame_count++;
}
static void close_video(AVFormatContext *oc, AVStream *st)
{
avcodec_close(st->codec);
av_free(src_picture.data[0]);
av_free(dst_picture.data[0]);
av_frame_free(&frame);
}
int _tmain(int argc, _TCHAR* argv[])
{
printf("starting...\n");
const char *filename = "rtsp://test:password@192.168.33.19:1935/ffmpeg/0";
AVOutputFormat *fmt;
AVFormatContext *oc;
AVStream *video_st;
AVCodec *video_codec;
double video_time;
int flush, ret;
/* Initialize libavcodec, and register all codecs and formats. */
av_register_all();
avformat_network_init();
AVOutputFormat* oFmt = av_oformat_next(NULL);
while (oFmt) {
if (oFmt->video_codec == VIDEO_CODEC_ID) {
break;
}
oFmt = av_oformat_next(oFmt);
}
if (!oFmt) {
printf("Could not find the required output format.\n");
exit(1);
}
/* allocate the output media context */
avformat_alloc_output_context2(&oc, oFmt, "rtsp", filename);
if (!oc) {
printf("Could not set the output media context.\n");
exit(1);
}
fmt = oc->oformat;
if (!fmt) {
printf("Could not create the output format.\n");
exit(1);
}
video_st = NULL;
cout << "Codec = " << avcodec_get_name(fmt->video_codec) << endl;
if (fmt->video_codec != AV_CODEC_ID_NONE)
{
video_st = add_stream(oc, &video_codec, fmt->video_codec);
}
/* Now that all the parameters are set, we can open the video codec and allocate the necessary encode buffers. */
if (video_st) {
open_video(oc, video_codec, video_st);
}
av_dump_format(oc, 0, filename, 1);
char errorBuff[80];
if (!(fmt->flags & AVFMT_NOFILE)) {
ret = avio_open(&oc->pb, filename, AVIO_FLAG_WRITE);
if (ret < 0) {
fprintf(stderr, "Could not open outfile '%s': %s", filename, av_make_error_string(errorBuff, 80, ret));
return 1;
}
}
flush = 0;
while (video_st && !video_is_eof) {
/* Compute current video time. */
video_time = (video_st && !video_is_eof) ? video_st->pts.val * av_q2d(video_st->time_base) : INFINITY;
if (!flush && (!video_st || video_time >= STREAM_DURATION)) {
flush = 1;
}
if (video_st && !video_is_eof) {
write_video_frame(oc, video_st, flush);
}
}
if (video_st) {
close_video(oc, video_st);
}
if ((fmt->flags & AVFMT_NOFILE)) {
avio_close(oc->pb);
}
avformat_free_context(oc);
printf("finished.\n");
getchar();
return 0;
}
</cstring></sstream></fstream></iostream>Does anyone have any insights about how the packet timestamps can be successfully set ?
-
Mux raw H264 into MP4 using ffmpeg API
2 décembre 2016, par user1886318I try to mux raw H264 frames into MP4 container using ffmpeg in order to play the video with Media Source Extension in a web browser.
The following command works like a charm :
ffmpeg -i video.h264 -movflags frag_keyframe+empty_moov fragmented.mp4
Now, I would like to do the same thing using the FFmpeg C++ API.
I wrote the following source code :
#ifdef _WIN32
#define snprintf _snprintf
#endif
#include
#include
#include
#include
#include
extern "C" {
#include <libavutil></libavutil>avassert.h>
#include <libavutil></libavutil>channel_layout.h>
#include <libavutil></libavutil>opt.h>
#include <libavutil></libavutil>mathematics.h>
#include <libavformat></libavformat>avformat.h>
#include <libswscale></libswscale>swscale.h>
#include <libswresample></libswresample>swresample.h>
}
#include
#if defined(_MSC_VER) || defined(_MSC_EXTENSIONS)
#define DELTA_EPOCH_IN_MICROSECS 11644473600000000Ui64
#else
#define DELTA_EPOCH_IN_MICROSECS 11644473600000000ULL
#endif
struct timezone
{
int tz_minuteswest; /* minutes W of Greenwich */
int tz_dsttime; /* type of dst correction */
};
#ifdef WIN32
#include
#include <sys></sys>timeb.h>
int gettimeofday(struct timeval *tp, void *tz)
{
struct _timeb timebuffer;
_ftime(&timebuffer);
tp->tv_sec = timebuffer.time;
tp->tv_usec = timebuffer.millitm * 1000;
return 0;
}
#endif
// a wrapper around a single output AVStream
typedef struct OutputStream {
AVStream *st;
} OutputStream;
static int write_frame(AVFormatContext *fmt_ctx, const AVRational *time_base, AVStream *st, AVPacket *pkt)
{
return av_interleaved_write_frame(fmt_ctx, pkt);
}
/* Add an output stream. */
static void add_stream(OutputStream *ost, AVFormatContext *oc,
AVCodec **codec,
enum AVCodecID codec_id)
{
AVCodecContext *c;
*codec = avcodec_find_encoder(codec_id);
printf("avformat_new_stream\n");
ost->st = avformat_new_stream(oc, *codec);
if (!ost->st) {
fprintf(stderr, "Could not allocate stream\n");
exit(1);
}
ost->st->id = oc->nb_streams - 1;
c = ost->st->codec;
switch ((*codec)->type) {
case AVMEDIA_TYPE_VIDEO:
c->codec_id = codec_id;
c->width = 352;
c->height = 288;
ost->st->time_base = { 1, 15 };
c->time_base = ost->st->time_base;
c->pix_fmt = AV_PIX_FMT_YUV420P;
c->profile = FF_PROFILE_H264_BASELINE;
break;
default:
break;
}
/* Some formats want stream headers to be separate. */
if (oc->oformat->flags & AVFMT_GLOBALHEADER)
c->flags |= CODEC_FLAG_GLOBAL_HEADER;
}
uint64_t begin_timestamp_us = 0;
static int write_video_frame(AVFormatContext *oc, OutputStream *ost)
{
int ret = 0;
AVCodecContext *c;
c = ost->st->codec;
struct timeval now;
gettimeofday(&now, NULL);
uint64_t timestamp_us = (now.tv_sec * 1000000LL + now.tv_usec - begin_timestamp_us);
{
{
AVFormatContext *fmt_ctx = oc;
const AVRational *time_base = &c->time_base;
AVStream *st = ost->st;
char* buf;
AVPacket pkt = { 0 };
av_init_packet(&pkt);
char filename[256];
static int pts = 0;
snprintf(filename, 256, "avc_raw\\avc_raw_%03d.h264", pts);
FILE *f = fopen(filename, "rb");
int len = 0;
if (f){
fseek(f, 0, SEEK_END); // seek to end of file
len = ftell(f); // get current file pointer
//fseek(f, 0, SEEK_SET); // seek back to beginning of file
rewind(f);
buf = new char[len + 1]; // to delete
fread(buf, 1, len, f);
fclose(f);
}
else{
return 1;
}
int nal_type = buf[4] & 0x0f;
int is_IDR_nal = 0;
if (nal_type == 0x7){
//7 sps; 8 pps; IDR frame follow with sps
is_IDR_nal = 1;
pkt.flags |= AV_PKT_FLAG_KEY;
printf("frame %d is IDR\n", pts);
}
pkt.stream_index = ost->st->index;
pkt.data = (uint8_t*)buf;
pkt.size = len;
pkt.dts = pkt.pts = timestamp_us;
pts++;
write_frame(fmt_ctx, &c->time_base, st, &pkt);
}
}
if (ret < 0) {
fprintf(stderr, "Error while writing video frame: %s\n", ret);
exit(1);
}
return 0;
}
/**************************************************************/
/* media file output */
int main(int argc, char **argv)
{
OutputStream video_st = { 0 };
const char *filename;
AVOutputFormat *fmt;
AVFormatContext *oc;
AVCodec *video_codec;
int ret;
int encode_video = 0;
AVDictionary *opt = NULL;
/* Initialize libavcodec, and register all codecs and formats. */
av_register_all();
if (argc < 2) {
return 1;
}
filename = argv[1];
oc = avformat_alloc_context();
if (!oc) {
fprintf(stderr, "Memory error\n");
exit(1);
}
// auto detect the output format from the name. default is mpeg.
AVOutputFormat * avOutputFormat = av_guess_format(NULL, filename, NULL);
if (!avOutputFormat) {
printf("Could not deduce output format from file extension: using MPEG.\n");
avOutputFormat = av_guess_format("mpeg", NULL, NULL);
}
if (!avOutputFormat) {
fprintf(stderr, "Could not find suitable output format\n");
exit(1);
}
oc->oformat = avOutputFormat;
avOutputFormat->video_codec = AV_CODEC_ID_H264;
avOutputFormat->audio_codec = AV_CODEC_ID_NONE;
fmt = avOutputFormat;
/* Add the audio and video streams using the default format codecs
* and initialize the codecs. */
if (fmt->video_codec != AV_CODEC_ID_NONE) {
add_stream(&video_st, oc, &video_codec, fmt->video_codec);
encode_video = 1;
}
av_dump_format(oc, 0, filename, 1);
/* open the output file, if needed */
if (!(fmt->flags & AVFMT_NOFILE)) {
ret = avio_open(&oc->pb, filename, AVIO_FLAG_WRITE);
if (ret < 0) {
fprintf(stderr, "Could not open '%s': %s\n", filename,
ret);
return 1;
}
}
// Fragmented MP4
av_dict_set(&opt, "movflags", "empty_moov+frag_keyframe", 0);
/* Write the stream header, if any. */
ret = avformat_write_header(oc, &opt);
if (ret < 0) {
fprintf(stderr, "Error occurred when opening output file: %s\n",
ret);
return 1;
}
struct timeval now;
gettimeofday(&now, NULL);
begin_timestamp_us = (now.tv_sec * 1000000LL + now.tv_usec);
while (encode_video) {
encode_video = !write_video_frame(oc, &video_st);
_sleep(1);
}
av_write_trailer(oc);
if (!(fmt->flags & AVFMT_NOFILE))
/* Close the output file. */
avio_close(oc->pb);
/* free the stream */
avformat_free_context(oc);
_sleep(1000);
return 0;
}H264 files can be downloaded from the link : https://github.com/gjwang/ffmpeg_mp4_mux/tree/master/avc_raw
This generates a MP4 video that I can play using Chrome and VLC, but Firefox and some tools (like Windows Media Player) say the video is corrupted.
Here is a link to the generated video : https://ufile.io/5b3241
Thanks for your help.
-
Death of A Micro Center
21 septembre 2012, par Multimedia Mike — HistoryThe Micro Center computer store located in Santa Clara, CA, USA closed recently :
I liked Micro Center. I have liked Micro Center ever since I first visited their Denver, CO location 10 years ago. I would sometimes drive an hour in each direction just to visit that shop. I was excited to see that they had a location in the Bay Area when I moved here a few years ago (despite the preponderance of Fry’s stores).
Now this location is gone. I wonder how much of the “we couldn’t come to favorable terms on a lease” was true (vs. an excuse to close a retail store at a time when more business is moving online, particularly in the heart of Silicon Valley). But that’s not what I wanted to discuss. I came here to discuss…
The Micro Center Window Logos
The craziest part about shopping the Santa Clara Micro Center location was the logos they displayed on the window outside. Every time I saw it, it made me sentimental for a time when some of these logos were current, or when some of these companies were still in business. Some of the logos on their front window were for companies I’ve never heard of. It reminds me of the nearby 7-11 convenience stores when I was growing up– their walls were decorated with people sporting embarrassingly 1970s styles long after the 1970s had transpired.
I thought I would record what those front window logos were and try to pinpoint when the store launched exactly (assuming the logos have been their since the initial opening and never changed).
Click for larger image
Here we have Lotus, Hewlett Packard/HP, Corel, Fuji, Power Macintosh, NEC, and Fujitsu. Lotus was purchased by IBM in 1995 and still seems to be maintained as a separate brand. The Power Macintosh was introduced as a brand in 1994. Corel’s logo has seen a few mutations over the years but I don’t know when this one fell out of favor.
Fuji (vs. Fujitsu) appears to refer to Fujifilm, though this logo is also obsolete.
Click for larger image
Hayes– I specifically remember reading the Slashdot post accouncing that Hayes is dead (followed by many comments reminiscing about the Hayes command set). Here is the post, from early 1999.
From Googling, it doesn’t appear IBM still has a presence in the consumer computing space (though they do have something pertaining to software for consumer products). Then there’s the good old rainbow Apple logo, something that went away in 1997. I suspect 1997 was also the last hurrah of the name ‘Macintosh’ (though I remember mistakenly referring to Apple computer products as Macintoshes well into the mid-2000s and inadvertently angering some Apple enthusiasts).
Click for larger image
As for the next segment, obviously, both Sony and Toshiba are still very much alive. Iomega was acquired by EMC in 2008 but is still maintained as a separate brand. USRobotics is still around and making — what else ? — 56K modems (and their current logo is slightly different than the one seen here).
Targus seems to be a case maker (“Leading Provider of Cases, Bags and Accessories for Laptops and Tablets”). I wonder if that’s just their current business or if they had more areas long ago ? It seems strange that they would get brand billing like this.
Finally, searching for information about Practical Peripherals only produces sites about how they’re long dead (like this history lesson). It’s unclear when they died.
The interior of this store was also decorated with more technology company logos near the ceiling (I didn’t really register that fact until I had visited many times). Regrettably, I now won’t be able to see how up to date those logos were.
Based on the data points above, it’s safe to conclude that the store opened between 1995 or 1996 (again, assuming the logos were placed at opening and never changed).
Epilogue
Here’s one more curious item still visible from the outside :
“See the world’s fastest PC !” Featuring an Intel Core 2 Extreme ? That CPU dates back to 2007 and was succeeded by Nehalem in late 2008. So even that sign, which is presumably easier and cleaner to replace than the window logos, was absurdly out of date.