
Recherche avancée
Autres articles (53)
-
Multilang : améliorer l’interface pour les blocs multilingues
18 février 2011, parMultilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela. -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (9999)
-
FFmpeg streaming to Akamai like FMLE does
21 juin 2013, par xdev.developerI am trying to stream video from my webcam to AkamaiHD service using ffmpeg (like it is implemented in Flash Media Live Encoder)
ffmpeg -f dshow -i video="Webcam C110" -s 640x360 -aspect 16:9 -profile:v baseline - pix_fmt yuv420p -vcodec libx264 -f flv "rtmp://..."
...
Input #0, dshow, from 'video=Webcam C110':
Duration: N/A, start: 31296.194000, bitrate: N/A
Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 640x480, 30 tbr, 10000k tbn, 30 tbc
Output #0
Metadata:
encoder : Lavf55.8.102
Stream #0:0: Video: h264 (libx264) ([7][0][0][0] / 0x0007), yuv420p, 640x360 [SAR 1:1 DAR 16:9], q=-1--1, 1k tbn, 30 tbc
Stream mapping:
Stream #0:0 -> #0:0 (rawvideo -> libx264)
....The video is sreamed, but when I try to view it at http://mediapm.edgesuite.net/edgeflash/public/zeri/debug/Main.html?url=myplayback_url/manifest.f4m
it is not displayed.I've found out that if the video is recorded using FMLE and restreamed to akamai, then HLS stream is played.
ffmpeg -re -i sample.f4v -c copy -f flv "rtmp://..."
....
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'sample.f4v':
Metadata:
major_brand : f4v
minor_version : 0
compatible_brands: isommp42m4v
creation_time : 2018-10-06 09:23:33
Duration: 00:01:01.77, start: 0.460000, bitrate: 718 kb/s
Stream #0:0(eng): Video: h264 (Baseline) (avc1 / 0x31637661), yuv420p, 640x360 [SAR 1:1 DAR 16:9], 624 kb/s, 30 tbr, 1k tbn, 48 tbc
Metadata:
creation_time : 2018-10-06 09:23:33
handler_name : MainConcept
Output #0
Metadata:
major_brand : f4v
minor_version : 0
compatible_brands: isommp42m4v
encoder : Lavf55.8.102
Stream #0:0(eng): Video: h264 ([7][0][0][0] / 0x0007), yuv420p, 640x360 [SAR 1:1 DAR 16:9], q=2-31, 624 kb/s, 1k tbn, 1k tbc
Metadata:
creation_time : 2018-10-06 09:23:33
handler_name : MainConcept
Stream mapping:
Stream #0:0 -> #0:0 (copy)
....It seems that the problem is with h.264 codec configuration, but I have found no solution yet.
Could you please advice, how can I implement streaming like FMLE using ffmpeg ?
-
Streaming a programatically created video to youtube using node and ffmpeg
23 septembre 2020, par CaltropI've been trying to Livestream a programmatically created image to youtube using node. I've had very limited success using FFmpeg. While I have managed to create and save an image thanks to this insightful thread, I have yet to make the code work for streaming to an RTMP server.


const cp = require('child_process'),
 destination = 'rtmp://a.rtmp.youtube.com/live2/[redacted]', //stream token redacted
 proc = cp.spawn('./ffmpeg/bin/ffmpeg.exe', [
 '-f', 'rawvideo',
 '-pix_fmt', 'rgb24',
 '-s', '426x240',
 '-i', '-', //allow us to insert a buffer through stdin
 '-f', 'flv',
 destination
 ]);

proc.stderr.pipe(process.stdout);

(function loop() {
 setTimeout(loop, 1000 / 30); //run loop at 30 fps
 const data = Array.from({length: 426 * 240 * 4}, () => ~~(Math.random() * 0xff)); //create array with random data
 proc.stdin.write(Buffer.from(data)); //convert array to buffer and send it to ffmpeg
})();



When running this code no errors appear and everything appears to be working, however, YouTube reports that no data is being received. Does anybody know what is going wrong here ?


Update : This is really counter-intuitive but adding a slash to the destination like this
'rtmp://a.rtmp.youtube.com/live2/[redacted]/'
causes ffmpeg to throw a genericI/O error
. This is really weird to me. Apologies if the answer to this is obvious, I'm really inexperienced with ffmpeg.

-
FFmpeg conversion RGB to YUV420 to RGB wrong result
29 août 2020, par sipwizI'm attempting to sort out a video encoding issue and along the way making sure I can convert between pixel formats with FFmpeg. I'm having a problem converting a dummy RGB24 bitmap to YUV420 and back again.


Below is my test program :


#include "Windows.h"

#include <libavcodec></libavcodec>avcodec.h>
#include <libavformat></libavformat>avformat.h>
#include <libavformat></libavformat>avio.h>
#include <libavutil></libavutil>imgutils.h>
#include <libswscale></libswscale>swscale.h>
#include <libavutil></libavutil>time.h>

#define WIDTH 32
#define HEIGHT 32
#define ERROR_LEN 128
#define SWS_FLAGS SWS_BICUBIC

char _errorLog[ERROR_LEN];
void CreateBitmapFile(LPCWSTR fileName, long width, long height, WORD bitsPerPixel, BYTE* bitmapData, DWORD bitmapDataLength);

int main()
{
 printf("FFmpeg Pixel Conversion Test\n");

 av_log_set_level(AV_LOG_DEBUG);

 int w = WIDTH;
 int h = HEIGHT;

 struct SwsContext* rgbToI420Context;
 struct SwsContext* i420ToRgbContext;

 rgbToI420Context = sws_getContext(w, h, AV_PIX_FMT_RGB24, w, h, AV_PIX_FMT_YUV420P, SWS_FLAGS, NULL, NULL, NULL);
 if (rgbToI420Context == NULL) {
 fprintf(stderr, "Failed to allocate RGB to I420 conversion context.\n");
 }

 i420ToRgbContext = sws_getContext(w, h, AV_PIX_FMT_YUV420P, w, h, AV_PIX_FMT_RGB24, SWS_FLAGS, NULL, NULL, NULL);
 if (i420ToRgbContext == NULL) {
 fprintf(stderr, "Failed to allocate I420 to RGB conversion context.\n");
 }

 // Create dummy bitmap.
 uint8_t rgbRaw[WIDTH * HEIGHT * 3];
 for (int row = 0; row < 32; row++)
 {
 for (int col = 0; col < 32; col++)
 {
 int index = row * WIDTH * 3 + col * 3;

 int red = (row < 16 && col < 16) ? 255 : 0;
 int green = (row < 16 && col > 16) ? 255 : 0;
 int blue = (row > 16 && col < 16) ? 255 : 0;

 rgbRaw[index] = (byte)red;
 rgbRaw[index + 1] = (byte)green;
 rgbRaw[index + 2] = (byte)blue;
 }
 }

 CreateBitmapFile(L"test-reference.bmp", WIDTH, HEIGHT, 24, rgbRaw, WIDTH * HEIGHT * 3);

 printf("Converting RGB to I420.\n");

 uint8_t* rgb[3];
 uint8_t* i420[3];
 int rgbStride[3], i420Stride[3];

 rgbStride[0] = w * 3;
 i420Stride[0] = w * h;
 i420Stride[1] = w * h / 4;
 i420Stride[2] = w * h / 4;

 rgb[0] = rgbRaw;
 i420[0] = (uint8_t*)malloc((size_t)i420Stride[0] * h);
 i420[1] = (uint8_t*)malloc((size_t)i420Stride[1] * h);
 i420[2] = (uint8_t*)malloc((size_t)i420Stride[2] * h);

 int toI420Res = sws_scale(rgbToI420Context, rgb, rgbStride, 0, h, i420, i420Stride);
 if (toI420Res < 0) {
 fprintf(stderr, "Conversion from RGB to I420 failed, %s.\n", av_make_error_string(_errorLog, ERROR_LEN, toI420Res));
 }

 printf("Converting I420 to RGB.\n");

 uint8_t* rgbOut[3];
 int rgbOutStride[3];

 rgbOutStride[0] = w * 3;
 rgbOut[0] = (uint8_t*)malloc((size_t)rgbOutStride[0] * h);

 int toRgbRes = sws_scale(i420ToRgbContext, i420, i420Stride, 0, h, rgbOut, rgbOutStride);
 if (toRgbRes < 0) {
 fprintf(stderr, "Conversion from RGB to I420 failed, %s.\n", av_make_error_string(_errorLog, ERROR_LEN, toRgbRes));
 }

 CreateBitmapFile(L"test-output.bmp", WIDTH, HEIGHT, 24, rgbOut, WIDTH * HEIGHT * 3);

 free(rgbOut[0]);

 for (int i = 0; i < 3; i++) {
 free(i420[i]);
 }

 sws_freeContext(rgbToI420Context);
 sws_freeContext(i420ToRgbContext);

 return 0;
}

/**
* Creates a bitmap file and writes to disk.
* @param[in] fileName: the path to save the file at.
* @param[in] width: the width of the bitmap.
* @param[in] height: the height of the bitmap.
* @param[in] bitsPerPixel: colour depth of the bitmap pixels (typically 24 or 32).
* @param[in] bitmapData: a pointer to the bytes containing the bitmap data.
* @param[in] bitmapDataLength: the number of pixels in the bitmap.
*/
void CreateBitmapFile(LPCWSTR fileName, long width, long height, WORD bitsPerPixel, BYTE* bitmapData, DWORD bitmapDataLength)
{
 HANDLE file;
 BITMAPFILEHEADER fileHeader;
 BITMAPINFOHEADER fileInfo;
 DWORD writePosn = 0;

 file = CreateFile(fileName, GENERIC_WRITE, 0, NULL, CREATE_ALWAYS, FILE_ATTRIBUTE_NORMAL, NULL); //Sets up the new bmp to be written to

 fileHeader.bfType = 19778; //Sets our type to BM or bmp
 fileHeader.bfSize = sizeof(fileHeader.bfOffBits) + sizeof(RGBTRIPLE); //Sets the size equal to the size of the header struct
 fileHeader.bfReserved1 = 0; //sets the reserves to 0
 fileHeader.bfReserved2 = 0;
 fileHeader.bfOffBits = sizeof(BITMAPFILEHEADER) + sizeof(BITMAPINFOHEADER); //Sets offbits equal to the size of file and info header
 fileInfo.biSize = sizeof(BITMAPINFOHEADER);
 fileInfo.biWidth = width;
 fileInfo.biHeight = height;
 fileInfo.biPlanes = 1;
 fileInfo.biBitCount = bitsPerPixel;
 fileInfo.biCompression = BI_RGB;
 fileInfo.biSizeImage = width * height * (bitsPerPixel / 8);
 fileInfo.biXPelsPerMeter = 2400;
 fileInfo.biYPelsPerMeter = 2400;
 fileInfo.biClrImportant = 0;
 fileInfo.biClrUsed = 0;

 WriteFile(file, &fileHeader, sizeof(fileHeader), &writePosn, NULL);

 WriteFile(file, &fileInfo, sizeof(fileInfo), &writePosn, NULL);

 WriteFile(file, bitmapData, bitmapDataLength, &writePosn, NULL);

 CloseHandle(file);
}



The source bitmap is :




The output bmp after the two conversions is :