
Recherche avancée
Médias (2)
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (99)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Les vidéos
21 avril 2011, parComme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...) -
(Dés)Activation de fonctionnalités (plugins)
18 février 2011, parPour gérer l’ajout et la suppression de fonctionnalités supplémentaires (ou plugins), MediaSPIP utilise à partir de la version 0.2 SVP.
SVP permet l’activation facile de plugins depuis l’espace de configuration de MediaSPIP.
Pour y accéder, il suffit de se rendre dans l’espace de configuration puis de se rendre sur la page "Gestion des plugins".
MediaSPIP est fourni par défaut avec l’ensemble des plugins dits "compatibles", ils ont été testés et intégrés afin de fonctionner parfaitement avec chaque (...)
Sur d’autres sites (10134)
-
Overlay system time with milliseconds in ffmpeg
8 septembre 2017, par userDtrmI need to overlay the current system time with milliseconds in ffmpeg. The solutions I’ve come here across simply displays the pts or gmtime (doesn’t show milliseconds). Please find below the script that I’m currently using for this. This simply shows the pts time stamp.
ffmpeg -f v4l2 -input_format yuyv422 -framerate 30 -s 640x480 -i /dev/video0 -filter_complex "drawtext=fontfile=/usr/share/fonts/truetype/freefont/FreeSerif.ttf: text='frame %{n}\\: %{pict_type}\\: pts=%{pts\\:hms}': x=100: y=50: fontsize=24: fontcolor=yellow@0.8: box=1: boxcolor=blue@0.9" -analyzeduration 5 -c:v libx264 -profile:v baseline -trellis 0 -subq 1 -level 32 -preset ultrafast -tune zerolatency -me_method epzs -r 50 -crf 20 -threads 0 -bufsize 1 -coder 0 -b_strategy 0 -bf 0 -sc_threshold 0 -x264-params vbv-maxrate=2000:vbv-bufsize=100:slice-max-size=200:keyint=10:min-keyint=10:partitions=-parti8x8-partp8x8:me=dia:qpmin=10:qpmax=51:qpstep=4:ref=1: -pix_fmt yuv420p -an -f mpegts - | nc -u 131.227.87.152 7001
Can someone please let me know how to get the current system time with milliseconds into the ffmpeg output ?
Thanks.
-
Is there an effective and cheap/free way to host video for a mobile app that must be approved by an admin before going live ? [on hold]
9 août 2013, par user2658775We are building a mobile app for the iOS and Android operating systems. The app is to be a communication platform for members within an organization. Content is generated by users and submitted to the admin. Once approved by the admin the content is pushed to the app. One feature of the app is the ability to upload video.
We are having a tough time attempting to figure out the best way to do this. Because the app will be representing the organization, the organization must have control over the approval process.
So far we have come up with the following options :
Option 1 : purchase a dedicated server from hosting service provider. The basic package with Blue host is $150/month which is fairly expensive.
Option 2 : have the users post to YouTube using their personal accounts. Upon posting to YouTube (via the app) the app would send a notification to the admin that a new video has been posted. Admin would review the video and if acceptable admin would user url link to post video to app. This option, while free, requires many steps that will bog down the submittal process.
Does anyone know of an effective way to post video to an app that requires approval by an admin ?
-
RGB to YUV conversion with libav (ffmpeg) triplicates image
17 avril 2021, par José Tomás TocinoI'm building a small program to capture the screen (using X11 MIT-SHM extension) on video. It works well if I create individual PNG files of the captured frames, but now I'm trying to integrate libav (ffmpeg) to create the video and I'm getting... funny results.


The furthest I've been able to reach is this. The expected result (which is a PNG created directly from the RGB data of the XImage file) is this :




However, the result I'm getting is this :




As you can see the colors are funky and the image appears cropped three times. I have a loop where I capture the screen, and first I generate the individual PNG files (currently commented in the code below) and then I try to use libswscale to convert from RGB24 to YUV420 :


while (gRunning) {
 printf("Processing frame framecnt=%i \n", framecnt);

 if (!XShmGetImage(display, RootWindow(display, DefaultScreen(display)), img, 0, 0, AllPlanes)) {
 printf("\n Ooops.. Something is wrong.");
 break;
 }

 // PNG generation
 // snprintf(imageName, sizeof(imageName), "salida_%i.png", framecnt);
 // writePngForImage(img, width, height, imageName);

 unsigned long red_mask = img->red_mask;
 unsigned long green_mask = img->green_mask;
 unsigned long blue_mask = img->blue_mask;

 // Write image data
 for (int y = 0; y < height; y++) {
 for (int x = 0; x < width; x++) {
 unsigned long pixel = XGetPixel(img, x, y);

 unsigned char blue = pixel & blue_mask;
 unsigned char green = (pixel & green_mask) >> 8;
 unsigned char red = (pixel & red_mask) >> 16;

 pixel_rgb_data[y * width + x * 3] = red;
 pixel_rgb_data[y * width + x * 3 + 1] = green;
 pixel_rgb_data[y * width + x * 3 + 2] = blue;
 }
 }

 uint8_t* inData[1] = { pixel_rgb_data };
 int inLinesize[1] = { in_w };

 printf("Scaling frame... \n");
 int sliceHeight = sws_scale(sws_context, inData, inLinesize, 0, height, pFrame->data, pFrame->linesize);

 printf("Obtained slice height: %i \n", sliceHeight);
 pFrame->pts = framecnt * (pVideoStream->time_base.den) / ((pVideoStream->time_base.num) * 25);

 printf("Frame pts: %li \n", pFrame->pts);
 int got_picture = 0;

 printf("Encoding frame... \n");
 int ret = avcodec_encode_video2(pCodecCtx, &pkt, pFrame, &got_picture);

// int ret = avcodec_send_frame(pCodecCtx, pFrame);

 if (ret != 0) {
 printf("Failed to encode! Error: %i\n", ret);
 return -1;
 }

 printf("Succeed to encode frame: %5d - size: %5d\n", framecnt, pkt.size);

 framecnt++;

 pkt.stream_index = pVideoStream->index;
 ret = av_write_frame(pFormatCtx, &pkt);

 if (ret != 0) {
 printf("Error writing frame! Error: %framecnt \n", ret);
 return -1;
 }

 av_packet_unref(&pkt);
 }



I've placed the entire code at this gist. This question right here looks pretty similar to mine, but not quite, and the solution did not work for me, although I think this has something to do with the way the line stride is calculated.