
Recherche avancée
Médias (2)
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (72)
-
Emballe Médias : Mettre en ligne simplement des documents
29 octobre 2010, parLe plugin emballe médias a été développé principalement pour la distribution mediaSPIP mais est également utilisé dans d’autres projets proches comme géodiversité par exemple. Plugins nécessaires et compatibles
Pour fonctionner ce plugin nécessite que d’autres plugins soient installés : CFG Saisies SPIP Bonux Diogène swfupload jqueryui
D’autres plugins peuvent être utilisés en complément afin d’améliorer ses capacités : Ancres douces Légendes photo_infos spipmotion (...) -
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...) -
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)
Sur d’autres sites (10955)
-
WebRTC Detect Orientation from Peer Video Stream
30 mars 2018, par DarylUsing RecordRTC and WebRTC I am able to save a Peer MediaStream as a webm/video file. The orientation of the file differs based on the peer device. Meaning a 640x480 video is rotated clockwise or counter clockwise.
To re-orient the recorded video, some of the videos need to be rotated "clockwise" and others need to be rotated "counter clockwise". I need a way to determine which direction to rotate the video. Otherwise, some videos will be right side up and others upside down.
I attempted to look up the orientation, using ffprobe. However, the videos created do not have the "rotate" tag in the metadata.
I have also been unable to find the orientation in the Peer video stream of the WebRTC object.
I really thought that since the Video Element has the WebRTC Peer Stream displayed correctly in it, that I should be able to get the orientation from it.
-
Display ffmpeg frames on opgel texture
21 mars 2018, par nakiI am using
Dranger
tutorial01 (ffmpeg
) to decode the video and getAVI
frames. I want to useOpenGL
to display the video.http://dranger.com/ffmpeg/tutorial01.html
The main function is as follows :
int main (int argc, char** argv) {
// opengl stuff
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_RGBA);
glutInitWindowSize(800, 600);
glutCreateWindow("Hello GL");
glutReshapeFunc(changeViewport);
glutDisplayFunc(render);
GLenum err = glewInit();
if(GLEW_OK !=err){
fprintf(stderr, "GLEW error");
return 1;
}
glClear(GL_COLOR_BUFFER_BIT);
glEnable(GL_TEXTURE_2D);
GLuint texture;
glGenTextures(1, &texture); //Make room for our texture
glBindTexture(GL_TEXTURE_2D, texture);
//ffmpeg stuff
AVFormatContext *pFormatCtx = NULL;
int i, videoStream;
AVCodecContext *pCodecCtx = NULL;
AVCodec *pCodec = NULL;
AVFrame *pFrame = NULL;
AVFrame *pFrameRGB = NULL;
AVPacket packet;
int frameFinished;
int numBytes;
uint8_t *buffer = NULL;
AVDictionary *optionsDict = NULL;
if(argc < 2) {
printf("Please provide a movie file\n");
return -1;
}
// Register all formats and codecs
av_register_all();
// Open video file
if(avformat_open_input(&pFormatCtx, argv[1], NULL, NULL)!=0)
return -1; // Couldn't open file
// Retrieve stream information
if(avformat_find_stream_info(pFormatCtx, NULL)<0)
return -1; // Couldn't find stream information
// Dump information about file onto standard error
av_dump_format(pFormatCtx, 0, argv[1], 0);
// Find the first video stream
videoStream=-1;
for(i=0; inb_streams; i++)
if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO) {
videoStream=i;
break;
}
if(videoStream==-1)
return -1; // Didn't find a video stream
// Get a pointer to the codec context for the video stream
pCodecCtx=pFormatCtx->streams[videoStream]->codec;
// Find the decoder for the video stream
pCodec=avcodec_find_decoder(pCodecCtx->codec_id);
if(pCodec==NULL) {
fprintf(stderr, "Unsupported codec!\n");
return -1; // Codec not found
}
// Open codec
if(avcodec_open2(pCodecCtx, pCodec, &optionsDict)<0)
return -1; // Could not open codec
// Allocate video frame
pFrame=av_frame_alloc();
// Allocate an AVFrame structure
pFrameRGB=av_frame_alloc();
if(pFrameRGB==NULL)
return -1;
// Determine required buffer size and allocate buffer
numBytes=avpicture_get_size(PIX_FMT_RGB24, pCodecCtx->width,
pCodecCtx->height);
buffer=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));
struct SwsContext *sws_ctx = sws_getContext(pCodecCtx->width,
pCodecCtx->height, pCodecCtx->pix_fmt, 800,
600, PIX_FMT_RGB24, SWS_BICUBIC, NULL,
NULL, NULL);
// Assign appropriate parts of buffer to image planes in pFrameRGB
// Note that pFrameRGB is an AVFrame, but AVFrame is a superset
// of AVPicture
avpicture_fill((AVPicture *)pFrameRGB, buffer, PIX_FMT_RGB24,
pCodecCtx->width, pCodecCtx->height);
// Read frames and save first five frames to disk
i=0;
while(av_read_frame(pFormatCtx, &packet)>=0) {
// Is this a packet from the video stream?
if(packet.stream_index==videoStream) {
// Decode video frame
avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished,
&packet);
// Did we get a video frame?
if(frameFinished) {
// Convert the image from its native format to RGB
/* sws_scale
(
sws_ctx,
(uint8_t const * const *)pFrame->data,
pFrame->linesize,
0,
pCodecCtx->height,
pFrameRGB->data,
pFrameRGB->linesize
);
*/
sws_scale(sws_ctx, pFrame->data, pFrame->linesize, 0, pCodecCtx->height, pFrameRGB->data, pFrameRGB->linesize);
// additional opengl
glBindTexture(GL_TEXTURE_2D, texture);
//gluBuild2DMipmaps(GL_TEXTURE_2D, 3, pCodecCtx->width, pCodecCtx->height, GL_RGB, GL_UNSIGNED_INT, pFrameRGB->data[0]);
// glTexSubImage2D(GL_TEXTURE_2D, 0, 0,0, 840, 460, GL_RGB, GL_UNSIGNED_BYTE, pFrameRGB->data[0]);
glTexImage2D(GL_TEXTURE_2D, //Always GL_TEXTURE_2D
0, //0 for now
GL_RGB, //Format OpenGL uses for image
pCodecCtx->width, pCodecCtx->height, //Width and height
0, //The border of the image
GL_RGB, //GL_RGB, because pixels are stored in RGB format
GL_UNSIGNED_BYTE, //GL_UNSIGNED_BYTE, because pixels are stored
//as unsigned numbers
pFrameRGB->data[0]); //The actual pixel data
// additional opengl end
// Save the frame to disk
if(++i<=5)
SaveFrame(pFrameRGB, pCodecCtx->width, pCodecCtx->height,
i);
}
}
glColor3f(1,1,1);
glBindTexture(GL_TEXTURE_2D, texture);
glBegin(GL_QUADS);
glTexCoord2f(0,1);
glVertex3f(0,0,0);
glTexCoord2f(1,1);
glVertex3f(pCodecCtx->width,0,0);
glTexCoord2f(1,0);
glVertex3f(pCodecCtx->width, pCodecCtx->height,0);
glTexCoord2f(0,0);
glVertex3f(0,pCodecCtx->height,0);
glEnd();
// Free the packet that was allocated by av_read_frame
av_free_packet(&packet);
}
// Free the RGB image
av_free(buffer);
av_free(pFrameRGB);
// Free the YUV frame
av_free(pFrame);
// Close the codec
avcodec_close(pCodecCtx);
// Close the video file
avformat_close_input(&pFormatCtx);
return 0;
}Unfortunately i could not find my solution here
ffmpeg video to opengl texture
The program compiles but does not show any video on the texture. Just a
OpenGL
window is created. -
Rotating videos with FFmpeg [closed]
14 juin 2023, par jocullI have been trying to figure out how to rotate videos with FFmpeg. I am working with iPhone videos taken in portrait mode. I know how to determine the current degrees of rotation using MediaInfo (excellent library, btw) but I'm stuck on FFmpeg now.



From what I've read, what you need to use is a vfilter option. According to what I see, it should look like this :



ffmpeg -vfilters "rotate=90" -i input.mp4 output.mp4




However, I can't get this to work. First, -vfilters doesn't exist anymore, it's now just -vf. Second, I get this error :



No such filter: 'rotate'
Error opening filters!




As far as I know, I have an all-options-on build of FFmpeg. Running ffmpeg -filters shows this :



Filters:
anull Pass the source unchanged to the output.
aspect Set the frame aspect ratio.
crop Crop the input video to x:y:width:height.
fifo Buffer input images and send them when they are requested.
format Convert the input video to one of the specified pixel formats.
hflip Horizontally flip the input video.
noformat Force libavfilter not to use any of the specified pixel formats
 for the input to the next filter.
null Pass the source unchanged to the output.
pad Pad input image to width:height[:x:y[:color]] (default x and y:
 0, default color: black).
pixdesctest Test pixel format definitions.
pixelaspect Set the pixel aspect ratio.
scale Scale the input video to width:height size and/or convert the i
mage format.
slicify Pass the images of input video on to next video filter as multi
ple slices.
unsharp Sharpen or blur the input video.
vflip Flip the input video vertically.
buffer Buffer video frames, and make them accessible to the filterchai
n.
color Provide an uniformly colored input, syntax is: [color[:size[:ra
te]]]
nullsrc Null video source, never return images.
nullsink Do absolutely nothing with the input video.




Having the options for vflip and hflip are great and all, but they just won't get me where I need to go. I need to the ability to rotate videos 90 degrees at the very least. 270 degrees would be an excellent option to have as well. Where have the rotate options gone ?