
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (45)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)
Sur d’autres sites (6272)
-
Encode Frames to Video with C Library
31 juillet 2018, par NetherGraniteFor the sake of continuity, let us assume "RGB values" are the following :
typedef struct RGB {
uint8_t r, g, b;
} rgb;However, if you feel that a different color space is more appropriate for this question, please use that instead.
How might I go about writing 2D arrays of RGB values to a video in C given an output format and framerate ?
Before I continue, I should specify that I wish to be able to do this all within one program. I am trying to add functionality to an application that would allow it to compile videos frame by frame without having to leave it.
Additionally, my needs for this functionality are extremely basic ; I simply need to be able to set individual pixels to certain colors.
The closest I have come to a solution so far is the C library FFmpeg. Allow me to describe what I was able to learn on my own :
After looking through its documentation, I came across the function
avcodec_send_frame(avctx, frame)
, whose parameters are of the typesAVCodexContext*
andconst AVFrame*
respectively. If these are not the right tools for what I am trying to do, please ignore the rest of the question and instead point me towards what I should be using.However, I do not know which fields of
avctx
andframe
must be set manually and which do not. The reason I assume some do not is because both are extremely large structures, but correct me if I am wrong.Question 1 : What values of an
AVCodecContext
andAVFrame
must be set ? Of these, what is/are the recommended value(s) for each of them ?Additionally, I was only able to find instructions on how to initialize an
AVFrame
(usingav_frame_alloc()
andav_frame_get_buffer()
) but not for anAVCodexConstant
.Question 2 : Is there a proper way to initialize an
AVCodexConstant
? And just in case, is the method of initializing anAVFrame
described above correct ? Do any of the fields of either have a proper method of initialization ?Also, I was not able to find official documentation on how to take this
AVCodexConstant
(which I assume contains the video information) and turn it into a video. I apologize if the documentation for this is easy to find and I just missed it.Question 3 : How do I turn an
AVCodexConstant
into a file of a given format ?And, given my limited knowledge :
Question 4 : Are there any other parts to this process that I am missing, and do I have any of the above parts wrong ?
Please keep in mind that I found out about FFmpeg for the first time very recently, and as a result, I am a complete beginner to this. Additionally, my experience with C is very limited, so I would greatly appreciate it if you could note which files need to be included with
#include
.Feel free to even go as far as recommending something other than FFmpeg, just as long as it is written in C. I do not need power-user options, but I would greatly prefer flexibility in what audio and video file types the library can handle.
Addressing Potential Duplicates
I appologize for how long this section is ; I just want to have my bases covered. I heavily apologize, however, if this is in fact a duplicate of a question that I was just unable to find.
- ffmpeg C API documentation/tutorial [closed] — This question was too open-ended and received answers pointing the asker towards a tutorial at dranger.com, a tutorial that confusingly muddied the waters by focusing heavily on a graphics library of choice. Please do not take this as me saying it is bad ; I am just enough of a beginner that I could not wade through it all.
- Encoding frames to video with ffmpeg — Although this question seems to have been asking the same thing, it is geared towards Unreal Engine 4, and the asker provided sample code, making it difficult for me to understand which of parts of the accepted answer were necessary for me and which were not.
- How to write frames to a video file ? — While this also asked the same thing, the accepted answer simply provides a command instead of an explanation of code.
- YUV Raw frames to video stream — While the accepted answer for this question is a command, the question states that it is looking for a way to encode frames generated by C++ code. Is there some way to run commands in code that I haven’t been able to find ?
- Converting sequenced frames to video — Not only is the asker’s code written in Python, but it also seems to use already-existing image files as frames.
- How to write bitmaps as frames to H.264 with x264 in C\C++ ? — The accepted answer seems to describe a process that would take multiple applications, but I could be wrong as I am enough of a beginner that I am not sure exactly what it means other than Step 3.
- How to write bitmaps as frames to Ogg Theora in C\C++ ? — Although it isn’t a problem that the question specifies the ogg format, it is a problem that the accepted answer suggests libtheora, which appears to only work with ogg files.
-
Anomalie #3315 (Fermé) : Champs obligatoires non renseignés : afficher un message en haut de page
28 octobre 2014, par cedric -Appliqué par commit r21729.
-
installing yasm / nasm on heroku with vulcan
21 novembre 2013, par scientifficI'm trying to do a build of ffmpeg on Heroku, and I need to use libvpx. In order to install libvpx, I need to have nasm or yasm. I tried installing both using vulcan, but I keep getting the error Neither yasm not nasm has been found
Here is what I did
Installing Nasm
- get nasm source from http://www.linuxfromscratch.org/blfs/view/svn/general/NASM.html
- tar file using : tar -xJf nasm-2.10.09.tar.xz
- build using : vulcan build -v -s . -c "./configure -prefix=/app/vendor/nasm && make && make install"
- add the output of the tar file from vulcan to vendor/nasm
- push to heroku
Installing Yasm
- get yasm source from http://www.linuxfromscratch.org/blfs/view/svn/general/yasm.html
- extra tar file and build using : vulcan build -v -s . -c "./configure -prefix=/app/vendor/yasm && make && make install"
- add the output of the tar file from vulcan to vendor/yasm
- push to heroku
Installing Libvpx
- get libvpx source from http://www.linuxfromscratch.org/blfs/view/svn/multimedia/libvpx.html
- build libvpx using : vulcan build -v -s . -c "./configure —enable-shared —disable-static —prefix=/app/vendor/llibvpx && make && make install"
Attemping to build libvpx yields this error :
Packaging local directory... /.rvm/gems/ruby-1.9.2-p320/gems/vulcan-0.8.2/lib/vulcan/cli.rb:49: warning: Insecure world writable dir /usr/local in PATH, mode 040777
done
Uploading source package... done
Building with: ./configure --enable-shared --disable-static --prefix=/app/vendor/llibvpx && make && make install
Configuring selected codecs
enabling vp8_encoder
enabling vp8_decoder
Configuring for target 'x86_64-linux-gcc'
enabling x86_64
enabling pic
enabling runtime_cpu_detect
enabling mmx
enabling sse
enabling sse2
enabling sse3
enabling ssse3
enabling sse4_1
**Neither yasm nor nasm have been found**
Configuration failed. This could reflect a misconfiguration of your
toolchains, improper options selected, or another problem. If you
don't see any useful error messages above, the next step is to look
at the configure error log file (config.err) to determine what
configure was trying to do when it died.How can I successfully build libvpx on heroku using vulcan ?
The instructions I've been (loosely) following are from here :
https://gist.github.com/czivko/4392472
And the reason I need to use libvpx is that I'm using the carrierwave-video gem in my Rails app to convert videos, and it needs libvpx to convert to webm to support video playback in Firefox.