
Recherche avancée
Médias (91)
-
GetID3 - Boutons supplémentaires
9 avril 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Image
-
Core Media Video
4 avril 2013, par
Mis à jour : Juin 2013
Langue : français
Type : Video
-
The pirate bay depuis la Belgique
1er avril 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Image
-
Bug de détection d’ogg
22 mars 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Video
-
Exemple de boutons d’action pour une collection collaborative
27 février 2013, par
Mis à jour : Mars 2013
Langue : français
Type : Image
-
Exemple de boutons d’action pour une collection personnelle
27 février 2013, par
Mis à jour : Février 2013
Langue : English
Type : Image
Autres articles (55)
-
Participer à sa traduction
10 avril 2011Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
Actuellement MediaSPIP n’est disponible qu’en français et (...) -
L’utiliser, en parler, le critiquer
10 avril 2011La première attitude à adopter est d’en parler, soit directement avec les personnes impliquées dans son développement, soit autour de vous pour convaincre de nouvelles personnes à l’utiliser.
Plus la communauté sera nombreuse et plus les évolutions seront rapides ...
Une liste de discussion est disponible pour tout échange entre utilisateurs. -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (11282)
-
converting a "gif" to video using swift
3 décembre 2019, par James WoodrowI’ve looked around and found a few things here and there, mainly that I should be using AVAssetWriter to do this but I have 0 experience with this and video editing/creation so it doesn’t help me much since I can’t seem to find anything that does something I can modify easily (or not at my level of knowledge at least) so that it works as I intend it to.
I have an app which takes
n
photos everycft
(capture frame time which I get from a backend server) seconds (it’s a double for obvious reasons) I then display these frames using a UIImageView and the frames change everydft
(display frame time which I also get from a backend server and can be different fromcft
). Up until this point nothing complicated.now what is currently the workflow is that these frames are sent back to a server with any relevant information I want and then the server would use imagemagick to create a real gif file and ffmpeg to create a 15 seconds video using said gif.
the issue is this makes it so that my heroku server bills aren’t as low as I would like because of the limited memory on the dynos and the time it takes to generate these videos is of about 5-10 seconds I believe (not sure but it’s longer than I’d like)
So the idea I had was to make the app create the video since he already has all the information he needs for this, and then simply upload it with the rest of the frames and relevant data. Using bandwidth nowadays is much cheaper than buying extra processing power on a server.
- he has
n
frames to loop over - he has a float value representing how long each frame should last
dft
- he has a gpu or at least a much better cpu than the dynos heroku have to offer
I’ve also looked around to see if anyone made an extensive tutorial on how to use ffmpeg in swift but I still didn’t find anything at my level and I didn’t even find a tutorial per se, only some GitHub projects which were partially completed and/or without the original tutorial linked to understand the thought process.
I would appreciate any tips/code sample/tutorials on the subject.
I’m adding the ffmpeg command line equivalent to what I would love to be able to do (if I could use ffmpeg directly with iOS this could be nice too)
ffmpeg -framerate 100/13 -loop 1 -i frame%02d.png -c:v libx264 -r 100/13 -pix_fmt yuv420p -t 0:15 instagram.mp4
where basically I did
100 / (dft * 100)
for the input frame rate and just output at the same fps for 15 seconds. by the way if there are any ways to optimise this command to make it run faster without losing quality I might be able to keep the current way of functioning with heroku although I would still prefer some iOS solution. - he has
-
Libavcodec "the procedure entry point for av_frame_alloc could not be located" error in Visual Studio 2017 C++ project
25 novembre 2019, par AvesI am trying to use libavcodec from ffmpeg library in C++ with Visual Studio 2017 Community. I downloaded the latest x64 dev and shared builds from zeranoe (version 20171217), set up include directories and additional libraries in Visual Studio for x64 build, added DLL files from shared package to my PATH.
This is my sample test code :
extern "C" {
#include
}
int main() {
avcodec_register_all();
AVFrame *pAvFrame = av_frame_alloc();
av_frame_free(&pAvFrame);
return 0;
}The code compiles without problems but when I run the application I see a dialogue window with error message "the procedure entry point for av_frame_alloc could not be located in DLL" (actual message is not in English, this is the translated version).
I tried to set Linker->Optimization->References to /OPT:NOREF as it was advised in the similar questions but it did not help.
Dependency walker shows that av_frame_alloc is exported, "Entry Point" is not bound. A little bit strange is that av_frame_alloc is displayed in both avcodec-58.dll (as red) and avutil-56.dll (as green). Maybe the reason is that the application is trying to get this function from avcodec instead of avutil, but I’m not sure, since I did not check the source code of these libraries.
So the question is how to set up such a simple FFMPEG-based C++ project in VS2017, where I’m wrong ?
UPD. 1.
Linker flags : /OUT :"C :\work\code\TestFfmpeg\x64\Release\TestFfmpeg.exe" /MANIFEST /NXCOMPAT /PDB :"C :\work\code\TestFfmpeg\x64\Release\TestFfmpeg.pdb" /DYNAMICBASE "c :\work\dev\ffmpeg-20171217-387ee1d-win64-dev\lib*.lib" "kernel32.lib" "user32.lib" "gdi32.lib" "winspool.lib" "comdlg32.lib" "advapi32.lib" "shell32.lib" "ole32.lib" "oleaut32.lib" "uuid.lib" "odbc32.lib" "odbccp32.lib" /DEBUG:FULL /MACHINE:X64 /OPT:NOREF /PGD :"C :\work\code\TestFfmpeg\x64\Release\TestFfmpeg.pgd" /MANIFESTUAC :"level=’asInvoker’ uiAccess=’false’" /ManifestFile :"x64\Release\TestFfmpeg.exe.intermediate.manifest" /OPT:ICF /ERRORREPORT:PROMPT /NOLOGO /TLBID:1
-
ffmpeg ... "Impossible to convert between the formats supported by the filter"
27 novembre 2017, par hydra3333Using the latest ffmpeg master built as at 2017.11.26, I’m having trouble deciphering what the error messages means and, more importantly, what to do about them.
Changing from -vf to -filter_complex did nothing (I had to try).
The main error message seems to beImpossible to convert between the formats supported by the filter
I have tried to insert "format=" and "scale" before/between/after yadif and unsharp_opencl but to no avail.
I wonder, could it be something to do with needing hwupload/hwdownload/hwmap or is that a red herring ?
What am I doing wrong ?
".\ffmpeg_3.latest_master.exe" -hide_banner -v verbose -init_hw_device opencl=ocl:1.0 -filter_hw_device ocl -i ".\test_01.mpg" -an -map_metadata -1 -sws_flags lanczos+accurate_rnd+full_chroma_int+full_chroma_inp -filter_complex "[0:v]yadif=0:0:0,scale=flags=lanczos+accurate_rnd+full_chroma_int+full_chroma_inp,unsharp_opencl=lx=3:ly=3:la=0.5:cx=3:cy=3:ca=0.5,setdar=dar=16/9" -r 25 -c:v h264_nvenc -preset slow -bf 2 -g 50 -refs 3 -rc:v vbr_hq -rc-lookahead:v 32 -cq 22 -qmin 16 -qmax 25 -coder cabac -movflags +faststart -profile:v high -level 4.1 -pixel_format yuv420p -y ".\test_01.newest.MP4"
[AVHWDeviceContext @ 0000020e229aad40] 1.0: NVIDIA CUDA / GeForce GTX 750 Ti
[AVHWDeviceContext @ 0000020e229aad40] DXVA2 to OpenCL mapping function found (clCreateFromDX9MediaSurfaceKHR).
[AVHWDeviceContext @ 0000020e229aad40] DXVA2 in OpenCL acquire function found (clEnqueueAcquireDX9MediaSurfacesKHR).
[AVHWDeviceContext @ 0000020e229aad40] DXVA2 in OpenCL release function found (clEnqueueReleaseDX9MediaSurfacesKHR).
[AVHWDeviceContext @ 0000020e229aad40] The cl_khr_d3d11_sharing extension is required for D3D11 to OpenCL mapping.
[AVHWDeviceContext @ 0000020e229aad40] D3D11 to OpenCL mapping not usable.
[mpeg @ 0000020e229ae240] max_analyze_duration 5000000 reached at 5000000 microseconds st:0
Input #0, mpeg, from '.\test_01.mpg':
Duration: 00:06:29.96, start: 0.240000, bitrate: 2799 kb/s
Stream #0:0[0x1e0]: Video: mpeg2video (Main), 1 reference frame, yuv420p(tv, top first, left), 720x576 [SAR 64:45 DAR 16:9], 25 fps, 25 tbr, 90k tbn, 50 tbc
Stream #0:1[0x1c0]: Audio: mp2, 48000 Hz, stereo, s16p, 256 kb/s
[Parsed_scale_1 @ 0000020e29951a20] w:iw h:ih flags:'lanczos+accurate_rnd+full_chroma_int+full_chroma_inp' interl:0
Stream mapping:
Stream #0:0 (mpeg2video) -> yadif
setdar -> Stream #0:0 (h264_nvenc)
Press [q] to stop, [?] for help
[Parsed_scale_1 @ 0000020e29cccda0] w:iw h:ih flags:'lanczos+accurate_rnd+full_chroma_int+full_chroma_inp' interl:0
[graph 0 input from stream 0:0 @ 0000020e29ccc980] w:720 h:576 pixfmt:yuv420p tb:1/90000 fr:25/1 sar:64/45 sws_param:flags=2
[auto_scaler_0 @ 0000020e29ccc580] w:iw h:ih flags:'bilinear' interl:0
[Parsed_unsharp_opencl_2 @ 0000020e29ccc4a0] auto-inserting filter 'auto_scaler_0' between the filter 'Parsed_scale_1' and the filter 'Parsed_unsharp_opencl_2'
Impossible to convert between the formats supported by the filter 'Parsed_scale_1' and the filter 'auto_scaler_0'
Error reinitializing filters!
Failed to inject frame into filter network: Function not implemented
Error while processing the decoded data for stream #0:0
Conversion failed!