
Recherche avancée
Autres articles (43)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Ajouter notes et légendes aux images
7 février 2011, parPour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
Modification lors de l’ajout d’un média
Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...) -
Submit bugs and patches
13 avril 2011Unfortunately a software is never perfect.
If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
You may also (...)
Sur d’autres sites (8800)
-
target_link_libraries in CMAKE using android studio 2.2.2
22 novembre 2016, par fadiI am facing a weird issue and it’s difficult to know why because the compiler doesnt give any errors.
I created a new project in android studio 2.2.2 with C++ support.
I edited the .cpp file inside src/main/cpp and compiled the project to obtain (.so) file that i can use as a shared library. To this point everything works perfect.Here is where the problem occurs :
I am trying to link prebuild shared libraries from ffmpeg. I have already build the libraries in .so format and all I need to do is link them to my .cpp file.
To link the libraries, I opened the CMakeLists.txt inside android studio and told cmake to link those prebuild shared libraries using the following code :
add_library(libavformat SHARED IMPORTED)
set_target_properties(libavformat PROPERTIES IMPORTED_LOCATION C:/Android /SDK/MyProjects/ffmpeg_to_jpg/P3/app/src/main/jniLibs/libavformat-55.so)
include_directories(src/main/cpp/include/)
target_link_libraries(native-lib libavformat)
This code basically links libavformat to native-lib (which is created from my .cpp file)
The linking process works fine, the reason is because the compiler doesn’t cry about any dependencies.
However, my original shared library (native-lib) stops working, and by that I mean, I cannot call any functions from within it.
If i remove the linking line
target_link_libraries(native-lib libavformat)
The native-lib.so works fine and I can call any function from within that does not depend on libavformat.
I am not sure what is going on, like I said the compiler doesnt issue any warnings or errors. it is almost like after the linking process the content of native-lib is overwritten by libavformat !!!!
any ideas ?
-
ffmpeg's segment_atclocktime cuts at inaccurate times for audio
3 mai 2023, par Ross RichardsonI am using ffmpeg's segment format to save files of an AAC stream to disk in hourly segments.
The segmenting works well, but the files are segmented/cut at different times in the clock each hour using
segment_atclocktime


I would like each to be exactly on the hour, e.g. 12:00:00, 13:00:00 etc. Or at least, beginning after the hour and not before, e.g. 12:00:00, 13:00:01, 14:00:00 etc.


I am using
ffmpeg-python
to process the AAC stream and send to two outputs : stdout and these segments.
Here's the code :

out1 = ffmpeg.input(stream, loglevel="panic").output("pipe:",
 format="s16le", 
 acodec="pcm_s16le", 
 ac="1", 
 ar="16000")

out2 = ffmpeg.input(stream, loglevel="info").output("rec/%Y-%m-%d-%H%M%S.aac",
 acodec="copy",
 format="segment",
 segment_time="3600",
 segment_atclocktime="1",
 reset_timestamps="1",
 strftime="1")
 
ffmpeg.merge_outputs(out1, out2)
 .run_async(pipe_stdout=True, overwrite_output=True)



Most files are produced at the desired time : 05:00:00, 06:00:00, 07:00:00, but one or two each day start at 08:59:59 (where 09:00:00 would be desired), or even 16:00:24.


I understand the segment needs to begin on a audio sample so it can't be perfect to the hour, but wondering how I can make it more consistent. Ideally, each hour's recording would begin at 00:00 or later, and not begin before the hour.


I have tried using
min_seg_duration 3600
,reset_timestamps 1

I am not sure how exactly to usesegment_clocktime_wrap_duration
for audio, or whethersegment_time_delta
applies to audio.

I'd appreciate any advice or understanding of how
segment_atclocktime
works with audio, as much on the internet seems video-focused.

-
Render SharpDX Texture2D in UWP application
10 décembre 2019, par AlexI’m implementing a solution for hardware-accelerated H264 decoding and rendering in the UWP application. I want to avoid copying from GPU to CPU.
The solutions consists of 2 parts :- C library that decodes the H264 stream using ffmpeg
- UWP/C#/SharpDX application to receive encoded data, pinvoke library and then render decoded frames.
I receive encoded data in the C# application and send it to the C library to decode and get the pointer to the frame back using pinvoke.
C part looks good so far. I managed to receive pointer to the decoded frame in GPU in the C library :
// ffmpeg decoding logic
ID3D11Texture2D* texturePointer = (ID3D11Texture2D*)context->frame->data[0];I managed to receive this pointer in C# code and create SharpDX texture from it.
var texturePointer = decoder.Decode(...data...); // pinvoke
if (texturePointer != IntPtr.Zero)
{
var texture = new Texture2D(texturePointer); // works just perfect
}Now I need to render it on the screen. My understanding is that I can create class that extends
SurfaceImageSource
so I can assign it as aSource
of XAMLImage
object.
It can be something like this :public class RemoteMediaImageSource : SurfaceImageSource
{
public void BeginDraw(IntPtr texturePointer)
{
var texture = new Texture2D(texturePointer);
// What to do to render texture in GPU to the screen?
}
}Is my assumption correct ?
If yes, how do I exactly do the rendering part (code example would be highly appreciated) ?