
Recherche avancée
Autres articles (69)
-
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 est la première version de MediaSPIP stable.
Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (10470)
-
Need to get level of audio signal in dbfs/lufs during playback with libmpv
11 avril 2023, par NullUserExploring the possibility of moving the video player from libvlc to libmpv, I came across a problem - I can not find a way to get the value of the audio signal level for each video frame (in dbfs / lufs). This is necessary to display the volume level during video playback. But libvlc and ffme have similar mechanisms.


For example, ffme has the ability to get raw audio data for further processing before outputting to an audio device. You can bind to this event (Unosquare.FFME.MediaElement.RenderingAudio http://unosquare.github.io/ffmediaelement/api/Unosquare.FFME.MediaElement.html#Unosquare_FFME_MediaElement_RenderingAudio) which carries RenderingAudioEventArgs, which has a method that allows get to this data (GetBufferData() http://unosquare.github.io/ffmediaelement/api/Unosquare.FFME.Common.RenderingAudioEventArgs.html#Unosquare_FFME_Common_RenderingAudioEventArgs_GetBufferData) which we can read and process. Libvlc can perform similar operations.


But I cannot find such functionality in libmpv. Does anyone have any ideas how to solve this issue ?


P.S. It is planned to use mpv.net (https://github.com/hudec117/Mpv.NET-lib-/blob/master/src/Mpv.NET) for the move. This SDK has the ability to call ffmpeg for video and audio processing, maybe anyone have any idea how to use ffmpeg to solve this problem ?


-
Rotating an MP4 Video at the Packet Level Using FFmpeg's Java CPP Presets and Outputting as M3U8
16 mars 2023, par Yunus Emre Guney StudentI'm attempting to rotate an MP4 video at the packet level and output it as an M3U8 using FFmpeg's Java CPP presets. Here's what I've tried :


First, I created a
rotationMatrixPointer
using an IntPointer and allocated memory for it usingavutil.av_malloc(size)
. Then, I set the display rotation to 90 degrees usingavutil.av_display_rotation_set(rotationMatrixPointer, 90)
.

Next, I created a bytePointer using the rotationMatrixPointer and set its limit to
rotationMatrixPointer.sizeof() * rotationMatrixPointer.limit()
. I then added side data to the videoPkt usingav_packet_add_side_data(videoPkt, AV_PKT_DATA_DISPLAYMATRIX, bytePointer, bytePointer.limit())
.

If ret < 0, I print "cannot add side data". Finally, I write the packet using
writePacket(videoPkt, (AVCodecContext)null)
and release the videoPkt usingav_packet_unref(videoPkt)
.

packet level rotation matrix side data hls stream output :


IntPointer rotationMatrixPointer = new IntPointer(avutil.av_malloc(size)).capacity(size);
 avutil.av_display_rotation_set(rotationMatrixPointer, 90);
 BytePointer bytePointer = new BytePointer(rotationMatrixPointer);
 bytePointer.limit(rotationMatrixPointer.sizeof() * rotationMatrixPointer.limit());
 int ret = av_packet_add_side_data(videoPkt, AV_PKT_DATA_DISPLAYMATRIX, bytePointer, bytePointer.limit());
 
 if (ret < 0) {
 System.out.println("cannot add side data");
 }
 writePacket(videoPkt, (AVCodecContext)null);
 
 av_packet_unref(videoPkt);



Although I'm not getting any errors, the video isn't being rotated. However, if I add the rotation matrix at the stream level and output it as an MP4, it works properly.


stream level mp4 to mp4 is working :


int size = 9 * Pointer.sizeof(IntPointer.class);
 IntPointer rotationMatrixPointer = new IntPointer(avutil.av_malloc(size)).capacity(size);
 
 avutil.av_display_rotation_set(rotationMatrixPointer, rotation);

 BytePointer bytePointer = new BytePointer(rotationMatrixPointer);
 bytePointer.limit(rotationMatrixPointer.sizeof() * rotationMatrixPointer.limit());
 
 ret = avformat.av_stream_add_side_data(stream, avcodec.AV_PKT_DATA_DISPLAYMATRIX , bytePointer, bytePointer.limit());
 if (ret < 0) {
 }



What is the proper way to rotate an MP4 input and output it as an HLS stream (M3U8) at the packet level using FFmpeg ? C++ examples are also acceptable.


-
avcodec/mediacodec_wrapper : get api level via NDK method
10 janvier 2023, par Zhao Zhiliavcodec/mediacodec_wrapper : get api level via NDK method
android_get_device_api_level() is a static inline before API level
29. It was implemented via __system_property_get(). We can do the
same thing, but I don't want to mess up with __system_property_get.Signed-off-by : Zhao Zhili <zhilizhao@tencent.com>