
Recherche avancée
Autres articles (44)
-
(Dés)Activation de fonctionnalités (plugins)
18 février 2011, parPour gérer l’ajout et la suppression de fonctionnalités supplémentaires (ou plugins), MediaSPIP utilise à partir de la version 0.2 SVP.
SVP permet l’activation facile de plugins depuis l’espace de configuration de MediaSPIP.
Pour y accéder, il suffit de se rendre dans l’espace de configuration puis de se rendre sur la page "Gestion des plugins".
MediaSPIP est fourni par défaut avec l’ensemble des plugins dits "compatibles", ils ont été testés et intégrés afin de fonctionner parfaitement avec chaque (...) -
Activation de l’inscription des visiteurs
12 avril 2011, parIl est également possible d’activer l’inscription des visiteurs ce qui permettra à tout un chacun d’ouvrir soit même un compte sur le canal en question dans le cadre de projets ouverts par exemple.
Pour ce faire, il suffit d’aller dans l’espace de configuration du site en choisissant le sous menus "Gestion des utilisateurs". Le premier formulaire visible correspond à cette fonctionnalité.
Par défaut, MediaSPIP a créé lors de son initialisation un élément de menu dans le menu du haut de la page menant (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (6529)
-
pts and dts problems while encoding multiple streams to AVFormatContext with libavcodec and libavformat
20 novembre 2022, par WalleyMI am trying to encode a mpeg2video stream and a signed PCM 32 bit audio stream to a .mov file using ffmpeg's avcodec and avformat libraries.


My video stream is set up in almost the exact same way as is described here with my audio stream being set up in a very similar way.


My time_base for both audio and video is set to 1/fps.


Here is the overview output from setting up the encoder :




Output #0, mov, to ' /Recordings/SDI_Video.mov' :

Metadata :

encoder : Lavf59.27.100

Stream #0:0 : Video : mpeg2video (m2v1 / 0x3176326D), yuv420p, 1920x1080, q=2-31, 207360 kb/s, 90k tbn

Stream #0:1 : Audio : pcm_s32be (in32 / 0x32336E69), 48000 Hz, stereo, s32, 3072 kb/s



As I understand it my pts should be when the frame is presented while dts should be when the frame is decoded. This means that audio and video frame pts should be the same whereas dts should be incremental between them.


Essentially meaning interleaved audio and video frames should be in the following pts and dts order :


pts 112233
dts 123456


I am using this format to set my pts and dts :


videoFrame->pts = frameCounter;
 
if(avcodec_send_frame(videoContext, videoFrame) < 0)
{
 std::cout << "Failed to send video frame " << frameCounter << std::endl;
 return;
}
 
AVPacket videoPkt;
av_init_packet(&videoPkt);
videoPkt.data = nullptr;
videoPkt.size = 0;
videoPkt.flags |= AV_PKT_FLAG_KEY;
videoPkt.stream_index = 0;
videoPkt.dts = frameCounter * 2;
 
if(avcodec_receive_packet(videoContext, &videoPkt) == 0)
{
 av_interleaved_write_frame(outputFormatContext, &videoPkt);
 av_packet_unref(&videoPkt);
}



With audio the same except :


audioPkt.stream_index = 1;
audioPkt.dts = frameCounter * 2 + 1;



However, I still get problems with my dts setting shown in this output :




[mov @ 0x7fc1b3667480] Application provided invalid, non monotonically increasing dts to muxer in stream 0 : 1 >= 0

[mov @ 0x7fc1b3667480] Application provided invalid, non monotonically increasing dts to muxer in stream 0 : 2 >= 1

[mov @ 0x7fc1b3667480] Application provided invalid, non monotonically increasing dts to muxer in stream 0 : 3 >= 2



I would like to fix this issue.


-
Discord.js voice stop playing audio after 10 consecutive files
29 avril 2021, par SpiralioI am trying to do the simple task of playing a single MP3 file when a command is run. The file is stored locally, and I have FFmpeg installed on my computer. The code below is part of my command's file :


const Discord = require("discord.js");
const fs = require('fs');
const { Client, RichEmbed } = require('discord.js');
const config = require("../config.json");

let playing = undefined;
let connection = undefined;

module.exports.run = async (client, message, args, config) => {


 if (playing) playing.end()
 if (connection == undefined) await message.member.voice.channel.join().then((c) => {
 connection = c;
 })
 playing = connection.play('./sounds/sound.mp3')

}



(note that this code is heavily narrowed down to single out the issue)


When I run the command the first 9 times, it works perfectly - the file is played, and cuts off if it is already playing. I also want to note that the file is 2 minutes long. However, once I play the file for exactly the 10th time, the bot stops playing audio entirely - as long as all 10 times are overlapping (meaning I don't let the audio finish).


What's more confusing is that if an error is passed after the bot stops playing audio, it appears in an entirely different format than the standard Discord.js errors. For example, this code does not test to see if the user is in a voice channel, so if I purposefully crash the bot by initiating the command without being in a voice channel (after running the command 10 times), the error looks like this :


abort(RangeError: offset is out of bounds). Build with -s ASSERTIONS=1 for more info.
(Use `electron --trace-uncaught ...` to show where the exception was thrown)



(Preceded by a bunch of unformatted code) This however, is not consistent. It seems to only appear after letting the files run entirely.


The issue only fixes itself when the entire bot restarts. Any help would be appreciated.


-
How Can I Configure Storybook to Use React-App-Rewired ?
8 août 2022, par josephI'm working on a project that implements react-app-rewired to send headers to the server in order to bypass
ReferenceError: SharedArrayBuffer is not defined
(I'm getting this error from using the@ffmpeg/ffmpeg
library).

// config-overrides.js
const {
 override,
 // disableEsLint,
 // addBabelPlugins,
 // overrideDevServer
} = require('customize-cra')

module.exports = {
 devServer(configFunction) {
 // eslint-disable-next-line func-names
 return function (proxy, allowedHost) {
 const config = configFunction(proxy, allowedHost)

 // Set loose allow origin header to prevent CORS issues
 config.headers = {
 'Access-Control-Allow-Origin': '*',
 'Cross-Origin-Opener-Policy': 'same-origin',
 'Cross-Origin-Embedder-Policy': 'require-corp',
 'Cross-Origin-Resource-Policy': 'cross-origin'
 }

 return config
 }
 }
}



// package.json
"scripts": {
 "start": "react-app-rewired start",
 "build": "react-app-rewired build",
 "test": "react-app-rewired test --transformIgnorePatterns \"node_modules/(?!siriwave)/\"",
 "eject": "react-scripts eject",
 "storybook": "start-storybook -p 6006 -s public",
 "build-storybook": "build-storybook -s public"
}



Though this works when I run
npm start
, meaning the headers get sent to the server, it doesn't work when I runnpm run storybook
, and I still get theSharedArrayBuffer is not defined
error. I'm assuming it's becausenpm run storybook
still usesreact-scripts
as opposed toreact-app-rewired
under the hood, but I'm not sure where I can change the configurations for this. Any ideas ?