Recherche avancée

Médias (0)

Mot : - Tags -/configuration

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (99)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (16239)

  • Interesting behavior in Media Source Extensions

    28 mai 2020, par newtonian_fig

    I'm trying to build a fairly standard video player using Media Source Extensions ; however, I want the user to be able to control when the player moves on to a new video segment. For example, we might see the following behavior :

    



      

    1. Video player plays 1st segment
    2. 


    3. Source Buffer runs out of data causing the video to appear paused
    4. 


    5. When the user is ready, they click a button that adds the 2nd segment to the Source Buffer
    6. 


    7. The video continues by playing the 2nd segment
    8. 


    



    This works well, except that when the video appears paused during step 2 it doesn't stop at the last frame of the 1st segment. Instead, it stops two frames before the end of the 1st segment. Those last two frames aren't being dropped, they just get played after the user clicks the button to advance the video. This is an issue for my application, and I'm trying to figure out a way to make sure all of the frames from the 1st segment get played before the end of step 2.

    



    I suspect that these last two frames are getting held up in the video decoder buffer. Especially since calling endOfStream() on my Media Source after adding the 1st segment to the Source Buffer causes the 1st segment to play all the way through with no frames left behind.

    



    Additional Info

    



      

    • I created each video segment file from a series of PNGs using the following ffmpeg command
    • 


    



    ffmpeg -i %04d.png -movflags frag_keyframe+empty_moov+default_base_moof video_segment.mp4

    



      

    • Maybe this is a clue ? End of stream situations not handled correctly (last frames are dropped)
    • 


    • Another interesting thing to note is that if the video only has 2 frames or less, MSE doesn't play it at all.
    • 


    • The browser I'm using is Chrome. The code for my MSE player is just taken from the Google Developers example, but I'll post it here for completeness. This code only covers up to step 2 since that's where the issue is.
    • 


    



    <code class="echappe-js">&lt;script&gt;&amp;#xA;const mediaSource = new MediaSource();&amp;#xA;video.src = URL.createObjectURL(mediaSource);&amp;#xA;mediaSource.addEventListener(&amp;#x27;sourceopen&amp;#x27;, sourceOpen, { once: true });&amp;#xA;&amp;#xA;function sourceOpen() {&amp;#xA;  URL.revokeObjectURL(video.src);&amp;#xA;  const sourceBuffer = mediaSource.addSourceBuffer(&amp;#x27;video/mp4; codecs=&quot;avc1.64001f&quot;&amp;#x27;);&amp;#xA;  sourceBuffer.mode = &amp;#x27;sequence&amp;#x27;;&amp;#xA;&amp;#xA;  // Fetch the video and add it to the Source Buffer&amp;#xA;  fetch(&amp;#x27;https://s3.amazonaws.com/bucket_name/video_file.mp4&amp;#x27;)&amp;#xA;  .then(response =&gt; response.arrayBuffer())&amp;#xA;  .then(data =&gt; sourceBuffer.appendBuffer(data));&amp;#xA;}&amp;#xA;&amp;#xA;&lt;/code&gt;&lt;/pre&gt;&amp;#xA;
  • FFMPEG : Converting from raw audio to audio/mp4 (audio is being converted with slow speed)

    29 décembre 2017, par Valdir

    If I convert from mp3 to mp4 directly everything works perfectly. But if I try to convert from raw pcm, the audio speed is slowed down.

    I’ve tried the following (this works) :

    ffmpeg -i mp3/1.mp3 -strict -2 final.mp4

    This doesn’t work as expected :

    ffmpeg -f s16le -i final.raw -strict -2 -r 26 final.mp4

    With the following output :

    Input #0, s16le, from 'final.raw':
     Duration: 00:08:37.38, bitrate: 705 kb/s
       Stream #0:0: Audio: pcm_s16le, 44100 Hz, 1 channels, s16, 705 kb/s
    File 'final.mp4' already exists. Overwrite ? [y/N] y
    Output #0, mp4, to 'final.mp4':
     Metadata:
       encoder         : Lavf56.40.101
       Stream #0:0: Audio: aac ([64][0][0][0] / 0x0040), 44100 Hz, mono, fltp, 128 kb/s
       Metadata:
         encoder         : Lavc56.60.100 aac
    Stream mapping:
     Stream #0:0 -> #0:0 (pcm_s16le (native) -> aac (native))
    Press [q] to stop, [?] for help
    size=    8273kB time=00:08:37.38 bitrate= 131.0kbits/s
    video:0kB audio:8185kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.073808%

    I’ve tried to set parameters like :

    ffmpeg -ar 44100 -f s16le -i final.raw -strict -2 -r 26 final.mp4

    With no luck.

    In order to get the PCM from mp3 I’m using nodejs lame decoder :

    var decoder = new lame.Decoder({
           channels: 2,
           bitDepth: 16,
           sampleRate: 44100,
           bitRate: 128,
           outSampleRate: 44100, // 22050
           mode: lame.STEREO
       });
  • Unable to split audio using easy_audio_trimmer

    27 juillet 2023, par Sana Wasim

    can we use the easy_audio_trimmer package to split an audio ? I tried using the ffmpeg but it is conflicting with the above package and not work.

    &#xA;

    I tried splitting by using these functions and it gave an error at the FlutterFFmpeg() method and i cant find an alternative also the duration(filePath) in the command final durationResult = await flutterSound.duration(filePath) ; shows an error

    &#xA;

    Future<void> _splitAudio() async {&#xA;    setState(() {&#xA;      _progressVisibility = true;&#xA;    });&#xA;&#xA;    // Get the application documents directory&#xA;    final appDocumentsDirectory = await getApplicationDocumentsDirectory();&#xA;&#xA;    // Get the input audio file path&#xA;    final inputAudioPath = widget.file.path;&#xA;&#xA;    // Get the output file names for the two parts&#xA;    final outputFileName1 = &#x27;split_audio_part1.mp3&#x27;;&#xA;    final outputFileName2 = &#x27;split_audio_part2.mp3&#x27;;&#xA;&#xA;    // Get the output file paths for the two parts&#xA;    final outputPath1 = &#x27;${appDocumentsDirectory.path}/$outputFileName1&#x27;;&#xA;    final outputPath2 = &#x27;${appDocumentsDirectory.path}/$outputFileName2&#x27;;&#xA;&#xA;    // Calculate the duration of the original audio&#xA;    final originalDuration = await _getAudioDuration(inputAudioPath);&#xA;&#xA;    // Calculate the durations of the two parts&#xA;    final part1Duration = _startValue;&#xA;    final part2Duration = originalDuration - _endValue;&#xA;&#xA;    // Construct the FFmpeg command to split the audio&#xA;    final ffmpeg = FlutterFFmpeg();&#xA;    final splitCommand = &#x27;-i $inputAudioPath -ss 0 -t $part1Duration -c copy $outputPath1 -ss $_endValue -t $part2Duration -c copy $outputPath2&#x27;;&#xA;&#xA;    try {&#xA;      // Execute the FFmpeg command to split the audio&#xA;      final int result = await ffmpeg.execute(splitCommand);&#xA;&#xA;      if (result == 0) {&#xA;        setState(() {&#xA;          _progressVisibility = false;&#xA;        });&#xA;        debugPrint(&#x27;Audio split successfully.&#x27;);&#xA;      } else {&#xA;        setState(() {&#xA;          _progressVisibility = false;&#xA;        });&#xA;        debugPrint(&#x27;Failed to split audio.&#x27;);&#xA;      }&#xA;    } catch (error) {&#xA;      setState(() {&#xA;        _progressVisibility = false;&#xA;      });&#xA;      debugPrint(&#x27;Error while splitting audio: $error&#x27;);&#xA;    }&#xA;  }&#xA;&#xA;  Future<int> _getAudioDuration(String filePath) async {&#xA;    final flutterSound = FlutterSound();&#xA;    final durationResult = await flutterSound.duration(filePath);&#xA;    return durationResult.inMilliseconds;&#xA;  }&#xA;</int></void>

    &#xA;

    Dependencies

    &#xA;

     path_provider: ^2.0.15&#xA;  ffmpeg_kit_flutter: ^5.1.0&#xA;  audioplayers: ^4.1.0&#xA;  flutter_sound: ^9.2.13&#xA;

    &#xA;