Recherche avancée

Médias (91)

Autres articles (52)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

Sur d’autres sites (9028)

  • i m not able to play my video in my phone after merging through ffmpeg in flutter ?

    16 juin 2021, par Muhammad Arbaz Zafar

    this is my code

    


    void _videoMerger() async {


    final appDir = await syspaths.getApplicationDocumentsDirectory();
    String rawDocumentPath = appDir.path;
    final outputPath = '$rawDocumentPath/outputPath.mp4';
    final FlutterFFmpeg _flutterFFmpeg = new FlutterFFmpeg();
    String commandToExecute = ' -i ${_storedVideoOne.path} -i ${_storedVideoTwo.path} -filter_complex "[0:v][1:v]concat=n=2:v=1:a=0[out]" -map "[out]" $outputPath.mp4';
    _flutterFFmpeg.execute(commandToExecute).then((rc) => print("FFmpeg process exited with rc $rc"));

    await MediaStore.saveFile(outputPath);

  }


    


    my video merged and i get file in my mobile and then i try to play but its not able to play after that i built my code in emulator and merge vide then merge video successfuly but still not able to play my merge video so guys any solution ?

    


  • mp4 generated using MediaMuxer doesn't play in ffmpeg

    7 février 2019, par APD

    I need to generate an mp4 video from the raw h264 bitstream only (hence don’t have any audio channel). To do the same, I used the MediaMuxer APIs from Android. The output mp4 videos that got generated, seems fine and are playing well on VLC media player. But the generated videos don’t play on chrome browser (on Ubuntu machine) and same is the case with ffmpeg.
    Other thing that I noticed is that, the mp4 generated with Android devices greater than Nougat (i.e., Oreo and later) are fine (i.e., they play on Chrome browser and ffmpeg also plays it well), however those generated from devices before Oreo, have this problem.
    Has anybody observed such a behavior and what can be done to handle this on Android-devices before Oreo ?

  • Google Speech - Streaming Request Returns EOF Error

    16 octobre 2017, par Josh

    Using Go, I’m taking a RTMP stream, transcoding it to FLAC (using ffmpeg) and attempting to stream to Google’s Speech API to transcribe the audio. However, I keep getting EOF errors when sending the data. I can’t find any information on this error in the docs so I’m not exactly sure what’s causing it.

    I’m chunking the received data into 3s clips (length isn’t relevant as long as it’s less than the maximum length of a streaming recognition request).

    Here is the core of my code :

    func main() {

       done := make(chan os.Signal)
       received := make(chan []byte)

       go receive(received)
       go transcribe(received)

       signal.Notify(done, os.Interrupt, syscall.SIGTERM)

       select {
       case <-done:
           os.Exit(0)
       }
    }

    func receive(received chan<- []byte) {
       var b bytes.Buffer
       stdout := bufio.NewWriter(&b)

       cmd := exec.Command("ffmpeg", "-i", "rtmp://127.0.0.1:1935/live/key", "-f", "flac", "-ar", "16000", "-")
       cmd.Stdout = stdout

       if err := cmd.Start(); err != nil {
           log.Fatal(err)
       }

       duration, _ := time.ParseDuration("3s")
       ticker := time.NewTicker(duration)

       for {
           select {
           case <-ticker.C:
               stdout.Flush()
               log.Printf("Received %d bytes", b.Len())
               received <- b.Bytes()
               b.Reset()
           }
       }
    }

    func transcribe(received <-chan []byte) {
       ctx := context.TODO()

       client, err := speech.NewClient(ctx)
       if err != nil {
           log.Fatal(err)
       }

       stream, err := client.StreamingRecognize(ctx)
       if err != nil {
           log.Fatal(err)
       }

       // Send the initial configuration message.
       if err = stream.Send(&speechpb.StreamingRecognizeRequest{
           StreamingRequest: &speechpb.StreamingRecognizeRequest_StreamingConfig{
               StreamingConfig: &speechpb.StreamingRecognitionConfig{
                   Config: &speechpb.RecognitionConfig{
                       Encoding:        speechpb.RecognitionConfig_FLAC,
                       LanguageCode:    "en-GB",
                       SampleRateHertz: 16000,
                   },
               },
           },
       }); err != nil {
           log.Fatal(err)
       }

       for {
           select {
           case data := <-received:
               if len(data) > 0 {
                   log.Printf("Sending %d bytes", len(data))
                   if err := stream.Send(&speechpb.StreamingRecognizeRequest{
                       StreamingRequest: &speechpb.StreamingRecognizeRequest_AudioContent{
                           AudioContent: data,
                       },
                   }); err != nil {
                       log.Printf("Could not send audio: %v", err)
                   }
               }
           }
       }
    }

    Running this code gives this output :

    2017/10/09 16:05:00 Received 191704 bytes
    2017/10/09 16:05:00 Saving 191704 bytes
    2017/10/09 16:05:00 Sending 191704 bytes
    2017/10/09 16:05:00 Could not send audio: EOF

    2017/10/09 16:05:03 Received 193192 bytes
    2017/10/09 16:05:03 Saving 193192 bytes
    2017/10/09 16:05:03 Sending 193192 bytes
    2017/10/09 16:05:03 Could not send audio: EOF

    2017/10/09 16:05:06 Received 193188 bytes
    2017/10/09 16:05:06 Saving 193188 bytes
    2017/10/09 16:05:06 Sending 193188 bytes // Notice that this doesn't error

    2017/10/09 16:05:09 Received 191704 bytes
    2017/10/09 16:05:09 Saving 191704 bytes
    2017/10/09 16:05:09 Sending 191704 bytes
    2017/10/09 16:05:09 Could not send audio: EOF

    Notice that not all of the Sends fail.

    Could anyone point me in the right direction here ? Is it something to do with the FLAC headers or something ? I also wonder if maybe resetting the buffer causes some of the data to be dropped (i.e. it’s a non-trivial operation that actually takes some time to complete) and it doesn’t like this missing information ?

    Any help would be really appreciated.