Recherche avancée

Médias (1)

Mot : - Tags -/bug

Autres articles (106)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (11091)

  • Coverting mp4 Stream's Audio to MP3 Instead of AAC [on hold]

    15 janvier 2019, par user3763099

    I’ve written 2 programs : one records streams that I’m getting via video encoders, and another plays back all of the streams synchronized to each other. I’m having an issue with audio however. The recorded streams are getting encoded by default into AAC - the playback tool is unable to find a codec to decode this format, so audio is not ideal.

    I don’t really have any control over what the stream looks like, but when I go to record the frames is there any way to tell it to record in a different format ? If not it looks like I’ll need to compile FFMPEG myself with AAC support, but I was hoping to avoid that.

    This was flagged as being "too broad," but I’m not sure how, it’s pretty darn specific. I am asking if there is a way to record a stream via C++ that is coming across with AAC audio as MP3 audio instead. Does that clarify ? I literally cannot get anymore specific than that.

  • Flutter : Failed assertion : 'file.absolute.existsSync()' : is not true

    11 août 2022, par whatwhatwhat

    In my app, a user can send a file to others in a group chat. First, the user records some audio using their mic. The file is then touched up using FFMPEG. Then, the file is uploaded to Firebase Cloud Storage and if this is successful, a record is written in Firebase Realtime Database.

    


    I'm getting the error below when the user records a long audio file and then presses submit. It almost seems as though FFMPEG hasn't finished processing the file...but I thought I used my async/await correctly to make sure that this processing is finished before moving on ?

    


    


    ##MyAppFile## saveMyAppFileToCloudStorage Error : 'package:firebase_storage/src/reference.dart' : Failed assertion : line 127 pos 12 : 'file.absolute.existsSync()' : is not true.

    


    


    Psuedo-code :

    


      

    1. User records audio
    2. 


    3. Audio file is processed using FFMPEG and the new processed file is created on the user's phone
    4. 


    5. User hits submit, uploading the file to Cloud Storage and, if successful, writing a record to Realtime Database
    6. 


    


    Order of Functions After User Hits Submit :

    


      

    1. msgInput.dart -> sendMyAppFile()
    2. 


    3. msgInput.dart -> prepareMyAppFileForSending()
    4. 


    5. msgInput.dart -> runFFMPEGHighLow()
    6. 


    7. message_dao.dart -> sendMyAppFile()
    8. 


    9. message_dao.dart -> saveMyAppFileToCloudStorage() //ERROR COMES FROM THIS FUNCTION
    10. 


    


    The Code :

    


    //msgInput.dart&#xA;Future<void> sendMyAppFile() async {&#xA;    if (sendableMyAppFileExists == 1) {&#xA;      final MyAppFileReadyToBeSent = await prepareMyAppFileForSending();&#xA;&#xA;      if (MyAppFileReadyToBeSent == &#x27;1&#x27;) {&#xA;        messageDao.sendMyAppFile(MyAppFile, filepath, filename); &#xA;      } else {&#xA;      &#xA;      }&#xA;    }&#xA;&#xA;    setState(() {&#xA;      sendableMyAppFileExists = 0;&#xA;    });&#xA;  }&#xA;  &#xA;  Future<string> prepareMyAppFileForSending() async {&#xA;    if (sendableMyAppFileExists == 1) {&#xA;      if (recordedMyAppFileFilterID == &#x27;1&#x27;) {&#xA;&#xA;        await runFFMPEGHighLow(&#x27;1&#x27;); &#xA;&#xA;        return &#x27;1&#x27;;&#xA;      }&#xA;&#xA;      if (recordedMyAppFileFilterID == &#x27;2&#x27;) {&#xA;&#xA;        await runFFMPEGHighLow(&#x27;2&#x27;); &#xA;&#xA;        return &#x27;1&#x27;;&#xA;      }&#xA;    }&#xA;&#xA;    return &#x27;0&#x27;;&#xA;  }&#xA;  &#xA;  Future<void> runFFMPEGHighLow(String filterID) async { &#xA;    if (filterID != &#x27;1&#x27; &amp;&amp; filterID != &#x27;2&#x27;) {&#xA;      return;&#xA;    }&#xA;&#xA;    if (sendableMyAppFileExists == 1) {&#xA;      if (filterID == &#x27;1&#x27;) {&#xA;&#xA;        await FFmpegKit.executeAsync(/*...parms...*/);&#xA;        setState(() {&#xA;          currentMyAppFileFilename = currentMyAppFileFilename &#x2B; &#x27;1.mp3&#x27;; &#xA;        });&#xA;&#xA;      }&#xA;&#xA;      if (filterID == &#x27;2&#x27;) {&#xA;&#xA;        await FFmpegKit.executeAsync(/*...parms...*/);&#xA;        setState(() {&#xA;          currentMyAppFileFilename = currentMyAppFileFilename &#x2B; &#x27;2.mp3&#x27;;&#xA;        });&#xA;&#xA;      }&#xA;    }&#xA;  }&#xA;  &#xA;//message_dao.dart&#xA;void sendMyAppFile(ChatData MyAppFile, String filepath, String filename) {&#xA;    saveMyAppFileToCloudStorage(filepath, filename).then((value) {&#xA;      if (value == true) {&#xA;        saveMyAppFileToRTDB(MyAppFile);&#xA;      }&#xA;    });&#xA;  }&#xA;  &#xA;Future<bool> saveMyAppFileToCloudStorage(String filepath, String filename) async {&#xA;    //filepath: /data/user/0/com.example.MyApp/app_flutter/MyApp/MyAppAudioFiles/MyAppFiles/2d7af6ae-6361-4be5-8209-8498dd17d77d1.mp3&#xA;    //filename: 2d7af6ae-6361-4be5-8209-8498dd17d77d1.mp3&#xA;&#xA;    _firebaseStoragePath = MyAppFileStorageDir &#x2B; filename;&#xA;    &#xA;    File file = File(filepath);&#xA;&#xA;    try {&#xA;      await _firebaseStorage&#xA;          .ref(_firebaseStoragePath)&#xA;          .putFile(file);&#xA;      return true;&#xA;    } catch (e) {&#xA;      print(&#x27;##MyAppFile## saveMyAppFileToCloudStorage Error: &#x27; &#x2B; e.toString()); //ERROR COMES FROM THIS LINE&#xA;      return false;&#xA;    }&#xA;    return true;&#xA;  }&#xA;</bool></void></string></void>

    &#xA;

  • Record video of a specific window with ffmpeg

    6 mars 2012, par Esteban Angee

    Im currently working with ubuntu 10.04 and ffmpeg. Here is my situation :

    I have this command which creates a window and reproduces a video in it :

    video_handle/static/simpleVRML media/generated/video1330515739317/chunk0.avi

    I need to record the video that is being displayed in that video container and save it to a video file ; webm is preferred. The video length is exactly 1 second and fps is 29.97

    I have already tried this command :

    ffmpeg -loglevel panic -f x11grab -s 640x480 -r 25 -i :0.0+0,50 -vframes 30 -sameq -y out.mpg >/dev/null 2>&amp;1

    It actually records the screen as the container emerges but I need the output to be really accurate

    Any ideas ???