Recherche avancée

Médias (17)

Mot : - Tags -/wired

Autres articles (96)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • Mediabox : ouvrir les images dans l’espace maximal pour l’utilisateur

    8 février 2011, par

    La visualisation des images est restreinte par la largeur accordée par le design du site (dépendant du thème utilisé). Elles sont donc visibles sous un format réduit. Afin de profiter de l’ensemble de la place disponible sur l’écran de l’utilisateur, il est possible d’ajouter une fonctionnalité d’affichage de l’image dans une boite multimedia apparaissant au dessus du reste du contenu.
    Pour ce faire il est nécessaire d’installer le plugin "Mediabox".
    Configuration de la boite multimédia
    Dès (...)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

Sur d’autres sites (18400)

  • IframeExtractor don't output sound with rtsp

    9 janvier 2013, par Kamax

    I use IframeExtractor from the git mooncatventure, it play nice the .mov file.
    But when i try to read a rtsp stream, i hear no sound.

    This is the FFMEG dump from the rtsp stream :

    Metadata:
    title           : unknown
    comment         : unknown
    Duration: N/A, start: 49435.000589, bitrate: 258 kb/s
    Program 3223
    No Program
    Stream #0:0: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p, 720x576 [SAR 64:45 DAR 16:9], 25 fps, 25 tbr, 90k tbn, 50 tbc
    Stream #0:1(fra): Audio: aac ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 142 kb/s
    Stream #0:2(fra): Subtitle: dvb_teletext ([6][0][0][0] / 0x0006)
    Stream #0:3(qad): Audio: aac ([15][0][0][0] / 0x000F), 48000 Hz, mono, fltp, 47 kb/s
    Stream #0:4(qaa): Audio: aac ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 68 kb/s

    And this is the dump from the local .mov file that work :

    Metadata:
    major_brand     : qt  
    minor_version   : 0
    compatible_brands: qt  
    creation_time   : 2010-01-17 21:52:33
    model           : iPhone 3GS
    model-eng       : iPhone 3GS
    date            : 2010-01-17T16:52:33-0500
    date-eng        : 2010-01-17T16:52:33-0500
    encoder         : 3.1.2
    encoder-eng     : 3.1.2
    make            : Apple
    make-eng        : Apple
    Duration: 00:00:03.25, start: 0.000000, bitrate: 3836 kb/s
    Stream #0:0(und): Video: h264 (Baseline) (avc1 / 0x31637661), yuv420p, 640x480, 3695 kb/s, 30.02 fps, 30 tbr, 600 tbn, 1200 tbc
    Metadata:
     rotate          : 90
     creation_time   : 2010-01-17 21:52:33
     handler_name    : Core Media Data Handler
    Stream #0:1(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 63 kb/s
    Metadata:
     creation_time   : 2010-01-17 21:52:33
     handler_name    : Core Media Data Handler

    The audio class that manage sounds contain a codec detector which say that the codec CODEC_ID_AAC is found for the two input :

    audioStreamBasicDesc_.mFormatFlags = 0;
    switch (_audioCodecContext->codec_id) {
       case CODEC_ID_MP3:
            audioStreamBasicDesc_.mFormatID = kAudioFormatMPEGLayer3;
           break;
       case CODEC_ID_AAC:
            audioStreamBasicDesc_.mFormatID = kAudioFormatMPEG4AAC;
            audioStreamBasicDesc_.mFormatFlags = kMPEG4Object_AAC_Main;
           NSLog(@"audio format aac %s (%d) is  supported",  _audioCodecContext->codec_name, _audioCodecContext->codec_id);
           break;
    }

    I see data going into the buffer but i hear nothing. It's maybe audioStreamBasicDesc_ which has wrong settings but i can't find what.

    Is it possible that it's not the same AAC codec ?

    Has someone experienced the same issue ?

    Any help are welcome, i'm on this problem since some days now.

    Edit :
    I have found a error that i had not before, i don't know how to resolve it. If i change audioStreamBasicDesc.mFramesPerPacket to 0 or divided by 2, the error message dissapear.

    AudioConverterNew returned 'fmt?'
    Prime failed ('fmt?'); will stop (72000/0 frames)
  • Can not play local rtsp URL using FFmpeg in iOS

    12 juin 2015, par Priyanka

    I am using FFmpeg for stream rtsp URL in iOS.
    I am trying to stream a local url but my app is failed to open url
    avformat_open_input method always return -5

    I have played the same url rtsp://172.16.1.226:5544/1 on VLC media player on my iPhone and macbook it works on both.

    After few research i have found there is some problem with rtsp_transport

    I was using av_dict_set(&serverOpt, "rtsp_transport", "tcp", 0); for the server configuration while opening url and the result is can not open feed.

    When I changed it to av_dict_set(&serverOpt, "rtsp_transport", "udp", 1);
    I am able to open url successfully but I continuously getting error rtsp 1 missing packet and so on.

    Can anybody help what should be the right configuration while opening a local rtsp url using ffmpeg.
    Should i need to update av_dict_set(&serverOpt, "rtsp_transport", "udp", 1)

    Thanks in advance

  • C - FFmpeg streaming from a C program ?

    16 août 2017, par golmschenk

    I’m looking to replicate an FFmpeg command-line command in my C code. Specifically I would like to be able to run :

    ffmpeg -re -i video.mp4 -f mpegts udp://localhost:7777

    One thing I’ve noticed when looking at people’s code who have used the libraries of FFmpeg in their own code is that they often have a few hundred lines of code for a single command similar to an FFmpeg command-line command. I’m guessing this is just because they are doing something very specific, because if I can run that short command on my command line and get what I want it should probably only take about ten lines of code to do the same thing in my C code. This should only take about that much work right ? Why would it take much more ?

    I’m having a bit of difficulty finding explanations on how to use the streaming capabilities of the FFmpeg libraries that aren’t overly complex because they’re for a very specific purpose. Can anyone explain how I might go about writing the code for the above command ? Or at the very least point me to some documentation explaining how to write such a script/program ? Thank you much !

    EDIT : I do hope to run this from an iPhone app eventually so I won’t just be able to straight up call FFmpeg from my program. I’ll need to use the libraries used by FFmpeg.