Recherche avancée

Médias (91)

Autres articles (2)

  • Menus personnalisés

    14 novembre 2010, par

    MediaSPIP utilise le plugin Menus pour gérer plusieurs menus configurables pour la navigation.
    Cela permet de laisser aux administrateurs de canaux la possibilité de configurer finement ces menus.
    Menus créés à l’initialisation du site
    Par défaut trois menus sont créés automatiquement à l’initialisation du site : Le menu principal ; Identifiant : barrenav ; Ce menu s’insère en général en haut de la page après le bloc d’entête, son identifiant le rend compatible avec les squelettes basés sur Zpip ; (...)

  • Mise à disposition des fichiers

    14 avril 2011, par

    Par défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
    Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
    Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)

Sur d’autres sites (2206)

  • ffmpeg caps audio bitrate to 128kb/s [closed]

    1er mai 2013, par Andrew Burns

    I want to upconvert a file to 256kb/s. (I understand I am losing quality and all
    that, I know that I really want to upconvert a 64kb/s=>256kb/s)

    The command I am using

    ffmpeg -i "Same Love.m4a" -acodec libfaac "Same Love.m4a" -b:a 256kb \
     -loglevel debug

    I have used every combination that I can think of and ffmpeg will only upconvert
    to 128kbs !

    Here is a log of my output

    $ ffmpeg -i "Same Love.m4a" -acodec libfaac "ready_to_import_to_itunes/Same Love.m4a" -b:a 256kb -loglevel debug
    ffmpeg version 1.0 Copyright (c) 2000-2012 the FFmpeg developers
      built on Oct 12 2012 12:31:30 with Apple clang version 4.1 (tags/Apple/clang-421.11.66) (based on LLVM 3.1svn)
      configuration : —prefix=/usr/local/Cellar/ffmpeg/1.0 —enable-shared —enable-gpl —enable-version3 —enable-nonfree —enable-hardcoded-tables —cc=cc —host-cflags= —host-ldflags= —enable-libx264 —enable-libfaac —enable-libmp3lame —enable-libxvid
      libavutil      51. 73.101 / 51. 73.101
      libavcodec     54. 59.100 / 54. 59.100
      libavformat    54. 29.104 / 54. 29.104
      libavdevice    54.  2.101 / 54.  2.101
      libavfilter     3. 17.100 /  3. 17.100
      libswscale      2.  1.101 /  2.  1.101
      libswresample   0. 15.100 /  0. 15.100
      libpostproc    52.  0.100 / 52.  0.100
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fc6ab006600] Format mov,mp4,m4a,3gp,3g2,mj2 probed with size=2048 and score=100
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fc6ab006600] ISO : File Type Major Brand : mp42
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fc6ab006600] File position before avformat_find_stream_info() is 30844
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fc6ab006600] All info found
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fc6ab006600] File position after avformat_find_stream_info() is 31216
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'Same Love.m4a' :
      Metadata :
        major_brand : mp42
        minor_version : 0
        compatible_brands : mp42isom3gp63g2a3gp4
        creation_time : 2012-08-11 09:56:26
      Duration : 00:05:18.71, start : 0.000000, bitrate : 64 kb/s
        Stream #0:0(und), 1, 1/44100 : Audio : aac (mp4a / 0x6134706D), 44100 Hz, stereo, s16, 64 kb/s
        Metadata :
          creation_time : 2012-08-11 09:56:26
          handler_name : soun
    [abuffer @ 0x7fc6aac0e560] Setting entry with key 'time_base' to value '1/44100'
    [abuffer @ 0x7fc6aac0e560] Setting entry with key 'sample_rate' to value '44100'
    [abuffer @ 0x7fc6aac0e560] Setting entry with key 'sample_fmt' to value 's16'
    [abuffer @ 0x7fc6aac0e560] Setting entry with key 'channel_layout' to value '0x3'
    [graph 0 input from stream 0:0 @ 0x7fc6aac0e660] tb:1/44100 samplefmt:s16 samplerate:44100 chlayout:0x3
    [aformat @ 0x7fc6aac0eba0] Setting entry with key 'sample_fmts' to value 's16'
    [aformat @ 0x7fc6aac0eba0] Setting entry with key 'channel_layouts' to value '0x4,0x3,0x7,0x107,0x37,0x3f'
    Output #0, ipod, to 'ready_to_import_to_itunes/Same Love.m4a' :
      Metadata :
        major_brand : mp42
        minor_version : 0
        compatible_brands : mp42isom3gp63g2a3gp4
        encoder : Lavf54.29.104
        Stream #0:0(und), 0, 1/44100 : Audio : aac (mp4a / 0x6134706D), 44100 Hz, stereo, s16, 128 kb/s
        Metadata :
          creation_time : 2012-08-11 09:56:26
          handler_name : soun
    Stream mapping :
      Stream #0:0 -> #0:0 (aac -> libfaac)
    Press [q] to stop, [?] for help
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fc6ab006600] demuxer injecting skip 1024
    [aac @ 0x7fc6ab02f600] skip 2048 samples due to side data
    [aac @ 0x7fc6ab02f600] skip whole frame, skip left : 0
    EOF on sink link output stream 0:0:default.28.0kbits/s
    No more output streams to write to, finishing.
    size=    5028kB time=00:05:18.71 bitrate= 129.2kbits/s
    video:0kB audio:4974kB subtitle:0 global headers:0kB muxing overhead 1.093817%
    [AVIOContext @ 0x7fc6aac0ca20] Statistics : 2580581 bytes read, 0 seeks
    

    What I think is happening is that ffmpeg determines that there is really no
    benifit from going to 256k so caps it at 128.

    How do I force it to output the file at the bitrate specified ? (I have tried
    with mp3 too and the samething happens)

  • How to write a Live555 FramedSource to allow me to stream H.264 live

    22 juillet 2015, par Garviel

    I’ve been trying to write a class that derives from FramedSource in Live555 that will allow me to stream live data from my D3D9 application to an MP4 or similar.

    What I do each frame is grab the backbuffer into system memory as a texture, then convert it from RGB -> YUV420P, then encode it using x264, then ideally pass the NAL packets on to Live555. I made a class called H264FramedSource that derived from FramedSource basically by copying the DeviceSource file. Instead of the input being an input file, I’ve made it a NAL packet which I update each frame.

    I’m quite new to codecs and streaming, so I could be doing everything completely wrong. In each doGetNextFrame() should I be grabbing the NAL packet and doing something like

    memcpy(fTo, nal->p_payload, nal->i_payload)

    I assume that the payload is my frame data in bytes ? If anybody has an example of a class they derived from FramedSource that might at least be close to what I’m trying to do I would love to see it, this is all new to me and a little tricky to figure out what’s happening. Live555’s documentation is pretty much the code itself which doesn’t exactly make it easy for me to figure out.

  • ffmpeg How to stream to my own server ?

    10 février 2018, par Ricky

    I have an app that can take in an hls source and output to an rtmp endpoint but what if I want to stream to my own server ? How would I set up my own rtmp end point. I would love to be able take a stream in and stream that stream to my own app so the user can see the stream. for example :

    if I stream to my own server like this

    ffmpeg -re -i "https://mnmedias.api.telequebec.tv/m3u8/29880.m3u8" -f flv "160.222.22.22"

    I have been googling all around and haven’t found any tutorials or guides in doing so. Any pointers would greatly be appreciated.