Recherche avancée

Médias (29)

Mot : - Tags -/Musique

Autres articles (50)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Mise à disposition des fichiers

    14 avril 2011, par

    Par défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
    Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
    Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

Sur d’autres sites (7891)

  • avcodec/libvpxenc : fix setting amount of threads used for encoding

    19 mai 2018, par James Almer
    avcodec/libvpxenc : fix setting amount of threads used for encoding
    

    The libvpx doxy says that a value of 0 for the g_threads field is
    equivalent to a value of 1, whereas for avctx->thread_count it means
    the maximum amount of threads possible for the host system.

    Use av_cpu_count() to get the correct thread count when auto threads
    is requested.

    Reviewed-by : James Zern <jzern@google.com>
    Signed-off-by : James Almer <jamrial@gmail.com>

    • [DH] libavcodec/libvpxenc.c
  • Getting raw h264 packages from USB camera on Raspberry Pi

    13 juin 2018, par Aninano

    I am trying to receive H264 frames from a USB webcamera connected to my Raspberry PI

    Using the RPi Camera Module I can run the following command to get H264 data outputted in stdin : raspivid -t 0 -w 640 -h 320 -fps 15 -o - with close to zero latency

    Is there an equivalent function to do this with a USB camera ? I have two USB cameras I would like to do this with.

    Using ffprobe /dev/videoX I get the following output : (shorted down to the improtant details) :

    $ ffprobe /dev/video0
    ...
    Input #0, video4linux2,v4l2, from '/dev/video0':
    Duration: N/A, start: 18876.273861, bitrate: 147456 kb/s
    Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1280x720, 147456 kb/s, 10 fps, 10 tbr, 1000k tbn, 1000k tbc

    $ ffprobe /dev/video1
    ...
    Input #0, video4linux2,v4l2, from '/dev/video1':
    Duration: N/A, start: 18980.783228, bitrate: 115200 kb/s
    Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 800x600, 115200 kb/s, 15 fps, 15 tbr, 1000k tbn, 1000k tbc


    $ ffprobe /dev/video2
    ...
    Input #0, video4linux2,v4l2, from '/dev/video2':
    Duration: N/A, start: 18998.984143, bitrate: N/A
    Stream #0:0: Video: h264 (Main), yuv420p(progressive), 1920x1080, -5 kb/s, 30 fps, 30 tbr, 1000k tbn, 2000k tbc

    As far as I can tell two of them are not H264, which will need to be "decoded" to H264 so I understand there is added a bit latency there. But the third one (video2) is H264 so I should be able to get data from it ? I’ve tried to just pipe it out with CAT but it says I got invalid arguments.

    I’ve come as far as using FFMPEG might be the only option here. Would like to use software easily available for all RPi (apt install).

    Bonus question regarding H264 packages : When I stream the data from raspivid command to my decoder it works perfectly. But if I decide to drop the 10 first packages then it never initializes the decoding process and just shows a black background. Anyone know what might be missing in the first packages that I might be able to recreate in my software so I dont have to restart the stream for every newly connected user ?

    EDIT : Bonus Question Answer : After googling around I see that the first two frames raspivid sends me are. So by ignoring the two first frames my decoder wont "decode" properly. So if I save those frames and send them first to all new users it works perfectly. Seems like these are used in some kind of initial process.

    0x27 = 01 00111 = type 7    Sequence parameter set (B-frame)
    0x28 = 01 01000 = type 8    Picture parameter set (B-frame)
  • Getting raw h264 packets from USB camera on Raspberry Pi

    14 juin 2018, par Aninano

    I am trying to receive H264 frames from a USB webcamera connected to my Raspberry PI

    Using the RPi Camera Module I can run the following command to get H264 data outputted in stdin : raspivid -t 0 -w 640 -h 320 -fps 15 -o - with close to zero latency

    Is there an equivalent function to do this with a USB camera ? I have two USB cameras I would like to do this with.

    Using ffprobe /dev/videoX I get the following output : (shorted down to the important details) :

    $ ffprobe /dev/video0
    ...
    Input #0, video4linux2,v4l2, from '/dev/video0':
    Duration: N/A, start: 18876.273861, bitrate: 147456 kb/s
    Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1280x720, 147456 kb/s, 10 fps, 10 tbr, 1000k tbn, 1000k tbc

    $ ffprobe /dev/video1
    ...
    Input #0, video4linux2,v4l2, from '/dev/video1':
    Duration: N/A, start: 18980.783228, bitrate: 115200 kb/s
    Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 800x600, 115200 kb/s, 15 fps, 15 tbr, 1000k tbn, 1000k tbc


    $ ffprobe /dev/video2
    ...
    Input #0, video4linux2,v4l2, from '/dev/video2':
    Duration: N/A, start: 18998.984143, bitrate: N/A
    Stream #0:0: Video: h264 (Main), yuv420p(progressive), 1920x1080, -5 kb/s, 30 fps, 30 tbr, 1000k tbn, 2000k tbc

    As far as I can tell two of them are not H264, which will need to be "decoded" to H264 so I understand there is added a bit latency there. But the third one (video2) is H264 so I should be able to get data from it ? I’ve tried to just pipe it out with CAT but it says I got invalid arguments.

    I’ve come as far as using FFMPEG might be the only option here. Would like to use software easily available for all RPi (apt install).

    Bonus question regarding H264 packets : When I stream the data from raspivid command to my decoder it works perfectly. But if I decide to drop the 10 first packets then it never initializes the decoding process and just shows a black background. Anyone know what might be missing in the first packets that I might be able to recreate in my software so I dont have to restart the stream for every newly connected user ?

    EDIT : Bonus Question Answer : After googling around I see that the first two frames raspivid sends me are. So by ignoring the two first frames my decoder wont "decode" properly. So if I save those frames and send them first to all new users it works perfectly. Seems like these are used in some kind of initial process.

    0x27 = 01 00111 = type 7    Sequence parameter set (B-frame)
    0x28 = 01 01000 = type 8    Picture parameter set (B-frame)