Recherche avancée

Médias (91)

Autres articles (34)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (4115)

  • FFMPEG FFBROBE Get Frame Count On Powershell [duplicate]

    27 avril 2020, par ilham zacky

    I am new to FFmpeg, I have some audio and video files, I need to get the duration with frames, I need it in this format H:M:S:F - "00:00:00:00", I am using Powershell

    



    currently, my duration works, but instead of frames, it prints a decimal value.
note : in my output maximum number of frames should be 30

    



    This is my code

    



    $audioId = "$id.m4a"
$videoId = "$id.mp4"

$duration1 = if ((ffprobe -i $audioId 2>&1 | Out-String) -match 'Duration:\s+([\d:"."]+)') { $matches[1] };

$duration = if ((ffmpeg -i $videoId 2>&1 | Out-String) -match 'Duration:\s+([\d:"."]+)') { $matches[1] };

$newduration1 = ("$duration1").Replace(".",":")
$newduration = ("$duration").Replace(".",":")

echo $duration1 
echo $duration



    



    my output be like

    



    00:00:03.48
00:00:03.46



    


  • Frame-by-frame video decoding and processing on Android

    22 octobre 2016, par Jason M

    I am working on a project to decode a recorded video on Android into yuv420sp frames with API17 for processing. I have implemented it on Win and iOS with ffmpeg and native api for decoding, respectively. Now I want to reuse the same c++ processing code on Android API17, but found difficulty. I saw straight API to do that in API21, but sadly I have to keep that better compatibility.

    At first, I tried to make the best use of APIs of hardware decoding (following MediaCodec get all frames from video and some others) but found that the data in the outputbuffer does not match that described by the format. For example, I have 1920x1080 frames decoded into yuv420sp but found that y channel actually span 1928x1080. Since I had similar headaches working with still image capturing that was never worked out(YUV image taken from takePicture not match defined format), I am not sure if I may ever solve this.

    Later, I tried to go back to ffmpeg but found most building guides either out of date(for ffmpeg 2.x) or a little hard to understand(such as http://writingminds.github.io/ffmpeg-android/, where they do provide nice pre-built binaries but no instructions on using it. Maybe I should download the sources and use the headers there ?).

    Any suggestions ?

  • ffmpeg udp/tcp stream receive frame not same as sent

    21 novembre 2013, par vivienlwt

    I am streaming a video on raspberrypi using command :

    ffmpeg -re -threads 2 -i sample_video.m2v -f mpegts - | \
    ffmpeg -f mpegts -i - -c copy -f mpegts udp://192.168.1.100:12345

    The remote PC with 192.168.1.100 uses ffmpeg library to listen to the input stream. For example :

    informat = ffmpeg::av_find_input_format("mpegts");
    avformat_open_input(&pFormatCtx, "udp://192.168.1.100:12345", informat, options);

    However, when I compute the hash value of each decoded frame on two sides (i.e. raspberrypi and PC), they DON'T MATCH at all. A weird thing is, among 2000 frames, there are in total 10 frames whose hash value are the same on the sender and receiver side. The match result look like this :

    00000....00011000...00011110000...000

    where 0 indicates non-match and 1 indicates match. The matched frame appeared 2 6 in sequence and appeared rarely while most of the other frames has different hash value.

    The hash is computed on the frame data buffer extracted using avpicture_layout(). On the Pi side, I just stream the video to a local port and there's a local process using the same code to decode and hash the frames :

    ffmpeg -re -threads 2 -i sample_video.m2v -f mpegts - | \
    ffmpeg -f mpegts -i - -c copy -f mpegts udp://localhost:12345
    ...

    The streaming source raspberry pi, is connected directly to the PC using cable. I don't think it is a packet loss problem. Because, first, I rerun the same process several times and the hash value of the received frames are the same (otherwise the result should be different because packet loss is probabilistic). Secondly, I even try to stream on tcp ://192.168.1.100:12345 (and "tcp ://192.168.1.100:12345 ?listen" on PC), and the received frame hash are still the same - different than the hash result on the Pi.

    So, does anyone know why the streaming to a remote address will yield different decoded frames ? Maybe I am missing some details.

    Thanks in advance !!