Recherche avancée

Médias (1)

Mot : - Tags -/3GS

Autres articles (67)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • MediaSPIP Core : La Configuration

    9 novembre 2010, par

    MediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
    Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (9993)

  • GoPro (MP4) video timestamp sync with precision of milliseconds

    3 février 2021, par Raphael Ottoni

    I need your help with a data sync problem... I m currently trying to sync my GoPro video with real world time (a.k.a my notebook). I manage to sync date and time of my notebook and my GoPro 3+ black perfectly. The problem is that when the GoPro save the files in disk it round up the milliseconds on the creation_time (the milliseconds is always 000000) . Thus, turning the perfect sync impossible. In attachment is a picture of the meta information (extracted by ffprobe) of the MP4 video.

    



    My question is : What I have to do, so the GoPro actually save the creation_time with precision of milliseconds ?

    



    Another small question : Looking at the attachment figure, we see the "timecode" which is a time synchronization data in the format of hours:minuts:seconds:frame. I was thinking that I could use the "frame" value to calculate the missing milliseconds value. If we take this attachment, as a example, we can see that the frame value is "36". Meaning that the millisecond that it started to record was the one associated with the 36th frame of the FPS (in this video : 60fps) value : Some thing like 1000/60 * 36 which is 600 milliseconds, thus the actual creation_time of this video would be : 2017-07-19T18:10:34.600

    



    Is this logic right ? it didn't work ! I don't know what else to do.

    



    P.S : I need this kind of time precision because I will sync the video frames with a external sensor data that is recorded at 11hz.

    



    Please Help

    



    enter image description here

    



    update

    



    I forgot to mention, even if you check the original raw file information, inside the GoPro SSD card, using "stats" to read the creation time (see attachment) it still has the same timestamp without milliseconds.

    



    enter image description here

    


  • How to add ffmpeg complex filter and concat multiple sources

    6 décembre 2022, par danunafig

    Trying to get all parts togther with no luck. Have a mp4, would like to use stick togther N chunks with xfade transition. 0-5 seconds — transition — 10-15 seconds — transition — 20-25 seconds and so on.

    


    Works perfect with 2 videos, but 3 and more - doesn't

    


    #1 Didn't work. The output video is only 10 seconds long

    


    ffmpeg()
.input(sourceAssetPath).inputOptions(['-ss 0', '-t 5'])
.input(sourceAssetPath).inputOptions(['-ss 10', '-t 5'])
.input(sourceAssetPath).inputOptions(['-ss 20', '-t 5'])
.complexFilter([
  {
    filter: 'xfade',
    options: {
      transition: 'hblur',
      duration: 0.5,
      offset: 4.75
    },
    inputs: ['0:v', '1:v'],
    outputs: ['outv1']
  },
  {
    filter: 'xfade',
    options: {
      transition: 'hblur',
      duration: 0.5,
      offset: 9.75
    },
    inputs: ['outv1', '2:v'],
    outputs: '[video]'
  }
])
.map('video')
.output('./data/out_transition.mp4')
.videoCodec('libx264')
.run();


    


    #2 Didn't work as well (length is 15 seconds, and the 3rd chunk is the 2nd stream, not 3rd)

    


    ffmpeg()
.input(sourceAssetPath).inputOptions(['-ss 0', '-t 5'])
.input(sourceAssetPath).inputOptions(['-ss 10', '-t 5'])
.input(sourceAssetPath).inputOptions(['-ss 20', '-t 5'])
.complexFilter([
  {
    filter: 'xfade',
    options: {
      transition: 'hblur',
      duration: 0.5,
      offset: 4.75
    },
    inputs: ['0:v', '1:v'],
    outputs: ['outv1']
  },
  {
    filter: 'xfade',
    options: {
      transition: 'hblur',
      duration: 0.5,
      offset: 9.75
    },
    inputs: ['1:v', '2:v'],
    outputs: '[outv2]'
  },
  '[outv1][outv2] concat=n=2:v=1:a=0 [video]'
])
.map('video')
.output('./data/out_transition.mp4')
.videoCodec('libx264')
.run();


    


  • How do I use ffmpeg to deliver the video

    19 juin 2014, par user3750991

    Our destination :

    We want to install camera on our device of arm architecture with capture real-time video and send them to ffserver, then we can watch this video from h ttp ://172.16.51.29:8090/stat.html on the other computer in the same time.

    Device IP address : 172.16.51.29
    computer IP address : 172.16.51.26

    1 I start the server on device

    ffserver -f /etc/ffserver.conf

    2 Camera start to capture video on device

    ffmpeg -s 320x240 -f video4linux2 -i /dev/video0 h ttp ://127.0.0.1:8090/cam1.ffm

    3 I got the messages

    Error while opening encoder for output stream #0.0 - maybe incorrect
    parameters such as bit_rate, rate, width or height

    4 Test

    ffmpeg -s 320x240 -f video4linux2 -i /dev/video0 cam1.asf

    Command above is perfect to produce video named cam1.asf

    5 Fix incorrect parameters such as bit_rate, rate, width or height

    ffmpeg -s 320x240 -f video4linux2 -i /dev/video0 -b 100k h ttp ://127.0.0.1:8090/cam1.ffm

    But use the command above produce the message presented below

    Input #0, video4linux2, from ’/dev/video0’ :
    Duration : N/A, start : 24300.069612, bitrate : 36864 kb/s
    Stream #0.0 : Video : rawvideo, yuyv422, 320x240, 36864 kb/s, 30 tbr, 1000k tbn, 30 tbc
    Sat Jan 1 07:09:07 2000 127.0.0.1 - - [GET] "/cam1.ffm HTTP/1.1" 200 4149
    [buffer @ 0x2f1a0] w:320 h:240 pixfmt:yuyv422
    [avsink @ 0x2f7e0] auto-inserting filter ’auto-inserted scaler 0’ between the filter ’src’ and the filter ’out’
    [scale @ 0x2ea60] w:320 h:240 fmt:yuyv422 -> w:320 h:240 fmt:yuv420p flags:0x4
    Sat Jan 1 07:09:07 2000 127.0.0.1 - - [POST] "/cam1.ffm HTTP/1.1" 200 0
    Segmentation fault

    What should I do to fix this issue ? Thanks.