Recherche avancée

Médias (1)

Mot : - Tags -/ogg

Autres articles (111)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Soumettre bugs et patchs

    10 avril 2011

    Un logiciel n’est malheureusement jamais parfait...
    Si vous pensez avoir mis la main sur un bug, reportez le dans notre système de tickets en prenant bien soin de nous remonter certaines informations pertinentes : le type de navigateur et sa version exacte avec lequel vous avez l’anomalie ; une explication la plus précise possible du problème rencontré ; si possibles les étapes pour reproduire le problème ; un lien vers le site / la page en question ;
    Si vous pensez avoir résolu vous même le bug (...)

Sur d’autres sites (11560)

  • ffmpeg command line for capturing (and recording) audio and video in 720p from decklink card using Windows 7

    6 janvier 2018, par Huron

    I am trying to capture audio and video from a blackmagic decklink capture card using Windows 7 @ 720p, but I cant seem to get the ffmpeg command line settings right.

    ffmpeg -list_devices true -f dshow -i dummy

    [dshow @ 02457a60] DirectShow video devices
    [dshow @ 02457a60]  "Blackmagic WDM Capture"
    [dshow @ 02457a60]  "Decklink Video Capture"
    [dshow @ 02457a60] DirectShow audio devices
    [dshow @ 02457a60]  "Decklink Audio Capture"

    ffmpeg -list_options true -f dshow -i video="Decklink Video Capture"

    [dshow @ 03c2ea20] DirectShow video device options
    [dshow @ 03c2ea20]  Pin "Capture"
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=720x486 fps=29.97 max s=720x486 fps=29.97
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=720x486 fps=23.976 max s=720x486 fps=23.976
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=720x576 fps=25 max s=720x576 fps=25
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=720x486 fps=59.9402 max s=720x486 fps=59.9402
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=720x576 fps=50 max s=720x576 fps=50
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=1920x1080 fps=23.976 max s=1920x1080 fps=23.976
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=1920x1080 fps=24 max s=1920x1080 fps=24
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=1920x1080 fps=25 max s=1920x1080 fps=25
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=1920x1080 fps=29.97 max s=1920x1080 fps=29.97
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=1920x1080 fps=30 max s=1920x1080 fps=30
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=1280x720 fps=50 max s=1280x720fps=50
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=1280x720 fps=59.9402 max s=1280x720 fps=59.9402
    [dshow @ 03c2ea20]   pixel_format=uyvy422  min s=1280x720 fps=60.0002 max s=1280x720 fps=60.0002

    ffmpeg -list_options true -f dshow -i audio="Decklink Audio Capture"

    [dshow @ 047fea20] DirectShow audio device options
    [dshow @ 047fea20]  Pin "Capture"
    [dshow @ 047fea20]   min ch=1 bits=16 rate= 48000 max ch=1 bits=16 rate= 48000
    [dshow @ 047fea20]   min ch=2 bits=16 rate= 48000 max ch=2 bits=16 rate= 48000
    [dshow @ 047fea20]   min ch=4 bits=16 rate= 48000 max ch=4 bits=16 rate= 48000
    [dshow @ 047fea20]   min ch=6 bits=16 rate= 48000 max ch=6 bits=16 rate= 48000
    [dshow @ 047fea20]   min ch=8 bits=16 rate= 48000 max ch=8 bits=16 rate= 48000
    [dshow @ 047fea20]   min ch=10 bits=16 rate= 48000 max ch=10 bits=16 rate= 48000
    [dshow @ 047fea20]   min ch=12 bits=16 rate= 48000 max ch=12 bits=16 rate= 48000
    [dshow @ 047fea20]   min ch=16 bits=16 rate= 48000 max ch=16 bits=16 rate= 48000

    This is the stream information for my current video/audio source, connected to the decklink card’s hdmi port

    Stream #0:0: Video: rawvideo (UYVY / 0x59565955), uyvy422(tv), 1280x720, 59.94 tbr, 10000k tbn, 59.94 tbc
    Stream #0:1: Audio: pcm_s16le, 48000 Hz, stereo, s16, 1536 kb/s

    Ultimately I will need to display this video on screen. And simultaneously be able to start and stop recording, while the preview keeps playing.

    My idea was to use ffmpeg to capture the video and audio signal and transmit this to a network stream (eg 127.0.0.1:6666). Then use VLC player to display the stream (the preview). And ultimately start and/or stop another ffmpeg to save that same stream to disk.

    In my mind this works, but I am no audio/video expert so if anyone with more experience could help out, I would appreciate it.

    Update :

    I have been able to display the video using ffplay, with the following command :

    ffplay -f dshow -video_size 1280x720 -rtbufsize 702000k -framerate 60 -i video="Decklink Video Capture":audio="Decklink Audio Capture" -threads 2

    Next step is streaming it so I can view the stream (preview) with VLC.

    Tried to use this command :

    ffmpeg -f dshow -video_size 1280x720 -rtbufsize 702000k -framerate 60 -i video="Decklink Video Capture":audio="Decklink Audio Capture" -threads 2 -f mpegts rtp://127.0.0.1:6666?pkt_size=188?buffer_size=65535

    Which does not give any errors, so appears to work. But when I try to open the stream in VLC I get the following error :

    SDP required :
    A description in SDP format is required to receive the RTP stream. Note that rtp :// URIs cannot work with dynamic RTP payload format (65).

    After a bit of reading it seems I should not be streaming to rtp :// but rather to udp ://

    Command became :

    ffmpeg -f dshow -video_size 1280x720 -rtbufsize 702000k -framerate 60 -i video="Decklink Video Capture":audio="Decklink Audio Capture" -threads 2 -f mpegts udp://127.0.0.1:6666?pkt_size=188?buffer_size=65535

    And when I try to open it now in VLC I get no error, no warning, but also no video.

    Time for some more reading.

  • FFMPEG segment records a minute worth of data but metadata shows 0.1s length video

    1er février 2018, par Soft Strategy

    I’m recording a stream from a Live555 restream of a rtsp camera. I record in chunks of one minute videos. This is the setup :

    RTSP Camera -> Live555 Server -> Recorder

    Without the Live555 restreamer server I always record the rtsp camera stream with ffmpeg with no issues. This is the command :

    ffmpeg -rtsp_transport tcp -y -i rtsp://IPcamera/url/to/video -c copy -f segment -segment_time 60 -reset_timestamps 1 -segment_atclocktime 1 -strftime 1 /video/%Y-%m-%d_%H-%M-%Stest.ts

    I record in a ts container to mitigate framerate inaccuracies.

    If I capture the proxy stream after few ours with no errors, something happens the stream starts to record files with the usual space on disk (around 5-10MB each) but when I check the metadata, the duration is 0.16 seconds. If you watch the video, it’s of course too fast but shows the clock hard coded in the image that runs through the whole minute, so the data is there.

    The command I use with the restream, just an input change.

    ffmpeg -rtsp_transport tcp -y -i rtsp://IPstreamServer/proxyStream-1 -c copy -f segment -segment_time 60 -reset_timestamps 1 -segment_atclocktime 1 -strftime 1 /video/%Y-%m-%d_%H-%M-%Stest.ts

    It’s like ffmpeg is messing up the metadata. With the tcp setting there are no packets lost.

    What could be the problem ?

    Recording on Debian 8.8, ffmpeg version 3.2.5-1 bpo8+1.

    EDIT : I’ve found that some guy had problem receiving metadata from the server, could it be the problem ? http://lists.live555.com/pipermail/live-devel/2013-May/017033.html

  • Live video stream on server (PC) from images sent by robot through UDP

    3 février 2018, par Richard Knop

    Hmm. I found this which seems promising :

    http://sourceforge.net/projects/mjpg-streamer/


    Ok. I will try to explain what I am trying to do clearly and in much detail.

    I have a small humanoid robot with camera and wifi stick (this is the robot). The robot’s wifi stick average wifi transfer rate is 1769KB/s. The robot has 500Mhz CPU and 256MB RAM so it is not enough for any serious computations (moreover there are already couple modules running on the robot for motion, vision, sonar, speech etc).

    I have a PC from which I control the robot. I am trying to have the robot walk around the room and see a live stream video of what the robot sees in the PC.

    What I already have working. The robot is walking as I want him to do and taking images with the camera. The images are being sent through UDP protocol to the PC where I am receiving them (I have verified this by saving the incoming images on the disk).

    The camera returns images which are 640 x 480 px in YUV442 colorspace. I am sending the images with lossy compression (JPEG) because I am trying to get the best possible FPS on the PC. I am doing the compression to JPEG on the robot with PIL library.

    My questions :

    1. Could somebody please give me some ideas about how to convert the incoming JPEG images to a live video stream ? I understand that I will need some video encoder for that. Which video encoder do you recommend ? FFMPEG or something else ? I am very new to video streaming so I want to know what is best for this task. I’d prefer to use Python to write this so I would prefer some video encoder or library which has Python API. But I guess if the library has some good command line API it doesn’t have to be in Python.

    2. What is the best FPS I could get out from this ? Given the 1769KB/s average wifi transfer rate and the dimensions of the images ? Should I use different compression than JPEG ?

    3. I will be happy to see any code examples. Links to articles explaining how to do this would be fine, too.

    Some code samples. Here is how I am sending JPEG images from robot to the PC (shortened simplified snippet). This runs on the robot :

    # lots of code here

    UDPSock = socket(AF_INET,SOCK_DGRAM)

     while 1:
       image = camProxy.getImageLocal(nameId)
       size = (image[0], image[1])
       data = image[6]
       im = Image.fromstring("YCbCr", size, data)
       s = StringIO.StringIO()
       im.save(s, "JPEG")

       UDPSock.sendto(s.getvalue(), addr)

       camProxy.releaseImage(nameId)

     UDPSock.close()

     # lots of code here

    Here is how I am receiving the images on the PC. This runs on the PC :

     # lots of code here

     UDPSock = socket(AF_INET,SOCK_DGRAM)
     UDPSock.bind(addr)

     while 1:
       data, addr = UDPSock.recvfrom(buf)
       # here I need to create a stream from the data
       # which contains JPEG image

     UDPSock.close()

     # lots of code here