Recherche avancée

Médias (0)

Mot : - Tags -/xmp

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (29)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

Sur d’autres sites (4400)

  • Could not find com.arthenica:ffmpeg-kit-full:6.0-2

    23 juin, par gabocalero

    I started receiving this message when I build my project

    


    > Could not find com.arthenica:ffmpeg-kit-full:6.0-2.
  Searched in the following locations:
    - https://dl.google.com/dl/android/maven2/com/arthenica/ffmpeg-kit-full/6.0-2/ffmpeg-kit-full-6.0-2.pom
    - https://repo.maven.apache.org/maven2/com/arthenica/ffmpeg-kit-full/6.0-2/ffmpeg-kit-full-6.0-2.pom
    - https://jcenter.bintray.com/com/arthenica/ffmpeg-kit-full/6.0-2/ffmpeg-kit-full-6.0-2.pom
    - https://jitpack.io/com/arthenica/ffmpeg-kit-full/6.0-2/ffmpeg-kit-full-6.0-2.pom
    - https://pkgs.dev.azure.com/MicrosoftDeviceSDK/DuoSDK-Public/_packaging/Duo-SDK-Feed/maven/v1/com/arthenica/ffmpeg-kit-full/6.0-2/ffmpeg-kit-full-6.0-2.pom
    - https://oss.sonatype.org/content/repositories/snapshots/com/arthenica/ffmpeg-kit-full/6.0-2/ffmpeg-kit-full-6.0-2.pom
  Required by:
      project :presentation > project :domain


    


    These are the project's repositories

    


    allprojects {     
   repositories {
      google()
      mavenCentral()
      jcenter()
      maven { url 'https://jitpack.io' }
      maven { url "https://oss.sonatype.org/content/repositories/snapshots" }     
   } 
}


    


    And this is the dependency I'm adding

    


        implementation(libs.arthenica.ffmpeg.full)


    


    This is my libs.version.toml

    


    ffmpeg = "6.0-2"
arthenica-ffmpeg-full = {  group = "com.arthenica", name = "ffmpeg-kit-full", version.ref = "ffmpeg" }


    


    AFAICS in the project's github page, it will not be maintained anymore
ffmpeg-kit is archived on April 21, 2025

    


    In the short term, do you know any other repository that still servers this dependency ?

    


    On the other hand, do you know any other project to replace Arthenica ffmpeg-kit-full dependency ?

    


    Thank you very much

    


  • v4l2loopback+ffmpeg input for uvc gadget

    13 mai, par Mosi

    I'm trying to use an MP4 video file as the input for a UVC Gadget setup on my Raspberry Pi 4 Model B, but I'm running into an issue when streaming through V4L2.

    


    Goal :

    


    To emulate a webcam that streams a looping MP4 video file to a Windows 11 host.

    


    My setup :

    


      

    • Hardware : Raspberry Pi 4 Model B
    • 


    • OS : Raspberry Pi OS Lite 64-bit (2025-05-06-raspios-bookworm-arm64-lite)
    • 


    • Kernel : 6.12.25+rpt-rpi-v8
    • 


    • Host System : Windows 11
    • 


    • UVC Gadget version : v0.3.0
    • 


    


    Workflow :

    


    [MP4 Video] → [FFmpeg] → [V4L2 Loopback] → [UVC Gadget] → Windows sees virtual webcam


    


    What works :

    


    The UVC Gadget works perfectly when I use a real webcam as the source (e.g., /dev/video0). Windows detects the virtual webcam and displays a smooth video stream.

    


    The problem :

    


    When I try to use an MP4 video file through FFmpeg and send it to the loopback device (/dev/video3), the UVC Gadget fails with the following error :

    



    


    Command I'm using :

    


    ffmpeg -re -stream_loop -1 -i input.mp4 -vf scale=640:480 \
  -c:v rawvideo -pix_fmt yuyv422 -r 30 -f v4l2 /dev/video3


    


    Then I run :

    


    sudo uvc-gadget -d /dev/video3 uvc.0


    


    Output :

    


    bRequestType 21 bRequest 01 wValue 0200 wIndex 0001 wLength 0022
streaming request (req SET_CUR cs 02)
setting commit control, length = 34
Setting format to 0x56595559 640x480
=== Setting frame rate to 30 fps
Starting video stream.
--> [At this point I open the camera on the Windows host]
/dev/video3: 2 buffers requested.
Failed to export buffer 0.
Failed to export buffers on source: Inappropriate ioctl for device (25)


    



    


    Things I've tried :

    


      

    • Multiple FFmpeg formats, resolutions, and pixel formats
    • 


    • Various ffmpeg buffer and framerate tweaks
    • 


    • Different UVC Gadget versions
    • 


    • GitHub related projects (showcamera, etc.)
    • 


    • Older Raspberry Pi OS versions
    • 


    


    Most guides and GitHub projects I found are outdated (5+ years old), and newer methods seem undocumented or incompatible with current kernel/UVC gadget tools.

    



    


    My question :

    


    How can I stream an MP4 file as a virtual webcam using UVC Gadget without getting ioctl errors ?
    
Is there a proper way to set up FFmpeg and loopback devices so that UVC Gadget can read the stream correctly ?

    


    Any modern working example or tips would be very appreciated. Thanks in advance !

    


  • FFMPEG send RTP audio at 8k bytes/sec [closed]

    10 mai, par Muzza

    I'm trying to use FFMPEG to mimick a device that transmits G711U audio over UDP/RTP at 8k bytes per second.
The device im mimicking sends rtp packets every 20ms with 160byte payload.

    


    I've had limited success using the following command

    


    ffmpeg -f dshow -i audio="Microphone (Realtek(R) Audio)" -ac 1 -ar 8000 -ab 8 -acodec pcm_mulaw -f rtp rtp://127.0.0.1:12345?pkt_size=160


    


    This sends G711U encoded audio, in 160byte chunks, but streams at 64kB/s, not the 8kB/s that my device is expected, so the device errors out ?

    


    Any idea's would be massively appreciated !

    


    Thank you

    


    Log from FFMPEG

    


    >ffmpeg -f dshow -i audio="Microphone (Realtek(R) Audio)" -ac 1 -ar 8000 -ab 8 -acodec pcm_mulaw -f rtp rtp://127.0.0.1:12345?pkt_size=160
ffmpeg version 2025-04-23-git-25b0a8e295-essentials_build-www.gyan.dev Copyright (c) 2000-2025 the FFmpeg developers
  built with gcc 14.2.0 (Rev3, Built by MSYS2 project)
  configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-zlib --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-sdl2 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-libaom --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-dxva2 --enable-d3d11va --enable-d3d12va --enable-ffnvcodec --enable-libvpl --enable-nvdec --enable-nvenc --enable-vaapi --enable-libgme --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libtheora --enable-libvo-amrwbenc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-librubberband
  libavutil      60.  2.100 / 60.  2.100
  libavcodec     62.  0.101 / 62.  0.101
  libavformat    62.  0.100 / 62.  0.100
  libavdevice    62.  0.100 / 62.  0.100
  libavfilter    11.  0.100 / 11.  0.100
  libswscale      9.  0.100 /  9.  0.100
  libswresample   6.  0.100 /  6.  0.100
  libpostproc    59.  1.100 / 59.  1.100
[aist#0:0/pcm_s16le @ 00000198256b73c0] Guessed Channel Layout: stereo
Input #0, dshow, from 'audio=Microphone (Realtek(R) Audio)':
  Duration: N/A, start: 135470.702000, bitrate: 1411 kb/s
  Stream #0:0: Audio: pcm_s16le, 44100 Hz, stereo, s16, 1411 kb/s, Start-Time 135470.702s
Stream mapping:
  Stream #0:0 -> #0:0 (pcm_s16le (native) -> pcm_mulaw (native))
Press [q] to stop, [?] for help
[pcm_mulaw @ 00000198256cf240] Bitrate 8 is extremely low, maybe you mean 8k
Output #0, rtp, to 'rtp://127.0.0.1:12345?pkt_size=160':
  Metadata:
    encoder         : Lavf62.0.100
  Stream #0:0: Audio: pcm_mulaw, 8000 Hz, mono, s16 (8 bit), 64 kb/s
    Metadata:
      encoder         : Lavc62.0.101 pcm_mulaw
SDP:
v=0
o=- 0 0 IN IP4 127.0.0.1
s=No Name
c=IN IP4 127.0.0.1
t=0 0
a=tool:libavformat 62.0.100
m=audio 12345 RTP/AVP 0
b=AS:64

[out#0/rtp @ 00000198256cdd00] video:0KiB audio:973KiB subtitle:0KiB other streams:0KiB global headers:0KiB muxing overhead: 8.467470%
size=    1055KiB time=00:02:04.51 bitrate=  69.4kbits/s speed=   1x
Exiting normally, received signal 2.


    


    Wireshark :
Wireshark Log

    


    Shows packets being sent every 0.20ms