Recherche avancée

Médias (1)

Mot : - Tags -/censure

Autres articles (105)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • Déploiements possibles

    31 janvier 2010, par

    Deux types de déploiements sont envisageable dépendant de deux aspects : La méthode d’installation envisagée (en standalone ou en ferme) ; Le nombre d’encodages journaliers et la fréquentation envisagés ;
    L’encodage de vidéos est un processus lourd consommant énormément de ressources système (CPU et RAM), il est nécessaire de prendre tout cela en considération. Ce système n’est donc possible que sur un ou plusieurs serveurs dédiés.
    Version mono serveur
    La version mono serveur consiste à n’utiliser qu’une (...)

Sur d’autres sites (7180)

  • Video Overlay not returning to black once video stopped

    24 septembre 2020, par Matt Nelson

    I am combining multiple videos from a webrtc call and aligning the "user-terminal" videos to the left, and the "user-visitor" videos to the right. As the visitor feed can start and strop, they have multiple videos so I'm offsetting them by the timestamp.

    


    This is working, however crazy it looks !

    


    The one last issue I have is that when the first visitor video stop on the right, it shows the last frame of that video until the next video starts on the left. Can I have it return o the black background ?

    


    Here is the command passed to ffmpeg :

    


    ffmpeg -y 
-i /recordings/process/5f6c9c3/videoroom-5f6c9c3-user-terminal-1600953586531366-audio.mjr.opus 
-i /recordings/process/5f6c9c3/videoroom-5f6c9c3-user-visitor-1600953592694430-audio.mjr.opus 
-i /recordings/process/5f6c9c3/videoroom-5f6c9c3-user-visitor-1600953609873223-audio.mjr.opus 
-i /recordings/process/5f6c9c3/videoroom-5f6c9c3-user-visitor-1600953628668227-audio.mjr.opus 
-i /recordings/process/5f6c9c3/videoroom-5f6c9c3-user-visitor-1600953663905342-audio.mjr.opus 
-i /recordings/process/5f6c9c3/videoroom-5f6c9c3-user-MasterTerminal-52350116-1600953681107272-audio.mjr.opus 
-i /recordings/process/5f6c9c3/videoroom-5f6c9c3-user-visitor-1600953697832165-audio.mjr.opus 
-i /recordings/process/5f6c9c3/videoroom-5f6c9c3-user-MasterTerminal-52350116-1600953723320364-audio.mjr.opus 
-i /recordings/process/5f6c9c3/videoroom-5f6c9c3-user-visitor-1600953725307043-audio.mjr.opus 
-filter_complex [1:a]adelay=6163|6163[1adelay];[2:a]adelay=23341|23341[2adelay];[3:a]adelay=42136|42136[3adelay];[4:a]adelay=77373|77373[4adelay];[5:a]adelay=94575|94575[5adelay];[6:a]adelay=111300|111300[6adelay];[7:a]adelay=136788|136788[7adelay];[8:a]adelay=138775|138775[8adelay];[0:a][1adelay][2adelay][3adelay][4adelay][5adelay][6adelay][7adelay][8adelay]amix=inputs=9:duration=longest[a] 
-map [a] -ac -2 /recordings/process/5f6c9c3/5f6c9c3.mp3


    


  • ffmpeg configure always returns "not found" when using sysroot and cross-prefix options - how to use them correctly ?

    5 août 2014, par user2212461

    I try to build ffmpeg with a cross-compiler. When setting the configure flags I am getting the following errors :

    WARNING: Compiler does not indicate floating-point ABI, guessing soft.
    build_it.sh: 29: build_it.sh: --sysroot=/home/android/android-ndk-r9/platforms/
       android-9/arch-arm/: not found
    build_it.sh: 32: build_it.sh.sh: --cross-prefix=/home/android/android-ndk-r9/
       toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-and
       roideabi-: not found

    The error message "not found" sounds like the paths dont exist but they are all correct and existing. How can I resolve this error ?

    build_it.sh :

    NDK=/home/android/android-ndk-r9
    SYSROOT=$NDK/platforms/android-9/arch-arm/
    TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86
    CPU=arm
    PREFIX=test
    ADDI_CFLAGS="-marm"
    platform="android-9"

    ./configure \
       --disable-asm \
       --enable-shared \
       --target-os=linux \
       --arch=arm \
       --enable-cross-compile \
       --disable-static \
       --sysroot=$SYSROOT \
       --extra-cflags="-Os -fpic $ADDI_CFLAGS" \
       --extra-ldflags="$ADDI_LDFLAGS"
       --cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
       --prefix=$PREFIX \

    ##configure fails here
    ...
  • Opencv VideoCapture not streaming RTSP link and returns "no frame !"

    6 septembre 2023, par Asadullah Naeem

    I am trying to stream my HikVision IP camera throough python. I am using cv2.VideoCapture("rtsp_link") which works fine on my Laptop but when I try to run the same python script with same Opencv and FFmpeg version it gives me following error :

    


    Error :

    


    [h264 @ 000002124c7f9a40] missing picture in access unit with size 47
[h264 @ 000002124c7f9a40] no frame!


    


    I have so far tried to run this script on 5 computer devices but it gives the same error. I am using the following python script and my Opencv version is 4.6.0.66 and ffmpeg version 2022-06-20-git-56419428a8-essentials_build-www.gyan.dev :

    


    Python Script :

    


    import cv2

# RTSP stream URL
rtsp_url = "rtsp://username:password@ip_address:port/Streaming/Channels/501"

# Open the RTSP stream
cap = cv2.VideoCapture(rtsp_url)

# Check if the stream was successfully opened
if not cap.isOpened():
    print("Failed to open RTSP stream.")
    exit()

# Read and display frames from the stream
while True:
    # Read a frame from the stream
    ret, frame = cap.read()

    # Check if the frame was successfully read
    if not ret:
        print("Failed to read frame from RTSP stream.")
        break

    # Display the frame
    cv2.imshow("RTSP Stream", frame)

    # Exit if 'q' is pressed
    if cv2.waitKey(1) & 0xFF == ord('q'):
        break

# Release the resources
cap.release()
cv2.destroyAllWindows()



    


    Update :

    


    Code runs on a laptop on both wifi and mobile internet (4G) but on other devices rtsp link is accessible only with mobile internet (4G).