Recherche avancée

Médias (1)

Mot : - Tags -/MediaSPIP

Autres articles (55)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

Sur d’autres sites (5494)

  • FFmpeg encoding slow for 4K HDR content

    1er mars 2023, par Geno Diaz

    When processing 4K input with the following configuration it is taking upwards of 2 minutes to process a 35s, 60fps, 4K HDR clip recorded from an iPhone. Is this the expected performance or is there an inefficiency within the configuration that is causing this ?

    


    In comparison, running this configuration on a 35s, 30fps, 4K non-HDR clip, only takes about 20 seconds.

    


    ffmpeg 
-i "input path" 
-y 
-filter:v scale=w=2160:h=3840 
-threads 4 
-r 59.94 
-c:v libx264 
-preset veryfast 
-vsync 1 
-tune film 
-maxrate 6000k 
-bufsize 5400k 
-g 60 
-x264opts no-scenecut 
-c:a aac 
-af aresample=async=1:min_hard_comp=0.100000:first_pts=0 
-ac 2 
-b:a 128k 
-ar 44100 
-vf zscale=transfer=linear:npl=100,
  format=gbrpf32le,
  zscale=primaries=bt709,
  tonemap=tonemap=hable:desat=0,
  zscale=transfer=bt709:matrix=bt709:range=tv,
  format=yuv420p 
-sws_flags full_chroma_int+full_chroma_inp 
-pix_fmt yuv420p 
"outputfile".mp4


    


  • How can I stream video in swift using FFmpeg in real time ?

    13 novembre 2019, par rIn

    I want to make a real time video streaming app but I’m new to Swift and live streaming..

    Video input captured by AVCaptureSession is from iPhone and I want to encode the input from MPEG-4 to MPEG-2.
    I will use FFmpeg library to encode the video input in swift.

    Here’s my questions.

    1. I don’t know how to deliver video data to FFmpeg functions. Actually I’m not certain that I have to use FFmpeg library to encode video data from MPEG-4 to MPEG-2 transport stream. Is there any API that can encode video data in Swift by Apple ?

    2. How can I deal with video data from AVCaptureSession ? Are there frames or h.264 video in CMSampleBuffer ? I want to know what type of data is in CMSampleBuffer.

    I’m struggling to solve this problem. Please let me know anything if you have some experiences about this kind of project. Any ideas or advice for a better approach are welcome.

  • iOS - How can I stream Encoded Video Frames(from AVFoundation and VideoToolBox) from device to server via RTP

    2 octobre 2015, par ASP Peek

    I am trying to stream live video from my iPhone device to server using RTP.
    Using AVFoundation’s AVCaptureVideoDataOutput, I was able to get CMSampleBuffer for video. I then feed these frames as they arrive into VideoToolBox’s VTCompressionSessionEncodeFrame() and is able to get Encoded CMSampleBuffer.

    Now to send these encoded Frames via RTP, I came across FFMPEG and found its built library for iOS device. (https://github.com/kewlbear/FFmpeg-iOS-build-script)

    However I am not able to find any iOS example or sample code or any documentation that explains the process of sending the encoded frames via RTP for iOS apps.

    Is there any existing example or documentation that can explain me how can I send the encoded CMSampleBuffers to server via RTP using FFMPEG.

    Thanks in Advance :)