Recherche avancée

Médias (0)

Mot : - Tags -/presse-papier

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (78)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Gestion des droits de création et d’édition des objets

    8 février 2011, par

    Par défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;

  • Dépôt de média et thèmes par FTP

    31 mai 2013, par

    L’outil MédiaSPIP traite aussi les média transférés par la voie FTP. Si vous préférez déposer par cette voie, récupérez les identifiants d’accès vers votre site MédiaSPIP et utilisez votre client FTP favori.
    Vous trouverez dès le départ les dossiers suivants dans votre espace FTP : config/ : dossier de configuration du site IMG/ : dossier des média déjà traités et en ligne sur le site local/ : répertoire cache du site web themes/ : les thèmes ou les feuilles de style personnalisées tmp/ : dossier de travail (...)

Sur d’autres sites (10863)

  • Shaka Player : ignore empty AdaptationSet

    24 juin 2019, par Mitya

    I’m trying to play DASH stream in Shaka Player.
    Sometimes the stream doesn’t have an audio source and its manifest file contains an empty AdaptationSet entry.
    In this case Shaka Player returns the manifest parsing error :

    | DASH_EMPTY_ADAPTATION_SET | 4003 | number |  The DASH Manifest contained an AdaptationSet with no Representations. |

    Is it possible to ignore this error somehow and play the video without audio source ?

    Example of manifest file :

    <?xml version="1.0" encoding="utf-8"?>
    <mpd xmlns="urn:mpeg:dash:schema:mpd:2011" profiles="urn:mpeg:dash:profile:isoff-live:2011" type="dynamic" minimumupdateperiod="PT4S" suggestedpresentationdelay="PT4S" availabilitystarttime="2019-06-24T13:38:04Z" publishtime="2019-06-24T13:38:34Z" timeshiftbufferdepth="PT14.9S" minbuffertime="PT9.9S">
       <programinformation>
           
       </programinformation>
       <period start="PT0.0S">
           <adaptationset contenttype="video" segmentalignment="true" bitstreamswitching="true">
               <representation mimetype="video/mp4" codecs="avc1.640028" bandwidth="2000000" width="1920" height="1080" framerate="20/1">
                   <segmenttemplate timescale="10240" initialization="init-stream$RepresentationID$.m4s" media="chunk-stream$RepresentationID$-$Number%05d$.m4s" startnumber="5">
                       <segmenttimeline>
                           <s t="201697" d="51190" r="2"></s>
                       </segmenttimeline>
                   </segmenttemplate>
               </representation>
           </adaptationset>
           <adaptationset contenttype="audio" segmentalignment="true" bitstreamswitching="true">
           </adaptationset>
       </period>
    </mpd>
  • ffmpeg dash Segment offset

    18 mars 2019, par inkubux

    I’m trying to integrate live-transcoding like "plex" or "emby" with my application.

    I am able to serve dash content over to shaka-player or dash.js but only in ’live-mode’. But I want to enable seeking through the player.

    I looked at plex and to enable this they create their own mpd file with duration so the player will have a full seekbar.

    However when seeking the player will ask for a segment number eg : 449. I need to stop ffmpeg and restart with an offset (-ss &lt;<segment>>)</segment>, but ffmpeg will just restart a transcode session from segment 0 with an initial segment.

    What I want is to tell ffmpeg to start at a seekpoint but only output from segment number and now-on.

    When playing with hls and mpegts, I can tell ffmpeg to output at a certain segment : with the option -segment_start_number but this is not available for dash. And plex use their own transcoder based of ffmpeg with the option -skip_to_segment

    I tried to ’hack’ around by keeping a manual offset on my web-server, even if I serve the "supposed" right segment after the seek point dash.js and shaka-player can’t recover the stream.. VLC on the other habd is able to (probably more tolerent) to errors in segments.

    Is the supposed right segment after a seek in dash (contains the initial segment) or only the segment.

    Is ffmpeg able to start segmenting dash as a supposed segment (for seek and resume)

    The same technique works in hls with forced key frames and a custom m3u8 (with all the "predicted" segments) but calculating the right segment length and the right bandwidth is much harder and hackish and dash is more tolerant to variation.

    I would really like to be able to seek through my live transcoding video.

    For reference here is a custom mpd file I serve to enable "seeking" :

    <mpd xmlns="urn:mpeg:dash:schema:mpd:2011" profiles="urn:mpeg:dash:profile:isoff-live:2011" type="static" suggestedpresentationdelay="PT1S" mediapresentationduration="PT49M2.920S" maxsegmentduration="PT2S" minbuffertime="PT10S">
       <period start="PT0S" duration="PT49M2.920S">
           <adaptationset segmentalignment="true">
               <segmenttemplate timescale="1" duration="1" initialization="$RepresentationID$/initial.mp4" media="$RepresentationID$/$Number$.m4s" startnumber="1">
               </segmenttemplate>
               <representation mimetype="video/mp4" codecs="avc1.640029" bandwidth="3766000" width="1920" height="1080">
               </representation>
           </adaptationset>
           <adaptationset segmentalignment="true">
               <segmenttemplate timescale="1" duration="1" initialization="$RepresentationID$/initial.mp4" media="$RepresentationID$/$Number$.m4s" startnumber="1">
               </segmenttemplate>
               <representation mimetype="audio/mp4" codecs="mp4a.40.2" bandwidth="188000" audiosamplingrate="48000">
                   <audiochannelconfiguration schemeiduri="urn:mpeg:dash:23003:3:audio_channel_configuration:2011" value="6"></audiochannelconfiguration>
               </representation>
           </adaptationset>
       </period>
    </mpd>

    And here is the ffmpeg command to pull it off :

    ffmpeg -ss 0 -i movie.mkv -y -acodec aac -vcodec libx264 -f dash -min_seg_duration 1000000 -individual_header_trailer 0 -pix_fmt yuv420p -vf scale=trunc(min(max(iw\,ih*dar)\,1920)/2)*2:trunc(ow/dar/2)*2 -bsf:v h264_mp4toannexb -profile:v high -level 4.1 -map_chapters -1 -map_metadata -1 -preset veryfast -movflags frag_keyframe+empty_moov -use_template 1 -use_timeline 0 -remove_at_exit 1 -crf 23 -bufsize 7532k -maxrate 3766k -start_at_zero -threads 0 -force_key_frames expr:if(isnan(prev_forced_t),eq(t,t),gte(t,prev_forced_t+1)) -init_seg_name $RepresentationID$/0_initial.mp4 -media_seg_name $RepresentationID$/0_$Number$.m4s /transcoding_temp/Z1GVWEc/index.mpd

    The media_seg_name is where I prepend the custom seek_point let’s say I want to seek to segment 1233 the template would be :

    -media_seg_name $RepresentationID$/1233_$Number$.m4s

    and the segments would be 1233_1 1233_2 1233_* So I can serve the right segment after seek. but the player does not recover and still downloading subsequent segments. I guess since a new initial segment is generated and I somehow miss headers for continuous playback after seek but I’m probably wrong.

    Thanks for your help

  • add localtime to each frame as metadata or subtitle while streaming use ffmpeg

    15 novembre 2019, par Mohammad M

    I want to encode some raw frames as nvenc_h264 and then stream it over UDP. What I need, now, is to add some data like capturing time to each frame as its metadata or subtitle(just for that frame or specifying in the metadata that it is just related to which frame.).

    now I use this command in C# to start FFmpeg for receiving raw frame using pipe and stream it over UDP.

             "-f rawvideo -vcodec rawvideo -pixel_format rgba"
             + " -colorspace bt709"
             + " -s " + width + "x" + height
             + " -framerate " + frameRate + " -vsync 0 "
             + " -loglevel warning -i - "// +// preset.GetOptions()
             + " -c:v h264_nvenc  -preset ll -zerolatency 1 -cq 10 -bf 2 -g 150 -f mpeg pipe:10" //udp://172.20.82.106:2000?"  //outputPath

    The base code that I used is here :
    https://github.com/keijiro/FFmpegOut

    and I use this command for receiving and saving the file :
    ffmpeg -i udp://127.0.0.1:2001 -c:v h264_nvenc video.mp4

    As I said I want the times as data that I can process it later, not just like a watermark on the frame. How can I add it ? is there any way ?