Recherche avancée

Médias (0)

Mot : - Tags -/content

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (62)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

Sur d’autres sites (8190)

  • how to configure ffmpeg pipline for streaming

    30 juin 2021, par Changhoon Lee

    I want to get real-time streaming data (use mediasuop) and send it to nginx-rtmp-server via ffmpeg, but if I run it through the pipeline, the error occurs after logging as below. I checked that the rtmp server works well through OBS studio program, but when I run with the pipeline I prepared, the error occurs. Do you know where this problem occurs ?

    


    Problems with the pipeline ? Problem with Sdp ? Or something else ?

    


    my pipe line

    


    ffmpeg
-loglevel
debug
-nostdin
-protocol_whitelist
file,pipe,udp,rtp,rtmp
-analyzeduration 30M
-probesize
30M
-f
sdp
-i
pipe:0
-map 0:v:0
-c:v copy -map
0:a:0 -strict -2
-c:a aac

-flags +global_headers

{filePath}.flv

-pix_fmt
yuv420p
-preset
ultrafast
-use_wallclock_as_timestamps
1
-tune zerolatency
-qmin 2
-qmax 51
-muxrate 1300k
-sdp_file {file}.sdp
-r 30
-b:v 1000k
-bufsize 3000k
-minrate 500k
-maxrate 2000k
-qscale 3
-threads 4
-b:a 128k
-framerate 30
-g 50
-crf 30
-ar 44100
-s 400x700
-f
flv

rtmp://{rtmpServerUri}


    


    and my sdp file

    


        v=0
  o=- 0 0 IN IP4 127.0.0.1
  s=FFmpeg
  c=IN IP4 127.0.0.1
  t=0 0
  m=video 23490 RTP/AVP 101 
  a=rtpmap:101 VP8/90000
  a=sendonly
  m=audio 29773 RTP/AVP 100 
  a=rtpmap:100 opus/48000/2
  a=sendonly


    


    and ErrorLog

    


    ffmpeg version 4.4 Copyright (c) 2000-2021 the FFmpeg developers
built with Apple clang version 12.0.5 (clang-1205.0.22.9)

'  libavutil      56. 70.100 / 56. 70.100\n' +
  '  libavcodec     58.134.100 / 58.134.100\n' +
  '  libavformat    58. 76.100 / 58. 76.100\n' +
  '  libavdevice    58. 13.100 / 58. 13.100\n' +
  '  libavfilter     7.110.100 /  7.110.100\n' +
  '  libavresample   4.  0.  0 /  4.  0.  0\n' +
  '  libswscale      5.  9.100 /  5.  9.100\n' +
  '  libswresample   3.  9.100 /  3.  9.100\n' +
  '  libpostproc    55.  9.100 / 55.  9.100\n' +

"Reading option '-loglevel' ... matched as option 'loglevel' (set logging level) with argument 'debug'
"Reading option '-nostdin' ... matched as option 'stdin' (enable or disable interaction on standard input) with argument 0"
"Reading option '-protocol_whitelist' ... matched as AVOption 'protocol_whitelist' with argument 'file,pipe,udp,rtp,rtmp'"
"Reading option '-analyzeduration' ... matched as AVOption 'analyzeduration' with argument '30M'"
"Reading option '-probesize' ... matched as AVOption 'probesize' with argument '30M'"
"Reading option '-f' ... matched as option 'f' (force format) with argument 'sdp'"
"Reading option '-i' ... matched as input url with argument 'pipe:0'"
"Reading option '-map' ... matched as option 'map' (set input stream mapping) with argument '0:v:0'"
"Reading option '-c:v' ... matched as option 'c' (codec name) with argument 'copy'"
"Reading option '-map' ... matched as option 'map' (set input stream mapping) with argument '0:a:0'"
"Reading option '-strict' ..."

Routing option strict to both codec and muxer layer

matched as AVOption 'strict' with argument '-2'.
Reading option '-c:a' ... matched as option 'c' (codec name) with argument 'aac'.
Reading option '-flags' ... matched as AVOption 'flags' with argument '+global_header'.
Reading option '{filePath}.flv' ... matched as output url." +
Reading option '-pix_fmt' ... matched as option 'pix_fmt' (set pixel format) with argument 'yuv420p'" 
Reading option '-preset' ... matched as AVOption 'preset' with argument 'ultrafast'." 
Reading option '-use_wallclock_as_timestamps' ... matched as AVOption 'use_wallclock_as_timestamps' with argument '1'." 
Reading option '-tune' ... matched as AVOption 'tune' with argument 'zerolatency'." 
Reading option '-qmin' ... matched as AVOption 'qmin' with argument '2'." 
Reading option '-qmax' ... matched as AVOption 'qmax' with argument '51'." 
Reading option '-muxrate' ... matched as AVOption 'muxrate' with argument '1300k'." 
Reading option '-sdp_file' ... matched as option 'sdp_file' (specify a file in which to print sdp information) with argument '{filename}.sdp'." 
Reading option '-r' ... matched as option 'r' (set frame rate (Hz value, fraction or abbreviation)) with argument '30'"
Reading option '-b:v' ... matched as option 'b' (video bitrate (please use -b:v)) with argument '1000k
Reading option '-bufsize' ... matched as AVOption 'bufsize' with argument '3000k'
Reading option '-minrate' ... matched as AVOption 'minrate' with argument '500k'
Reading option '-maxrate' ... matched as AVOption 'maxrate' with argument '2000k'
Reading option '-qscale' ... matched as option 'qscale' (use fixed quality scale (VBR)) with argument '3'
Reading option '-threads' ... matched as AVOption 'threads' with argument '4
Reading option '-b:a' ... matched as option 'b' (video bitrate (please use -b:v)) with argument '128k'
Reading option '-framerate' ... matched as AVOption 'framerate' with argument '30'
Reading option '-g' ... matched as AVOption 'g' with argument '50'
Reading option '-crf' ... matched as AVOption 'crf' with argument '30'
Reading option '-ar' ... matched as option 'ar' (set audio sampling rate (in Hz)) with argument '44100
Reading option '-s' ... matched as option 's' (set frame size (WxH or abbreviation)) with argument '400x700'
Reading option '-f' ... matched as option 'f' (force format) with argument 'flv'.
Reading option 'rtmp:{rtmpserver} ' ... matched as output url
Finished splitting the commandline.\n' +
Parsing a group of options: global .\n' +
Applying option loglevel (set logging level) with argument debug.\n' +
Applying option nostdin (enable or disable interaction on standard input) with argument 0.\n' +
Applying option sdp_file (specify a file in which to print sdp information) with argument {filename}.sdp
Successfully parsed a group of options
Parsing a group of options: input url pipe:0
Applying option f (force format) with argument sdp.
Successfully parsed a group of options
Opening an input file: pipe:0.
[sdp @ 0x12e008e00] Opening 'pipe:0' for reading

[sdp @ 0x12e008e00] video codec set to: vp8\n' +
[sdp @ 0x12e008e00] audio codec set to: opus\n' +
[sdp @ 0x12e008e00] audio samplerate set to: 48000\n' +
[sdp @ 0x12e008e00] audio channels set to: 2\n' +
[udp @ 0x12c610b70] end receive buffer size reported is 393216\n' +
[udp @ 0x12c610c30] end receive buffer size reported is 393216\n' +
[sdp @ 0x12e008e00] setting jitter buffer size to 500\n' +
[udp @ 0x12c611290] end receive buffer size reported is 393216\n' +
[udp @ 0x12c611330] end receive buffer size reported is 393216\n' +
[sdp @ 0x12e008e00] setting jitter buffer size to 500\n' +
[sdp @ 0x12e008e00] Before avformat_find_stream_info() pos: 310 bytes read:310 seeks:0 nb_streams:2\n'

[sdp @ 0x12e008e00] Could not find codec parameters for stream 0 (Video: vp8, 1 reference frame, yuv420p): unspecified size\n' +
  "Consider increasing the value for the 'analyzeduration' (30000000) and 'probesize' (30000000) options

'Codec AVOption qmin (minimum video quantizer scale (VBR)) specified for output file #1 (rtmp:{rtmpserver}) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream

[rtmp @ 0x11c604310] No default whitelist set
[rtmp @ 0x11c604310] No default whitelist set

Starting connection attempt to {server} port 1935
[tcp @ 0x10c6054a0] Successfully connected to {server} port 1935
[rtmp @ 0x11c604310] Handshaking...
[rtmp @ 0x11c604310] Proto = rtmp, path = {serverpath}, app = live, fname = {filename}'
[rtmp @ 0x11c604310] Window acknowledgement size = 5000000
[rtmp @ 0x11c604310] Max sent, unacked = 5000000
[rtmp @ 0x11c604310] New incoming chunk size = 4096
[rtmp @ 0x11c604310] Releasing stream
[rtmp @ 0x11c604310] FCPublish stream
[rtmp @ 0x11c604310] Creating stream
[rtmp @ 0x11c604310] Sending publish command for {filename}

Successfully opened the file

Stream mapping

Stream #0:0 -> #0:0 (copy)
Stream #0:1 -> #0:1 (opus (native) -> aac (native)
Stream #0:1 -> #1:0 (opus (native) -> mp3 (libmp3lame)
cur_dts is invalid st:0 (0) [init:1 i_done:0 finish:0] (this is harmless if it occurs once at the start per stream)
cur_dts is invalid st:1 (0) [init:0 i_done:0 finish:0] (this is harmless if it occurs once at the start per stream)
pipe:0: Operation timed out
cur_dts is invalid st:0 (0) [init:1 i_done:0 finish:3] (this is harmless if it occurs once at the start per stream)
cur_dts is invalid st:1 (0) [init:0 i_done:0 finish:0] (this is harmless if it occurs once at the start per stream)
detected 8 logical cores
[graph_0_in_0_1 @ 0x10c605b50] Setting 'time_base' to value '1/48000
[graph_0_in_0_1 @ 0x10c605b50] Setting 'sample_rate' to value '48000'
[graph_0_in_0_1 @ 0x10c605b50] Setting 'sample_fmt' to value 'fltp'
[graph_0_in_0_1 @ 0x10c605b50] Setting 'channel_layout' to value '0x3'
[graph_0_in_0_1 @ 0x10c605b50] tb:1/48000 samplefmt:fltp samplerate:48000 chlayout:0x3
[format_out_0_1 @ 0x10c605ea0] Setting 'sample_fmts' to value 'fltp'
[format_out_0_1 @ 0x10c605ea0] Setting 'sample_rates' to value '96000|88200|64000|48000|44100|32000|24000|22050|16000|12000|11025|8000|7350'
[AVFilterGraph @ 0x12d204de0] query_formats: 4 queried, 9 merged, 0 already done, 0 delayed

//// error ////

[flv @ 0x10c80aa00] dimensions not set
Could not write header for output file #0 (incorrect codec parameters ?): Invalid argument
Error initializing output stream 0:1

[AVIOContext @ 0x10c6040a0] Statistics: 0 seeks, 0 writeouts
[AVIOContext @ 0x10c6040a0] Statistics: 0 seeks, 0 writeouts
[rtmp @ 0x11c604310] UnPublishing stream
[rtmp @ 0x11c604310] Deleting stream
[aac @ 0x10c80ce00] Qavg: nan\n']
[AVIOContext @ 0x12d005120] Statistics: 310 bytes read, 0 seeks
Conversion failed!


    


    I'd appreciate it if you could tell me why not...

    


  • How can I get matplotlib to show full subplots in an animation ?

    12 mars 2015, par Matt Stone

    I’m trying to write a simple immune system simulator. I’m modeling infected tissue as a simple grid of cells and various intracellular signals, and I’d like to animate movement of cells in one plot and the intensity of viral presence in another as the infection progresses. I’m doing so with the matshow function provided by matplotlib. However, when I plot the two next to each other, the full grid gets clipped unless I stretch out the window myself. I can’t address the problem at all when saving to an mp4.

    Here’s the default view, which is identical to what I observe when saving to mp4 :

    And here’s what it looks like after stretching out the viewer window

    I’m running Python 2.7.9 with matplotlib 1.4.2 on OS X 10.10.2, using ffmpeg 2.5.2 (installed via Homebrew). Below is the code I’m using to generate the animation. I tried using plt.tight_layout() but it didn’t affect the problem. If anyone has any advice as to how to solve this, I’d really appreciate it ! I’d especially like to be able to save it without viewing with plt.show(). Thanks !

    def animate(self, fname=None, frames=100):
       fig, (agent_ax, signal_ax) = plt.subplots(1, 2, sharey=True)

       agent_ax.set_ylim(0, self.grid.shape[0])
       agent_ax.set_xlim(0, self.grid.shape[1])
       signal_ax.set_ylim(0, self.grid.shape[0])
       signal_ax.set_xlim(0, self.grid.shape[1])

       agent_mat = agent_ax.matshow(self.display_grid(),
                                    vmin=0, vmax=10)
       signal_mat = signal_ax.matshow(self.signal_display(virus),
                                      vmin=0, vmax=20)
       fig.colorbar(signal_mat)

       def anim_update(tick):
           self.update()
           self.diffuse()
           agent_mat.set_data(self.display_grid())
           signal_mat.set_data(self.signal_display(virus))
           return agent_mat, signal_mat

       anim = animation.FuncAnimation(fig, anim_update, frames=frames,
                                      interval=3000, blit=False)

       if fname:
           anim.save(fname, fps=5, extra_args=['-vcodec', 'libx264'])
       else:
           plt.show()
  • ffmpeg can't read png data on iphone

    10 octobre 2014, par user2789801

    I’m using ffmpeg to decode a png picture and use the AVFrame as a opengl texture.
    But the strangest thing is that I can get the png converted to opengl texture nicely on a iphone simulator, but I got a blank texture on a real iphone.

    on both simulator and iphone, I got a null pointer for AVFrame’s data

    avcodec_decode_video2(codecContext/* AVCodecContext* */,frame /* AVFrame */,&finished,&tempPacket);

    Then I covert the color space to AV_PIX_FMT_RGBA

    void convertToRGBColor()
    {
       int numBytes = avpicture_get_size(
                                     AV_PIX_FMT_RGBA,
                                     codecContext->width,
                                     codecContext->height);
       uint8_t *buffer = (uint8_t *)av_malloc(numBytes);
       avpicture_fill(rgbFrame/* AVFrame* */, buffer, AV_PIX_FMT_RGBA, codecContext->width, codecContext->height);

       struct SwsContext *img_convert_ctx = NULL;
       img_convert_ctx = sws_getCachedContext(
                            img_convert_ctx,
                            codecContext->width,
                            codecContext->height,
                            codecContext->pix_fmt,
                            codecContext->width,
                            codecContext->height,
                            COLOR_SPACE,
                            SWS_BILINEAR,
                            NULL,
                            NULL,
                            NULL);

       if( !img_convert_ctx )
       {
           fprintf(stderr, "Cannot initialize sws conversion context\n");
       }

       sws_scale(img_convert_ctx,
                 frame->data,
                 frame->linesize,
                 0,
                 codecContext->height,
                 rgbFrame->data,
                 rgbFrame->linesize);
       sws_freeContext(img_convert_ctx);
    }

    On a simulator, rgbFrame’s data[0] will be a valid pointer, but on a iphone, it’s null.

    So, does anyone had the same problem before ?