Recherche avancée

Médias (0)

Mot : - Tags -/utilisateurs

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (40)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Other interesting software

    13 avril 2011, par

    We don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
    The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
    We don’t know them, we didn’t try them, but you can take a peek.
    Videopress
    Website : http://videopress.com/
    License : GNU/GPL v2
    Source code : (...)

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

Sur d’autres sites (8701)

  • Animated line chart with pandas, matplotlib and ffmpeg

    10 avril 2020, par Mark K

    In producing an animated line chart, I have below data and codes.

    



    But when the chart produced, it shows no line. What did I do wrong ?

    



    Thank you.

    



    import numpy as np
import pandas as pd
import seaborn as sns
import matplotlib
import matplotlib.pyplot as plt
import matplotlib.animation as animation

title = 'Heroin Overdoses'

data = {'Year' : ["1999","2000","2001","2002","2003","2004","2005","2006","2007","2008","2009","2010","2011","2012","2013","2014","2015","2016"], 
'Heroin Overdoses' : [280,443,413,486,475,148,197,170,448,103,137,160,483,356,352,300,466,278]}
overdose = pd.DataFrame(data)

Writer = animation.writers['ffmpeg']
writer = Writer(fps=20, metadata=dict(artist='Me'), bitrate=1800)

fig = plt.figure(figsize=(10,6))
plt.xlim(1999, 2016)
plt.ylim(np.min(overdose)[0], np.max(overdose)[0])
plt.xlabel('Year',fontsize=20)
plt.ylabel(title,fontsize=20)
plt.title('Heroin Overdoses per Year',fontsize=20)

def animate(i):
    data = overdose.iloc[:int(i+1)] #select data range
    p = sns.lineplot(x=data.index, y=data[title], data=data, color="r")
    p.tick_params(labelsize=17)
    plt.setp(p.lines,linewidth=7)

ani = matplotlib.animation.FuncAnimation(fig, animate, frames=17, repeat=True)

ani.save('C:\\folder\\line chart.mp4', writer=writer)


    


  • Stream Live Video and relay audio only to icecast2 server

    28 avril 2019, par BadAddy

    I have a working nginx server which allows me to stream live video from our mobile production system. We also have a radio station on a separate server and would like to stream to both. But I cannot make it work, nor can I get any logs or error information to explain why. I have tried nginx config and FFMPEG to try and resolve this.

    I have tried various attempts using what I think I understand from other pages online :

    exec_push FFREPORT=file=ffreport.log:level=48 ffmpeg -i $basename.flv -vn -acodec mp3 rtmp://source:********!!@xxx.xxx.xxx.180:8000/live;

    Also tried using the simple restream in the nginx conf :

    application restream {
                       live on;
                       exec_push ffmpeg -i $basename.flv -vn -acodec mp3 rtmp://source:***********@xxx.xxx.xxx.180:8000/live;
                       # push server2:1935
               }

    I have used the same information on Mixxx Live Broadcast Connection to get the details, thinking I am asking the same thing on the icecast2 server. Just the source is the nginx server.

    This is the full conf on nginx

    rtmp {

       server {
               listen 1935;
               chunk_size 4000;

               application live {
                       live on;
                       allow publish 127.0.0.1;
                       allow publish all;
                       allow play all;
                       record all;
                       record_path /usr/local/nginx/flv-streams;
                       record_unique on;
                       exec_record_done ffmpeg -i $basename.flv /usr/local/nginx/html/streams/$basename.mp4;
                       hls on;
                       hls_nested on;
                       hls_path /mnt/hls;
                       hls_fragment 1s;
                       hls_sync 1ms;
                       #exec_push FFREPORT=file=ffreport.log:level=48 ffmpeg -i $basename.flv -vn -acodec mp3 rtmp://source:*************@xxx.xxx.xxx.xxx:8000/live;
               }
               # Video on Demand
               application streams {
                       play /usr/local/nginx/html/streams/;
               }

               # Restream
               application restream {
                       live on;
                       exec_push ffmpeg -i $basename.flv -vn -acodec mp3 rtmp://source***************@xxx.xxx.xxx:8000/live;
                       # push server2:1935
               }

       }

    I would like those that can watch any broadcast with video, but if they can only listen, like a radio, I want them to listen via our radio player. They are two different streams, on different servers.

    At the moment I am using software to stream to both and would like to prevent this.

    Not found, by my own wording perhaps, any idea on how to do this.

    UPDATE

    With the help from TBR I have managed to get the stream from the Nginx Server going to a new server hosting icecast2. However, not in the way expected. It does this 32x faster, so not a stream as such.

    ffmpeg -i fcpr-1554651146.flv -vn -c:a mp3 icecast://source:password@10.0.0.0:8000/fcprlive.mp3

    However, I wonder if I have been thinking of this the wrong way. In my liquidsoap file I have this code :

    #!/usr/bin/liquidsoap
    # Log dir set("log.file.path","/tmp/basic-radio.log")
    # Music
    myplaylist = mksafe(playlist("/home/offlineftp/playlist"))

    #Live Source
    set("harbor.bind_addr","0.0.0.0")
    live = input.http("http://localhost:8000/fcprlive")
    radio = fallback(track_sensitive=false, [live,plist])

    # Stream it out
    output.icecast(%mp3, host = "localhost", port = 8000,
    password = "pass", mount = "/fcpr")

    Should I look at using LiquidSoap to pull the stream from Nginx, when live, and if no signal than go to the fallback ?

  • MPEG Dash output generated by FFMPEG not working

    8 janvier 2019, par Anto Joy

    I created a dash output sample_video.mpd file using ffmpeg, but when I tried to play it using dash js the video was just loading and nothing else.
    Below is the generated mpd file

    <?xml version="1.0" encoding="utf-8"?>
    <mpd xmlns="urn:mpeg:dash:schema:mpd:2011" profiles="urn:mpeg:dash:profile:isoff-live:2011" type="static" mediapresentationduration="PT1M2.3S" minbuffertime="PT20.0S">
    <programinformation>
    </programinformation>
    <period start="PT0.0S">
       <adaptationset contenttype="video" segmentalignment="true" bitstreamswitching="true" lang="und">
           <representation mimetype="video/mp4" codecs="avc1.640015" bandwidth="255520" width="426" height="240" framerate="24/1">
               <segmenttemplate timescale="12288" initialization="init-stream$RepresentationID$.m4s" media="chunk-stream$RepresentationID$-$Number%05d$.m4s" startnumber="3">
                   <segmenttimeline>
                       <s t="198656" d="69120"></s>
                       <s d="75264"></s>
                       <s d="174592"></s>
                       <s d="122880"></s>
                       <s d="125440"></s>
                   </segmenttimeline>
               </segmenttemplate>
           </representation>
           <representation mimetype="video/mp4" codecs="avc1.64001e" bandwidth="726596" width="854" height="480" framerate="24/1">
               <segmenttemplate timescale="12288" initialization="init-stream$RepresentationID$.m4s" media="chunk-stream$RepresentationID$-$Number%05d$.m4s" startnumber="3">
                   <segmenttimeline>
                       <s t="198656" d="69120"></s>
                       <s d="75264"></s>
                       <s d="174592"></s>
                       <s d="122880"></s>
                       <s d="125440"></s>
                   </segmenttimeline>
               </segmenttemplate>
           </representation>
           <representation mimetype="video/mp4" codecs="avc1.64001f" bandwidth="1433314" width="1280" height="720" framerate="24/1">
               <segmenttemplate timescale="12288" initialization="init-stream$RepresentationID$.m4s" media="chunk-stream$RepresentationID$-$Number%05d$.m4s" startnumber="3">
                   <segmenttimeline>
                       <s t="198656" d="69120"></s>
                       <s d="75264"></s>
                       <s d="174592"></s>
                       <s d="122880"></s>
                       <s d="125440"></s>
                   </segmenttimeline>
               </segmenttemplate>
           </representation>
       </adaptationset>
       <adaptationset contenttype="audio" segmentalignment="true" bitstreamswitching="true" lang="und">
           <representation mimetype="audio/mp4" codecs="mp4a.40.2" bandwidth="341000" audiosamplingrate="48000">
               <audiochannelconfiguration schemeiduri="urn:mpeg:dash:23003:3:audio_channel_configuration:2011" value="6"></audiochannelconfiguration>
               <segmenttemplate timescale="48000" initialization="init-stream$RepresentationID$.m4s" media="chunk-stream$RepresentationID$-$Number%05d$.m4s" startnumber="3">
                   <segmenttimeline>
                       <s t="772096" d="270336"></s>
                       <s d="293888"></s>
                       <s d="681984"></s>
                       <s d="480256"></s>
                       <s d="492544"></s>
                   </segmenttimeline>
               </segmenttemplate>
           </representation>
           <representation mimetype="audio/mp4" codecs="mp4a.40.2" bandwidth="341000" audiosamplingrate="48000">
               <audiochannelconfiguration schemeiduri="urn:mpeg:dash:23003:3:audio_channel_configuration:2011" value="6"></audiochannelconfiguration>
               <segmenttemplate timescale="48000" initialization="init-stream$RepresentationID$.m4s" media="chunk-stream$RepresentationID$-$Number%05d$.m4s" startnumber="3">
                   <segmenttimeline>
                       <s t="772096" d="270336"></s>
                       <s d="293888"></s>
                       <s d="681984"></s>
                       <s d="480256"></s>
                       <s d="492544"></s>
                   </segmenttimeline>
               </segmenttemplate>
           </representation>
           <representation mimetype="audio/mp4" codecs="mp4a.40.2" bandwidth="341000" audiosamplingrate="48000">
               <audiochannelconfiguration schemeiduri="urn:mpeg:dash:23003:3:audio_channel_configuration:2011" value="6"></audiochannelconfiguration>
               <segmenttemplate timescale="48000" initialization="init-stream$RepresentationID$.m4s" media="chunk-stream$RepresentationID$-$Number%05d$.m4s" startnumber="3">
                   <segmenttimeline>
                       <s t="772096" d="270336"></s>
                       <s d="293888"></s>
                       <s d="681984"></s>
                       <s d="480256"></s>
                       <s d="492544"></s>
                   </segmenttimeline>
               </segmenttemplate>
           </representation>
       </adaptationset>
    </period>
    </mpd>

    The following is the ffmpeg command that i used to generate multi bitrate video

    ffmpeg -y -i sample_video.mp4 ^
    -c:v libx264 -x264opts "keyint=24:min-keyint=24:no-scenecut" -r 24 ^
    -c:a aac -b:a 128k ^
    -bf 1 -b_strategy 0 -sc_threshold 0 -pix_fmt yuv420p ^
    -map 0:v:0 -map 0:a:0 -map 0:v:0 -map 0:a:0 -map 0:v:0 -map 0:a:0 ^
    -b:v:0 250k  -filter:v:0 "scale=-2:240" -profile:v:0 baseline ^
    -b:v:1 750k  -filter:v:1 "scale=-2:480" -profile:v:1 main ^
    -b:v:2 1500k -filter:v:2 "scale=-2:720" -profile:v:2 high ^
    sample_dash.mp4

    and to generate the mpd file

    ffmpeg -y -re -i sample_dash.mp4 ^
    -map 0 ^
    -use_timeline 1 -use_template 1 -window_size 5 -adaptation_sets
    "id=0,streams=v id=1,streams=a" ^
    -f dash sample_video.mpd

    Also I saw in the chrome network tab that only init-stream4.m4s and init-stream5.m4s was called.