Recherche avancée

Médias (0)

Mot : - Tags -/performance

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (53)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

Sur d’autres sites (9851)

  • Streaming Rtsp stream to website using FFmpeg and FFserver.

    23 août 2016, par Pallav Gupta

    I am working on a website for a client and one of the requirement is to embed the video from HikVision DVR DS7116. I have the RTSP url for the DVR. I want help with FFmpeg and FFserver. I already have written my ffserver config file.

    /etc/ffserver.config

    Port 9500
    # bind to all IPs aliased or not
    BindAddress 0.0.0.0
    # max number of simultaneous clients
    MaxClients 1000
    # max bandwidth per-client (kb/s)
    MaxBandwidth 10000
    # Suppress that if you want to launch ffserver as a daemon.
    NoDaemon

    <feed>
    File /tmp/feed1.ffm
    FileMaxSize 5M
    </feed>

    <stream>
    Feed feed1.ffm
    Format swf
    VideoCodec flv
    VideoFrameRate 15
    VideoBufferSize 80000
    VideoBitRate 100
    VideoQMin 1
    VideoQMax 5
    VideoSize 352x288
    PreRoll 0
    Noaudio
    </stream>

    I next run my ffserver and ffmpeg command which is

    ffserver &amp; ffmpeg -re -i rtsp://admin:12345@192.168.1.3/MPEG-4/ch1/main/av_stream  http://192.168.1.105:9500/feed1.ffm

    The output which i receive is as follows

    ffmpeg version N-80901-gfebc862 Copyright (c) 2000-2016 the FFmpeg developers
     built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.3)
     configuration: --extra-libs=-ldl --prefix=/opt/ffmpeg --mandir=/usr/share/man --enable-avresample --disable-debug --enable-nonfree --enable-gpl --enable-version3 --enable-libopencore-amrnb --enable-libopencore-amrwb --disable-decoder=amrnb --disable-decoder=amrwb --enable-libpulse --enable-libfreetype --enable-gnutls --enable-libx264 --enable-libx265 --enable-libfdk-aac --enable-libvorbis --enable-libmp3lame --enable-libopus --enable-libvpx --enable-libspeex --enable-libass --enable-avisynth --enable-libsoxr --enable-libxvid --enable-libvidstab

     libavutil      55. 28.100 / 55. 28.100
     libavcodec     57. 48.101 / 57. 48.101
     libavformat    57. 41.100 / 57. 41.100
     libavdevice    57.  0.102 / 57.  0.102
     libavfilter     6. 47.100 /  6. 47.100
     libavresample   3.  0.  0 /  3.  0.  0
     libswscale      4.  1.100 /  4.  1.100
     libswresample   2.  1.100 /  2.  1.100
     libpostproc    54.  0.100 / 54.  0.100

    [rtsp @ 0x20c4720] Missing PPS in sprop-parameter-sets, ignoring
    [h264 @ 0x20c7f60] non-existing PPS 0 referenced
       Last message repeated 1 times
    [h264 @ 0x20c7f60] decode_slice_header error
    [h264 @ 0x20c7f60] no frame!
    [rtsp @ 0x20c4720] RTP: missed 1137 packets
    [rtsp @ 0x20c4720] max delay reached. need to consume packet
    [rtsp @ 0x20c4720] RTP: missed 1125 packets
    [rtsp @ 0x20c4720] max delay reached. need to consume packet
    [rtsp @ 0x20c4720] RTP: missed 1126 packets

    Guessed Channel Layout for Input Stream #0.1 : mono
    Input #0, rtsp, from 'rtsp://admin:12345@192.168.1.3/MPEG-4/ch1/main/av_stream':
     Metadata:
       title           : HIK Media Server
       comment         : HIK Media Server Session Description : standard
     Duration: N/A, start: 0.000000, bitrate: N/A
       Stream #0:0: Video: h264 (Baseline), yuv420p, 352x288, 10 fps, 25 tbr, 90k tbn, 20 tbc
       Stream #0:1: Audio: pcm_mulaw, 8000 Hz, 1 channels, s16, 64 kb/s
    [ffm @ 0x21c0e80] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead.

    Output #0, ffm, to 'http://192.168.1.105:9500/feed1.ffm':
     Metadata:
       title           : HIK Media Server
       comment         : HIK Media Server Session Description : standard
       creation_time   : now
       encoder         : Lavf57.41.100
       Stream #0:0: Video: flv1 (flv), yuv420p, 352x288, q=1-5, 100 kb/s, 10 fps, 1000k tbn, 15 tbc

       Metadata:
         encoder         : Lavc57.48.101 flv
       Side data:
         cpb: bitrate max/min/avg: 200000/0/100000 buffer size: 655360000 vbv_delay: -1

    Stream mapping:
     Stream #0:0 -> #0:0 (h264 (native) -> flv1 (flv))
    Press [q] to stop, [?] for help
    frame=    0 fps=0.0 q=0.0 size=       4kB time=00:00:00.00 bitrate=N/A speed=   frame=    0 fps=0.0 q=0.0 size=       4kB time=00:00:00.00 bitrate=N/A speed=   frame=    0 fps=0.0 q=0.0 size=       4kB time=00:00:00.00 bitrate=N/A speed=   Past duration 1.003319 too large
       Last message repeated 1 times
    Past duration 1.005333 too large
    frame=   31 fps= 15 q=31.0 size=      56kB time=00:00:02.00 bitrate= 229.4kbits/Past duration 1.005653 too large
    Past duration 1.005989 too large
    frame=   37 fps= 15 q=24.8 size=      76kB time=00:00:02.40 bitrate= 259.4kbits/Past duration 1.006660 too large
    Past duration 1.006996 too large
    frame=   46 fps= 15 q=31.0 size=      80kB time=00:00:03.00 bitrate= 218.5kbits/Past duration 1.007988 too large
    Past duration 1.008659 too large
    frame=   53 fps= 15 q=31.0 size=      96kB time=00:00:03.46 bitrate= 226.9kbits/Past duration 1.009987 too large
    Past duration 1.010323 too large
    frame=   61 fps= 15 q=24.8 size=     116kB time=00:00:04.00 bitrate= 237.6kbits/Past duration 1.010994 too large
    Past duration 1.011330 too large
    Past duration 1.011986 too large
    frame=   68 fps= 15 q=31.0 size=     120kB time=00:00:04.46 bitrate= 220.1kbits/Past duration 1.012657 too large

    [rtsp @ 0x20c4720] max delay reached. need to consume packet
    [rtsp @ 0x20c4720] RTP: missed 1114 packets
    [rtsp @ 0x20c4720] max delay reached. need to consume packet
    [rtsp @ 0x20c4720] RTP: missed 1115 packets
    Past duration 1.012993 too large
    frame=   76 fps= 15 q=31.0 size=     140kB time=00:00:05.00 bitrate= 229.4kbits/Past duration 1.013664 too large
    Past duration 1.014320 too large
    Past duration 1.014656 too large
    frame=   83 fps= 15 q=31.0 size=     144kB time=00:00:05.46 bitrate= 215.8kbits/Past duration 1.015327 too large
    [rtsp @ 0x20c4720] max delay reached. need to consume packet
    [rtsp @ 0x20c4720] RTP: missed 1103 packets
    [rtsp @ 0x20c4720] max delay reached. need to consume packet
    [rtsp @ 0x20c4720] RTP: missed 1104 packets

    Past duration 1.015999 too large
    frame=   91 fps= 15 q=31.0 size=     160kB time=00:00:06.00 bitrate= 218.5kbits/Past duration 1.016655 too large
    Past duration 1.017326 too large
    Past duration 1.017998 too large
    frame=   98 fps= 15 q=31.0 size=     180kB time=00:00:06.46 bitrate= 228.0kbits/Past duration 1.018654 too large
    [rtsp @ 0x20c4720] max delay reached. need to consume packet
    [rtsp @ 0x20c4720] RTP: missed 1092 packets
    [rtsp @ 0x20c4720] max delay reached. need to consume packet
    [rtsp @ 0x20c4720] RTP: missed 1093 packets
    Past duration 1.019325 too large
    frame=  106 fps= 15 q=31.0 size=     184kB time=00:00:07.00 bitrate= 215.3kbits/Past duration 1.019997 too large
    [rtsp @ 0x20c4720] max delay reached. need to consume packet
    [rtsp @ 0x20c4720] RTP: missed 35 packets
    Past duration 1.020653 too large
    frame=  161 fps= 15 q=31.0 size=     264kB time=00:00:10.66 bitrate= 202.8kbits/Past duration 1.032661 too large
    Past duration 1.033333 too large
    frame=  167 fps= 15 q=31.0 Lsize=     276kB time=00:00:11.06 bitrate= 204.3kbits/s dup=94 drop=0 speed=0.964x
    video:265kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 4.280833%

    I can see that with some errors, the stream starts.
    When i put the output url in my html code, there is no stream. I also tried playing the network stream in VLC and did not get anything. Can anyone please help me with that ? Any leads are also appreciated. Thank you.

    My HTML Code is :

  • How to port signal() to sigaction() ?

    28 septembre 2016, par Shark

    due to recent problems discovered with NDK12 and NDK13b2, i’m thinking of ’porting’ libx264’s use of signal() (and missing bsd_signal() in ndk12) to use sigaction() instead.

    The problem is, I’m not quite sure what’s the simple&fastest way to replace signal() calls with sigaction() ones.

    For all i see, it’s mainly used in x264-snapshot/common/cpu.c in the following manner :

    using the following signal handler :

    static void sigill_handler( int sig )
    {
       if( !canjump )
       {
           signal( sig, SIG_DFL );
           raise( sig );
       }

       canjump = 0;
       siglongjmp( jmpbuf, 1 );
    }

    This is the problematic x264_cpu_detect function... currently, i’m guessing i only need to tackle the ARM version, but i’ ; ; still have to replace all occurances of signal() with sigaction() so i might just cover both of them to get the thing building...

    FYI - the NDK13 beta2 still has "unstable" libc and the build doesn’t fail on this part, but rather the first invocation of the rand() function somewhere else... So i’m out of luck and replacing the signal() calls might be better than just waiting for the official NDK13 release. I’m doing this to get rid of text-relocations so i can run the library (and doubango) on API 24 (Android N)

    the problematic part of function that invokes signal() :

    #elif SYS_LINUX

    uint32_t x264_cpu_detect( void )
    {
       static void (*oldsig)( int );

       oldsig = signal( SIGILL, sigill_handler );
       if( sigsetjmp( jmpbuf, 1 ) )
       {
           signal( SIGILL, oldsig );
           return 0;
       }

       canjump = 1;
       asm volatile( "mtspr 256, %0\n\t"
                     "vand 0, 0, 0\n\t"
                     :
                     : "r"(-1) );
       canjump = 0;

       signal( SIGILL, oldsig );

       return X264_CPU_ALTIVEC;
    }
    #endif

    #elif ARCH_ARM

    void x264_cpu_neon_test( void );
    int x264_cpu_fast_neon_mrc_test( void );

    uint32_t x264_cpu_detect( void )
    {
       int flags = 0;
    #if HAVE_ARMV6
       flags |= X264_CPU_ARMV6;

       // don't do this hack if compiled with -mfpu=neon
    #if !HAVE_NEON
       static void (* oldsig)( int );
       oldsig = signal( SIGILL, sigill_handler );
       if( sigsetjmp( jmpbuf, 1 ) )
       {
           signal( SIGILL, oldsig );
           return flags;
       }

       canjump = 1;
       x264_cpu_neon_test();
       canjump = 0;
       signal( SIGILL, oldsig );
    #endif

       flags |= X264_CPU_NEON;

       // fast neon -> arm (Cortex-A9) detection relies on user access to the
       // cycle counter; this assumes ARMv7 performance counters.
       // NEON requires at least ARMv7, ARMv8 may require changes here, but
       // hopefully this hacky detection method will have been replaced by then.
       // Note that there is potential for a race condition if another program or
       // x264 instance disables or reinits the counters while x264 is using them,
       // which may result in incorrect detection and the counters stuck enabled.
       // right now Apple does not seem to support performance counters for this test
    #ifndef __MACH__
       flags |= x264_cpu_fast_neon_mrc_test() ? X264_CPU_FAST_NEON_MRC : 0;
    #endif
       // TODO: write dual issue test? currently it's A8 (dual issue) vs. A9 (fast      mrc)
    #endif
       return flags;
    }

    #else

    uint32_t x264_cpu_detect( void )
    {
       return 0;
    }

    So the question is really this : what would be the quickest/easiest//fastest way to replace the signal() calls with sigaction() ones while preserving the current functionality ?

    EDIT :
    The reason i’m trying to get rid of signal() are these build errors :

    /home/devshark/SCRATCH/doubango/thirdparties/android/armv5te/lib/dist/libx264.a(cpu.o):cpu.c:function sigill_handler: error: undefined reference to 'bsd_signal'

    /home/devshark/SCRATCH/doubango/thirdparties/android/armv5te/lib/dist/libx264.a(cpu.o):cpu.c:function x264_cpu_detect: error: undefined reference to 'bsd_signal'
    /home/devshark/SCRATCH/doubango/thirdparties/android/armv5te/lib/dist/libx264.a(cpu.o):cpu.c:function x264_cpu_detect: error: undefined reference to 'bsd_signal'

    /home/devshark/SCRATCH/doubango/thirdparties/android/armv5te/lib/dist/libx264.a(cpu.o):cpu.c:function x264_cpu_detect: error: undefined reference to 'bsd_signal'

    I already know that this is a known NDK12 problem, that might be solved by bringing bsd_signal back to the libc in NDK13. However, in it’ beta state with it’s unstable libc - it’s currently missing the rand() function and simply waiting for it might not do the trick. But in the worst-case scenario, i guess i’ll just have to wait for it and retry after it’s release.

    But as it currently is, the prebuilt version of the library i want to use has text-relocations and is being rejected by phones running newer API / version of the android OS.

    EDIT2 :
    I also know that signal() usually works by using sigaction() under the hood, but maybe i won’t get bsd_signal related build-errors... since i’m suspecting that this one isn’t using it. It’s obviously using bsd_signal, which may or may not be the same underlying thing :/

  • How do I run two indefinite loops simultaneously, while also changing variables within them ?

    11 octobre 2016, par Nyalmo

    I’m trying to write a script in Python that records a stream of an IP camera in realtime. It only keeps about a minute worth of recording, constantly overwriting the same files. Whenever an external sensor is triggered I want a variable to be set (event variable) which merges the recording with an extra 30 seconds it records after the sensor is triggered. The combined 90 seconds are then saved as the date and time for later review.

    The idea was to have 2 indefinite while loops, the first containing both the real time recording and the event. The second one would constantly read input and activate the ’event’ function. Initially I though I could just have a software version of the hardware interrupt I’ve learned before, though after some research it seems that’s only for exceptions. I’m currently using TkInter, simulating the external input with keypresses.

    When I tried it out the input wouldn’t cause the event to be triggered. So my question is : How do I run the two indefinite loops simultaneously, while also having the input loop change variables in the main loop so that the ’event’ can actually occur in real-time ?

    Since I’m using ffmpeg to record the stream, once the command is called to record it can’t be stopped, but I want the event variable to be changed as soon as possible.

    I’ve looked at several similar questions regarding multiple loops, and have tried multiprocessing(though this only seems to be used for performance, which is not that important here), making two separate files(not sure how to have them work together) and lastly, threads. None of these seem to work in my situation as I can’t get them running in the way that I want.

    Here is my latest attempt, using threads :

    i = 0
    event = False
    aboutToQuit = False
    someVar = 'Event Deactivated'
    lastVar = False

    def process(frame):
       print "Thread"
       i = 0    
       frame = frame
       frame.bind("<space>", switch)
       frame.bind("<escape>", exit)
       frame.pack()
       frame.focus_set()

    def switch(eventl):
       print(['Activating','Deactivating'][event])
       event = eventl
       event = not(event)

    def exit(eventl):
       print'Exiting Application'
       global aboutToQuit
       #aboutToQuit = True
       root.destroy()

    print("the imported file is", tkinter.__file__)
    def GetTime(): #function opens a script which saves the final merged file as date and time.
       time = datetime.datetime.now()
       subprocess.call("./GetTime.sh", shell = True)
       return (time)

    def main(root):
       global event, aboutToQuit, someVar,lastVar      
       while (not aboutToQuit):
           root.update() # always process new events

           if event == False:
               someVar = 'Event Deactivated'
               subprocess.call(Last30S_Command) #records last 30 seconds overwriting itself.
               print "Merge now"
               subprocess.call(Merge_Command) #merges last 30 seconds with the old 30 seconds      
               print "Shift now"
               subprocess.call(Shift_Command) #copies the last30s recording to the old 30 seconds, overwriting it.
               if lastVar == True: #Triggers only when lastVar state changes
                   print someVar
                   lastVar = False
               time.sleep(.1)

           if event == True:
                someVar = 'Event Activated'
               print"Record Event"
               subprocess.call(EventRecord_Command) #records 30 seconds after event is triggered.
               print"EventMerge Now"
               subprocess.call(EventMerge_Command) # merges the 1 minute recording of Merge_Command with 30 seconds of EventRecord_Command
               if lastVar == False:
                   print someVar
                   lastVar = True
               time.sleep(.1)
               GetTime() #Saves 90 seconds of EventMerge_Command as date and time.
               subprocess.call(EventShift_Command) #Copies EventRecord file to the old 30 second recording, overwriting it

           if aboutToQuit:
              break


    if __name__ == "__main__":
       root = Tk()
       frame = Frame(root, width=100, height=100)  
    #   maintthread = threading.Thread(target=main(root))
    #   inputthread = threading.Thread(target=process(frame))
    #   inputthread.daemon = True
    #   inputthread.start()
    #   maintthread.daemon = True
    #   maintthread.start()
    #   maintthread.join()
       Process(target=process(frame)).start()
       Process(target=main(root)).start()
       print "MainLoop"
    </escape></space>