Recherche avancée

Médias (0)

Mot : - Tags -/formulaire

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (62)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Contribute to documentation

    13 avril 2011

    Documentation is vital to the development of improved technical capabilities.
    MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
    To contribute, register to the project users’ mailing (...)

Sur d’autres sites (6443)

  • dashenc : Write out DASH manifest immediately in streaming mode

    8 juin 2021, par Kevin LaFlamme
    dashenc : Write out DASH manifest immediately in streaming mode
    

    When streaming mode is enabled with fMP4/CMAF for DASH output, the
    segment files are available to read by players as soon as the first byte
    is written instead of only after the file is fully written. The DASH
    manifest currently only gets written when the final write to the segment
    file occurs. This means that players cannot stream the first segment
    while it is being written.

    When -lhls is enabled with MP4 segments the HLS manifest is written
    immediately to advertise the in-flight segments. This change adds the
    same behavior for the DASH manifest so players can stream it
    immediately.

    • [DH] libavformat/dashenc.c
  • linux ffmpeg Live streaming cannot connect to another PC

    9 juin 2019, par Tae-Yeong

    I’m set ffserver and ffmpeg streaming code in the linux console terminal.

    ffserver => ffserver -f ./ffserver.conf (terminal 1)

    ffmpeg => ffmpeg -r 15 -s 640x480 -f video4linux2 -i /dev/video0 http://mylocalhostip:8090/feed1.ffm (terminal 2)

    It was working success in the localhost.
    But It could not connect other PC or Mobile.
    Firewall set disable.

    My desktop and mobile have a normal Internet connection.

    I want to Live-Streaming connect other PC or Mobile.
    How can I do ?

    Hardware = LattePanda
    OS = Ubuntu 16.04

    < ffserver.conf >

    HTTPPort 8090

    bind to all IPs aliased or not

    HTTPBindAddress 0.0.0.0

    max number of simultaneous clients

    MaxClients 1000

    max bandwidth per-client (kb/s)

    MaxBandwidth 10000

    Suppress that if you want to launch ffserver as a daemon.

    NoDaemon

       <feed>
       File /tmp/feed1.ffm
       FileMaxSize 100M
       </feed>
       <stream>
       Feed feed1.ffm
       Format swf
       VideoCodec flv
       VideoFrameRate 15
       VideoBufferSize 80000
       VideoBitRate 100
       VideoQMin 1
       VideoQMax 5
       VideoSize 640x480
       PreRoll 0
       Noaudio
       </stream>
       <stream>
       Format status
       ACL allow localhost
       ACL allow 121.172.0.0 121.172.0.255 (my desktop Ip range)
       </stream>

    I expect to output of My WebCam Screen, but the actual output timeout.

    My desktop OS is Window10. And Mobile OS is Android.

  • How to implement HTTP Live Streaming server on Unix ?

    5 septembre 2012, par alex

    I just realized that Apple required HTTP Live Streaming in order to view videos in iPhone apps. I was not aware of this before... I am now trying to understand what this involves so I can decide whether I want to do the work and make the videos available in 3G or limit video playing to users who are connected to wi-fi.

    I read the overview provided by Apple, and now understand that my server needs to segment and index my media files. I also understand that I don't have to host the content to be able to stream it (I can point to a video hosted somewhere else, right ?).

    What's not clear to me at this point is what to implement on my server (Ubuntu Hardy) to do the actual segmenting and indexing on the fly (once again, I do not host the videos I want to serve).

    I found a link explaining how to install FFmpeg and X264, but I don't know if this is the best solution (since I have an Ubuntu server, I can't use the Apple Live Streaming tools, is it correct ?). Also, I do not understand at which point my server knows that a video needs to be converted and starts the job...

    Any feedback that could help me understand exactly what to do on the server side to be able to stream videos on my iPhone app in 3G would be greatly appreciated ! (Oh, and just it makes any difference, my app back-end is in Rails)