Recherche avancée

Médias (0)

Mot : - Tags -/metadatas

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (26)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Contribute to documentation

    13 avril 2011

    Documentation is vital to the development of improved technical capabilities.
    MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
    To contribute, register to the project users’ mailing (...)

Sur d’autres sites (4599)

  • Using ffmpeg on RTOS

    7 avril 2015, par Dhiraj

    I am trying to capture video and audio from a webcam and stream it wirelessly through a software defined radio. Essentially, I need to packetize the video stream so that it is suitable for the transport layer implemented in INEGRITY OS running on an ARM processor. While I am able to capture the video and transmit it wirelessly, on the receiving end, when I try to view the video using ffplay, the quality is very bad. Ugly green patches and video tearing. Do pardon my ignorance but ffmpeg is not my forte. This is how I am sending the video :

    ffmpeg -rtbufsize 1500M -f dshow -i video="Vimicro USB Camera (Altair)":audio="Microphone (Realtek High Definition Audio)" -r 10 -vcodec libx264 -threads 0 -crf 23 -preset ultrafast -tune zerolatency -acodec libmp3lame -b 600k -flush_packets 0 -f mpegts udp://192.9.200.254:8000?pkt_size=1128

    On the receiver end, I run ffplay using the following command :

    ffplay udp://192.9.200.69:8000

    Importantly, the video from the USB camera is sent over ethernet to an ARM processor running INTEGRITY RTOS. The transport layer of the Software Defined Radio is implemented in the RTOS. This is where the video data is multiplexed with other application data being transmitted through the SDR and hence a stringent requirement on the packet size (1128 bytes). From the ARM processor, the data packet is sent on to a DSP where the network and DLL layer are implemented and finally on to a FPGA where the PHY layer is implemented.

    Apart from using ffplay, I have also tried mplayer, however, the video output is equally bad.
    Any help would be greatly appreciated.

  • avformat/movenc : allow writing out channel count in MP4 and 3GP

    7 février 2023, par Jan Ekström
    avformat/movenc : allow writing out channel count in MP4 and 3GP
    

    ISOBMFF (14496-12) made this field ('channelcount') in the
    AudioSampleEntry structure non-template¹ somewhere before the
    release of the 2022 edition. As for ETSI TS 126 244 AKA 3GPP
    file format (V16.1.0, 2020-10), it does not seem contain any
    references limiting the channelcount entry in AudioSampleEntry
    or in its own definition of EVSSampleEntry.

    fate-mov-mp4-chapters test had to be adjusted as it output a
    mono vorbis stream, which would now be properly marked as such
    in the container.

    1 : As per 14496-12 :
    Fields shown as “template” in the box descriptions are fields
    which are coded with a default value unless a derived
    specification defines their use and permits writers to use
    other values than the default.

    • [DH] libavformat/movenc.c
    • [DH] tests/ref/fate/mov-mp4-chapters
  • Playing fragmented mp4 doesnt continue playing

    18 mai 2017, par adviner

    Using ffmpeg I create an mp4 using my video camera as the source.

    ffmpeg -f dshow -i video="Integrated Webcam":audio="Microphone (Realtek High Definition Audio)"^
    -g 52^
    -vcodec libx264 -pix_fmt yuv420p -profile:v baseline -level 3^
    -f mp4 -movflags empty_moov+default_base_moof+frag_keyframe^
    %OUTPUT%\video.mp4

    Works with IE11, Chrome and Firefox

    And my html video tags :

    <video controls="controls" autoplay="autoplay" style="width:640px;height:360px;">
     <source src="http://localhost/video.mp4" type="video/mp4;codecs=&quot;avc1.42E01E, mp4a.40.2&quot;"></source>
    </video>

    The node web server version just has src="http://localhost/" for the

    using

    I’ve tried nginx and a node version that I got from this site
    The nginx doesnt do anything special. I just basically point the root to the folder so it sees the mp4.

    The problem is it only plays however much the webserver sees when the page is loaded. The file is still continuously growing. And if I refresh the page I can see the time length is now longer since the last time.

    My question is how can I make the video tag continuously play the fragmented video data without stopping and refreshing the page (which starts from the beginning again) ?