Recherche avancée

Médias (0)

Mot : - Tags -/xmlrpc

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (54)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (8437)

  • TV audio extracted using ffmeg does not work in iOS (but it works in the simulator)

    26 janvier 2014, par Genar

    I can process a TV signal (well I have a .ts video which comes from a TV channel), using ffmpeg but the audio cannot be understood in an iPhone/iPad. The most strange is that the audio (and video) works properly in the simulator (and also in an Android real device, but this is another point), but in a real iPhone/iPad device the video is OK but the audio sounds like a metallic box and nothing can be understood.

    I have created the ffmpeg libraries for iOS (I have tried the version 2.0.2 and also the version 2.1.3) using the information provided in the following link :

    Installing ffmpeg ios libraries armv7, armv7s, i386 and universal on Mac with 10.8

    The aforementioned link explains how to create the ffmpeg include folders and the universal libraries which I have included into my project (the libraries created are : libavcodec.a, libavdevice.a, libavfilter.a, libavformat.a, libavresample.a, libavutil.a, libswresample.a and libswscale.a).

    The sampling frequency used is 48000.

    The audio got from ffmpeg is then stored into a buffer and then "inserted" using OpenAL ; but, from the same TV content (the same .ts video file) the audio data which is generated from ffmpeg in an iPhone/iPad is totally different from the audio data generated from ffmpeg in the simulator (which can reproduce both the audio and video perfectly).

    Thanks in advance,

  • concatenate mp4 videos with ffmpeg concat demuxer in android

    6 mai 2016, par Ara Badalyan

    I want to concatenate mp4 videos with ffmpeg, the problem is when I want to merge videos taken with Iphone and Android it throws problem

    " Non-monotonous DTS in output stream 0:1 ; previous : 150528, current : 139268 ; changing to 150529. This may result in incorrect timestamps in the output file."

    This is my code

    merge.txt

    file 'iphone.mp4'
    file 'android.mp4'

    ffmpeg command

    ffmpeg -f concat -i marge.txt -c copy -y merge.mp4

    If I can’t merge this videos how can i make them with same parameters (frame rate, bitrate...) and merge them ?

    P.S I use ffmpeg version 2.4.2 , because I can’t find android ffmpeg library higher then 2.4.2.

  • Use deck.js as a remote presentation tool

    8 janvier 2014, par silvia

    deck.js is one of the new HTML5-based presentation tools. It’s simple to use, in particular for your basic, every-day presentation needs. You can also create more complex slides with animations etc. if you know your HTML and CSS.

    Yesterday at linux.conf.au (LCA), I gave a presentation using deck.js. But I didn’t give it from the lectern in the room in Perth where LCA is being held – instead I gave it from the comfort of my home office at the other end of the country.

    I used my laptop with in-built webcam and my Chrome browser to give this presentation. Beforehand, I had uploaded the presentation to a Web server and shared the link with the organiser of my speaker track, who was on site in Perth and had set up his laptop in the same fashion as myself. His screen was projecting the Chrome tab in which my slides were loaded and he had hooked up the audio output of his laptop to the room speaker system. His camera was pointed at the audience so I could see their reaction.

    I loaded a slide master URL :
    http://html5videoguide.net/presentations/lca_2014_webrtc/?master
    and the room loaded the URL without query string :
    http://html5videoguide.net/presentations/lca_2014_webrtc/.

    Then I gave my talk exactly as I would if I was in the same room. Yes, it felt exactly as though I was there, including nervousness and audience feedback.

    How did we do that ? WebRTC (Web Real-time Communication) to the rescue, of course !

    We used one of the modules of the rtc.io project called rtc-glue to add the video conferencing functionality and the slide navigation to deck.js. It was actually really really simple !

    Here are the few things we added to deck.js to make it work :

    • Code added to index.html to make the video connection work :
      <meta name="rtc-signalhost" content="http://rtc.io/switchboard/">
      <meta name="rtc-room" content="lca2014">
      ...
      <video id="localV" rtc-capture="camera" muted></video>
      <video id="peerV" rtc-peer rtc-stream="localV"></video>
      ...
      <script src="glue.js"></script>
      <script>
      glue.config.iceServers = [{ url: 'stun:stun.l.google.com:19302' }];
      </script>

      The iceServers config is required to punch through firewalls – you may also need a TURN server. Note that you need a signalling server – in our case we used http://rtc.io/switchboard/, which runs the code from rtc-switchboard.

    • Added glue.js library to deck.js :

      Downloaded from https://raw.github.com/rtc-io/rtc-glue/master/dist/glue.js into the source directory of deck.js.

    • Code added to index.html to synchronize slide navigation :
      glue.events.once('connected', function(signaller) {
       if (location.search.slice(1) !== '') {
         $(document).bind('deck.change', function(evt, from, to) {
           signaller.send('/slide', {
             idx: to,
             sender: signaller.id
           });
         });
       }
       signaller.on('slide', function(data) {
         console.log('received notification to change to slide: ', data.idx);
         $.deck('go', data.idx);
       });
      });

      This simply registers a callback on the slide master end to send a slide position message to the room end, and a callback on the room end that initiates the slide navigation.

    And that’s it !

    You can find my slide deck on GitHub.

    Feel free to write your own slides in this manner – I would love to have more users of this approach. It should also be fairly simple to extend this to share pointer positions, so you can actually use the mouse pointer to point to things on your slides remotely. Would love to hear your experiences !

    Note that the slides are actually a talk about the rtc.io project, so if you want to find out more about these modules and what other things you can do, read the slide deck or watch the talk when it has been published by LCA.

    Many thanks to Damon Oehlman for his help in getting this working.

    BTW : somebody should really fix that print style sheet for deck.js – I’m only ever getting the one slide that is currently showing. ;-)