Recherche avancée

Médias (17)

Mot : - Tags -/wired

Autres articles (111)

  • Script d’installation automatique de MediaSPIP

    25 avril 2011, par

    Afin de palier aux difficultés d’installation dues principalement aux dépendances logicielles coté serveur, un script d’installation "tout en un" en bash a été créé afin de faciliter cette étape sur un serveur doté d’une distribution Linux compatible.
    Vous devez bénéficier d’un accès SSH à votre serveur et d’un compte "root" afin de l’utiliser, ce qui permettra d’installer les dépendances. Contactez votre hébergeur si vous ne disposez pas de cela.
    La documentation de l’utilisation du script d’installation (...)

  • Que fait exactement ce script ?

    18 janvier 2011, par

    Ce script est écrit en bash. Il est donc facilement utilisable sur n’importe quel serveur.
    Il n’est compatible qu’avec une liste de distributions précises (voir Liste des distributions compatibles).
    Installation de dépendances de MediaSPIP
    Son rôle principal est d’installer l’ensemble des dépendances logicielles nécessaires coté serveur à savoir :
    Les outils de base pour pouvoir installer le reste des dépendances Les outils de développements : build-essential (via APT depuis les dépôts officiels) ; (...)

  • Automated installation script of MediaSPIP

    25 avril 2011, par

    To overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
    You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
    The documentation of the use of this installation script is available here.
    The code of this (...)

Sur d’autres sites (10783)

  • Is it possible to play two videos as one like this ? [on hold]

    25 février 2015, par Marko

    Is it possible to show/play video online that’s made of two or more video files ? Here’s explanation.

    My site is hosted on Linux/Appache/PHP server. I have video files in FLV/F4V format.

    What I want is to have online video player that plays video composed of multiple video files concatenated together in real-time, i.e. when user clicks to see a video.

    Resulting video looks like one video, with no visual clues, lags or any delay between videos parts. Basically what is done is some form of on-the-fly editing or pre-editing, and user sees the result.

    Is this possible anyhow with flash, actionscript, ffmpeg, php, html or some other online technology ? I don’t need explanation how it’s possible, but just a nod that it’s possible and some links to further investigate.

  • Expanding media capabilities of Win Embedded CE 6.0

    1er décembre 2014, par Simo Erkinheimo

    I have an embedded device with WinCE 6.0 as OS. The manufacturer provides an IDE for 3rd party development to it. The IDE pretty much allows nothing else than

    • .NET 3.5 Compact Framework scripting that’s invoked from various events from the main application
    • Adding files to the device.

    The included mediaplayer seems to be using DirectShow and the OS has media codec only for mpeg-1 encoded video playback. My goal is to to be able to play media encoded with some other codecs as well inside that main application.

    I’ve already managed to use DirectShowNETCF (DirectShow wrapper for .NET Compact Framework) and successfully playback mpeg-1 encoded video.

    I’m totally new with this stuff and I have tons of (stupid) questions. I’ll try to narrow them down :

    • The OS is based on WinCE, but as far as I’ve understood, it’s actually always some customized version of it (via Platform Builder). Only "correct way" of developing anything for it afterwards is to use the SDK the manufacturer usually provides. Right ? In my case, the SDK is extremely limited and tightly integrated into IDE as noted above. However, .NET CF 3.5 is capable for interop so its possible to call native libraries -as long as they are compiled for correct platform.

    • Compiled code is pretty much just instructions for the processor (assembler code) and the compiler chooses the correct instructions based on the target processor setting. Also there’s the PE-header that defines under which platform the program is meant to be run. If I target my "helloworld.exe" (does nothing but returns specific exit code) to x86 and compile it with VC, should it work ?

    • If the PE-header is in fact the problem, is it possible to setup for WINCE without the SDK ? Do I REALLY need the whole SDK for creating a simple executable that uses only base types ? I’m using VS2010, which doesn’t even support smart device dev anymore and I’d hate to downgrade just for testing purposes.

    • Above questions are prequel to my actual idea : Porting ffmpeg/ffdshow for WinCE. This actually already exists, but not targeted nor built for Intel Atom. Comments ?

    • If the native implementation is not possible and I would end up implementing some specific codec with C#...well that would probably be quite a massive task. But having to choose C# over native, could I run into problems with codec performance ? I mean.. is C# THAT much slower ?

    Thank you.

  • SegmentedIndexBox (SIDX) not generated when using WEBM over DASH

    11 juillet 2014, par Flock Dawson

    I’m trying to get the Industry Format DASH player to work with WEBM audio/video files. However, I’m running in the same error again and again, and Google doesn’t seem to give much help.

    To start with, I created different streams of the same file (different resolutions and bitrates) using this tutorial : https://developer.mozilla.org/en-US/docs/Web/HTML/DASH_Adaptive_Streaming_for_HTML_5_Video

    Then, I downloaded the Industry Format DASH player (http://dashif.org/software/) and pointed it to the DASH manifest I created. When I try to play the video in Chrome, I get the following log :

    Parsing complete: ( xml2json: 3ms, objectiron: 2ms, total: 0.005s) dash.all.js:3
    Manifest has loaded. dash.all.js:3
    MediaSource is open! dash.all.js:3
    Event {clipboardData: undefined, path: NodeList[0], cancelBubble: false, returnValue: true, srcElement: MediaSource…}
    dash.all.js:3
    Video codec: video/webm;codecs="vp8" dash.all.js:3
    No text tracks. dash.all.js:3
    Audio codec: audio/webm;codecs="vorbis" dash.all.js:3
    Duration successfully set to: 27.2 dash.all.js:3
    Perform SIDX load: https://*****/street_orig_125k_final.webm dash.all.js:3
    Perform SIDX load: https://*****/street_audio_final.webm dash.all.js:3
    Uncaught RangeError: Offset is outside the bounds of the DataView

    From this log, I distilled that the manifest is fetched and processed correctly, but something goes wrong when trying to process the SIDX (SegmetIndexBox). I tried another (third-party) source, which works perfectly. I analysed the response returned by the server when trying to fetch the SIDX, and when converted to a readable presentation, the text ’Dsidx’ can be found in this response. So, I analyzed the WEBM file I provide (hexdump and grep), but I cannot find such a SIDX. My conclusion is that the SIDX is never added to the WEBM file.

    From the tutorial I used, I guess the generation of the SIDX is handled by the samplemuxer command, which does not offer any additional parameters. Is there anyone who has more experience in generating this SIDX ?