Recherche avancée

Médias (0)

Mot : - Tags -/logo

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (64)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

  • Dépôt de média et thèmes par FTP

    31 mai 2013, par

    L’outil MédiaSPIP traite aussi les média transférés par la voie FTP. Si vous préférez déposer par cette voie, récupérez les identifiants d’accès vers votre site MédiaSPIP et utilisez votre client FTP favori.
    Vous trouverez dès le départ les dossiers suivants dans votre espace FTP : config/ : dossier de configuration du site IMG/ : dossier des média déjà traités et en ligne sur le site local/ : répertoire cache du site web themes/ : les thèmes ou les feuilles de style personnalisées tmp/ : dossier de travail (...)

Sur d’autres sites (12000)

  • lavf/http: handle case where the server returns a redirect during a seek

    13 avril 2015, par Rodger Combs
    lavf/http: handle case where the server returns a redirect during a seek
    

    txoffer (e.g. http://tori.aoi-chan.com/ ) redirects to the same URI on your
    first request, and serves the actual file on the second. It’s stupid, but AFAIK
    technically compliant. We’d previously see the server not handing back a Range
    header and return an error ; now, instead, we see that there’s a redirect and
    keep track of the offset we want while trying again at the new URL.

    Signed-off-by : Michael Niedermayer <michaelni@gmx.at>

    • [DH] libavformat/http.c
  • Revert "avformat/mp3dec : offset seek index to end of id3v2 tag"

    14 avril 2015, par wm4
    Revert "avformat/mp3dec : offset seek index to end of id3v2 tag"
    

    This reverts commit 8b76c0eb561b0313e2a27950fe9d2bc5e4780dd8.

    It was slightly incorrect ; the next commit fixes it.

    Signed-off-by : Michael Niedermayer <michaelni@gmx.at>

    • [DH] libavformat/mp3dec.c
  • How to play and seek fragmented MP4 audio using MSE SourceBuffer ?

    29 juin 2024, par Stefan Falk

    Note :

    &#xA;

    &#xA;

    If you end up here, you might want to take a look at shaka-player and the accompanying shaka-streamer. Use it. Don't implement this yourself unless you really have to.

    &#xA;

    &#xA;

    I am trying for quite some time now to be able to play an audio track on Chrome, Firefox, Safari, etc. but I keep hitting brick walls. My problem is currently that I am just not able to seek within a fragmented MP4 (or MP3).

    &#xA;

    At the moment I am converting audio files such as MP3 to fragmented MP4 (fMP4) and send them chunk-wise to the client. What I do is defining a CHUNK_DURACTION_SEC (chunk duration in seconds) and compute a chunk size like this :

    &#xA;

    chunksTotal = Math.ceil(this.track.duration / CHUNK_DURATION_SEC);&#xA;chunkSize = Math.ceil(this.track.fileSize / this.chunksTotal);&#xA;

    &#xA;

    With this I partition the audio file and can fetch it entirely jumping chunkSize-many bytes for each chunk :

    &#xA;

    -----------------------------------------&#xA;| chunk 1 | chunk 2 |   ...   | chunk n |&#xA;-----------------------------------------&#xA;

    &#xA;

    How audio files are converted to fMP4

    &#xA;

    ffmpeg -i input.mp3 -acodec aac -b:a 256k -f mp4 \&#xA;       -movflags faststart&#x2B;frag_every_frame&#x2B;empty_moov&#x2B;default_base_moof \&#xA;        output.mp4&#xA;

    &#xA;

    This seems to work with Chrome and Firefox (so far).

    &#xA;

    How chunks are appended

    &#xA;

    After following this example, and realizing that it's simply not working as it is explained here, I threw it away and started over from scratch. Unfortunately without success. It's still not working.

    &#xA;

    The following code is supposed to play a track from the very beginning to the very end. However, I also need to be able to seek. So far, this is simply not working. Seeking will just stop the audio after the seeking event got triggered.

    &#xA;

    The code

    &#xA;

    /* Desired chunk duration in seconds. */&#xA;const CHUNK_DURATION_SEC = 20;&#xA;&#xA;const AUDIO_EVENTS = [&#xA;  &#x27;ended&#x27;,&#xA;  &#x27;error&#x27;,&#xA;  &#x27;play&#x27;,&#xA;  &#x27;playing&#x27;,&#xA;  &#x27;seeking&#x27;,&#xA;  &#x27;seeked&#x27;,&#xA;  &#x27;pause&#x27;,&#xA;  &#x27;timeupdate&#x27;,&#xA;  &#x27;canplay&#x27;,&#xA;  &#x27;loadedmetadata&#x27;,&#xA;  &#x27;loadstart&#x27;,&#xA;  &#x27;updateend&#x27;,&#xA;];&#xA;&#xA;&#xA;class ChunksLoader {&#xA;&#xA;  /** The total number of chunks for the track. */&#xA;  public readonly chunksTotal: number;&#xA;&#xA;  /** The length of one chunk in bytes */&#xA;  public readonly chunkSize: number;&#xA;&#xA;  /** Keeps track of requested chunks. */&#xA;  private readonly requested: boolean[];&#xA;&#xA;  /** URL of endpoint for fetching audio chunks. */&#xA;  private readonly url: string;&#xA;&#xA;  constructor(&#xA;    private track: Track,&#xA;    private sourceBuffer: SourceBuffer,&#xA;    private logger: NGXLogger,&#xA;  ) {&#xA;&#xA;    this.chunksTotal = Math.ceil(this.track.duration / CHUNK_DURATION_SEC);&#xA;    this.chunkSize = Math.ceil(this.track.fileSize / this.chunksTotal);&#xA;&#xA;    this.requested = [];&#xA;    for (let i = 0; i &lt; this.chunksTotal; i&#x2B;&#x2B;) {&#xA;      this.requested[i] = false;&#xA;    }&#xA;&#xA;    this.url = `${environment.apiBaseUrl}/api/tracks/${this.track.id}/play`;&#xA;  }&#xA;&#xA;  /**&#xA;   * Fetch the first chunk.&#xA;   */&#xA;  public begin() {&#xA;    this.maybeFetchChunk(0);&#xA;  }&#xA;&#xA;  /**&#xA;   * Handler for the "timeupdate" event. Checks if the next chunk should be fetched.&#xA;   *&#xA;   * @param currentTime&#xA;   *  The current time of the track which is currently played.&#xA;   */&#xA;  public handleOnTimeUpdate(currentTime: number) {&#xA;&#xA;    const nextChunkIndex = Math.floor(currentTime / CHUNK_DURATION_SEC) &#x2B; 1;&#xA;    const hasAllChunks = this.requested.every(val => !!val);&#xA;&#xA;    if (nextChunkIndex === (this.chunksTotal - 1) &amp;&amp; hasAllChunks) {&#xA;      this.logger.debug(&#x27;Last chunk. Calling mediaSource.endOfStream();&#x27;);&#xA;      return;&#xA;    }&#xA;&#xA;    if (this.requested[nextChunkIndex] === true) {&#xA;      return;&#xA;    }&#xA;&#xA;    if (currentTime &lt; CHUNK_DURATION_SEC * (nextChunkIndex - 1 &#x2B; 0.25)) {&#xA;      return;&#xA;    }&#xA;&#xA;    this.maybeFetchChunk(nextChunkIndex);&#xA;  }&#xA;&#xA;  /**&#xA;   * Fetches the chunk if it hasn&#x27;t been requested yet. After the request finished, the returned&#xA;   * chunk gets appended to the SourceBuffer-instance.&#xA;   *&#xA;   * @param chunkIndex&#xA;   *  The chunk to fetch.&#xA;   */&#xA;  private maybeFetchChunk(chunkIndex: number) {&#xA;&#xA;    const start = chunkIndex * this.chunkSize;&#xA;    const end = start &#x2B; this.chunkSize - 1;&#xA;&#xA;    if (this.requested[chunkIndex] == true) {&#xA;      return;&#xA;    }&#xA;&#xA;    this.requested[chunkIndex] = true;&#xA;&#xA;    if ((end - start) == 0) {&#xA;      this.logger.warn(&#x27;Nothing to fetch.&#x27;);&#xA;      return;&#xA;    }&#xA;&#xA;    const totalKb = ((end - start) / 1000).toFixed(2);&#xA;    this.logger.debug(`Starting to fetch bytes ${start} to ${end} (total ${totalKb} kB). Chunk ${chunkIndex &#x2B; 1} of ${this.chunksTotal}`);&#xA;&#xA;    const xhr = new XMLHttpRequest();&#xA;    xhr.open(&#x27;get&#x27;, this.url);&#xA;    xhr.setRequestHeader(&#x27;Authorization&#x27;, `Bearer ${AuthenticationService.getJwtToken()}`);&#xA;    xhr.setRequestHeader(&#x27;Range&#x27;, &#x27;bytes=&#x27; &#x2B; start &#x2B; &#x27;-&#x27; &#x2B; end);&#xA;    xhr.responseType = &#x27;arraybuffer&#x27;;&#xA;    xhr.onload = () => {&#xA;      this.logger.debug(`Range ${start} to ${end} fetched`);&#xA;      this.logger.debug(`Requested size:        ${end - start &#x2B; 1}`);&#xA;      this.logger.debug(`Fetched size:          ${xhr.response.byteLength}`);&#xA;      this.logger.debug(&#x27;Appending chunk to SourceBuffer.&#x27;);&#xA;      this.sourceBuffer.appendBuffer(xhr.response);&#xA;    };&#xA;    xhr.send();&#xA;  };&#xA;&#xA;}&#xA;&#xA;export enum StreamStatus {&#xA;  NOT_INITIALIZED,&#xA;  INITIALIZING,&#xA;  PLAYING,&#xA;  SEEKING,&#xA;  PAUSED,&#xA;  STOPPED,&#xA;  ERROR&#xA;}&#xA;&#xA;export class PlayerState {&#xA;  status: StreamStatus = StreamStatus.NOT_INITIALIZED;&#xA;}&#xA;&#xA;&#xA;/**&#xA; *&#xA; */&#xA;@Injectable({&#xA;  providedIn: &#x27;root&#x27;&#xA;})&#xA;export class MediaSourcePlayerService {&#xA;&#xA;  public track: Track;&#xA;&#xA;  private mediaSource: MediaSource;&#xA;&#xA;  private sourceBuffer: SourceBuffer;&#xA;&#xA;  private audioObj: HTMLAudioElement;&#xA;&#xA;  private chunksLoader: ChunksLoader;&#xA;&#xA;  private state: PlayerState = new PlayerState();&#xA;&#xA;  private state$ = new BehaviorSubject<playerstate>(this.state);&#xA;&#xA;  public stateChange = this.state$.asObservable();&#xA;&#xA;  private currentTime$ = new BehaviorSubject<number>(null);&#xA;&#xA;  public currentTimeChange = this.currentTime$.asObservable();&#xA;&#xA;  constructor(&#xA;    private httpClient: HttpClient,&#xA;    private logger: NGXLogger&#xA;  ) {&#xA;  }&#xA;&#xA;  get canPlay() {&#xA;    const state = this.state$.getValue();&#xA;    const status = state.status;&#xA;    return status == StreamStatus.PAUSED;&#xA;  }&#xA;&#xA;  get canPause() {&#xA;    const state = this.state$.getValue();&#xA;    const status = state.status;&#xA;    return status == StreamStatus.PLAYING || status == StreamStatus.SEEKING;&#xA;  }&#xA;&#xA;  public playTrack(track: Track) {&#xA;    this.logger.debug(&#x27;playTrack&#x27;);&#xA;    this.track = track;&#xA;    this.startPlayingFrom(0);&#xA;  }&#xA;&#xA;  public play() {&#xA;    this.logger.debug(&#x27;play()&#x27;);&#xA;    this.audioObj.play().then();&#xA;  }&#xA;&#xA;  public pause() {&#xA;    this.logger.debug(&#x27;pause()&#x27;);&#xA;    this.audioObj.pause();&#xA;  }&#xA;&#xA;  public stop() {&#xA;    this.logger.debug(&#x27;stop()&#x27;);&#xA;    this.audioObj.pause();&#xA;  }&#xA;&#xA;  public seek(seconds: number) {&#xA;    this.logger.debug(&#x27;seek()&#x27;);&#xA;    this.audioObj.currentTime = seconds;&#xA;  }&#xA;&#xA;  private startPlayingFrom(seconds: number) {&#xA;    this.logger.info(`Start playing from ${seconds.toFixed(2)} seconds`);&#xA;    this.mediaSource = new MediaSource();&#xA;    this.mediaSource.addEventListener(&#x27;sourceopen&#x27;, this.onSourceOpen);&#xA;&#xA;    this.audioObj = document.createElement(&#x27;audio&#x27;);&#xA;    this.addEvents(this.audioObj, AUDIO_EVENTS, this.handleEvent);&#xA;    this.audioObj.src = URL.createObjectURL(this.mediaSource);&#xA;&#xA;    this.audioObj.play().then();&#xA;  }&#xA;&#xA;  private onSourceOpen = () => {&#xA;&#xA;    this.logger.debug(&#x27;onSourceOpen&#x27;);&#xA;&#xA;    this.mediaSource.removeEventListener(&#x27;sourceopen&#x27;, this.onSourceOpen);&#xA;    this.mediaSource.duration = this.track.duration;&#xA;&#xA;    this.sourceBuffer = this.mediaSource.addSourceBuffer(&#x27;audio/mp4; codecs="mp4a.40.2"&#x27;);&#xA;    // this.sourceBuffer = this.mediaSource.addSourceBuffer(&#x27;audio/mpeg&#x27;);&#xA;&#xA;    this.chunksLoader = new ChunksLoader(&#xA;      this.track,&#xA;      this.sourceBuffer,&#xA;      this.logger&#xA;    );&#xA;&#xA;    this.chunksLoader.begin();&#xA;  };&#xA;&#xA;  private handleEvent = (e) => {&#xA;&#xA;    const currentTime = this.audioObj.currentTime.toFixed(2);&#xA;    const totalDuration = this.track.duration.toFixed(2);&#xA;    this.logger.warn(`MediaSource event: ${e.type} (${currentTime} of ${totalDuration} sec)`);&#xA;&#xA;    this.currentTime$.next(this.audioObj.currentTime);&#xA;&#xA;    const currentStatus = this.state$.getValue();&#xA;&#xA;    switch (e.type) {&#xA;      case &#x27;playing&#x27;:&#xA;        currentStatus.status = StreamStatus.PLAYING;&#xA;        this.state$.next(currentStatus);&#xA;        break;&#xA;      case &#x27;pause&#x27;:&#xA;        currentStatus.status = StreamStatus.PAUSED;&#xA;        this.state$.next(currentStatus);&#xA;        break;&#xA;      case &#x27;timeupdate&#x27;:&#xA;        this.chunksLoader.handleOnTimeUpdate(this.audioObj.currentTime);&#xA;        break;&#xA;      case &#x27;seeking&#x27;:&#xA;        currentStatus.status = StreamStatus.SEEKING;&#xA;        this.state$.next(currentStatus);&#xA;        if (this.mediaSource.readyState == &#x27;open&#x27;) {&#xA;          this.sourceBuffer.abort();&#xA;        }&#xA;        this.chunksLoader.handleOnTimeUpdate(this.audioObj.currentTime);&#xA;        break;&#xA;    }&#xA;  };&#xA;&#xA;  private addEvents(obj, events, handler) {&#xA;    events.forEach(event => obj.addEventListener(event, handler));&#xA;  }&#xA;&#xA;}&#xA;</number></playerstate>

    &#xA;

    Running it will give me the following output :

    &#xA;

    enter image description here

    &#xA;

    &#xA;

    Apologies for the screenshot but it's not possible to just copy the output without all the stack traces in Chrome.

    &#xA;

    &#xA;

    What I also tried was following this example and call sourceBuffer.abort() but that didn't work. It looks more like a hack that used to work years ago but it's still referenced in the docs (see "Example" -> "You can see something similar in action in Nick Desaulnier's bufferWhenNeeded demo ..").

    &#xA;

    case &#x27;seeking&#x27;:&#xA;  currentStatus.status = StreamStatus.SEEKING;&#xA;  this.state$.next(currentStatus);        &#xA;  if (this.mediaSource.readyState === &#x27;open&#x27;) {&#xA;    this.sourceBuffer.abort();&#xA;  } &#xA;  break;&#xA;

    &#xA;

    Trying with MP3

    &#xA;

    I have tested the above code under Chrome by converting tracks to MP3 :

    &#xA;

    ffmpeg -i input.mp3 -acodec aac -b:a 256k -f mp3 output.mp3&#xA;

    &#xA;

    and creating a SourceBuffer using audio/mpeg as type :

    &#xA;

    this.mediaSource.addSourceBuffer(&#x27;audio/mpeg&#x27;)&#xA;

    &#xA;

    I have the same problem when seeking.

    &#xA;

    The issue wihout seeking

    &#xA;

    The above code has another issue :

    &#xA;

    After two minutes of playing, the audio playback starts to stutter and comes to a halt prematurely. So, the audio plays up to a point and then it stops without any obvious reason.

    &#xA;

    For whatever reason there is another canplay and playing event. A few seconds after, the audio simply stops..

    &#xA;

    enter image description here

    &#xA;