Recherche avancée

Médias (91)

Autres articles (68)

  • Use, discuss, criticize

    13 avril 2011, par

    Talk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
    The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
    A discussion list is available for all exchanges between users.

  • Formulaire personnalisable

    21 juin 2013, par

    Cette page présente les champs disponibles dans le formulaire de publication d’un média et il indique les différents champs qu’on peut ajouter. Formulaire de création d’un Media
    Dans le cas d’un document de type média, les champs proposés par défaut sont : Texte Activer/Désactiver le forum ( on peut désactiver l’invite au commentaire pour chaque article ) Licence Ajout/suppression d’auteurs Tags
    On peut modifier ce formulaire dans la partie :
    Administration > Configuration des masques de formulaire. (...)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

Sur d’autres sites (10698)

  • Video from Android Camera muxed with libavformat not playable in all players and audio not synced

    19 janvier 2021, par Debrugger

    I'm using avformat to mux encoded video and audio received from Android into an mp4 file. The resulting file is playable through ffplay though it sometimes outputs "No Frame !" during playback. VLC kind of plays it back but with glitches that look like the effect when movement data for one video is combined with color data from another. The video player on my phone does not play it at all.

    


    On top of that audio is not properly synced, even though MediaCodec manages to produce a proper file with nothing more than the code below has available (ie. presentationTimeStamp in microseconds.

    


    This is my code (error checking omitted for clarity) :

    


    // Initializing muxer
AVStream *videoStream = avformat_new_stream(outputContext, nullptr);
videoStreamIndex = videoStream->index;

videoStream->codecpar->codec_type = AVMEDIA_TYPE_VIDEO;
videoStream->codecpar->codec_id = AV_CODEC_ID_H264;
videoStream->codecpar->bit_rate = bitrate;
videoStream->codecpar->width = width;
videoStream->codecpar->height = height;
videoStream->time_base.num = 1;
videoStream->time_base.den = 90000;

AVStream* audioStream = avformat_new_stream(outputContext, nullptr);
audioStreamIndex = audioStream->index;
audioStream->codecpar->codec_type = AVMEDIA_TYPE_AUDIO;
audioStream->codecpar->codec_id = AV_CODEC_ID_MP4ALS;
audioStream->codecpar->bit_rate = audiobitrate;
audioStream->codecpar->sample_rate = audiosampleRate;
audioStream->codecpar->channels = audioChannelCount;
audioStream->time_base.num = 1;
audioStream->time_base.den = 90000;

avformat_write_header(outputContext, &opts);

writtenAudio = writtenVideo = false;


// presentationTimeUs is the absolute timestamp when the encoded frame was received in Android code. 
// This is what is usually fed into MediaCodec
int writeVideoFrame(uint8_t *data, int size, int64_t presentationTimeUs) {
    AVPacket pkt;
    av_init_packet(&pkt);
    pkt.flags |= AV_PKT_FLAG_KEY; // I know setting this on every frame is wrong. When do I set it?
    pkt.data = data;
    pkt.size = size;
    pkt.dts = AV_NOPTS_VALUE;
    pkt.pts = presentationTimeUs;
    if (writtenVideo) { // since the timestamp is absolute we have to subtract the initial offset
        pkt.pts -= firstVideoPts;
    }
    // rescale from microseconds to the stream timebase
    av_packet_rescale_ts(&pkt, AVRational { 1, 1000000 }, outputContext->streams[videoStreamIndex]->time_base);
    pkt.dts = AV_NOPTS_VALUE;
    pkt.stream_index = videoStreamIndex;
    if (!writtenVideo) {
        AVStream* videoStream = outputContext->streams[videoStreamIndex];
        videoStream->start_time = pkt.pts;
        firstVideoPts = presentationTimeUs;
    }
    if (av_interleaved_write_frame(outputContext, &pkt) < 0) {
        return 1;
    }
    writtenVideo = true;
    return 0;
}

int writeAudioFrame(uint8_t *data, int size, int64_t presentationTimeUs) {
    AVPacket pkt;
    av_init_packet(&pkt);
    pkt.data = data;
    pkt.size = size;
    pkt.stream_index = audioStreamIndex;
    pkt.pts = presentationTimeUs;
    av_packet_rescale_ts(&pkt, AVRational { 1, 1000000}, outputContext->streams[audioStreamIndex]->time_base);
    pkt.flags |= AV_PKT_FLAG_KEY;
    pkt.dts = AV_NOPTS_VALUE;
    if (!writtenAudio) {
        outputContext->streams[audioStreamIndex]->start_time = pkt.pts;
    }
    if (av_interleaved_write_frame(outputContext, &pkt) < 0) {
        return 1;
    }
    writtenAudio = true;
    return 0;
}

void close() {
    av_write_trailer(outputContext);
    running = false;

    // cleanup AVFormatContexts etc
}


    


    I think I'm doing the same as shown in avformat docs and examples, and the produced video is somewhat usable (reencoding it with ffmpeg yields a working video). But some things must still be wrong.

    


  • How can I improve the up-time of my coffee pot live stream ?

    26 avril 2017, par tww0003

    Some Background on the Project :

    Like most software developers I depend on coffee to keep me running, and so do my coworkers. I had an old iPhone sitting around, so I decided to pay homage to the first webcam and live stream my office coffee pot.

    The stream has become popular within my company, so I want to make sure it will stay online with as little effort possible on my part. As of right now, it will occasionally go down, and I have to manually get it up and running again.

    My Setup :

    I have nginx set up on a digital ocean server (my nginx.conf is shown below), and downloaded an rtmp streaming app for my iPhone.

    The phone is set to stream to example.com/live/stream and then I use an ffmpeg command to take that stream, strip the audio (the live stream is public and I don’t want coworkers to feel like they have to be careful about what they say), and then make it accessible at rtmp://example.com/live/coffee and example.com/hls/coffee.m3u8.

    Since I’m not too familiar with ffmpeg, I had to google around and find the appropriate command to strip the coffee stream of the audio and I found this :

    ffmpeg -i rtmp://localhost/live/stream -vcodec libx264 -vprofile baseline -acodec aac -strict -2 -f flv -an rtmp://localhost/live/coffee

    Essentially all I know about this command is that the input stream comes from, localhost/live/stream, it strips the audio with -an, and then it outputs to rtmp://localhost/live/coffee.

    I would assume that ffmpeg -i rtmp://localhost/live/stream -an rtmp://localhost/live/coffee would have the same effect, but the page I found the command on was dealing with ffmpeg, and nginx, so I figured the extra parameters were useful.

    What I’ve noticed with this command is that it will error out, taking the live stream down. I wrote a small bash script to rerun the command when it stops, but I don’t think this is the best solution.

    Here is the bash script :

    while true;
    do
           ffmpeg -i rtmp://localhost/live/stream -vcodec libx264 -vprofile baseline -acodec aac -strict -2 -f flv -an rtmp://localhost/live/coffee
           echo 'Something went wrong. Retrying...'
           sleep 1
    done

    I’m curious about 2 things :

    1. What is the best way to strip audio from an rtmp stream ?
    2. What is the proper configuration for nginx to ensure that my rtmp stream will stay up for as long as possible ?

    Since I have close to 0 experience with nginx, ffmpeg, and rtmp streaming any help, or tips would be appreciated.

    Here is my nginx.conf file :

    worker_processes  1;

    events {
       worker_connections  1024;
    }


    http {
       include       mime.types;
       default_type  application/octet-stream;

       sendfile        on;

       keepalive_timeout  65;

       server {
           listen       80;
           server_name  localhost;

           location / {
               root   html;
               index  index.html index.htm;
           }

           error_page   500 502 503 504  /50x.html;
           location = /50x.html {
               root   html;
           }

           location /stat {
                   rtmp_stat all;
                   rtmp_stat_stylesheet stat.xsl;
                   allow 127.0.0.1;
           }
           location /stat.xsl {
                   root html;
           }
           location /hls {
                   root /tmp;
                   add_header Cache-Control no-cache;
           }
           location /dash {
                   root /tmp;
                   add_header Cache-Control no-cache;
                   add_header Access-Control-Allow-Origin *;
           }
       }
    }

    rtmp {

       server {

           listen 1935;
           chunk_size 4000;

           application live {
               live on;

               hls on;
               hls_path /tmp/hls;

               dash on;
               dash_path /tmp/dash;
           }
       }
    }

    edit :
    I’m also running into this same issue : https://trac.ffmpeg.org/ticket/4401

  • How to fix the problem I'm having with FFmpeg ?

    23 février 2023, par John

    I'm working with the ffmpeg library to convert mp4 video files to mp3 audio files.
Here is my code :

    


    package com.exer;


import android.app.Activity;
import android.app.ProgressDialog;
import android.os.Bundle;
import android.os.Environment;
import android.widget.Toast;
import com.github.hiteshsondhi88.libffmpeg.ExecuteBinaryResponseHandler;
import com.github.hiteshsondhi88.libffmpeg.FFmpeg;
import com.github.hiteshsondhi88.libffmpeg.FFmpegLoadBinaryResponseHandler;

public class MainActivity extends Activity {
    
    FFmpeg ffmpeg;
    private ProgressDialog progressDialog;
    
    
    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
    
        
        
        try {
            setUp();
            String[] command = {
                "-i", getPaths()+"/dir/input.mp4", "-vn", getPaths()+"/dir/output.mp3"
            };
            //convert("ffmpeg -i input.mp4 -vn output.mp3");
            convert(command);
            
        } catch (Exception e) {
            Toast.makeText(getApplicationContext(), e.getCause().toString(), Toast.LENGTH_SHORT).show();
        }
    }
    
    
    public void setUp() throws Exception {
        
        if(ffmpeg == null) {
            
            ffmpeg = FFmpeg.getInstance(this);
            ffmpeg.loadBinary(new FFmpegLoadBinaryResponseHandler(){
                    
            @Override
            public void onFailure() {
                Toast.makeText(getApplicationContext(), "failed to load library", Toast.LENGTH_SHORT).show();   
            }
                    
            @Override
            public void onSuccess() {
                Toast.makeText(getApplicationContext(), "loaded!", Toast.LENGTH_SHORT).show();
            }
                    
            @Override
            public void onStart() {
                        
            }
                    
            @Override
            public void onFinish() {
                        
            }
                    
                    
            });
            
        }
        
    }
    
    
    private void convert(String[] cmd) throws Exception {
        
        ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler(){
            
            @Override
            public void onFailure(String message){
                super.onFailure(message);
            }
            
            @Override
            public void onFinish(){
                super.onFinish();
                Toast.makeText(getApplicationContext(), "finished!", Toast.LENGTH_SHORT).show();
            }
            
            @Override
            public void onStart(){
                super.onStart();
                Toast.makeText(getApplicationContext(), "start conversion...", Toast.LENGTH_SHORT).show();
            }
            
            @Override
            public void onProgress(String message){
                super.onProgress(message);
            }
        });
        
    
    }
    
    private String getPaths() {
        return Environment.getExternalStorageDirectory().getPath();
    }
    
}


    


    When I run the app, the Toast messages are shown :

    


    loaded!
start converting...
finished! as I write them in the functions, apart that nothing else happens the file is not converted what's wrong ?

    


    Here my manifest file :

    


    &lt;?xml version="1.0" encoding="utf-8"?>&#xA;<manifest package="com.exer">&#xA;    &#xA;    &#xA;    &#xA;    &#xA;    &#xA;        &#xA;            &#xA;                <action></action>&#xA;&#xA;                <category></category>&#xA;            &#xA;        &#xA;    &#xA;&#xA;</manifest>&#xA;

    &#xA;

    I've tried to delete the specified file on the phone to see what erros I might got, but still those three Toasts.

    &#xA;