Recherche avancée

Médias (39)

Mot : - Tags -/audio

Autres articles (55)

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

Sur d’autres sites (5700)

  • Cant we use mp4 as format of FFmpegFrameRecorder while RTMP streaming

    13 avril 2016, par Saty

    I am trying to stream video with FFmpegFrameRecorder using javacv. All are working good however I want to use "mp4" as the recording format instead of FLV. is that possible because when i set that, it crashed ?

    I am using the below code which half of the internet community is using to live stream

    public class MainActivity extends Activity implements OnClickListener {

    private final static String LOG_TAG = "MainActivity";

    private PowerManager.WakeLock mWakeLock;

    private String ffmpeg_link = "rtmp://username:password@xxx.xxx.xxx.xxx:1935/live/test.flv";
    //private String ffmpeg_link = "/mnt/sdcard/new_stream.flv";

    private volatile FFmpegFrameRecorder recorder;
    boolean recording = false;
    long startTime = 0;

    private int sampleAudioRateInHz = 44100;
    private int imageWidth = 320;
    private int imageHeight = 240;
    private int frameRate = 30;

    private Thread audioThread;
    volatile boolean runAudioThread = true;
    private AudioRecord audioRecord;
    private AudioRecordRunnable audioRecordRunnable;

    private CameraView cameraView;
    private IplImage yuvIplimage = null;

    private Button recordButton;
    private LinearLayout mainLayout;

    @Override
    public void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);

       setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
       setContentView(R.layout.activity_main);

       initLayout();
       initRecorder();
    }

    @Override
    protected void onResume() {
       super.onResume();

       if (mWakeLock == null) {
           PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
           mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, LOG_TAG);
           mWakeLock.acquire();
       }
    }

    @Override
    protected void onPause() {
       super.onPause();

       if (mWakeLock != null) {
           mWakeLock.release();
           mWakeLock = null;
       }
    }

    @Override
    protected void onDestroy() {
       super.onDestroy();

       recording = false;
    }


    private void initLayout() {

       mainLayout = (LinearLayout) this.findViewById(R.id.record_layout);

       recordButton = (Button) findViewById(R.id.recorder_control);
       recordButton.setText("Start");
       recordButton.setOnClickListener(this);

       cameraView = new CameraView(this);

       LinearLayout.LayoutParams layoutParam = new LinearLayout.LayoutParams(imageWidth, imageHeight);        
       mainLayout.addView(cameraView, layoutParam);
       Log.v(LOG_TAG, "added cameraView to mainLayout");
    }

    private void initRecorder() {
       Log.w(LOG_TAG,"initRecorder");

       if (yuvIplimage == null) {
           // Recreated after frame size is set in surface change method
           yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
           //yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);

           Log.v(LOG_TAG, "IplImage.create");
       }

       recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
       Log.v(LOG_TAG, "FFmpegFrameRecorder: " + ffmpeg_link + " imageWidth: " + imageWidth + " imageHeight " + imageHeight);

       recorder.setFormat("flv");
       Log.v(LOG_TAG, "recorder.setFormat(\"flv\")");

       recorder.setSampleRate(sampleAudioRateInHz);
       Log.v(LOG_TAG, "recorder.setSampleRate(sampleAudioRateInHz)");

       // re-set in the surface changed method as well
       recorder.setFrameRate(frameRate);
       Log.v(LOG_TAG, "recorder.setFrameRate(frameRate)");

       // Create audio recording thread
       audioRecordRunnable = new AudioRecordRunnable();
       audioThread = new Thread(audioRecordRunnable);
    }

    // Start the capture
    public void startRecording() {
       try {
           recorder.start();
           startTime = System.currentTimeMillis();
           recording = true;
           audioThread.start();
       } catch (FFmpegFrameRecorder.Exception e) {
           e.printStackTrace();
       }
    }

    public void stopRecording() {
       // This should stop the audio thread from running
       runAudioThread = false;

       if (recorder != null && recording) {
           recording = false;
           Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
           try {
               recorder.stop();
               recorder.release();
           } catch (FFmpegFrameRecorder.Exception e) {
               e.printStackTrace();
           }
           recorder = null;
       }
    }

    @Override
    public boolean onKeyDown(int keyCode, KeyEvent event) {
       // Quit when back button is pushed
       if (keyCode == KeyEvent.KEYCODE_BACK) {
           if (recording) {
               stopRecording();
           }
           finish();
           return true;
       }
       return super.onKeyDown(keyCode, event);
    }

    @Override
    public void onClick(View v) {
       if (!recording) {
           startRecording();
           Log.w(LOG_TAG, "Start Button Pushed");
           recordButton.setText("Stop");
       } else {
           stopRecording();
           Log.w(LOG_TAG, "Stop Button Pushed");
           recordButton.setText("Start");
       }
    }

    //---------------------------------------------
    // audio thread, gets and encodes audio data
    //---------------------------------------------
    class AudioRecordRunnable implements Runnable {

       @Override
       public void run() {
           // Set the thread priority
           android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

           // Audio
           int bufferSize;
           short[] audioData;
           int bufferReadResult;

           bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                   AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
           audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                   AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

           audioData = new short[bufferSize];

           Log.d(LOG_TAG, "audioRecord.startRecording()");
           audioRecord.startRecording();

           // Audio Capture/Encoding Loop
           while (runAudioThread) {
               // Read from audioRecord
               bufferReadResult = audioRecord.read(audioData, 0, audioData.length);
               if (bufferReadResult > 0) {
                   //Log.v(LOG_TAG,"audioRecord bufferReadResult: " + bufferReadResult);

                   // Changes in this variable may not be picked up despite it being "volatile"
                   if (recording) {
                       try {
                           // Write to FFmpegFrameRecorder
                           Buffer[] buffer = {ShortBuffer.wrap(audioData, 0, bufferReadResult)};                        
                           recorder.record(buffer);
                       } catch (FFmpegFrameRecorder.Exception e) {
                           Log.v(LOG_TAG,e.getMessage());
                           e.printStackTrace();
                       }
                   }
               }
           }
           Log.v(LOG_TAG,"AudioThread Finished");

           /* Capture/Encoding finished, release recorder */
           if (audioRecord != null) {
               audioRecord.stop();
               audioRecord.release();
               audioRecord = null;
               Log.v(LOG_TAG,"audioRecord released");
           }
       }
    }

    class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

       private boolean previewRunning = false;

       private SurfaceHolder holder;
       private Camera camera;

       private byte[] previewBuffer;

       long videoTimestamp = 0;

       Bitmap bitmap;
       Canvas canvas;

       public CameraView(Context _context) {
           super(_context);

           holder = this.getHolder();
           holder.addCallback(this);
           holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
       }

       @Override
       public void surfaceCreated(SurfaceHolder holder) {
           camera = Camera.open();

           try {
               camera.setPreviewDisplay(holder);
               camera.setPreviewCallback(this);

               Camera.Parameters currentParams = camera.getParameters();
               Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
               Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);

               // Use these values
               imageWidth = currentParams.getPreviewSize().width;
               imageHeight = currentParams.getPreviewSize().height;
               frameRate = currentParams.getPreviewFrameRate();                

               bitmap = Bitmap.createBitmap(imageWidth, imageHeight, Bitmap.Config.ALPHA_8);


               /*
               Log.v(LOG_TAG,"Creating previewBuffer size: " + imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8);
               previewBuffer = new byte[imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8];
               camera.addCallbackBuffer(previewBuffer);
               camera.setPreviewCallbackWithBuffer(this);
               */              

               camera.startPreview();
               previewRunning = true;
           }
           catch (IOException e) {
               Log.v(LOG_TAG,e.getMessage());
               e.printStackTrace();
           }  
       }

       public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
           Log.v(LOG_TAG,"Surface Changed: width " + width + " height: " + height);

           // We would do this if we want to reset the camera parameters
           /*
           if (!recording) {
               if (previewRunning){
                   camera.stopPreview();
               }
               try {
                   //Camera.Parameters cameraParameters = camera.getParameters();
                   //p.setPreviewSize(imageWidth, imageHeight);
                   //p.setPreviewFrameRate(frameRate);
                   //camera.setParameters(cameraParameters);

                   camera.setPreviewDisplay(holder);
                   camera.startPreview();
                   previewRunning = true;
               }
               catch (IOException e) {
                   Log.e(LOG_TAG,e.getMessage());
                   e.printStackTrace();
               }  
           }            
           */

           // Get the current parameters
           Camera.Parameters currentParams = camera.getParameters();
           Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
           Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);

           // Use these values
           imageWidth = currentParams.getPreviewSize().width;
           imageHeight = currentParams.getPreviewSize().height;
           frameRate = currentParams.getPreviewFrameRate();

           // Create the yuvIplimage if needed
           yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
           //yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);
       }

       @Override
       public void surfaceDestroyed(SurfaceHolder holder) {
           try {
               camera.setPreviewCallback(null);

               previewRunning = false;
               camera.release();

           } catch (RuntimeException e) {
               Log.v(LOG_TAG,e.getMessage());
               e.printStackTrace();
           }
       }

       @Override
       public void onPreviewFrame(byte[] data, Camera camera) {

           if (yuvIplimage != null && recording) {
               videoTimestamp = 1000 * (System.currentTimeMillis() - startTime);

               // Put the camera preview frame right into the yuvIplimage object
               yuvIplimage.getByteBuffer().put(data);

               // FAQ about IplImage:
               // - For custom raw processing of data, getByteBuffer() returns an NIO direct
               //   buffer wrapped around the memory pointed by imageData, and under Android we can
               //   also use that Buffer with Bitmap.copyPixelsFromBuffer() and copyPixelsToBuffer().
               // - To get a BufferedImage from an IplImage, we may call getBufferedImage().
               // - The createFrom() factory method can construct an IplImage from a BufferedImage.
               // - There are also a few copy*() methods for BufferedImage<->IplImage data transfers.

               // Let's try it..
               // This works but only on transparency
               // Need to find the right Bitmap and IplImage matching types

               /*
               bitmap.copyPixelsFromBuffer(yuvIplimage.getByteBuffer());
               //bitmap.setPixel(10,10,Color.MAGENTA);

               canvas = new Canvas(bitmap);
               Paint paint = new Paint();
               paint.setColor(Color.GREEN);
               float leftx = 20;
               float topy = 20;
               float rightx = 50;
               float bottomy = 100;
               RectF rectangle = new RectF(leftx,topy,rightx,bottomy);
               canvas.drawRect(rectangle, paint);

               bitmap.copyPixelsToBuffer(yuvIplimage.getByteBuffer());
               */
               //Log.v(LOG_TAG,"Writing Frame");

               try {

                   // Get the correct time
                   recorder.setTimestamp(videoTimestamp);

                   // Record the image into FFmpegFrameRecorder
                   recorder.record(yuvIplimage);

               } catch (FFmpegFrameRecorder.Exception e) {
                   Log.v(LOG_TAG,e.getMessage());
                   e.printStackTrace();
               }
           }
       }
    }

    }

  • ISO-9660 Compromise, Part 2 : Finding Root

    25 octobre 2021, par Multimedia Mike — General

    A long time ago, I dashed off a quick blog post with a curious finding after studying the ISO-9660 spec : The format stores multi-byte numbers in a format I termed “omni-endian”– the committee developing the format apparently couldn’t come to an agreement on this basic point regarding big- vs. little-endian encoding (I’m envisioning something along the lines of “tastes great ! … less filling !” in the committee meetings).

    I recently discovered another bit of compromise in the ISO-9660 spec : It seems that there are 2 different methods for processing the directory structure. That means it’s incumbent upon ISO-9660 creation software to fill in the data structures to support both methods, because some ISO-reading programs out there rely on one set of data structures while the rest prefer to read the other set.

    Background

    As a refresher, the “ISO” extension of an ISO file refers to the ISO-9660 specification. This is a type of read-only filesystem (i.e, the filesystem is created once and never updated after initial creation) for the purpose of storing on a read-only medium, often an optical disc (CD-ROM, DVD-ROM). The level of nostalgic interest I display for the ISO-9660 filesystem reminds me of my computer science curriculum professors from the mid-90s reminiscing about ye olden days of punchcard programming, but such is my lot. I’m probably also alone in my frustration of seeing rips of, e.g., GameCube or Xbox or 3DO games being tagged with the extension .ISO since those systems use different read-only filesystems.

    I recently fell in with an odd bunch called the eXoDOS project and was trying to help fill in a few gaps. One request was a 1994 game called Power Drive for DOS.


    Power Drive CD-ROM


    My usual CD-ROM ripping method (for the data track) is a simple ‘dd’ command from a Linux command line to copy the string of raw sectors. However, it turned out to be unusually difficult to open the resulting ISO. A few of the the options I know of worked but most didn’t. What’s the difference ?

    Methods that work :

    • Mounting the file with the Linux iso9660 kernel module, i.e.,
      mount -t iso9660 /dev/optical-drive /mnt

      or

      mount -t iso9660 -o loop /path/to/Power-Drive.iso /mnt
    • Directory Opus
    • Windows 10 can read the filesystem when reading the physical disc
    • Windows 10 can burn the ISO image to a new CD (“right click” -> “Burn disc image”) ; this method does not modify any of the existing sectors but did append 149 additional empty sectors

    Methods that don’t work :

    Understanding The Difference

    I think I might have a handle on why some tools are able to process this disc while most can’t. There appears to be 2 sets of data structures to describe the base of the filesystem : A root directory, and a path table. These both occur in the first substantive sector of the ISO-9660 filesystem, usually sector 16.

    A compact disc can be abstractly visualized as a long string of sectors, each one 2,352 bytes long. (See my Grand Unified Theory of Compact Disc post for deeper discussion.) A CD-ROM data track will contain 2048 bytes of data. Thus, sector 16 appears at 0x8000 of an ISO filesystem. I like the clarity of this description of the ISO-9660 spec. It shows that the path table is defined at byte 140 (little-endian ; big comes later) and location of the root directory is at byte 158. Thus, these locations generally occur at 0x808c and 0x809e.


    Primary Volume Descriptor
    Primary Volume Descriptor

    The path table is highlighted in green and the root directory record is highlighted in red. These absolute locations are specified in sectors. So the path table is located at sector 0x12 = offset 0x9000 in the image, while the root directory record is supposed to be at sector 0x62 = 0x31000. Checking into those sectors, it turns out that the path table is valid while the root directory record is invalid. Thus, any tool that relies on the path table will be successful in interpreting the disc, while tools that attempt to recursively traverse starting from root directory record are gonna have a bad time.

    Since I was able to view the filesystem with a few different tools, I know what the root directory contains. Searching for those filenames reveals that the root directory was supposed to point to the next sector, number 0x63. So this was a bizarre off-by-1 error on the part of the ISO creation tool. Maybe. I manually corrected 0x62 -> 0x63 and that fixed the interaction with fuseiso, but not with other tools. So there may have been some other errors. Note that a quick spot-check of another, functional ISO revealed that this root directory sector is supposed to be exact, not 1-indexed.

    Upon further inspection, I noticed that, while fuseiso appeared to work with that one patch, none of the files returned correct data, and none of the directories contained anything. That’s when I noticed that ALL of the sector locations described in the various directory and file records are off by 1 !

    Further Investigation

    I have occasionally run across ISO images on the Internet Archive that return the error about not being able to read the contents when trying to “View contents” (error text : “failed to obtain file list from xyz.iso”, as seen with this ISO). Too bad I didn’t make a record of them because I would be interested to see if they have the same corruption.

    Eventually, I’ll probably be able to compile an archive of deviant ISO-9660 images. A few months ago, I was processing a large collection from IA and found a corrupted ISO which had a cycle, i.e., the subdirectory pointed to a parent directory, which caused various ISO tools to loop forever. Just one of those things that is “never supposed to happen”, so why write code to deal with it gracefully ?

    See Also

    The post ISO-9660 Compromise, Part 2 : Finding Root first appeared on Breaking Eggs And Making Omelettes.

  • Crossdevice encoding static file to stream in browser using FFMPEG (segmented h264 ?)

    20 mars 2014, par Vprnl

    I'm building a mediacenter application in NodeJS which is going pretty ok.
    (you can check it out on Github : https://github.com/jansmolders86/mediacenterjs )

    I'm using FFMPEG to transcode local (static) movies to a stream which I then send to the browser.

    At first I used h264 with Flash which worked in browsers, but I really need it to work on Android an iOS (so no Flash) and preferably working on a Raspberry Pi.

    But getting it to play on all devices is driving me absolutely insane !

    I have all these bits of the puzzle I've gathered from countless hours reading articles, tutorials and stack overflow posts, which led me to the conclusion that I need to produce the following :

    • Use video codec H264 to transcode to MP4
    • Move the moovatom '-movflags' to make a MP4 streamable
    • Segment the stream so Apple can play the stream as well.

    But getting nowhere with this. Every time I produce a series of FFMPEG settings that either don't work, or work on some devices rather than all.

    Some of my failed attempt where :

    My flash attempt -> Main problem (not running in IOS) :

       '-y','-ss 0','-b 800k','-vcodec libx264','-acodec mp3'\
       '-ab 128','-ar 44100','-bufsize 62000', '-maxrate 620k'\
       metaDuration,tDuration,'-f flv

    my HLS attempt -> Main problem (not running in browser) :

           '-r 15','-b:v 128k','-c:v libx264','-x264opts level=41'\
           '-threads 4','-s 640x480','-map 0:v','-map 0:a:0','-c:a mp3'\
           '-b:a 160000','-ac 2','-f hls','-hls_time 10','-hls_list_size 6'\
           '-hls_wrap 18','-start_number 1'

    My MP4 attempt -> Main problem (duration is shortened and the later part of the video is speeding by)

          '-y','-ss 0','-b 800k','-vcodec libx264','-acodec mp3'\
          '-ab 128','-ar 44100','-bufsize 62000', '-maxrate 620k'\
          metaDuration,tDuration,'-f mp4','-movflags','frag_keyframe+empty_moov'

    Second MP4 attempt : -> Main problem (duration is shortened and the later part of the video is speeding by)

       '-y','-vcodec libx264','-pix_fmt yuv420p','-b 1200k','-flags +loop+mv4'\
       '-cmp 256','-partitions +parti4x4+parti8x8+partp4x4+partp8x8+partb8x8'\
       '-me_method hex','-subq 7','-trellis 1','-refs 5','-bf 3','-coder 1'\
       '-me_range 16','-g 150','-keyint_min 25','-sc_threshold 40'\
       '-i_qfactor 0.71','-acodec mp3','-qmin 10','-qdiff 4','-qmax 51'\
       '-ab 128k','-ar 44100','-threads 2','-f mp4','-movflags','frag_keyframe+empty_moov'])

    Here is an example of the FFMPEG log running with these settings :

           file conversion error ffmpeg version N-52458-gaa96439 Copyright (c) 2000-2013 the FFmpeg developers
             built on Apr 24 2013 22:19:32 with gcc 4.8.0 (GCC)
             configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --e
           nable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libcaca --enable-libfreetype --enable
           -libgsm --enable-libilbc --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --ena
           ble-libopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwola
           me --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxavs --enabl
           e-libxvid --enable-zlib
             libavutil      52. 27.101 / 52. 27.101
             libavcodec     55.  6.100 / 55.  6.100
             libavformat    55.  3.100 / 55.  3.100
             libavdevice    55.  0.100 / 55.  0.100
             libavfilter     3. 60.101 /  3. 60.101
             libswscale      2.  2.100 /  2.  2.100
             libswresample   0. 17.102 /  0. 17.102
             libpostproc    52.  3.100 / 52.  3.100
           [avi @ 02427900] non-interleaved AVI
           Guessed Channel Layout for  Input Stream #0.1 : mono
           Input #0, avi, from 'C:/temp/the avengers.avi':
             Duration: 00:00:34.00, start: 0.000000, bitrate: 1433 kb/s
               Stream #0:0: Video: cinepak (cvid / 0x64697663), rgb24, 320x240, 15 tbr, 15 tbn, 15 tbc
               Stream #0:1: Audio: pcm_u8 ([1][0][0][0] / 0x0001), 22050 Hz, mono, u8, 176 kb/s
           Please use -b:a or -b:v, -b is ambiguous
           [libx264 @ 02527c60] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
           [libx264 @ 02527c60] profile High, level 2.0
           [libx264 @ 02527c60] 264 - core 130 r2274 c832fe9 - H.264/MPEG-4 AVC codec - Copyleft 2003-2013 - http://www.videolan.org/x26
           4.html - options: cabac=1 ref=5 deblock=1:0:0 analyse=0x3:0x133 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16
            chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=2 lookahead_threads=1 sliced_th
           reads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 we
           ightb=1 open_gop=0 weightp=2 keyint=150 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=abr mbtree=1 bitrate=120
           0 ratetol=1.0 qcomp=0.60 qpmin=10 qpmax=51 qpstep=4 ip_ratio=1.41 aq=1:1.00
           Output #0, mp4, to 'pipe:1':
             Metadata:
               encoder         : Lavf55.3.100
               Stream #0:0: Video: h264 ([33][0][0][0] / 0x0021), yuv420p, 320x240, q=10-51, 1200 kb/s, 15360 tbn, 15 tbc
               Stream #0:1: Audio: mp3 (i[0][0][0] / 0x0069), 44100 Hz, mono, s16p, 128 kb/s
           Stream mapping:
             Stream #0:0 -> #0:0 (cinepak -> libx264)
             Stream #0:1 -> #0:1 (pcm_u8 -> libmp3lame)
           Press [q] to stop, [?] for help
           frame=  106 fps=0.0 q=10.0 size=       1kB time=00:00:06.94 bitrate=   1.4kbits/s
           frame=  150 fps=149 q=14.0 size=       1kB time=00:00:09.87 bitrate=   1.0kbits/s
           frame=  191 fps=126 q=16.0 size=       1kB time=00:00:12.61 bitrate=   0.8kbits/s
           frame=  244 fps=121 q=16.0 size=    2262kB time=00:00:16.14 bitrate=1147.6kbits/s
           frame=  303 fps=120 q=14.0 size=    2262kB time=00:00:20.08 bitrate= 922.2kbits/s
           frame=  354 fps=117 q=15.0 size=    3035kB time=00:00:23.48 bitrate=1058.6kbits/s
           frame=  402 fps=113 q=15.0 size=    3035kB time=00:00:26.67 bitrate= 932.1kbits/s
           frame=  459 fps=113 q=16.0 size=    4041kB time=00:00:30.43 bitrate=1087.7kbits/s
           frame=  510 fps=103 q=2686559.0 Lsize=    5755kB time=00:00:33.93 bitrate=1389.3kbits/s

           video:5211kB audio:531kB subtitle:0 global headers:0kB muxing overhead 0.235111%
           [libx264 @ 02527c60] frame I:6     Avg QP:10.55  size: 25921
           [libx264 @ 02527c60] frame P:245   Avg QP:12.15  size: 14543
           [libx264 @ 02527c60] frame B:259   Avg QP:15.55  size:  6242
           [libx264 @ 02527c60] consecutive B-frames:  6.1% 73.7% 14.7%  5.5%
           [libx264 @ 02527c60] mb I  I16..4: 19.9%  6.2% 73.9%
           [libx264 @ 02527c60] mb P  I16..4:  6.0%  0.2% 12.0%  P16..4: 35.4%  9.6% 16.3%  7.0%  5.6%    skip: 7.8%
           [libx264 @ 02527c60] mb B  I16..4:  0.7%  0.0%  4.3%  B16..8: 27.6% 17.2% 17.0%  direct:17.3%  skip:15.9%  L0:39.4% L1:43.2%
           BI:17.4%
           [libx264 @ 02527c60] final ratefactor: 11.41
           [libx264 @ 02527c60] 8x8 transform intra:1.6% inter:4.0%
           [libx264 @ 02527c60] coded y,uvDC,uvAC intra: 93.0% 97.0% 94.9% inter: 58.4% 58.7% 50.6%
           [libx264 @ 02527c60] i16 v,h,dc,p: 15% 26% 54%  5%
           [libx264 @ 02527c60] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 16% 17% 39%  4%  4%  3%  1%  6%  9%
           [libx264 @ 02527c60] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 28% 34% 21%  4%  2%  2%  2%  2%  5%
           [libx264 @ 02527c60] i8c dc,h,v,p: 51% 24% 19%  6%
           [libx264 @ 02527c60] Weighted P-Frames: Y:4.1% UV:1.2%
           [libx264 @ 02527c60] ref P L0: 68.2%  9.8% 11.0%  5.6%  4.6%  0.8%  0.0%
           [libx264 @ 02527c60] ref B L0: 87.7%  8.0%  3.9%  0.4%
           [libx264 @ 02527c60] ref B L1: 97.8%  2.2%
           [libx264 @ 02527c60] kb/s:1255.36

    Lastly this is my node code fireing up FFMPEG. (I use the module Fluent-ffmpeg : https://github.com/schaermu/node-fluent-ffmpeg )

       var proc = new ffmpeg({ source: movie, nolog: true, timeout:15000})                        
           .addOptions(['-r 15','-b:v 128k','-c:v libx264','-x264opts level=41','-threads 4','-s 640x480','-map 0:v','-map 0:a:0','-c:a mp3','-b:a 160000','-ac 2','-f hls','-hls_time 10','-hls_list_size 6','-hls_wrap 18','-start_number 1 stream.m3u8'])
           .writeToStream(res, function(retcode, error){
               if (!error){
                   console.log('file has been converted succesfully',retcode .green);
               }else{
                   console.log('file conversion error',error .red);
               }
           });

    So to conclude this very long and code heavy question :

    I hope this does not come off as a lazy request, but could someone show/explain to me which FFMPEG settings could/should work on all platforms (modern browsers, Android and iOS) producing a stream of a static file which I can send to a HTML5 player.

    [EDIT] what I need if a generic option isn't available

    And if this is not possible as some posts might suggest, I would love to see a set of FFMPEG settings that get's the job done properly as far as mp4 streaming is concerned. (e.g encoding a streamable mp4).

    The streaming mp4 needs the following

    • A shifted moovAtom
    • It needs to be h264

    Thanks very much for your help !