Recherche avancée

Médias (0)

Mot : - Tags -/signalement

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (34)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

Sur d’autres sites (6133)

  • What Is Data Ethics & Why Is It Important in Business ?

    9 mai 2024, par Erin

    Data is powerful — every business on earth uses data. But some are leveraging it more than others.

    The problem ?

    Not all businesses are using data ethically.

    You need to collect, store, and analyse data to grow your business. But, if you aren’t careful, you could be crossing the line with your data usage into unethical territories.

    In a society where data is more valuable than ever, it’s crucial you perform ethical practices.

    In this article, we break down what data ethics is, why it’s important in business and how you can implement proper data ethics to ensure you stay compliant while growing your business.

    What is data ethics ?

    Data ethics are how a business collects, protects and uses data.

    It’s one field of ethics focused on organisations’ moral obligation to collect, track, analyse and interpret data correctly.

    Data ethics analyses multiple ways we use data :

    • Collecting data
    • Generating data
    • Tracking data
    • Analysing data
    • Interpreting data
    • Implementing activities based on data

    Data ethics is a field that asks, “Is this right or wrong ?”

    And it also asks, “Can we use data for good ?”

    If businesses use data unethically, they could get into serious hot water with their customers and even with the law.

    You need to use data to ensure you grow your business to the best of your ability. But, to maintain a clean slate in the eyes of your customers and authorities, you need to ensure you have strong data ethics.

    Why you need to follow data ethics principles

    In 2018, hackers broke into British Airways’ website by inserting harmful code, leading website visitors to a fraudulent site. 

    The result ? 

    British Airways customers gave their information to the hackers without realising it : credit cards, personal information, login information, addresses and more.

    While this was a malicious attack, the reality is that data is an integral part of everyday life. Businesses need to do everything they can to protect their customers’ data and use it ethically.

    Data ethics is crucial to understand as it sets the standard for what’s right and wrong for businesses. Without a clear grasp of data ethics, companies will willingly or neglectfully misuse data.

    With a firm foundation of data ethics, businesses worldwide can make a collective effort to function smoothly, protect their customers, and, of course, protect their own reputation. 

    3 benefits of leaning into data ethics

    We’re currently transitioning to a new world led by artificial intelligence.

    While AI presents endless opportunities for innovation in the business world, there are also countless risks at play, and it’s never been more important to develop trust with your customers and stakeholders.

    With an influx of data being created and tracked daily, you need to ensure your business is prioritising data ethics to ensure you maintain trust with your customers moving forward.

    Diagram displaying the 3 benefits of data ethics - compliance, increased trust, maintain a good reputation.

    Here are three benefits of data ethics that will help you develop trust, maintain a solid reputation and stay compliant to continue growing your business :

    1. Compliance with data privacy

    Privacy is everything. 

    In a world where our data is being collected nonstop, and we live more public lives than ever with social media, AI and an influx of recording and tracking in everyday life, you need to protect the privacy of your customers.

    One crucial way to protect that privacy is by complying with major data privacy regulations.

    Some of the most common regulations you need to remain compliant with include :

    • General Data Protection Regulation (GDPR)
    • California Consumer Privacy Act (CCPA)
    • Health Insurance Portability and Accountability Act (HIPAA)
    • General Personal Data Protection Law (LGPD)
    • Privacy and Electronic Communications (EC Directive) Regulations (PECR)

    While these regulations don’t directly address ethics, there’s a core overlap between privacy requirements like accountability, lawfulness and AI ethics.

    Matomo ensures you protect the privacy of your web and app users so you can track and improve your website performance with peace of mind.

    2. Maintain a good reputation

    While data ethics can help you maintain data privacy compliance, it can also help you maintain a good reputation online and offline.

    All it takes is one bad event like the British Airways breach for your company’s reputation to be ruined.

    If you want to keep a solid reputation and maintain trust with your stakeholders, customers and lawmakers, then you need to focus on developing strong data ethics.

    Businesses that invest time in establishing proper data ethics set the right foundation to protect their reputation, develop trust with stakeholders and create goodwill and loyalty.

    3. Increased trust means greater revenue

    What happens when you establish proper data ethics ?

    You’ll gain the trust of your customers, maintain a solid reputation and increase your brand image.

    Customers who trust you to protect their privacy and data want to keep doing business with you.

    So, what’s the end result for a business that values data ethics ?

    You’ll generate more revenue in the long run. Trust is one thing you should never put on the back burner if you have plans to keep growing your business. By leaning more into data ethics, you’ll be able to build that brand reputation that helps people feel comfortable buying your products and services on repeat.

    While spending time and money on data ethics may seem like an annoyance, the reality is that it’s a business investment that will pay dividends for years to come.

    5 core data ethics principles

    So, what exactly is involved in data ethics ?

    For most people, data ethics is a pretty broad and vague term. If you’re curious about the core pillars of data ethics, then keep reading.

    Here are five core data ethical principles you need to follow to ensure you’re protecting your customers’ data and maintaining trust :

    Image displaying the 5 core data ethics principles - ownership, transparency, privacy, intention, outcomes.

    1. Data ownership

    The individual owns the data, not you. This is the first principle of data ethics. You don’t have control over someone else’s data. It’s theirs, and they have full ownership over it.

    Just as stealing a TV from an electronics store is a crime, stealing (or collecting) someone’s personal data without their consent is considered unlawful and unethical.

    Consent is the only way to ethically “own” someone’s data.

    How can you collect someone’s data ethically ?

    • Digital privacy policies
    • Signed, written agreements
    • Popups with checkboxes that allow you to track users’ behaviour

    Essentially, anytime you’re collecting data from your website or app users, you need to ensure you’re asking permission for that data.

    You should never assume a website visitor or customer is okay with you collecting your data automatically. Instead, ask permission to collect, track and use their data to avoid legal and ethical issues.

    2. Transparency

    The second core principle of data ethics within business is transparency. This means you need to be fully transparent on when, where and how you :

    • Collect data
    • Store data
    • Use data

    In other words, you need to allow your customers and website visitors to have a window inside your data activities.

    They need to be able to see exactly how you plan on using the data you’re collecting from them.

    For example, imagine you implemented a new initiative to personalise the website experience for each user based on individual behaviour. To do this, you’ll need to track cookies. In this case, you’d need to write up a new policy stating how this behavioural data is going to be collected, tracked and used.

    It’s within your website visitors’ rights to access this information so they can choose whether or not they want to accept or decline your website’s cookies.

    With any new data collection or tracking, you need to be 100% clear about how you’re going to use the data. You can’t be deceptive, misleading, or withholding any information on how you will use the data, as this is unethical and, in many cases, unlawful.

    3. Privacy

    Another important branch of ethics is privacy. The ethical implications of this should be obvious.

    When your users, visitors, or customers enter your sphere of influence and you begin collecting data on them, you are responsible for keeping that data private.

    When someone accepts the terms of your data usage, they’re not agreeing to have their data released to the public. They’re agreeing to let you leverage that data as their trusted business provider to better serve them. They expect you to maintain privacy.

    You can’t spread private information to third parties. You can’t blast this data to the public. 

    This is especially important if someone allows you to collect and use their personally identifiable information (PII), such as :

    • First and last name
    • Email address
    • Date of birth
    • Home address
    • Phone number

    To protect your audience’s data, you should only store it in a secure database. 

    Screenshot example of the Matomo dashboard

    For example, Matomo’s web analytics solution guarantees the privacy of both your users and analytics data.

    With Matomo, you have complete ownership of your data. Unlike other web analytics solutions that exploit your data for advertising purposes, Matomo users can use analytics with confidence, knowing that their data won’t be sold to advertisers.

    Learn more about data privacy with Matomo here.

    Try Matomo for Free

    Get the web insights you need, while respecting user privacy.

    No credit card required

    4. Intention

    When you collect and store data, you need to tell your users why you’re collecting their data. But there’s another principle of data ethics that goes beyond the reason you give your customers.

    Intention is the reason you give yourself for collecting and using the data.

    Before you start collecting and storing data, you should ask yourself the following :

    • Why you need it
    • What you’ll gain from it
    • What changes you’ll be able to make after you analyse the data

    If your intention is wrong in any way, it’s unethical to collect the data :

    • You’re collecting data to hurt others
    • You’re collecting data to profit from your users’ weaknesses
    • You’re collecting data for any other malicious reason

    When you collect data, you need to have the right intentions to maintain proper data ethics ; otherwise, you could harm your brand, break trust and ruin your reputation.

    5. Outcomes

    You may have the best intentions, but sometimes, there are negative outcomes from data use.

    For example, British Airways’ intention was not to allow hackers to gain access and harm their users. But the reality is that their customers’ data was stolen and used for malicious purposes. While this isn’t technically unlawful, the outcome of collecting data ended badly.

    To ensure proper data ethics, you must have good standing with your data. This means protecting your users at all costs, maintaining a good reputation and ensuring proper privacy measures are set up.

    How to implement data ethics as a business leader

    As a business leader, CTO or CEO, it’s your responsibility to implement data ethics within your organisation. Here are some tips to implement data ethics based on the size and stage of your organisation :

    Startups

    If you’re a startup, you need to be mindful of which technology and tools you use to collect, store and use data to help you grow your business.

    It can be a real challenge to juggle all the moving parts of a startup since things can change so quickly. However, it’s crucial to establish a leader and allow easy access to ethical analysis resources to maintain proper data ethics early on.

    Small and medium-sized businesses

    As you begin scaling, you’ll likely be using even more technology. With each new business technique you implement, there will be new ways you’ll be collecting user data. 

    One of the key processes involved in managing data as you grow is to hire engineers who build out different technologies. You must have protocols, best practices and management overseeing the new technologies being built to ensure proper data ethics.

    Global businesses

    Have you scaled internationally ?

    There will be even more rules, laws, regulations and organisations to answer to if you start managing data unethically.

    You should have established teams or departments to ensure you follow proper privacy and data protocols worldwide. When you have a large organisation, you have more money and vast amounts of data. This makes you a bigger target for leaks, ransomware and hackers.

    You should ensure you have cross-departmental groups working to establish ongoing protocols and training to keep your data management in good standing.

    Leverage data ethically with Matomo

    Data is powerful.

    It’s a crucial point of leverage that’s required to stay competitive.

    However, improper use and management of data can give you a bad reputation, break trust and even cause you legal trouble.

    That’s why you must maintain good data ethics within your organisation.

    One of the most important places to set up proper data ethics and privacy measures is with your website analytics.

    Matomo is the leading, privacy-friendly web analytics solution in the world. It automatically collects, stores, and tracks data across your website ethically.

    With over 1 million websites using Matomo, you get to take full control over your website performance with :

    • Accurate data (no data sampling)
    • Privacy-friendly and GDPR-compliant analytics
    • Open-source for transparency and to create a custom solution for you

    Try Matomo free for 21-days. No credit card required.

  • Feeding MediaCodec with byte data from AVPacket : problems with output buffers

    2 mars 2016, par serg66

    Description of my task :
    I’m developing a video player on Android (API >= 17). It has to work both with HLS and multicast video. In addition, it has to support multiple audio tracks.

    Why I decided to use ffmpeg :

    • On some devices MediaPlayer doesn’t support multicast-video
    • MediaExtractor doesn’t work with HLS (getTrackCount() returns 0)
    • ffmpeg works both with HLS and multicast

    My idea :
    I demux a stream using ffmpeg in a loop. I get the CSD using videoStream->codec->extradata and then properly configure the MediaFormat. On each iteration when I have a new video AVPacket available, I filter it’s buffer using av_bitstream_filter_init to h264_mp4toannexb. Then I call the java method onNewVideoData, in which I get the AVPacket byte array. I clear the available input buffer, after that I fill it with the new data. I also get the pts. Since I have a stream with no beginning, additionally, I calculate new pts’ by subtracting the pts of the first AVPacket from all the following pts’. The first pts I assign to 0. Then I call queueInputBuffer to send the buffer to the decoder.

    I use two threads : one for getting and submitting data to the input buffers, and another one for posting it to the Surface.

    The full player c-code :

    #include
    #include <android></android>log.h>
    #include

    #include <libavformat></libavformat>avformat.h>
    #include <libavcodec></libavcodec>avcodec.h>
    #include <libavutil></libavutil>buffer.h>

    #define TAG "ffmpegPlayer"

    struct
    {
       const char* url;
       jint width;
       jint height;
       jfloat aspectRatio;
       jint streamsCount;
       AVFormatContext* formatContext;
       AVStream* videoStream;
    } context;

    AVPacket packet;
    AVBitStreamFilterContext* avBitStreamFilterContext;

    JNIEXPORT jbyteArray JNICALL Java_com_example_app_FfmpegPlayer_getCsdNative(JNIEnv* env, jobject x)
    {
       jbyteArray arr = (*env)->NewByteArray(env, context.videoStream->codec->extradata_size);
       (*env)->SetByteArrayRegion(env, arr, 0, context.videoStream->codec->extradata_size, (jbyte*)context.videoStream->codec->extradata);

       return arr;
    }

    JNIEXPORT jint JNICALL Java_com_example_app_FfmpegPlayer_getWidthNative(JNIEnv* env, jobject x)
    {
       return context.width;
    }

    JNIEXPORT jint JNICALL Java_com_example_app_FfmpegPlayer_getHeightNative(JNIEnv* env, jobject x)
    {
       return context.height;
    }

    JNIEXPORT jfloat JNICALL Java_com_example_app_FfmpegPlayer_getAspectRatioNative(JNIEnv* env, jobject x)
    {
       return context.aspectRatio;
    }

    JNIEXPORT jfloat JNICALL Java_com_example_app_FfmpegPlayer_getStreamsCountNative(JNIEnv* env, jobject x)
    {
       return context.streamsCount;
    }

    JNIEXPORT jlong JNICALL Java_com_example_app_FfmpegPlayer_getPtsNative(JNIEnv* env, jobject obj)
    {
       return packet.pts * av_q2d(context.videoStream->time_base) * 1000000;
    }

    JNIEXPORT jboolean JNICALL Java_com_example_app_FfmpegPlayer_initNative(JNIEnv* env, jobject obj, const jstring u)
    {
       av_register_all();
       avBitStreamFilterContext = av_bitstream_filter_init("h264_mp4toannexb");

       const char* url = (*env)->GetStringUTFChars(env, u , NULL);
       __android_log_print(ANDROID_LOG_DEBUG, TAG, "Init: %s", url);

       AVFormatContext* formatContext = NULL;
       if (avformat_open_input(&amp;formatContext, url, NULL, NULL) &lt; 0) {
           __android_log_print(ANDROID_LOG_ERROR, TAG, "Unable to open input");
           return JNI_FALSE;
       }

       if (avformat_find_stream_info(formatContext, NULL) &lt; 0) {
           __android_log_print(ANDROID_LOG_ERROR, TAG, "Unable to find stream info");
           return JNI_FALSE;
       }

       AVInputFormat * iformat = formatContext->iformat;
       __android_log_print(ANDROID_LOG_DEBUG, TAG, "format: %s", iformat->name);

       context.streamsCount = formatContext->nb_streams;
       __android_log_print(ANDROID_LOG_DEBUG, TAG, "Streams count: %d", formatContext->nb_streams);

       int i = 0;
       AVStream* videoStream = NULL;
       AVDictionaryEntry* lang;
       for (i = 0; i &lt; formatContext->nb_streams; i++) {
           int codecType = formatContext->streams[i]->codec->codec_type;
           if (videoStream == NULL &amp;&amp; codecType == AVMEDIA_TYPE_VIDEO) {
               videoStream = formatContext->streams[i];
           }
           else if (codecType == AVMEDIA_TYPE_AUDIO) {
               lang = av_dict_get(formatContext->streams[i]->metadata, "language", NULL, 0);
               if (lang != NULL) {
                   __android_log_print(ANDROID_LOG_DEBUG, TAG, "Audio stream %d: %s", i, lang->value);
               }
           }
       }
       if (videoStream == NULL) {
           __android_log_print(ANDROID_LOG_ERROR, TAG, "Unable to find video stream");
           return JNI_FALSE;
       }
       context.videoStream = videoStream;
       __android_log_print(ANDROID_LOG_DEBUG, TAG, "Video stream:  %d", videoStream->index);

       AVCodecContext *codecContext = formatContext->streams[videoStream->index]->codec;

       __android_log_print(ANDROID_LOG_DEBUG, TAG, "width: %d, height: %d", codecContext->width, codecContext->height);
       context.width = codecContext->width;
       context.height = codecContext->height;

       AVRational aspectRatio = codecContext->sample_aspect_ratio;
       __android_log_print(ANDROID_LOG_DEBUG, TAG, "aspect ratio: %d/%d", aspectRatio.num, aspectRatio.den);
       context.aspectRatio = aspectRatio.num / aspectRatio.den;

       context.formatContext = formatContext;

       return JNI_TRUE;
    }

    void filterPacket()
    {
       av_bitstream_filter_filter(avBitStreamFilterContext, context.videoStream->codec, NULL, &amp;packet.data, &amp;packet.size, packet.data, packet.size, packet.flags);
    }

    JNIEXPORT void JNICALL Java_com_example_app_FfmpegPlayer_startNative(JNIEnv* env, jobject obj)
    {
       jclass cl = (*env)->GetObjectClass(env, obj);
       jmethodID updateMethodId = (*env)->GetMethodID(env, cl, "onNewVideoData", "()V");

       while (av_read_frame(context.formatContext, &amp;packet) >= 0) {
           if (context.formatContext == NULL) {
               return;
           }
           if (packet.stream_index == context.videoStream->index) {
               filterPacket();
               (*env)->CallVoidMethod(env, obj, updateMethodId);
           }
       }
    }

    JNIEXPORT jbyteArray JNICALL Java_com_example_app_FfmpegPlayer_getVideoDataNative(JNIEnv* env, jobject obj)
    {
       AVBufferRef *buf = packet.buf;

       jbyteArray arr = (*env)->NewByteArray(env, buf->size);
       (*env)->SetByteArrayRegion(env, arr, 0, buf->size, (jbyte*)buf->data);

       return arr;
    }

    The full Java-code :

    package com.example.app;


    import android.media.MediaCodec;
    import android.media.MediaFormat;
    import android.view.Surface;

    import java.nio.ByteBuffer;

    public class FfmpegPlayer {

       static {
           System.loadLibrary("avutil-54");
           System.loadLibrary("swscale-3");
           System.loadLibrary("swresample-1");
           System.loadLibrary("avcodec-56");
           System.loadLibrary("avformat-56");
           System.loadLibrary("avfilter-5");
           System.loadLibrary("ffmpeg-player");
       }

       private native boolean initNative(String url);
       private native boolean startNative();
       private native int getWidthNative();
       private native int getHeightNative();
       private native float getAspectRatioNative();
       private native byte[] getVideoDataNative();
       private native long getPtsNative();
       private native byte[] getCsdNative();

       private String source;
       private PlayerThread playerThread;
       private int width;
       private int height;
       private MediaCodec decoder;
       private ByteBuffer[] inputBuffers;
       private Surface surface;
       private long firstPtsTime;

       public PlanetaPlayer(Surface surface) {
           this.surface = surface;
       }

       public void setDataSource(String source) {
           if (!initNative(source)) {
               return;
           }
           width = getWidthNative();
           height = getHeightNative();
           MediaFormat format = MediaFormat.createVideoFormat("video/avc", width, height);
           format.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, width * height);
           format.setByteBuffer("csd-0", ByteBuffer.wrap(getCsdNative()));
           LogUtils.log("CSD: ");
           outputAsHex(getCsdNative());
           try {
               decoder = MediaCodec.createDecoderByType("video/avc");
               decoder.configure(format, surface, null, 0);
               decoder.start();

               playerThread = new PlayerThread();
               playerThread.start();

               new OutputThread().run();
           }
           catch (Exception e) {
               e.printStackTrace();
           }
       }

       public void onNewVideoData() {
           int index = decoder.dequeueInputBuffer(0);
           if (index >= 0) {
               byte[] data = getVideoDataNative();
               ByteBuffer byteBuffer = decoder.getInputBuffers()[index];
               byteBuffer.clear();
               byteBuffer.put(data);
               long pts = getPtsNative();

               LogUtils.log("Input AVPacket pts: " + pts);
               LogUtils.log("Input AVPacket data length: " + data.length);
               LogUtils.log("Input AVPacket data: ");
               outputAsHex(data);

               if (firstPtsTime == 0) {
                   firstPtsTime = pts;
                   pts = 0;
               }
               else {
                   pts -= firstPtsTime;
               }
               decoder.queueInputBuffer(index, 0, data.length, pts, 0);
           }
       }

       private void outputAsHex(byte[] data) {
           String[] test = new String[data.length];
           for (int i = 0; i &lt; data.length; i++) {
               test[i] = String.format("%02x", data[i]);
           }
           LogUtils.log(test);
       }

       private class PlayerThread extends Thread {
           @Override
           public void run() {
               super.run();

               startNative();
           }
       }

       private class OutputThread extends Thread {

           @Override
           public void run() {
               super.run();
               MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
               while (true) {
                   int index = decoder.dequeueOutputBuffer(info, 0);
                   if (index >= 0) {
                       ByteBuffer buffer = decoder.getOutputBuffers()[index];
                       buffer.position(info.offset);
                       buffer.limit(info.offset + info.size);
                       byte[] test = new byte[info.size];
                       for (int i = 0; i &lt; info.size; i++) {
                           test[i] = buffer.get(i);
                       }
                       LogUtils.log("Output info: size=" + info.size + ", presentationTimeUs=" + info.presentationTimeUs + ",offset=" + info.offset + ",flags=" + info.flags);
                       LogUtils.log("Output data: ");
                       outputAsHex(test);
                       decoder.releaseOutputBuffer(index, true);
                   }
               }
           }
       }
    }

    The problem :
    For the tests I used a TS file with the following video stream :

    Codec: H264 - MPEG-4 AVC (part 10) (h264)
    Resolution: 720x578
    Frame rate: 25
    Decoded format: Planar 4:2:0 YUV

    The CSD is the following :

    [00, 00, 00, 01, 09, 10, 00, 00, 00, 01, 27, 4d, 40, 1e, 9a, 62, 01, 68, 48, b0, 44, 20, a0, a0, a8, 00, 00, 03, 00, 08, 00, 00, 03, 01, 94, a0, 00, 00, 00, 01, 28, ee, 3c, 80]

    On different devices I have different results. But I couldn’t achieve showing the video on the Surface.

    Input :

    Input AVPacket pts: 351519222
    Input AVPacket data length: 54941
    Input AVPacket data: [00, 00, 00, 01, 09, 10, 00, 00, 00, 01, 27, 4d, 40, 1e, 9a, 62, 01, 68, 48, b0, 44, 20, a0, a0, a8, 00, 00, 03, 00, 08, 00, 00, 03, 01, 94, a0, 00, 00, 00, 01,...]
    ------------------------------------
    Input AVPacket pts: 351539222
    Input AVPacket data length: 9605
    Input AVPacket data: [00, 00, 00, 01, 09, 30, 00, 00, 00, 01, 06, 01, 01, 24, 80, 00, 00, 00, 01, 21, e3, bd, da, e4, 46, c5, 8b, 6b, 7d, 07, 59, 23, 6f, 92, e9, fb, 3b, b9, 4d, f9,...]
    ------------------------------------
    Input AVPacket pts: 351439222
    Input AVPacket data length: 1985
    Input AVPacket data: [00, 00, 00, 01, 09, 50, 00, 00, 00, 01, 06, 01, 01, 14, 80, 00, 00, 00, 01, 21, a8, f2, 74, 69, 14, 54, 4d, c5, 8b, e8, 42, 52, ac, 80, 53, b4, 4d, 24, 1f, 6c,...]
    ------------------------------------
    Input AVPacket pts: 351459222
    Input AVPacket data length: 2121
    Input AVPacket data: [00, 00, 00, 01, 09, 50, 00, 00, 00, 01, 06, 01, 01, 24, 80, 00, 00, 00, 01, 21, a8, f3, 74, e9, 0b, 8b, 17, e8, 43, f8, 10, 88, ca, 2b, 11, 53, c8, 31, f0, 0b,...]
    ... on and on

    Asus Zenfone (Android 5.0.2) output thread (after decoding, strange results with 25 buffers of only 8 byte data) :

    Output info: size=8, presentationTimeUs=-80001,offset=0,flags=0
    Output data:
    [01, 00, 00, 00, 90, c5, 99, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=0,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, 78, ea, 86, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=720000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e8, 86, b6, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=780000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, c0, cb, 93, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=840000,offset=0,flags=0
    Output data:
    [01, 00, 00, 00, 80, 87, 93, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=960000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e0, 3f, 8b, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=1040000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, f8, 76, 85, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=1180000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e0, 87, 93, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=1260000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e8, b5, d2, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=1800000,offset=0,flags=0
    Output data:
    [01, 00, 00, 00, 90, c5, 99, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=1860000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e0, c0, 84, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=2080000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, c0, cb, 93, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=3440000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, 80, 87, 93, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=3520000,offset=0,flags=0
    Output data:
    [01, 00, 00, 00, 78, ea, 86, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=4160000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e8, 86, b6, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=4300000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e0, 3f, 8b, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=4400000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, 90, c5, 99, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=4480000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, f8, 76, 85, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=4680000,offset=0,flags=0
    Output data:
    [01, 00, 00, 00, c0, cb, 93, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=4720000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e0, c0, 84, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=4760000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e0, 87, 93, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=4800000,offset=0,flags=0
    Output data:
    [01, 00, 00, 00, 58, 54, 83, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=5040000,offset=0,flags=0
    Output data:
    [01, 00, 00, 00, e8, b5, d2, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=5100000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, 80, 87, 93, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=5320000,offset=0,flags=0
    Output data:
    [01, 00, 00, 00, 78, ea, 86, ac]
    ---------------------------
    Output info: size=8, presentationTimeUs=5380000,offset=0,flags=1
    Output data:
    [01, 00, 00, 00, e8, 86, b6, ac]

    Other Asus Zenfone logs :

    01-25 17:11:36.859 4851-4934/com.example.app I/OMXClient: Using client-side OMX mux.
    01-25 17:11:36.865 317-1075/? I/OMX-VDEC-1080P: component_init: OMX.qcom.video.decoder.avc : fd=43
    01-25 17:11:36.867 317-1075/? I/OMX-VDEC-1080P: Capabilities: driver_name = msm_vidc_driver, card = msm_vdec_8974, bus_info = , version = 1, capabilities = 4003000
    01-25 17:11:36.881 317-1075/? I/OMX-VDEC-1080P: omx_vdec::component_init() success : fd=43
    01-25 17:11:36.885 4851-4934/com.example.app I/ACodec: [OMX.qcom.video.decoder.avc] DRC Mode: Dynamic Buffer Mode
    01-25 17:11:36.893 317-20612/? E/C2DColorConvert: unknown format passed for luma alignment number
    01-25 17:11:36.933 317-12269/? E/C2DColorConvert: unknown format passed for luma alignment number
    01-25 17:11:36.933 317-12269/? E/C2DColorConvert: unknown format passed for luma alignment number
    01-25 17:11:36.935 317-5559/? E/C2DColorConvert: unknown format passed for luma alignment number
    01-25 17:11:36.957 317-5559/? E/C2DColorConvert: unknown format passed for luma alignment number
    01-25 17:11:36.957 4851-4934/com.example.app I/ExtendedCodec: Decoder will be in frame by frame mode
    01-25 17:11:36.963 317-1075/? E/C2DColorConvert: unknown format passed for luma alignment number
    01-25 17:11:36.963 317-1075/? E/C2DColorConvert: unknown format passed for luma alignment number
    01-25 17:11:36.964 317-20612/? E/OMX-VDEC-1080P: Extension: OMX.google.android.index.describeColorFormat not implemented
    01-25 17:11:37.072 317-20612/? E/OMX-VDEC-1080P: Extension: OMX.google.android.index.describeColorFormat not implemented
    01-25 17:11:37.072 4851-4934/com.example.app W/ACodec: do not know color format 0x7fa30c04 = 2141391876

    Asus Nexus 7 (Android 6.0.1) crashes :

    01-25 17:23:06.921 11602-11695/com.example.app I/OMXClient: Using client-side OMX mux.
    01-25 17:23:06.952 11602-11694/com.example.app I/MediaCodec: [OMX.qcom.video.decoder.avc] setting surface generation to 11880449
    01-25 17:23:06.954 194-194/? E/OMX-VDEC-1080P: Extension: OMX.google.android.index.storeANWBufferInMetadata not implemented
    01-25 17:23:06.954 194-194/? E/OMX-VDEC-1080P: Extension: OMX.google.android.index.storeMetaDataInBuffers not implemented
    01-25 17:23:06.954 194-194/? E/OMXNodeInstance: getExtensionIndex(45:qcom.decoder.avc, OMX.google.android.index.storeMetaDataInBuffers) ERROR: NotImplemented(0x80001006)
    01-25 17:23:06.954 11602-11695/com.example.app E/ACodec: [OMX.qcom.video.decoder.avc] storeMetaDataInBuffers failed w/ err -2147483648
    01-25 17:23:06.963 11602-11695/com.example.app D/SurfaceUtils: set up nativeWindow 0xa0b7a108 for 720x576, color 0x7fa30c03, rotation 0, usage 0x42002900
    01-25 17:23:06.967 194-604/? E/OMX-VDEC-1080P: GET_MV_BUFFER_SIZE returned: Size: 122880 and alignment: 8192
    01-25 17:23:07.203 11602-11695/com.example.app W/AHierarchicalStateMachine: Warning message AMessage(what = 'omxI') = {
                                                                            int32_t type = 0
                                                                            int32_t event = 2130706432
                                                                            int32_t data1 = 1
                                                                            int32_t data2 = 0
                                                                          } unhandled in root state.
    01-25 17:23:07.232 11602-11695/com.example.app D/SurfaceUtils: set up nativeWindow 0xa0b7a108 for 720x576, color 0x7fa30c03, rotation 0, usage 0x42002900
    01-25 17:23:07.241 194-194/? E/OMX-VDEC-1080P: GET_MV_BUFFER_SIZE returned: Size: 122880 and alignment: 8192
    01-25 17:23:07.242 194-194/? E/OMX-VDEC-1080P: Insufficient sized buffer given for playback, expected 671744, got 663552
    01-25 17:23:07.242 194-194/? E/OMXNodeInstance: useBuffer(45:qcom.decoder.avc, Output:1 671744@0xb60a0860) ERROR: BadParameter(0x80001005)
    01-25 17:23:07.243 11602-11695/com.example.app E/ACodec: registering GraphicBuffer 0 with OMX IL component failed: -2147483648
    01-25 17:23:07.243 11602-11695/com.example.app E/ACodec: Failed to allocate output port buffers after port reconfiguration: (-2147483648)
    01-25 17:23:07.243 11602-11695/com.example.app E/ACodec: signalError(omxError 0x80001001, internalError -2147483648)
    01-25 17:23:07.243 11602-11694/com.example.app E/MediaCodec: Codec reported err 0x80001001, actionCode 0, while in state 6
    01-25 17:23:07.245 11602-11602/com.example.app W/System.err: java.lang.IllegalStateException
    01-25 17:23:07.245 11602-11602/com.example.app W/System.err:     at android.media.MediaCodec.native_dequeueOutputBuffer(Native Method)
    01-25 17:23:07.245 11602-11602/com.example.app W/System.err:     at android.media.MediaCodec.dequeueOutputBuffer(MediaCodec.java:2379)
    01-25 17:23:07.245 11602-11602/com.example.app W/System.err:     at com.example.app.FfmpegPlayer$OutputThread.run(FfmpegPlayer.java:122)
    01-25 17:23:07.245 11602-11602/com.example.app W/System.err:     at com.example.app.FfmpegPlayer.setDataSource(FfmpegPlayer.java:66)
    01-25 17:23:07.245 11602-11602/com.example.app W/System.err:     at com.example.app.activities.TestActivity$2.surfaceCreated(TestActivity.java:151)
    01-25 17:23:07.245 11602-11602/com.example.app W/System.err:     at android.view.SurfaceView.updateWindow(SurfaceView.java:583)
    01-25 17:23:07.245 11602-11602/com.example.app W/System.err:     at android.view.SurfaceView$3.onPreDraw(SurfaceView.java:177)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.view.ViewTreeObserver.dispatchOnPreDraw(ViewTreeObserver.java:944)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.view.ViewRootImpl.performTraversals(ViewRootImpl.java:2055)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.view.ViewRootImpl.doTraversal(ViewRootImpl.java:1107)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.view.ViewRootImpl$TraversalRunnable.run(ViewRootImpl.java:6013)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.view.Choreographer$CallbackRecord.run(Choreographer.java:858)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.view.Choreographer.doCallbacks(Choreographer.java:670)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.view.Choreographer.doFrame(Choreographer.java:606)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.view.Choreographer$FrameDisplayEventReceiver.run(Choreographer.java:844)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.os.Handler.handleCallback(Handler.java:739)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.os.Handler.dispatchMessage(Handler.java:95)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.os.Looper.loop(Looper.java:148)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at android.app.ActivityThread.main(ActivityThread.java:5417)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at java.lang.reflect.Method.invoke(Native Method)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:726)
    01-25 17:23:07.246 11602-11602/com.example.app W/System.err:     at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:616)

    Another device always has empty output buffers, thought the indexes aren >= 0 ;

    What am I doing wrong ?

  • HTTP Livestreaming with ffmpeg

    12 décembre 2020, par Hugo

    Some context : I have an MKV file, I am attempting to stream it to http://localhost:8090/test.flv as an flv file.

    &#xA;&#xA;

    The stream begins and then immediately ends.

    &#xA;&#xA;

    The command I am using is :

    &#xA;&#xA;

    sudo ffmpeg -re -i input.mkv -c:v libx264 -maxrate 1000k -bufsize 2000k -an -bsf:v h264_mp4toannexb -g 50 http://localhost:8090/test.flv&#xA;

    &#xA;&#xA;

    A breakdown of what I believe these options do incase this post becomes useful for someone else :

    &#xA;&#xA;

    sudo&#xA;

    &#xA;&#xA;

    Run as root

    &#xA;&#xA;

    ffmpeg&#xA;

    &#xA;&#xA;

    The stream command thingy

    &#xA;&#xA;

    -re&#xA;

    &#xA;&#xA;

    Stream in real-time

    &#xA;&#xA;

    -i input.mkv&#xA;

    &#xA;&#xA;

    Input option and path to input file

    &#xA;&#xA;

    -c:v libx264&#xA;

    &#xA;&#xA;

    Use codec libx264 for conversion

    &#xA;&#xA;

    -maxrate 1000k -bufsize 2000k&#xA;

    &#xA;&#xA;

    No idea, some options for conversion, seems to help

    &#xA;&#xA;

    -an -bsf:v h264_mp4toannexb&#xA;

    &#xA;&#xA;

    Audio options I think, not sure really. Also seems to help

    &#xA;&#xA;

    -g 50&#xA;

    &#xA;&#xA;

    Still no idea, maybe frame rateframerateframerateframerate ?

    &#xA;&#xA;

    http://localhost:8090/test.flv&#xA;

    &#xA;&#xA;

    Output using http protocol to localhost on port 8090 as a file called test.flv

    &#xA;&#xA;

    Anyway the actual issue I have is that it begins to stream for about a second and then immediately ends.

    &#xA;&#xA;

    The mpeg command result :

    &#xA;&#xA;

    ffmpeg version N-80901-gfebc862 Copyright (c) 2000-2016 the FFmpeg developers&#xA;  built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.3)&#xA;  configuration: --extra-libs=-ldl --prefix=/opt/ffmpeg --mandir=/usr/share/man --enable-avresample --disable-debug --enable-nonfree --enable-gpl --enable-version3 --enable-libopencore-amrnb --enable-libopencore-amrwb --disable-decoder=amrnb --disable-decoder=amrwb --enable-libpulse --enable-libfreetype --enable-gnutls --enable-libx264 --enable-libx265 --enable-libfdk-aac --enable-libvorbis --enable-libmp3lame --enable-libopus --enable-libvpx --enable-libspeex --enable-libass --enable-avisynth --enable-libsoxr --enable-libxvid --enable-libvidstab&#xA;  libavutil      55. 28.100 / 55. 28.100&#xA;  libavcodec     57. 48.101 / 57. 48.101&#xA;  libavformat    57. 41.100 / 57. 41.100&#xA;  libavdevice    57.  0.102 / 57.  0.102&#xA;  libavfilter     6. 47.100 /  6. 47.100&#xA;  libavresample   3.  0.  0 /  3.  0.  0&#xA;  libswscale      4.  1.100 /  4.  1.100&#xA;  libswresample   2.  1.100 /  2.  1.100&#xA;  libpostproc    54.  0.100 / 54.  0.100&#xA;Input #0, matroska,webm, from &#x27;input.mkv&#x27;:&#xA;  Metadata:&#xA;    encoder         : libebml v1.3.0 &#x2B; libmatroska v1.4.0&#xA;    creation_time   : 1970-01-01 00:00:02&#xA;  Duration: 00:01:32.26, start: 0.000000, bitrate: 4432 kb/s&#xA;    Stream #0:0(eng): Video: h264 (High 10), yuv420p10le, 1920x1080 [SAR 1:1 DAR 16:9], 23.98 fps, 23.98 tbr, 1k tbn, 47.95 tbc (default)&#xA;    Stream #0:1(nor): Audio: flac, 48000 Hz, stereo, s16 (default)&#xA;[libx264 @ 0x2e1c380] using SAR=1/1&#xA;[libx264 @ 0x2e1c380] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX&#xA;[libx264 @ 0x2e1c380] profile High, level 4.0&#xA;[libx264 @ 0x2e1c380] 264 - core 148 r2643 5c65704 - H.264/MPEG-4 AVC codec - Copyleft 2003-2015 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=1 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=50 keyint_min=5 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 vbv_maxrate=1000 vbv_bufsize=2000 crf_max=0.0 nal_hrd=none filler=0 ip_ratio=1.40 aq=1:1.00&#xA;[flv @ 0x2e3f0a0] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead.&#xA;Output #0, flv, to &#x27;http://localhost:8090/test.flv&#x27;:&#xA;  Metadata:&#xA;    encoder         : Lavf57.41.100&#xA;    Stream #0:0(eng): Video: h264 (libx264) ([7][0][0][0] / 0x0007), yuv420p, 1920x1080 [SAR 1:1 DAR 16:9], q=-1--1, 23.98 fps, 1k tbn, 23.98 tbc (default)&#xA;    Metadata:&#xA;      encoder         : Lavc57.48.101 libx264&#xA;    Side data:&#xA;      cpb: bitrate max/min/avg: 1000000/0/0 buffer size: 2000000 vbv_delay: -1&#xA;Stream mapping:&#xA;  Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))&#xA;Press [q] to stop, [?] for help&#xA;Killed   26 fps= 26 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A speed=   0x  &#xA;

    &#xA;&#xA;

    The ffserver outputs :

    &#xA;&#xA;

    Sat Aug 20 12:40:11 2016 File &#x27;/test.flv&#x27; not found&#xA;Sat Aug 20 12:40:11 2016 [SERVER IP] - - [POST] "/test.flv HTTP/1.1" 404 189&#xA;

    &#xA;&#xA;

    The config file is :

    &#xA;&#xA;

    #Sample ffserver configuration file&#xA;&#xA;# Port on which the server is listening. You must select a different&#xA;# port from your standard HTTP web server if it is running on the same&#xA;# computer.&#xA;Port 8090&#xA;&#xA;# Address on which the server is bound. Only useful if you have&#xA;# several network interfaces.&#xA;BindAddress 0.0.0.0&#xA;&#xA;# Number of simultaneous HTTP connections that can be handled. It has&#xA;# to be defined *before* the MaxClients parameter, since it defines the&#xA;# MaxClients maximum limit.&#xA;MaxHTTPConnections 2000&#xA;&#xA;# Number of simultaneous requests that can be handled. Since FFServer&#xA;# is very fast, it is more likely that you will want to leave this high&#xA;# and use MaxBandwidth, below.&#xA;MaxClients 1000&#xA;&#xA;# This the maximum amount of kbit/sec that you are prepared to&#xA;# consume when streaming to clients.&#xA;MaxBandwidth 1000&#xA;&#xA;# Access log file (uses standard Apache log file format)&#xA;# &#x27;-&#x27; is the standard output.&#xA;CustomLog -&#xA;&#xA;# Suppress that if you want to launch ffserver as a daemon.&#xA;#NoDaemon&#xA;&#xA;&#xA;##################################################################&#xA;# Definition of the live feeds. Each live feed contains one video&#xA;# and/or audio sequence coming from an ffmpeg encoder or another&#xA;# ffserver. This sequence may be encoded simultaneously with several&#xA;# codecs at several resolutions.&#xA;&#xA;<feed>&#xA;&#xA;ACL allow 192.168.0.0 192.168.255.255&#xA;&#xA;# You must use &#x27;ffmpeg&#x27; to send a live feed to ffserver. In this&#xA;# example, you can type:&#xA;#&#xA;#ffmpeg http://localhost:8090/test.ffm&#xA;&#xA;# ffserver can also do time shifting. It means that it can stream any&#xA;# previously recorded live stream. The request should contain:&#xA;# "http://xxxx?date=[YYYY-MM-DDT][[HH:]MM:]SS[.m...]".You must specify&#xA;# a path where the feed is stored on disk. You also specify the&#xA;# maximum size of the feed, where zero means unlimited. Default:&#xA;# File=/tmp/feed_name.ffm FileMaxSize=5M&#xA;File /tmp/feed1.ffm&#xA;FileMaxSize 200m&#xA;&#xA;# You could specify&#xA;# ReadOnlyFile /saved/specialvideo.ffm&#xA;# This marks the file as readonly and it will not be deleted or updated.&#xA;&#xA;# Specify launch in order to start ffmpeg automatically.&#xA;# First ffmpeg must be defined with an appropriate path if needed,&#xA;# after that options can follow, but avoid adding the http:// field&#xA;#Launch ffmpeg&#xA;&#xA;# Only allow connections from localhost to the feed.&#xA;    ACL allow 127.0.0.1&#xA;&#xA;</feed>&#xA;&#xA;&#xA;##################################################################&#xA;# Now you can define each stream which will be generated from the&#xA;# original audio and video stream. Each format has a filename (here&#xA;# &#x27;test1.mpg&#x27;). FFServer will send this stream when answering a&#xA;# request containing this filename.&#xA;&#xA;<stream>&#xA;&#xA;# coming from live feed &#x27;feed1&#x27;&#xA;Feed feed1.ffm&#xA;&#xA;# Format of the stream : you can choose among:&#xA;# mpeg       : MPEG-1 multiplexed video and audio&#xA;# mpegvideo  : only MPEG-1 video&#xA;# mp2        : MPEG-2 audio (use AudioCodec to select layer 2 and 3 codec)&#xA;# ogg        : Ogg format (Vorbis audio codec)&#xA;# rm         : RealNetworks-compatible stream. Multiplexed audio and video.&#xA;# ra         : RealNetworks-compatible stream. Audio only.&#xA;# mpjpeg     : Multipart JPEG (works with Netscape without any plugin)&#xA;# jpeg       : Generate a single JPEG image.&#xA;# asf        : ASF compatible streaming (Windows Media Player format).&#xA;# swf        : Macromedia Flash compatible stream&#xA;# avi        : AVI format (MPEG-4 video, MPEG audio sound)&#xA;Format mpeg&#xA;&#xA;# Bitrate for the audio stream. Codecs usually support only a few&#xA;# different bitrates.&#xA;AudioBitRate 32&#xA;&#xA;# Number of audio channels: 1 = mono, 2 = stereo&#xA;AudioChannels 2&#xA;&#xA;# Sampling frequency for audio. When using low bitrates, you should&#xA;# lower this frequency to 22050 or 11025. The supported frequencies&#xA;# depend on the selected audio codec.&#xA;AudioSampleRate 44100&#xA;&#xA;# Bitrate for the video stream&#xA;VideoBitRate 64&#xA;&#xA;# Ratecontrol buffer size&#xA;VideoBufferSize 40&#xA;&#xA;# Number of frames per second&#xA;VideoFrameRate 3&#xA;&#xA;# Size of the video frame: WxH (default: 160x128)&#xA;# The following abbreviations are defined: sqcif, qcif, cif, 4cif, qqvga,&#xA;# qvga, vga, svga, xga, uxga, qxga, sxga, qsxga, hsxga, wvga, wxga, wsxga,&#xA;# wuxga, woxga, wqsxga, wquxga, whsxga, whuxga, cga, ega, hd480, hd720,&#xA;# hd1080&#xA;VideoSize hd1080&#xA;&#xA;# Transmit only intra frames (useful for low bitrates, but kills frame rate).&#xA;#VideoIntraOnly&#xA;&#xA;# If non-intra only, an intra frame is transmitted every VideoGopSize&#xA;# frames. Video synchronization can only begin at an intra frame.&#xA;VideoGopSize 12&#xA;&#xA;# More MPEG-4 parameters&#xA;# VideoHighQuality&#xA;# Video4MotionVector&#xA;&#xA;# Choose your codecs:&#xA;#AudioCodec mp2&#xA;#VideoCodec mpeg1video&#xA;&#xA;# Suppress audio&#xA;#NoAudio&#xA;&#xA;# Suppress video&#xA;#NoVideo&#xA;&#xA;#VideoQMin 3&#xA;#VideoQMax 31&#xA;&#xA;# Set this to the number of seconds backwards in time to start. Note that&#xA;# most players will buffer 5-10 seconds of video, and also you need to allow&#xA;# for a keyframe to appear in the data stream.&#xA;#Preroll 15&#xA;&#xA;# ACL:&#xA;&#xA;# You can allow ranges of addresses (or single addresses)&#xA;ACL ALLOW localhost&#xA;&#xA;# You can deny ranges of addresses (or single addresses)&#xA;#ACL DENY <first address="address"> &#xA;&#xA;# You can repeat the ACL allow/deny as often as you like. It is on a per&#xA;# stream basis. The first match defines the action. If there are no matches,&#xA;# then the default is the inverse of the last ACL statement.&#xA;#&#xA;# Thus &#x27;ACL allow localhost&#x27; only allows access from localhost.&#xA;# &#x27;ACL deny 1.0.0.0 1.255.255.255&#x27; would deny the whole of network 1 and&#xA;# allow everybody else.&#xA;&#xA;</first></stream>&#xA;&#xA;&#xA;##################################################################&#xA;# Example streams&#xA;&#xA;&#xA;# Multipart JPEG&#xA;&#xA;#<stream>&#xA;#Feed feed1.ffm&#xA;#Format mpjpeg&#xA;#VideoFrameRate 2&#xA;#VideoIntraOnly&#xA;#NoAudio&#xA;#Strict -1&#xA;#</stream>&#xA;&#xA;&#xA;# Single JPEG&#xA;&#xA;#<stream>&#xA;#Feed feed1.ffm&#xA;#Format jpeg&#xA;#VideoFrameRate 2&#xA;#VideoIntraOnly&#xA;##VideoSize 352x240&#xA;#NoAudio&#xA;#Strict -1&#xA;#</stream>&#xA;&#xA;&#xA;# Flash&#xA;&#xA;#<stream>&#xA;#Feed feed1.ffm&#xA;#Format swf&#xA;#VideoFrameRate 2&#xA;#VideoIntraOnly&#xA;#NoAudio&#xA;#</stream>&#xA;&#xA;&#xA;# ASF compatible&#xA;&#xA;<stream>&#xA;Feed feed1.ffm&#xA;Format asf&#xA;VideoFrameRate 15&#xA;VideoSize 352x240&#xA;VideoBitRate 256&#xA;VideoBufferSize 40&#xA;VideoGopSize 30&#xA;AudioBitRate 64&#xA;StartSendOnKey&#xA;</stream>&#xA;&#xA;&#xA;# MP3 audio&#xA;&#xA;#<stream>&#xA;#Feed feed1.ffm&#xA;#Format mp2&#xA;#AudioCodec mp3&#xA;#AudioBitRate 64&#xA;#AudioChannels 1&#xA;#AudioSampleRate 44100&#xA;#NoVideo&#xA;#</stream>&#xA;&#xA;&#xA;# Ogg Vorbis audio&#xA;&#xA;#<stream>&#xA;#Feed feed1.ffm&#xA;#Title "Stream title"&#xA;#AudioBitRate 64&#xA;#AudioChannels 2&#xA;#AudioSampleRate 44100&#xA;#NoVideo&#xA;#</stream>&#xA;&#xA;&#xA;# Real with audio only at 32 kbits&#xA;&#xA;#<stream>&#xA;#Feed feed1.ffm&#xA;#Format rm&#xA;#AudioBitRate 32&#xA;#NoVideo&#xA;#NoAudio&#xA;#</stream>&#xA;&#xA;&#xA;# Real with audio and video at 64 kbits&#xA;&#xA;#<stream>&#xA;#Feed feed1.ffm&#xA;#Format rm&#xA;#AudioBitRate 32&#xA;#VideoBitRate 128&#xA;#VideoFrameRate 25&#xA;#VideoGopSize 25&#xA;#NoAudio&#xA;#</stream>&#xA;&#xA;&#xA;##################################################################&#xA;# A stream coming from a file: you only need to set the input&#xA;# filename and optionally a new format. Supported conversions:&#xA;#    AVI -> ASF&#xA;&#xA;#<stream>&#xA;#File "/usr/local/httpd/htdocs/tlive.rm"&#xA;#NoAudio&#xA;#</stream>&#xA;&#xA;#<stream>&#xA;#File "/usr/local/httpd/htdocs/test.asf"&#xA;#NoAudio&#xA;#Author "Me"&#xA;#Copyright "Super MegaCorp"&#xA;#Title "Test stream from disk"&#xA;#Comment "Test comment"&#xA;#</stream>&#xA;&#xA;&#xA;##################################################################&#xA;# RTSP examples&#xA;#&#xA;# You can access this stream with the RTSP URL:&#xA;#   rtsp://localhost:5454/test1-rtsp.mpg&#xA;#&#xA;# A non-standard RTSP redirector is also created. Its URL is:&#xA;#   http://localhost:8090/test1-rtsp.rtsp&#xA;&#xA;#<stream>&#xA;#Format rtp&#xA;#File "/usr/local/httpd/htdocs/test1.mpg"&#xA;#</stream>&#xA;&#xA;&#xA;# Transcode an incoming live feed to another live feed,&#xA;# using libx264 and video presets&#xA;&#xA;#<stream>&#xA;#Format rtp&#xA;#Feed feed1.ffm&#xA;#VideoCodec libx264&#xA;#VideoFrameRate 24&#xA;#VideoBitRate 100&#xA;#VideoSize 480x272&#xA;#AVPresetVideo default&#xA;#AVPresetVideo baseline&#xA;#AVOptionVideo flags &#x2B;global_header&#xA;#&#xA;#AudioCodec libfaac&#xA;#AudioBitRate 32&#xA;#AudioChannels 2&#xA;#AudioSampleRate 22050&#xA;#AVOptionAudio flags &#x2B;global_header&#xA;#</stream>&#xA;&#xA;##################################################################&#xA;# SDP/multicast examples&#xA;#&#xA;# If you want to send your stream in multicast, you must set the&#xA;# multicast address with MulticastAddress. The port and the TTL can&#xA;# also be set.&#xA;#&#xA;# An SDP file is automatically generated by ffserver by adding the&#xA;# &#x27;sdp&#x27; extension to the stream name (here&#xA;# http://localhost:8090/test1-sdp.sdp). You should usually give this&#xA;# file to your player to play the stream.&#xA;#&#xA;# The &#x27;NoLoop&#x27; option can be used to avoid looping when the stream is&#xA;# terminated.&#xA;&#xA;#<stream>&#xA;#Format rtp&#xA;#File "/usr/local/httpd/htdocs/test1.mpg"&#xA;#MulticastAddress 224.124.0.1&#xA;#MulticastPort 5000&#xA;#MulticastTTL 16&#xA;#NoLoop&#xA;#</stream>&#xA;&#xA;&#xA;##################################################################&#xA;# Special streams&#xA;&#xA;# Server status&#xA;&#xA;<stream>&#xA;Format status&#xA;&#xA;# Only allow local people to get the status&#xA;ACL allow localhost&#xA;ACL allow 192.168.0.0 192.168.255.255&#xA;&#xA;#FaviconURL http://pond1.gladstonefamily.net:8080/favicon.ico&#xA;</stream>&#xA;&#xA;&#xA;# Redirect index.html to the appropriate site&#xA;&#xA;<redirect>&#xA;URL http://www.ffmpeg.org/&#xA;</redirect>&#xA;&#xA;&#xA;#http://www.ffmpeg.org/&#xA;

    &#xA;&#xA;

    Any help is greatly appreciated, I will do my best draw a picture of the best answer based on their username.

    &#xA;