Recherche avancée

Médias (0)

Mot : - Tags -/signalement

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (93)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Le plugin : Gestion de la mutualisation

    2 mars 2010, par

    Le plugin de Gestion de mutualisation permet de gérer les différents canaux de mediaspip depuis un site maître. Il a pour but de fournir une solution pure SPIP afin de remplacer cette ancienne solution.
    Installation basique
    On installe les fichiers de SPIP sur le serveur.
    On ajoute ensuite le plugin "mutualisation" à la racine du site comme décrit ici.
    On customise le fichier mes_options.php central comme on le souhaite. Voilà pour l’exemple celui de la plateforme mediaspip.net :
    < ?php (...)

  • Gestion de la ferme

    2 mars 2010, par

    La ferme est gérée dans son ensemble par des "super admins".
    Certains réglages peuvent être fais afin de réguler les besoins des différents canaux.
    Dans un premier temps il utilise le plugin "Gestion de mutualisation"

Sur d’autres sites (11297)

  • Safari on Mac and IOS 14 Won't Play HTML 5 MP4 Video

    10 mars 2021, par Glen Elkins

    So i have developed a chat application that uses node for the back-end. When a user selects a video on their iphone it usually is .mov format so when it's sent to the node server it's then converted to mp4 with ffmpeg. All that works fine, then if i load up my chat again in Chrome on my mac the video plays just fine as the mp4.

    &#xA;

    enter image description here

    &#xA;

    This screenshot shows the video embed is there, set to mp4 yet it won't play in Safari on my mac or my phone, in fact it just shows the video as 0 seconds long yet i can play it in chrome and also download the mp4 file by accessing the embed url directly.

    &#xA;

    Any ideas ? I had it convert to mp4 to prevent things like this, but safari doesn't seem to even like mp4 files.

    &#xA;

    The back-end part that serves the private file is in Symfony 4 (PHP) :

    &#xA;

    /**&#xA;     * @Route("/private/files/download/{base64Path}", name="downloadFile")&#xA;     * @param string $base64Path&#xA;     * @param Request $request&#xA;     * @return Response&#xA;     */&#xA;    public function downloadFile(string $base64Path, Request $request) : Response&#xA;    {&#xA;&#xA;&#xA;        // get token&#xA;        if(!$token = $request->query->get(&#x27;token&#x27;)){&#xA;            return new Response(&#x27;Access Denied&#x27;,403);&#xA;        }&#xA;&#xA;&#xA;&#xA;        /** @var UserRepository $userRepo */&#xA;        $userRepo = $this->getDoctrine()->getRepository(User::class);&#xA;&#xA;        /** @var User $user */&#xA;        if(!$user = $userRepo->findOneBy([&#x27;deleted&#x27;=>false,&#x27;active&#x27;=>true,&#x27;systemUser&#x27;=>false,&#x27;apiKey&#x27;=>$token])){&#xA;            return new Response(&#x27;Access Denied&#x27;,403);&#xA;        }&#xA;&#xA;&#xA;&#xA;        // get path&#xA;        if($path = base64_decode($base64Path)){&#xA;&#xA;            // make sure the folder we need exists&#xA;            $fullPath = $this->getParameter(&#x27;private_upload_folder&#x27;) . &#x27;/&#x27; . $path;&#xA;&#xA;&#xA;&#xA;            if(!file_exists($fullPath)){&#xA;                return new Response(&#x27;File Not Found&#x27;,404);&#xA;            }&#xA;&#xA;        &#xA;&#xA;            $response = new Response();&#xA;            $response->headers->set(&#x27;Content-Type&#x27;, mime_content_type($fullPath));&#xA;            $response->headers->set(&#x27;Content-Disposition&#x27;, &#x27;inline; filename="&#x27; . basename($fullPath) . &#x27;"&#x27;);&#xA;            $response->headers->set(&#x27;Content-Length&#x27;, filesize($fullPath));&#xA;            $response->headers->set(&#x27;Pragma&#x27;, "no-cache");&#xA;            $response->headers->set(&#x27;Expires&#x27;, "0");&#xA;            $response->headers->set(&#x27;Content-Transfer-Encoding&#x27;, "binary");&#xA;&#xA;            $response->sendHeaders();&#xA;&#xA;            $response->setContent(readfile($fullPath));&#xA;&#xA;            return $response;&#xA;        }&#xA;&#xA;        return new Response(&#x27;Invalid Path&#x27;,404);&#xA;    }&#xA;

    &#xA;

    This works fine everywhere except safari when trying to embed the video. It's done like this because the videos are not public and need an access token.

    &#xA;

    UPDATE : Here is a test link of an mp4, you'll have to allow the insecure certificate as it's on a quick test sub domain. If you open it in chrome, you'll see a 3 second video of my 3d printer curing station, if you load the same link in safari, you'll see it doesn't work

    &#xA;

    https://tester.nibbrstaging.com/private/files/download/Y2hhdC83Nzk1Y2U2MC04MDFmLTExZWItYjkzYy1lZjI4ZGYwMDhkOTMubXA0?token=6ab1720bfe922d44208c25f655d61032

    &#xA;

    The server runs on cPanel with Apache and i think it might be something to do with the video needs streaming ?

    &#xA;

    UPDATED CODE THAT WORKS IN SAFARI BUT NOW BROKEN IN CHROME :

    &#xA;

    Chrome is now giving Content-Length : 0 but it's working fine in safari.

    &#xA;

    public function downloadFile(string $base64Path, Request $request) : ?Response&#xA;    {&#xA;&#xA;        ob_clean();&#xA;&#xA;        // get token&#xA;        if(!$token = $request->query->get(&#x27;token&#x27;)){&#xA;            return new Response(&#x27;Access Denied&#x27;,403);&#xA;        }&#xA;&#xA;&#xA;        &#xA;&#xA;        /** @var UserRepository $userRepo */&#xA;        $userRepo = $this->getDoctrine()->getRepository(User::class);&#xA;&#xA;        /** @var User $user */&#xA;        if(!$user = $userRepo->findOneBy([&#x27;deleted&#x27;=>false,&#x27;active&#x27;=>true,&#x27;systemUser&#x27;=>false,&#x27;apiKey&#x27;=>$token])){&#xA;            return new Response(&#x27;Access Denied&#x27;,403);&#xA;        }&#xA;&#xA;&#xA;&#xA;        // get path&#xA;        if($path = base64_decode($base64Path)){&#xA;&#xA;            // make sure the folder we need exists&#xA;            $fullPath = $this->getParameter(&#x27;private_upload_folder&#x27;) . &#x27;/&#x27; . $path;&#xA;&#xA;&#xA;&#xA;            if(!file_exists($fullPath)){&#xA;                return new Response(&#x27;File Not Found&#x27;,404);&#xA;            }&#xA;&#xA;&#xA;            $filesize = filesize($fullPath);&#xA;            $mime = mime_content_type($fullPath);&#xA;&#xA;            header(&#x27;Content-Type: &#x27; . $mime);&#xA;&#xA;            if(isset($_SERVER[&#x27;HTTP_RANGE&#x27;])){&#xA;&#xA;                // Parse the range header to get the byte offset&#xA;                $ranges = array_map(&#xA;                    &#x27;intval&#x27;, // Parse the parts into integer&#xA;                    explode(&#xA;                        &#x27;-&#x27;, // The range separator&#xA;                        substr($_SERVER[&#x27;HTTP_RANGE&#x27;], 6) // Skip the `bytes=` part of the header&#xA;                    )&#xA;                );&#xA;&#xA;&#xA;&#xA;                // If the last range param is empty, it means the EOF (End of File)&#xA;                if(!$ranges[1]){&#xA;                    $ranges[1] = $filesize - 1;&#xA;                }&#xA;&#xA;                header(&#x27;HTTP/1.1 206 Partial Content&#x27;);&#xA;                header(&#x27;Accept-Ranges: bytes&#x27;);&#xA;                header(&#x27;Content-Length: &#x27; . ($ranges[1] - $ranges[0])); // The size of the range&#xA;&#xA;                // Send the ranges we offered&#xA;                header(&#xA;                    sprintf(&#xA;                        &#x27;Content-Range: bytes %d-%d/%d&#x27;, // The header format&#xA;                        $ranges[0], // The start range&#xA;                        $ranges[1], // The end range&#xA;                        $filesize // Total size of the file&#xA;                    )&#xA;                );&#xA;&#xA;                // It&#x27;s time to output the file&#xA;                $f = fopen($fullPath, &#x27;rb&#x27;); // Open the file in binary mode&#xA;                $chunkSize = 8192; // The size of each chunk to output&#xA;&#xA;                // Seek to the requested start range&#xA;                fseek($f, $ranges[0]);&#xA;&#xA;                // Start outputting the data&#xA;                while(true){&#xA;                    // Check if we have outputted all the data requested&#xA;                    if(ftell($f) >= $ranges[1]){&#xA;                        break;&#xA;                    }&#xA;&#xA;                    // Output the data&#xA;                    echo fread($f, $chunkSize);&#xA;&#xA;                    // Flush the buffer immediately&#xA;                    @ob_flush();&#xA;                    flush();&#xA;                }&#xA;            }else{&#xA;&#xA;                // It&#x27;s not a range request, output the file anyway&#xA;                header(&#x27;Content-Length: &#x27; . $filesize);&#xA;&#xA;                // Read the file&#xA;                @readfile($filesize);&#xA;&#xA;                // and flush the buffer&#xA;                @ob_flush();&#xA;                flush();&#xA;&#xA;&#xA;&#xA;            }&#xA;&#xA;        }else {&#xA;&#xA;            return new Response(&#x27;Invalid Path&#x27;, 404);&#xA;        }&#xA;    }&#xA;

    &#xA;

    I have notice in chrome that it's sending the range header like this :

    &#xA;

    Range : bytes=611609-

    &#xA;

    Where safari sends

    &#xA;

    Range : bytes=611609-61160

    &#xA;

    So for some reason chrome is missing the second range amount, that obviously means my code can't find a range number for the second one.

    &#xA;

    Doesn’t matter what I do I can’t get it working in both chrome and safari. Safari wants the byte range part , chrome seems to request it then sends a new request for the full file but even the full file part of the code gives a 500 error. If I take out the byte range bit then it works fine in chrome but not safari.

    &#xA;

    UPDATE :

    &#xA;

    Here is some strange things going on in chrome :

    &#xA;

    For the video i am testing with it makes 3 range requests :

    &#xA;

    REQUEST 1 HEADERS - asking for bytes 0- (to the end of the file)

    &#xA;

    GET /private/files/download/Y2hhdC83Nzk1Y2U2MC04MDFmLTExZWItYjkzYy1lZjI4ZGYwMDhkOTMubXA0?token=6ab1720bfe922d44208c25f655d61032 HTTP/1.1&#xA;&#xA;Connection: keep-alive&#xA;User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.192 Safari/537.36&#xA;Accept-Encoding: identity;q=1, *;q=0&#xA;Accept: */*&#xA;Sec-Fetch-Site: same-site&#xA;Sec-Fetch-Mode: no-cors&#xA;Sec-Fetch-Dest: video&#xA;Referer: https://gofollow.vip/&#xA;Accept-Language: en-US,en;q=0.9&#xA;Range: bytes=0-&#xA;

    &#xA;

    RESPONSE GIVES IT BACK ALL THE BYTES IN THE FILE AS THAT'S WHAT WAS ASKED FOR BY CHROME :

    &#xA;

    HTTP/1.1 206 Partial Content&#xA;Date: Wed, 10 Mar 2021 12:35:54 GMT&#xA;Server: Apache&#xA;Accept-Ranges: bytes&#xA;Content-Length: 611609&#xA;Content-Range: bytes 0-611609/611610&#xA;Vary: User-Agent&#xA;Keep-Alive: timeout=5, max=100&#xA;Connection: Keep-Alive&#xA;Content-Type: video/mp4&#xA;

    &#xA;

    SECOND REQUEST HEADERS : NOW IT'S ASKING FOR 589824 to the end of the file :

    &#xA;

    Connection: keep-alive&#xA;User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.192 Safari/537.36&#xA;Accept-Encoding: identity;q=1, *;q=0&#xA;Accept: */*&#xA;Sec-Fetch-Site: same-site&#xA;Sec-Fetch-Mode: no-cors&#xA;Sec-Fetch-Dest: video&#xA;Referer: https://gofollow.vip/&#xA;Accept-Language: en-US,en;q=0.9&#xA;Range: bytes=589824-&#xA;

    &#xA;

    RESPONSE OBLIGES :

    &#xA;

    HTTP/1.1 206 Partial Content&#xA;Date: Wed, 10 Mar 2021 12:35:55 GMT&#xA;Server: Apache&#xA;Accept-Ranges: bytes&#xA;Content-Length: 21785&#xA;Content-Range: bytes 589824-611609/611610&#xA;Vary: User-Agent&#xA;Keep-Alive: timeout=5, max=99&#xA;Connection: Keep-Alive&#xA;Content-Type: video/mp4&#xA;

    &#xA;

    THEN IT'S MAKING THIS 3rd REQUEST THAT GIVES AN INTERNAL SERVER ERORR, THIS TIME IT'S LITERALLY ASKING FOR THE LAST BYTE :

    &#xA;

    GET /private/files/download/Y2hhdC83Nzk1Y2U2MC04MDFmLTExZWItYjkzYy1lZjI4ZGYwMDhkOTMubXA0?token=6ab1720bfe922d44208c25f655d61032 HTTP/1.1&#xA;&#xA;Connection: keep-alive&#xA;User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.192 Safari/537.36&#xA;Accept-Encoding: identity;q=1, *;q=0&#xA;Accept: */*&#xA;Sec-Fetch-Site: same-site&#xA;Sec-Fetch-Mode: no-cors&#xA;Sec-Fetch-Dest: video&#xA;Referer: https://gofollow.vip/&#xA;Accept-Language: en-US,en;q=0.9&#xA;Range: bytes=611609-&#xA;

    &#xA;

    RESPONSE - THE CONTENT LENGTH IS 0 BECAUSE THERE IS NO DIFFERENCE BETWEEN THE REQUESTED BYTES AND THE BYTES RETURNED :

    &#xA;

    HTTP/1.1 500 Internal Server Error&#xA;Date: Wed, 10 Mar 2021 12:35:56 GMT&#xA;Server: Apache&#xA;Accept-Ranges: bytes&#xA;Cache-Control: max-age=0, must-revalidate, private&#xA;X-Frame-Options: DENY&#xA;X-XSS-Protection: 1&#xA;X-Content-Type-Options: nosniff&#xA;Referrer-Policy: origin&#xA;Strict-Transport-Security: max-age=31536000; includeSubDomains&#xA;Expires: Wed, 10 Mar 2021 12:35:56 GMT&#xA;Content-Length: 0&#xA;Content-Range: bytes 611609-611609/611610&#xA;Vary: User-Agent&#xA;Connection: close&#xA;Content-Type: text/html; charset=UTF-8&#xA;

    &#xA;

  • Video record with audio in wpf

    5 août 2022, par Kostas Kontaras

    I am developing a chat application in WPF .NET Framework 4.7.2.&#xA;I want to implement video recording functionality using the web camera of the PC.&#xA;Up to now, I have done this :&#xA;I use AForge.Video and AForge.Video.DirectShow to use the webcam and get the frames.&#xA;Aforge creates a new thread for every frame. I'm receiving where I save the image and pass it on the UI thread to show the image.

    &#xA;

     private void Cam_NewFrame(object sender, NewFrameEventArgs eventArgs)&#xA;        {&#xA;            //handle frames from camera&#xA;            try&#xA;            {&#xA;                //New task to save the bitmap (new frame) into an image&#xA;                Task.Run(() =>&#xA;                {&#xA;                    if (_recording)&#xA;                    {&#xA;                        &#xA;                        currentreceivedframebitmap = (Bitmap)eventArgs.Frame.Clone();&#xA;                        currentreceivedframebitmap.Save($@"{CurrentRecordingFolderForImages}/{imgNumber}-{guidName}.png", ImageFormat.Png);&#xA;                        imgNumber&#x2B;&#x2B;;&#xA;                    }&#xA;                });&#xA;                //convert bitmap to bitmapImage to show it on the ui&#xA;                BitmapImage bi;&#xA;                CurrentFrame = new Bitmap(eventArgs.Frame);&#xA;                using (var bitmap = (Bitmap)eventArgs.Frame.Clone())&#xA;                {&#xA;                    bi = new BitmapImage();&#xA;                    bi.BeginInit();&#xA;                    MemoryStream ms = new MemoryStream();&#xA;                    bitmap.Save(ms, ImageFormat.Bmp);&#xA;                    bi.StreamSource = ms;&#xA;                    bi.CacheOption = BitmapCacheOption.OnLoad;&#xA;                    bi.EndInit();&#xA;&#xA;                }&#xA;                bi.Freeze();&#xA;                Dispatcher.BeginInvoke(new ThreadStart(delegate&#xA;                {&#xA;                    imageFrames.Source = bi;&#xA;                }));&#xA;            }&#xA;            catch (Exception ex)&#xA;            {&#xA;                Console.WriteLine(ex.Message);&#xA;            }&#xA;        }&#xA;

    &#xA;

    When the record finishes i take the image and make the video using ffmpeg.

    &#xA;

     public static void ImagesToVideo(string ffmpegpath, string guid, string CurrentRecordingFolderForImages, string outputPath, int frameRate, int quality, int avgFrameRate)&#xA;        {&#xA;            &#xA;            Process process;&#xA;            process = new Process&#xA;            {&#xA;&#xA;                StartInfo = new ProcessStartInfo&#xA;                {&#xA;                    FileName = $@"{ffmpegpath}",&#xA;                    //-r framerate , vcodec video codec, -crf video quality 0-51&#xA;                    Arguments = $@" -r {frameRate} -i {CurrentRecordingFolderForImages}\%d-{guid}.png -r {avgFrameRate} -vcodec libx264 -crf {quality} -pix_fmt yuv420p  {outputPath}",&#xA;                    UseShellExecute = false,&#xA;                    RedirectStandardOutput = true,&#xA;                    CreateNoWindow = true,&#xA;                    RedirectStandardError = true&#xA;                },&#xA;                EnableRaisingEvents = true,&#xA;&#xA;            };&#xA;            process.Exited &#x2B;= ExeProcess_Exited;&#xA;            process.Start();&#xA;&#xA;            string processOutput = null;&#xA;            while ((processOutput = process.StandardError.ReadLine()) != null)&#xA;            {&#xA;                //TO-DO handle errors&#xA;                Debug.WriteLine(processOutput);&#xA;            }&#xA;        }&#xA;

    &#xA;

    For the sound i use Naudio to record it and save it

    &#xA;

    waveSource = new WaveIn();&#xA;            waveSource.StartRecording();&#xA;            waveFile = new WaveFileWriter(AudioFilePath, waveSource.WaveFormat);&#xA;&#xA;            waveSource.WaveFormat = new WaveFormat(8000, 1);&#xA;            waveSource.DataAvailable &#x2B;= new EventHandler<waveineventargs>(waveSource_DataAvailable);&#xA;            waveSource.RecordingStopped &#x2B;= new EventHandler<stoppedeventargs>(waveSource_RecordingStopped);&#xA;&#xA;private void waveSource_DataAvailable(object sender, WaveInEventArgs e)&#xA;        {&#xA;            if (waveFile != null)&#xA;            {&#xA;                waveFile.Write(e.Buffer, 0, e.BytesRecorded);&#xA;                waveFile.Flush();&#xA;            }&#xA;        }&#xA;</stoppedeventargs></waveineventargs>

    &#xA;

    and then ffmpeg again to merge video with sound

    &#xA;

    public static void AddAudioToVideo(string ffmpegpath, string VideoPath, string AudioPath, string outputPath)&#xA;        {&#xA;            _videoPath = VideoPath;&#xA;            _audioPath = AudioPath;&#xA;            Process process;&#xA;&#xA;            process = new Process&#xA;            {&#xA;&#xA;                StartInfo = new ProcessStartInfo&#xA;                {&#xA;                    FileName = $@"{ffmpegpath}",&#xA;                    Arguments = $" -i {VideoPath} -i {AudioPath} -map 0:v -map 1:a -c:v copy -shortest {outputPath} -y",&#xA;                    UseShellExecute = false,&#xA;                    RedirectStandardOutput = true,&#xA;                    CreateNoWindow = true,&#xA;                    RedirectStandardError = true&#xA;                },&#xA;                EnableRaisingEvents = true,&#xA;&#xA;            };&#xA;            process.Exited &#x2B;= ExeProcess_Exited;&#xA;            process.Start();&#xA;&#xA;            string processOutput = null;&#xA;            while ((processOutput = process.StandardError.ReadLine()) != null)&#xA;            {&#xA;                // do something with processOutput&#xA;                Debug.WriteLine(processOutput);&#xA;            }&#xA;&#xA;        }&#xA;

    &#xA;

    Questions :

    &#xA;

      &#xA;
    1. Is there a better approach to achieve what im trying to do ?
    2. &#xA;

    3. My camera has 30 fps capability but i receive only 16 fps how could this happen ?
    4. &#xA;

    5. Sometimes video and sound are not synchronized.
    6. &#xA;

    &#xA;

    i created a sample project github.com/dinos19/WPFVideoRecorder

    &#xA;

  • How to build and link FFMPEG to iOS ?

    30 juin 2015, par Alexander Tkachenko

    all !

    I know, there are a lot of questions here about FFMPEG on iOS, but no one answer is appropriate for my case :(
    Something strange happens each case when I am trying to link FFMPEG in my project, so please, help me !

    My task is to write video-chat application for iOS, that uses RTMP-protocol for publishing and reading video-stream to/from custom Flash Media Server.

    I decided to use rtmplib, free open-source library for streaming FLV video over RTMP, as it is the only appropriate library.

    Many problem appeared when I began research of it, but later I understood how it should work.

    Now I can read live stream of FLV video(from url) and send it back to channel, with the help of my application.

    My trouble now is in sending video FROM Camera.
    Basic operations sequence, as I understood, should be the following :

    1. Using AVFoundation, with the help of sequence (Device-AVCaptureSession-AVVideoDataOutput-> AVAssetWriter) I write this to a file(If you need, I can describe this flow more detailed, but in the context of question it is not important). This flow is necessary to make hardware-accelerated conversion of live video from the camera into H.264 codec. But it is in MOV container format. (This is completed step)

    2. I read this temporary file with each sample written, and obtain the stream of bytes of video-data, (H.264 encoded, in QuickTime container). (this is allready completed step)

    3. I need to convert videodata from QuickTime container format to FLV. And it all in real-time.(packet - by - packet)

    4. If i will have the packets of video-data, contained in FLV container format, I will be able to send packets over RTMP using rtmplib.

    Now, the most complicated part for me, is step 3.

    I think, I need to use ffmpeg lib to this conversion (libavformat). I even found the source code, showing how to decode h.264 data packets from MOV file (looking in libavformat, i found that it is possible to extract this packets even from byte stream, which is more appropriate for me). And having this completed, I will need to encode packets into FLV(using ffmpeg or manually, in a way of adding FLV-headers to h.264 packets, it is not problem and is easy, if I am correct).

    FFMPEG has great documentation and is very powerfull library, and I think, there won’t be a problem to use it. BUT the problem here is that I can not got it working in iOS project.

    I have spend 3 days reading documentation, stackoverflow and googling the answer on the question "How to build FFMPEG for iOS" and I think, my PM is gonna fire me if I will spend one more week on trying to compile this library :))

    I tried to use many different build-scripts and configure files, but when I build FFMPEG, i Got libavformat, libavcodec, etc. for x86 architecture (even when I specify armv6 arch in build-script). (I use "lipo -info libavcodec.a" to show architectures)

    So I cannot build this sources, and decided to find prebuilt FFMPEG, that is build for architecture armv7, armv6, i386.

    I have downloaded iOS Comm Lib from MidnightCoders from github, and it contains example of usage FFMPEG, it contains prebuilt .a files of avcodec,avformat and another FFMPEG libraries.

    I check their architecture :

    iMac-2:MediaLibiOS root# lipo -info libavformat.a
    Architectures in the fat file: libavformat.a are: armv6 armv7 i386

    And I found that it is appropriate for me !
    When I tried to add this libraries and headers to xCode project, It compiles fine(and I even have no warnings like "Library is compiled for another architecture"), and I can use structures from headers, but when I am trying to call C-function from libavformat (av_register_all()), the compiler show me error message "Symbol(s) not found for architecture armv7 : av_register_all".

    I thought, that maybe there are no symbols in lib, and tried to show them :

    root# nm -arch armv6 libavformat.a | grep av_register_all
    00000000 T _av_register_all

    Now I am stuck here, I don’t understand, why xCode can not see this symbols, and can not move forward.

    Please, correct me if I am wrong in the understanding of flow for publishing RTMP-stream from iOS, and help me in building and linking FFMPEG for iOS.

    I have iPhone 5.1. SDK and xCode 4.2.