Recherche avancée

Médias (1)

Mot : - Tags -/book

Autres articles (80)

  • Gestion des droits de création et d’édition des objets

    8 février 2011, par

    Par défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;

  • Other interesting software

    13 avril 2011, par

    We don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
    The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
    We don’t know them, we didn’t try them, but you can take a peek.
    Videopress
    Website : http://videopress.com/
    License : GNU/GPL v2
    Source code : (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (12290)

  • Unity : Converting Texture2D to YUV420P and sending with UDP using FFmpeg

    22 juin 2018, par potu1304

    In my Unity game each frame is rendered into a texture and then put together into a video using FFmpeg. Now my questions is if I am doing this right because avcodec_send_frame throws every time an exception.
    I am pretty sure that I am doing something wrong or in the wrong order or simply missing something.

    Here is the code for capturing the texture :

    void Update() {
           //StartCoroutine(CaptureFrame());

           if (rt == null)
           {
               rect = new Rect(0, 0, captureWidth, captureHeight);
               rt = new RenderTexture(captureWidth, captureHeight, 24);
               frame = new Texture2D(captureWidth, captureHeight, TextureFormat.RGB24, false);
           }

           Camera camera = this.GetComponent<camera>(); // NOTE: added because there was no reference to camera in original script; must add this script to Camera
           camera.targetTexture = rt;
           camera.Render();

           RenderTexture.active = rt;
           frame.ReadPixels(rect, 0, 0);
           frame.Apply();

           camera.targetTexture = null;
           RenderTexture.active = null;

           byte[] fileData = null;
           fileData = frame.GetRawTextureData();
           encoding(fileData, fileData.Length);

       }
    </camera>

    And here is the code for encoding and sending the byte data :

    private unsafe void encoding(byte[] bytes, int size)
       {
           Debug.Log("Encoding...");
           AVCodec* codec;
           codec = ffmpeg.avcodec_find_encoder(AVCodecID.AV_CODEC_ID_H264);
           int ret, got_output = 0;

           AVCodecContext* codecContext = null;
           codecContext = ffmpeg.avcodec_alloc_context3(codec);
           codecContext->bit_rate = 400000;
           codecContext->width = captureWidth;
           codecContext->height = captureHeight;
           //codecContext->time_base.den = 25;
           //codecContext->time_base.num = 1;

           AVRational timeBase = new AVRational();
           timeBase.num = 1;
           timeBase.den = 25;
           codecContext->time_base = timeBase;
           //AVStream* videoAVStream = null;
           //videoAVStream->time_base = timeBase;



           AVRational frameRate = new AVRational();
           frameRate.num = 25;
           frameRate.den = 1;
           codecContext->framerate = frameRate;

           codecContext->gop_size = 10;
           codecContext->max_b_frames = 1;
           codecContext->pix_fmt = AVPixelFormat.AV_PIX_FMT_YUV420P;

           AVFrame* inputFrame;
           inputFrame = ffmpeg.av_frame_alloc();
           inputFrame->format = (int)codecContext->pix_fmt;
           inputFrame->width = captureWidth;
           inputFrame->height = captureHeight;
           inputFrame->linesize[0] = inputFrame->width;

           AVPixelFormat dst_pix_fmt = AVPixelFormat.AV_PIX_FMT_YUV420P, src_pix_fmt = AVPixelFormat.AV_PIX_FMT_RGBA;
           int src_w = 1920, src_h = 1080, dst_w = 1920, dst_h = 1080;
           SwsContext* sws_ctx;

           GCHandle pinned = GCHandle.Alloc(bytes, GCHandleType.Pinned);
           IntPtr address = pinned.AddrOfPinnedObject();

           sbyte** inputData = (sbyte**)address;
           sws_ctx = ffmpeg.sws_getContext(src_w, src_h, src_pix_fmt,
                                dst_w, dst_h, dst_pix_fmt,
                                0, null, null, null);

           fixed (int* lineSize = new int[1])
           {
               lineSize[0] = 4 * captureHeight;
               // Convert RGBA to YUV420P
               ffmpeg.sws_scale(sws_ctx, inputData, lineSize, 0, codecContext->width, inputFrame->extended_data, inputFrame->linesize);
           }

           inputFrame->pts = counter++;

           if (ffmpeg.avcodec_send_frame(codecContext, inputFrame) &lt; 0)
               throw new ApplicationException("Error sending a frame for encoding!");

           AVPacket pkt;
           pkt = new AVPacket();
           //pkt.data = inData;
           AVPacket* packet = &amp;pkt;
           ffmpeg.av_init_packet(packet);

           Debug.Log("pkt.size " + pkt.size);
           pinned.Free();
           AVDictionary* options = null;
           ffmpeg.av_dict_set(&amp;options, "pkt_size", "1300", 0);
           ffmpeg.av_dict_set(&amp;options, "buffer_size", "65535", 0);
           AVIOContext* server = null;
           ffmpeg.avio_open2(&amp;server, "udp://192.168.0.1:1111", ffmpeg.AVIO_FLAG_WRITE, null, &amp;options);
           Debug.Log("encoded");
           ret = ffmpeg.avcodec_encode_video2(codecContext, &amp;pkt, inputFrame, &amp;got_output);
           ffmpeg.avio_write(server, pkt.data, pkt.size);
           ffmpeg.av_free_packet(&amp;pkt);
           pkt.data = null;
           pkt.size = 0;
       }

    And every time I start the game

     if (ffmpeg.avcodec_send_frame(codecContext, inputFrame) &lt; 0)
               throw new ApplicationException("Error sending a frame for encoding!");

    throws the exception.
    Any help in fixing the issue would be greatly appreciated :)

  • Cannot install ffmpeg via brew on macOS Mojave

    12 juillet 2018, par Savinggrace
    • Xcode 10 beta installed
    • Command line tool for macOS 10.14 installed
    • macOS_SDK_headers_for_macOS_10.14.pkg installed
    • brew update done ;

    When trying to install ffmpeg by brew install ffmpeg, I had un error below :

    tar: Error exit delayed from previous errors.
    Error: Failure while executing: tar xf /Users/myname/Library/Caches/Homebrew/texi2html-5.0.tar.gz -C /private/tmp/texi2html-20180712-39689-hmsh90

    Which seems like a problem of Homebrew texi2html-5.0.tar.gz.

    Then using brew info ffmpeg :

    ffmpeg: stable 4.0.1, HEAD
    Play, record, convert, and stream audio and video
    https://ffmpeg.org/
    Not installed
    From: https://github.com/Homebrew/homebrew-core/blob/master/Formula/ffmpeg.rb
    ==> Dependencies
    Build: nasm ✔, pkg-config ✔, texi2html ✘
    Recommended: lame ✘, x264 ✘, xvid ✘
    Optional: chromaprint ✘, fdk-aac ✘, fontconfig ✘, freetype ✘, frei0r ✘, game-music-emu ✘, libass ✘, libbluray ✘, libbs2b ✘, libcaca ✘, libgsm ✘, libmodplug ✘, librsvg ✘, libsoxr ✘, libssh ✘, libvidstab ✘, libvorbis ✘, libvpx ✘, opencore-amr ✘, openh264 ✘, openjpeg ✘, openssl ✘, opus ✘, rtmpdump ✘, rubberband ✘, sdl2 ✘, snappy ✘, speex ✘, tesseract ✘, theora ✘, two-lame ✘, wavpack ✘, webp ✘, x265 ✘, xz ✔, zeromq ✘, zimg ✘, srt ✘

    As is not working, so I just downloaded static FFmpeg binaries for macOS 64-bit from https://evermeet.cx/ffmpeg/ and moved the exec file to /usr/local/bin/ and now ffmpeg works perfectly.

    But I’m still curious to know how to resolve this brew error.

    Thanks in advance if anyone has the solution.

  • Intercept and divert graphic output before it hits the linux framebuffer, not using X11

    13 octobre 2018, par Apollo

    I have been researching this issue for about a week, and it may be that I don’t know the right questions to ask.

    I am running a debian distro (Raspbian Stretch on the RaspberryPi). I am not using X11. Instead, I am booting straight to the command line, and will eventually probably boot straight up headless, starting a program at startup or sshing in to interact.

    What I need to do is to start an application (a game engine specifically) that draws graphics to the framebuffer. Then, I need to intercept that stream before it makes it into the framebuffer, so I can work with it in real time.

    Ultimately, I am using ffmpeg to compress and stream video of the output to another Pi. So, I want to be able to start an application and stream its output over a LAN, while still being able to interact with the command line in a separate thread.

    I have the ffmpeg command to pull from /dev/fb0, and have successfully started the graphics application and streamed the content. But, is there anyway to intercept, capture, or redirect that application’s output so it never actually hits the framebuffer ? In my searching I have found many examples of writing to or reading from the framebuffer, but nothing about stopping content before it reaches the buffer.

    I am happy with any solution that uses an existing package or application, or for C or RUST code that accomplishes what I need.

    Thank you