
Recherche avancée
Autres articles (46)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...) -
Librairies et binaires spécifiques au traitement vidéo et sonore
31 janvier 2010, parLes logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
Binaires complémentaires et facultatifs flvtool2 : (...)
Sur d’autres sites (8920)
-
Unity : Converting Texture2D to YUV420P using FFmpeg
23 juillet 2021, par strong_kobayashiI'm trying to create a game in Unity where each frame is rendered into a texture and then put together into a video using FFmpeg. The output created by FFmpeg should eventually be sent over the network to a client UI. However, I'm struggling mainly with the part where a frame is caught, and passed to an unsafe method as a byte array where it should be processed further by FFmpeg. The wrapper I'm using is FFmpeg.AutoGen.



The render to texture method :



private IEnumerator CaptureFrame()
{
 yield return new WaitForEndOfFrame();

 RenderTexture.active = rt;
 frame.ReadPixels(rect, 0, 0);
 frame.Apply();

 bytes = frame.GetRawTextureData();

 EncodeAndWrite(bytes, bytes.Length);
}




The unsafe encoding method so far :



private unsafe void EncodeAndWrite(byte[] bytes, int size)
{
 GCHandle pinned = GCHandle.Alloc(bytes, GCHandleType.Pinned);
 IntPtr address = pinned.AddrOfPinnedObject();

 sbyte** inData = (sbyte**)address;
 fixed(int* lineSize = new int[1])
 {
 lineSize[0] = 4 * textureWidth;
 // Convert RGBA to YUV420P
 ffmpeg.sws_scale(sws, inData, lineSize, 0, codecContext->width, inputFrame->extended_data, inputFrame->linesize);
 }

 inputFrame->pts = frameCounter++;

 if(ffmpeg.avcodec_send_frame(codecContext, inputFrame) < 0)
 throw new ApplicationException("Error sending a frame for encoding!");

 pkt = new AVPacket();
 fixed(AVPacket* packet = &pkt)
 ffmpeg.av_init_packet(packet);
 pkt.data = null;
 pkt.size = 0;

 pinned.Free();
 ...
}




sws_scale
takes asbyte**
as the second parameter, therefore I'm trying to convert the input byte array tosbyte**
by first pinning it withGCHandle
and doing an explicit type conversion afterwards. I don't know if that's the correct way, though.


Moreover, the condition
if(ffmpeg.avcodec_send_frame(codecContext, inputFrame) < 0)
alwasy throws an ApplicationException, where I also really don't know why this happens.codecContext
andinputFrame
are my AVCodecContext and AVFrame objects, respectively, and the fields are defined as the following :


codecContext



codecContext = ffmpeg.avcodec_alloc_context3(codec);
codecContext->bit_rate = 400000;
codecContext->width = textureWidth;
codecContext->height = textureHeight;

AVRational timeBase = new AVRational();
timeBase.num = 1;
timeBase.den = (int)fps;
codecContext->time_base = timeBase;
videoAVStream->time_base = timeBase;

AVRational frameRate = new AVRational();
frameRate.num = (int)fps;
frameRate.den = 1;
codecContext->framerate = frameRate;

codecContext->gop_size = 10;
codecContext->max_b_frames = 1;
codecContext->pix_fmt = AVPixelFormat.AV_PIX_FMT_YUV420P;




inputFrame



inputFrame = ffmpeg.av_frame_alloc();
inputFrame->format = (int)codecContext->pix_fmt;
inputFrame->width = textureWidth;
inputFrame->height = textureHeight;
inputFrame->linesize[0] = inputFrame->width;




Any help in fixing the issue would be greatly appreciated :)


-
FFmpeg : Decoding AVPackets received from UDP socket
8 août 2022, par strong_kobayashiI am trying to stream video frames encoded with FFmpeg from an Unity game to a client UI via UDP. To be specific, I am sending AVPackets (which contain the compressed frames, as far as I understand) from the server, which are then, on the client side, extracted as follows :



inline UDPpacket* SDLGameClient::receiveData()
{
 if(SDLNet_UDP_Recv(socket, packet))
 return packet;
 else
 return NULL;
}

int main()
{
 ...
 // Initialize UDP
 ...
 UDPpacket *udpPacket;

 int i = 0;

 for(;;)
 {
 udpPacket = client->receiveData();

 if(udpPacket != nullptr)
 {
 i++;
 std::cout << "Packet " << i << " received!" << std::endl;

 AVPacket packet;
 av_init_packet(&packet);

 packet.data = new uint8_t[udpPacket->len];
 memcpy(packet.data, udpPacket->data, udpPacket->len);
 packet.size = udpPacket->len;

 ...




To realize networking, I use the SDL_net library. Fragmenting, sending and receiving the packets seem to work without any problem. My question is, how do I decode the incoming AVPackets and stream the frames recorded in Unity to the client ?
I am aware that I need to use
avcodec_send_packet
andavcodec_receive_frame
for decoding (sinceavcodec_decode_video2
, that is used in many example decoding code, is deprecated), but when I do something like this :


int ret = avcodec_send_packet(codecContext, &packet);
if(ret < 0 || ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)
 std::cout << "avcodec_send_packet: " << ret << std::endl;
else
{
 while(ret >= 0)
 {
 ret = avcodec_receive_frame(codecContext, frame);
 if(ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)
 std::cout << "avcodec_receive_frame: " << ret << std::endl;

 std::cout << "Frame: " << codecContext->frame_number << std::endl;
 }
}

av_packet_unref(packet);




ret always returns a negative value (-22), so perhaps something is wrong with the AVPackets, or I'm sending the frames way too fast, I really have no clue :/



Thanks in advance for any help !


-
FFmpeg somehow set the udp speed ?
19 juin 2018, par potu1304I wanted to nearly live stream my Unit game with FFmpeg to a simple client. I have one Unity game in which each frame is saved as an jpg image. These images are wrapped in ffmpeg and send over udp to a simple c# client where I use ffplay to play the stream. The problem is, that FFmpeg is wrapping the images way faster than the unity app can write them. So ffmpeg quits but Unity is still writing frames. Is there a way to set ffmpeg in a loop to wait for the next image or can I somehow make a for loop without call every time ffmpeg ?
Here is my function from my capturing script in Unity :
Process process;
//BinaryWriter _stdin;
public void encodeFrame()
{
ProcessStartInfo startInfo = new ProcessStartInfo();
var basePath = Application.streamingAssetsPath + "/FFmpegOut/Windows/ffmpeg.exe";
info.Arguments = "-re -i screen_%d.jpg -vcodec libx264 -r 24 -f mpegts udp://127.0.0.1:1100";
info.RedirectStandardOutput = true;
info.RedirectStandardInput = true;
info.RedirectStandardError = true;
info.CreateNoWindow = true;
info.UseShellExecute = false;
info.RedirectStandardError = true;
UnityEngine.Debug.Log(string.Format(
"Executing \"{0}\" with arguments \"{1}\".\r\n",
info.FileName,
info.Arguments));
process = Process.Start(info);
//_stdin = new BinaryWriter(process.StandardInput.BaseStream);
process.WaitForExit();
var outputReader = process.StandardError;
string Error = outputReader.ReadToEnd();
UnityEngine.Debug.Log(Error);
}And here the function from my cs file from my simple windowsform application :
private void xxxFFplay()
{
text = "start";
byte[] send_buffer = Encoding.ASCII.GetBytes(text);
sock.SendTo(send_buffer, endPoint);
ffplay.StartInfo.FileName = "ffplay.exe";
ffplay.StartInfo.Arguments = "udp://127.0.0.1:1100";
ffplay.StartInfo.CreateNoWindow = true;
ffplay.StartInfo.RedirectStandardOutput = true;
ffplay.StartInfo.UseShellExecute = false;
ffplay.EnableRaisingEvents = true;
ffplay.OutputDataReceived += (o, e) => Debug.WriteLine(e.Data ?? "NULL", "ffplay");
ffplay.ErrorDataReceived += (o, e) => Debug.WriteLine(e.Data ?? "NULL", "ffplay");
ffplay.Exited += (o, e) => Debug.WriteLine("Exited", "ffplay");
ffplay.Start();
Thread.Sleep(500); // you need to wait/check the process started, then...
// child, new parent
// make 'this' the parent of ffmpeg (presuming you are in scope of a Form or Control)
//SetParent(ffplay.MainWindowHandle, this.panel1.Handle);
// window, x, y, width, height, repaint
// move the ffplayer window to the top-left corner and set the size to 320x280
//MoveWindow(ffplay.MainWindowHandle, -5, -300, 320, 280, true);
}Does have somebody some ideas ? I am really stuck at this to create a somehow "live" stream.
Best regards