
Recherche avancée
Autres articles (43)
-
Selection of projects using MediaSPIP
2 mai 2011, parThe examples below are representative elements of MediaSPIP specific uses for specific projects.
MediaSPIP farm @ Infini
The non profit organizationInfini develops hospitality activities, internet access point, training, realizing innovative projects in the field of information and communication technologies and Communication, and hosting of websites. It plays a unique and prominent role in the Brest (France) area, at the national level, among the half-dozen such association. Its members (...) -
Sélection de projets utilisant MediaSPIP
29 avril 2011, parLes exemples cités ci-dessous sont des éléments représentatifs d’usages spécifiques de MediaSPIP pour certains projets.
Vous pensez avoir un site "remarquable" réalisé avec MediaSPIP ? Faites le nous savoir ici.
Ferme MediaSPIP @ Infini
L’Association Infini développe des activités d’accueil, de point d’accès internet, de formation, de conduite de projets innovants dans le domaine des Technologies de l’Information et de la Communication, et l’hébergement de sites. Elle joue en la matière un rôle unique (...) -
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
Sur d’autres sites (5375)
-
FFmpeg somehow set the udp speed ?
19 juin 2018, par potu1304I wanted to nearly live stream my Unit game with FFmpeg to a simple client. I have one Unity game in which each frame is saved as an jpg image. These images are wrapped in ffmpeg and send over udp to a simple c# client where I use ffplay to play the stream. The problem is, that FFmpeg is wrapping the images way faster than the unity app can write them. So ffmpeg quits but Unity is still writing frames. Is there a way to set ffmpeg in a loop to wait for the next image or can I somehow make a for loop without call every time ffmpeg ?
Here is my function from my capturing script in Unity :
Process process;
//BinaryWriter _stdin;
public void encodeFrame()
{
ProcessStartInfo startInfo = new ProcessStartInfo();
var basePath = Application.streamingAssetsPath + "/FFmpegOut/Windows/ffmpeg.exe";
info.Arguments = "-re -i screen_%d.jpg -vcodec libx264 -r 24 -f mpegts udp://127.0.0.1:1100";
info.RedirectStandardOutput = true;
info.RedirectStandardInput = true;
info.RedirectStandardError = true;
info.CreateNoWindow = true;
info.UseShellExecute = false;
info.RedirectStandardError = true;
UnityEngine.Debug.Log(string.Format(
"Executing \"{0}\" with arguments \"{1}\".\r\n",
info.FileName,
info.Arguments));
process = Process.Start(info);
//_stdin = new BinaryWriter(process.StandardInput.BaseStream);
process.WaitForExit();
var outputReader = process.StandardError;
string Error = outputReader.ReadToEnd();
UnityEngine.Debug.Log(Error);
}And here the function from my cs file from my simple windowsform application :
private void xxxFFplay()
{
text = "start";
byte[] send_buffer = Encoding.ASCII.GetBytes(text);
sock.SendTo(send_buffer, endPoint);
ffplay.StartInfo.FileName = "ffplay.exe";
ffplay.StartInfo.Arguments = "udp://127.0.0.1:1100";
ffplay.StartInfo.CreateNoWindow = true;
ffplay.StartInfo.RedirectStandardOutput = true;
ffplay.StartInfo.UseShellExecute = false;
ffplay.EnableRaisingEvents = true;
ffplay.OutputDataReceived += (o, e) => Debug.WriteLine(e.Data ?? "NULL", "ffplay");
ffplay.ErrorDataReceived += (o, e) => Debug.WriteLine(e.Data ?? "NULL", "ffplay");
ffplay.Exited += (o, e) => Debug.WriteLine("Exited", "ffplay");
ffplay.Start();
Thread.Sleep(500); // you need to wait/check the process started, then...
// child, new parent
// make 'this' the parent of ffmpeg (presuming you are in scope of a Form or Control)
//SetParent(ffplay.MainWindowHandle, this.panel1.Handle);
// window, x, y, width, height, repaint
// move the ffplayer window to the top-left corner and set the size to 320x280
//MoveWindow(ffplay.MainWindowHandle, -5, -300, 320, 280, true);
}Does have somebody some ideas ? I am really stuck at this to create a somehow "live" stream.
Best regards
-
FFmpeg : Decoding AVPackets received from UDP socket
8 août 2022, par strong_kobayashiI am trying to stream video frames encoded with FFmpeg from an Unity game to a client UI via UDP. To be specific, I am sending AVPackets (which contain the compressed frames, as far as I understand) from the server, which are then, on the client side, extracted as follows :



inline UDPpacket* SDLGameClient::receiveData()
{
 if(SDLNet_UDP_Recv(socket, packet))
 return packet;
 else
 return NULL;
}

int main()
{
 ...
 // Initialize UDP
 ...
 UDPpacket *udpPacket;

 int i = 0;

 for(;;)
 {
 udpPacket = client->receiveData();

 if(udpPacket != nullptr)
 {
 i++;
 std::cout << "Packet " << i << " received!" << std::endl;

 AVPacket packet;
 av_init_packet(&packet);

 packet.data = new uint8_t[udpPacket->len];
 memcpy(packet.data, udpPacket->data, udpPacket->len);
 packet.size = udpPacket->len;

 ...




To realize networking, I use the SDL_net library. Fragmenting, sending and receiving the packets seem to work without any problem. My question is, how do I decode the incoming AVPackets and stream the frames recorded in Unity to the client ?
I am aware that I need to use
avcodec_send_packet
andavcodec_receive_frame
for decoding (sinceavcodec_decode_video2
, that is used in many example decoding code, is deprecated), but when I do something like this :


int ret = avcodec_send_packet(codecContext, &packet);
if(ret < 0 || ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)
 std::cout << "avcodec_send_packet: " << ret << std::endl;
else
{
 while(ret >= 0)
 {
 ret = avcodec_receive_frame(codecContext, frame);
 if(ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)
 std::cout << "avcodec_receive_frame: " << ret << std::endl;

 std::cout << "Frame: " << codecContext->frame_number << std::endl;
 }
}

av_packet_unref(packet);




ret always returns a negative value (-22), so perhaps something is wrong with the AVPackets, or I'm sending the frames way too fast, I really have no clue :/



Thanks in advance for any help !


-
Unity : Converting Texture2D to YUV420P using FFmpeg
23 juillet 2021, par strong_kobayashiI'm trying to create a game in Unity where each frame is rendered into a texture and then put together into a video using FFmpeg. The output created by FFmpeg should eventually be sent over the network to a client UI. However, I'm struggling mainly with the part where a frame is caught, and passed to an unsafe method as a byte array where it should be processed further by FFmpeg. The wrapper I'm using is FFmpeg.AutoGen.



The render to texture method :



private IEnumerator CaptureFrame()
{
 yield return new WaitForEndOfFrame();

 RenderTexture.active = rt;
 frame.ReadPixels(rect, 0, 0);
 frame.Apply();

 bytes = frame.GetRawTextureData();

 EncodeAndWrite(bytes, bytes.Length);
}




The unsafe encoding method so far :



private unsafe void EncodeAndWrite(byte[] bytes, int size)
{
 GCHandle pinned = GCHandle.Alloc(bytes, GCHandleType.Pinned);
 IntPtr address = pinned.AddrOfPinnedObject();

 sbyte** inData = (sbyte**)address;
 fixed(int* lineSize = new int[1])
 {
 lineSize[0] = 4 * textureWidth;
 // Convert RGBA to YUV420P
 ffmpeg.sws_scale(sws, inData, lineSize, 0, codecContext->width, inputFrame->extended_data, inputFrame->linesize);
 }

 inputFrame->pts = frameCounter++;

 if(ffmpeg.avcodec_send_frame(codecContext, inputFrame) < 0)
 throw new ApplicationException("Error sending a frame for encoding!");

 pkt = new AVPacket();
 fixed(AVPacket* packet = &pkt)
 ffmpeg.av_init_packet(packet);
 pkt.data = null;
 pkt.size = 0;

 pinned.Free();
 ...
}




sws_scale
takes asbyte**
as the second parameter, therefore I'm trying to convert the input byte array tosbyte**
by first pinning it withGCHandle
and doing an explicit type conversion afterwards. I don't know if that's the correct way, though.


Moreover, the condition
if(ffmpeg.avcodec_send_frame(codecContext, inputFrame) < 0)
alwasy throws an ApplicationException, where I also really don't know why this happens.codecContext
andinputFrame
are my AVCodecContext and AVFrame objects, respectively, and the fields are defined as the following :


codecContext



codecContext = ffmpeg.avcodec_alloc_context3(codec);
codecContext->bit_rate = 400000;
codecContext->width = textureWidth;
codecContext->height = textureHeight;

AVRational timeBase = new AVRational();
timeBase.num = 1;
timeBase.den = (int)fps;
codecContext->time_base = timeBase;
videoAVStream->time_base = timeBase;

AVRational frameRate = new AVRational();
frameRate.num = (int)fps;
frameRate.den = 1;
codecContext->framerate = frameRate;

codecContext->gop_size = 10;
codecContext->max_b_frames = 1;
codecContext->pix_fmt = AVPixelFormat.AV_PIX_FMT_YUV420P;




inputFrame



inputFrame = ffmpeg.av_frame_alloc();
inputFrame->format = (int)codecContext->pix_fmt;
inputFrame->width = textureWidth;
inputFrame->height = textureHeight;
inputFrame->linesize[0] = inputFrame->width;




Any help in fixing the issue would be greatly appreciated :)