Recherche avancée

Médias (1)

Mot : - Tags -/école

Autres articles (51)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

Sur d’autres sites (7369)

  • SteamVR creating loading screen flicker when recording using FFMPEG in unity

    28 octobre 2018, par Mohamed Muzammil

    I have a heatmap plugin integrated in my Unity VR game(with SteamVR). The plugin uses eye tracking information to generate live heatmaps as the user gazes at different elements in the game. As the heatmaps are generated, the whole camera view(user’s) is overlayed with heatmaps info and written to a MP4 file using FFMPEG.


    The whole process works fine. But I have an annoying problem where during the recording the user’s camera view is not stable and keeps flickering and stops only when the recording is stopped. It interrupts the smooth flow of his game
    For now, I’ve narrowed down the code which causes the trouble,

           public void Write(byte[] data)
       {
           if (_subprocess == null) return;

           _stdin.Write(data);
           _stdin.Flush();
       }

    From my understanding, It is in this part of the code stdinput is invoked to write to the file system. So, I surmise that the problem must be with accessing the file system which in turn must have caused some time delay when each frame is written in the update method. Correct me if i am wrong here.
    enter image description here

    The loading screen which appears during every frame write looks something like above. It interrupts the smooth flow of the game and also makes the recording useless as the user keeps focusing on the flicker rather than the actual objects of interest. I would really be grateful if someone shows light on the issue here ?

  • Controlling ffmpeg at runtime with zmq

    3 avril 2023, par Gavin

    I want to dynamically change the rectilinear view (eg its yaw) of a 360 video as it plays.

    


    basic command

    


    To take a 360 video and show a flat/normal view at a yaw perspective of 60 degrees

    


    ffmpeg -i input360.mp4 -vf "v360=input=e:rectilinear:yaw=60,scale=iw/4:-1" out60.mp4
This works fine.

    


    Changing the yaw during playback

    


    I understand ffmpeg has two methods to change filter params at runtime ; sendcmd (file based) and zmq (message based)

    


    I got sendcmd method working, but am struggling to understand zmq syntax and use. ffmpeg's zmq docs are pretty sparse.
I am using a local Windows 10 PC

    


    sendcmd

    


    ffplay -i input360.mp4 -vf "sendcmd=f=cmd.txt,v360=input=e:rectilinear:reset_rot=1,scale=iw/4:-1"

    


    with a cmd.txt file

    


    0-5 [expr] v360 yaw 'lerp(0,90,TI)';
5-10 [expr] v360 yaw 'lerp(90,0,TI)';


    


    result : yaw changes from 0 to 90 degrees from t=0-5s and then 90 to 0 degrees from t=5-10s. Perfect

    


    zmq

    


    ffplay -i input360.mp4 -vf "v360=input=e:rectilinear:reset_rot=1,zmq,scale=iw/4:-1"

    


    I got zmqsend from ffmpeg-tools.zip and added to my ffmpeg bin directory

    


      

    1. execute above ffplay command from terminal window #1 - and see video playing

      


    2. 


    3. Open terminal window #2 and execute : echo v360 yaw 90 > zmqsend

      


    4. 


    


    result : no change to video yaw. No errors either

    


    What am I doing wrong ? Im not sure if my ffmpeg command is wrong, or if my zmq message does not reach ffmpeg (or both). I checked my ffmpeg v2022-10-10 config has —enable-libzmq

    


  • Webcam stream with FFMpeg on iPhone

    6 décembre 2011, par Saphrosit

    I'm trying to send and show a webcam stream from a linux server to an iPhone app. I don't know if it's the best solution, but I downloaded and installed FFMpeg on the linux server (following, for those who want to know, this tutorial).
    FFMpeg is working fine. After a lots of wandering, I managed to send a stream to the client launching

    ffmpeg  -s 320x240 -f video4linux2 -i /dev/video0 -f mpegts -vcodec libx264 udp://192.168.1.34:1234

    where 192.168.1.34 is the address of the client. Actually the client is a Mac, but it is supposed to be an iPhone. I know the stream is sent and received correctly (tested in different ways).
    However I didn't managed to watch the stream directly on the iPhone.
    I thought of different (possible) solutions :

    • first solution : store incoming data in a NSMutableData object. Then, when the stream ends, store it and then play it using a MPMoviePlayerController. Here's the code :

      [video writeToFile:@"videoStream.m4v" atomically:YES];
      NSURL *url = [NSURL fileURLWithPath:@"videoStream.m4v"];

      MPMoviePlayerController *videoController = [[MPMoviePlayerController alloc] initWithContentURL:url];

      [videoController.view setFrame:CGRectMake(100, 100, 150, 150)];

      [self.view addSubview:videoController.view];

      [videoController play];

      the problem of this solution is that nothing is played (I only see a black square), even if the video is saved correctly (I can play it directly from my disk using VLC). Besides, it's not such a great idea. It's just to make things work.

    • Second solution : use CMSampleBufferRef to store the incoming video. Much more problems comes with this solution : first of all, there's no CoreMedia.framework in my system. Besides I do not get well what does this class represents and what should I do to make it works : I mean if I start (somehow) filling this "SampleBuffer" with bytes I receive from UDP connection, then it will automatically call the CMSampleBufferMakeDataReadyCallback function I set during creation ? If yes, when ? When the single frame is completed or when the whole stream is received ?

    • Third solution : use AVFoundation framework (neither this is actually available on my Mac). I did not understand if it's actually possible to start recording from a remote source or even from a NSMutableData, a char* or something like that. On AVFoundation Programming Guide I didn't find any reference that say if it's possible or not.

    I don't know which one of this solution is the best for my purpose. ANY suggestion would be appreciate.

    Besides, there's also another problem : I didn't use any segmenter program to send the video. Now, if I'm not getting wrong, segmenter needs to split the source video in smaller/shorter video easier to send. If it is right, then maybe it's not strictly necessary to make things work (may be added later). However, since the server is running under linux, I cannot use Apple's mediastreamsegmeter. May someone suggest an opensource segmenter to use in association with FFMpeg ?


    UPDATE : I edited my question adding more informations on what I did since now and what my doubts are.