Recherche avancée

Médias (0)

Mot : - Tags -/flash

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (62)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

  • Ajouter notes et légendes aux images

    7 février 2011, par

    Pour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
    Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
    Modification lors de l’ajout d’un média
    Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...)

  • Demande de création d’un canal

    12 mars 2010, par

    En fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
    Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...)

Sur d’autres sites (9426)

  • Surface Texture object is not getting the frames from a Surface Class

    22 avril 2016, par Juan Manuel González Otero

    On the one hand, I have a Surface Class which when instantiated, automatically initialize a new thread and start grabbing frames from a streaming source via native code based on FFMPEG. Here is the main parts of the code for the aforementioned Surface Class :

    public class StreamingSurface extends Surface implements Runnable {

       ...

       public StreamingSurface(SurfaceTexture surfaceTexture, int width, int height) {
           super(surfaceTexture);

           screenWidth  = width;
           screenHeight = height;      
           init();

       }

       public void init() {
           mDrawTop = 0;
           mDrawLeft = 0;
           mVideoCurrentFrame = 0;
           this.setVideoFile();
           this.startPlay();
       }

       public void setVideoFile() {        
           // Initialise FFMPEG
           naInit("");

           // Get stream video res
           int[] res = naGetVideoRes();
           mDisplayWidth = (int)(res[0]);
           mDisplayHeight = (int)(res[1]);

           // Prepare Display
           mBitmap = Bitmap.createBitmap(mDisplayWidth, mDisplayHeight, Bitmap.Config.ARGB_8888);
           naPrepareDisplay(mBitmap, mDisplayWidth, mDisplayHeight);
       }

       public void startPlay() {
           thread = new Thread(this);
           thread.start();
       }

       @Override
       public void run() {
           while (true) {
               while (2 == mStatus) {
                   //pause
                   SystemClock.sleep(100);
               }
               mVideoCurrentFrame = naGetVideoFrame();
               if (0 < mVideoCurrentFrame) {
                   //success, redraw
                   if(isValid()){
                        Canvas canvas = lockCanvas(null);
                        if (null != mBitmap) {
                            canvas.drawBitmap(mBitmap, mDrawLeft, mDrawTop, prFramePaint);
                        }
                        unlockCanvasAndPost(canvas);
                   }
               } else {
                   //failure, probably end of video, break
                   naFinish(mBitmap);
                   mStatus = 0;
                   break;
               }
           }  
       }

    }

    In my MainActivity class, I instantiated this class in the following way :

    public void startCamera(int texture)
    {
       mSurface = new SurfaceTexture(texture);
       mSurface.setOnFrameAvailableListener(this);
       Surface surface = new StreamingSurface(mSurface, 640, 360);
       surface.release();        
    }

    I read the following line in the Android developer page, regarding the Surface class constructor :

    "Images drawn to the Surface will be made available to the SurfaceTexture, which can attach them to an OpenGL ES texture via updateTexImage()."

    That is exactly what I want to do, and I have everything ready for the further renderization. But definitely, with the above code, I never get my frames captured in the surface class transformed to its corresponding SurfaceTexture. I know this because the debugger, for instace, never call the OnFrameAvailableLister method associated with that Surface Texture.

    Any ideas ? Maybe the fact that I am using a thread to call the drawing functions is messing everything ? In such a case, what alternatives I have to grab the frames ?

    Thanks in advance

  • Subprocess call stopping asynchronously-executed Python parent process

    6 mai 2016, par Suriname0

    The following shell session demonstrates the behavior I am seeing :

    [user@compname python-test]$ cat test.py
    #!/usr/bin/env python
    import subprocess
    from time import sleep
    proc = subprocess.Popen("ffmpeg", stdout=subprocess.PIPE, stderr=subprocess.STDOUT, shell=True)
    print "Starting process: " + str(proc)
    status = proc.poll()
    while status is None:
       print "Process still running."
       sleep(0.01)
       status = proc.poll()
    print "Status: " + str(status)
    [user@compname python-test]$ python test.py
    Starting process:
    Process still running.
    Process still running.
    Status: 1
    [user@compname python-test]$ python test.py &
    [4] 6976
    [user@compname python-test]$ Starting process:
    Process still running.
    Process still running.
    [4]+  Stopped                 python test.py
    [user@compname python-test]$ ps
     PID TTY          TIME CMD
    4684 pts/101  00:00:00 python
    4685 pts/101  00:00:00 ffmpeg
    7183 pts/101  00:00:00 ps
    14385 pts/101  00:00:00 bash

    As you can see, when the simple test Python program is run normally, it completes successfully. When it is run asynchronously (using &), the Python process is stopped as soon as the subprocess call is complete (and poll() would return a non-None value).

    1. The same behavior occurs when using Popen.wait()
    2. The behavior is unique to ffmpeg.
    3. Both the Python process and ffmpeg are ending up stopped, as seen in the call to ps.

    Can someone help me detangle this behavior ? I don’t see anything in the documentation for the subprocess module, bash’s & operator, or ffmpeg that would explain this.

    The Python version is 2.6.6, bash is GNU bash version 4.1.2(1)-release (x86_64-redhat-linux-gnu), ffmpeg is version 3.0.1-static.

    Thank you for any help !

  • ffmpeg command running stand alone but not from a linux script

    31 mai 2016, par user1490563

    i made a simple script that breaks a flv file into multiple parts, convert them all to .mp4 individually and then merge all of them to form a final mp4 file. i did this to save time and convert large files in parallel.

    however, i am stuck because the command that normally run on command line for ffmpeg, dont run via script.

    I am kind of stuck here and will like to have some assistance.

    #!/bin/bash


    #sleep 5


    filenametmp=$1;

    filename=`echo "$filenametmp" | awk '{split($0,a,"."); print a[1]}'`


    echo $filename

    output="$filename-output"

    filenamewithoutpath=`echo "$output" | awk '{split($0,a,"/"); print a[4]}'`

    echo $output $filenamewithoutpath

    /usr/bin/ffmpeg -i $filenametmp -c copy -map 0 -segment_time $2 -f segment $output%01d.flv

    #sleep 10


    #echo "/bin/ls -lrt /root/storage/ | /bin/grep $filenamewithoutpath | /usr/bin/wc -l"
    filecounttmp=`/bin/ls -lrt /opt/storage/ | /bin/grep $filenamewithoutpath | /usr/bin/wc -l`

    filecount=`expr $filecounttmp - 1`

    echo $filecount

    for i in `seq 0 $filecount`
    do


    suffix=`expr 0000 + $i`

    filenametoconvert="$output$suffix.flv"
    convertedfilename="$output$suffix.mp4"
    echo $filenametoconvert

    /usr/bin/ffmpeg -i $filenametoconvert -c:v libx264 -crf 23 -preset medium -vsync 1 -r 25 -c:a aac -strict -2 -b:a 64k -ar 44100 -ac 1 $convertedfilename > /dev/null 2>&1 &


    done





    sleep 5

    concatstring=""

    for j in `seq 0 $filecount`
    do


    suffix=`expr 0000 + $j`

    convertedfilenamemp4="$output$suffix.mp4"

    #concatstring=`concat:$concatstring|$convertedfilenamemp4`

    echo "file" $convertedfilenamemp4 >> $filename.txt

    #ffmpeg -i concat:"$concatstring" -codec copy $filename.mp4

    #ffmpeg -f concat -i $filename.txt -c copy $filename.mp4


    done

    echo $concatstring

    ffmpeg -f concat -i $filename.txt -c copy $filename.mp4


    rm $output*
    rm $filename.txt

    following is how to run this on any flv file

    ./ff.sh /opt/storage/tttttssssssssss_573f5b1cd473202daf2bf694.flv 20

    Any help will be appreciated and any input required will be given.

    I am on Ubuntu 14.04 LTS version. Standard installation of ffmpeg.