Recherche avancée

Médias (1)

Mot : - Tags -/pirate bay

Autres articles (69)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

Sur d’autres sites (6951)

  • How to stop ffmpeg when recording the desktop to save the file to the hard disk ?

    27 juin 2022, par Eliot Shein

    I'm trying to record the desktop with the ffmpeg and save a video file to the hard disk.

    


    using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace Testings
{
    internal class FFmpeg_Capture
    {
        Process process;

        public FFmpeg_Capture()
        {
            process = new Process();
        }

        public void Start(string FileName, int Framerate)
        {
            process.StartInfo.FileName = @"D:\Captured Videos\ffmpeg.exe"; // Change the directory where ffmpeg.exe is.  
            process.EnableRaisingEvents = false;
            process.StartInfo.WorkingDirectory = @"D:\Captured Videos"; // The output directory  
            process.StartInfo.Arguments = @"-f gdigrab -framerate " + Framerate +
                " -i desktop -preset ultrafast - pix_fmt yuv420p " + FileName;
            process.Start();
            process.StartInfo.UseShellExecute = false;
            process.StartInfo.CreateNoWindow = false;
            Stop();
        }

        public void Stop()
        {
            process.Close();
        }
    }
}


    


    And using it in form1 :

    


    private void btnRecord_Click(object sender, EventArgs e)
        {
            recordToggle = !recordToggle;

            if (recordToggle)
            {
                btnRecord.Text = "Stop";
                record.Start("Testing", 60);
            }
            else
            {
                btnRecord.Text = "Record";
                record.Stop();
            }
        }


    


    but the file Testing never saved to the hard disk. my guess is that

    


    process.Close();


    


    is not like ctrl+ c and ctrl + c is what stopping the ffmpeg and save the file.

    


    This is working but how to remove the black window of the ffmpeg ?

    


    ffmpeg black window

    


    using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Drawing;
using System.IO;
using System.IO.Pipes;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Windows.Forms;

namespace Testings
{
    internal class FFmpeg_Capture
    {
        Process process;

        public FFmpeg_Capture()
        {
            process = new Process();
        }

        public void Start(string FileName, int Framerate)
        {
            process.StartInfo.FileName = @"D:\Captured Videos\ffmpeg.exe"; // Change the directory where ffmpeg.exe is.  
            process.EnableRaisingEvents = false;
            process.StartInfo.WorkingDirectory = @"D:\Captured Videos\"; // The output directory  
            process.StartInfo.Arguments = @"-y -f gdigrab -framerate " + Framerate +
                " -i desktop -preset ultrafast -pix_fmt yuv420p " + FileName;
            process.StartInfo.UseShellExecute = false;
            process.StartInfo.CreateNoWindow = false;
            process.StartInfo.RedirectStandardInput = true; //Redirect stdin
            process.Start();
        }

        public void Stop()
        {
            byte[] qKey = Encoding.GetEncoding("gbk").GetBytes("q"); //Get encoding of 'q' key
            process.StandardInput.BaseStream.Write(qKey, 0, 1); //Write 'q' key to stdin of FFmpeg sub-processs
            process.StandardInput.BaseStream.Flush(); //Flush stdin (just in case).
            process.Close();
        }
    }
}


    


  • Save video from rtsp and play in exoplayer simultaneously [closed]

    25 janvier 2024, par Julian Peña Gallego

    I am trying to receive a live stream via RTSP from an IP camera in Android Kotlin, I am recording the video to a local file with ffmpegkit but I must additionally view the live stream.

    


    When I record the live stream with ffmpeg without playing the stream through exoplayer it works fine, but when both processes are running there is packet loss in the video recording or in the exoplayer.

    


    Then I tried to get exoplayer to play the video that ffmpeg was recording, but it gave me an error since the file has not been closed yet.

    


    Could you provide me with a solution ? I have found on the internet that it is possible with server sockets but they do not indicate how to do it.

    


  • Save image while face detection

    13 octobre 2017, par hidura

    Hello I’ve this app that save an image for everytime my face move something similar to the IG stories my problem is that the cellphone get very slow and the app close suddenly because of allocation memory problems I want to know how i can do this without slowing down the cellphone and dont receive the error of closing.

    The next code open the svg :

    @TargetApi(Build.VERSION_CODES.LOLLIPOP)
    private static Bitmap getBitmap(VectorDrawable vectorDrawable) {
       Bitmap bitmap = Bitmap.createBitmap(vectorDrawable.getIntrinsicWidth(),
               vectorDrawable.getIntrinsicHeight(), Bitmap.Config.ARGB_8888);
       Canvas canvas = new Canvas(bitmap);
       vectorDrawable.setBounds(0, 0, canvas.getWidth(), canvas.getHeight());
       vectorDrawable.draw(canvas);
       Log.e("", "getBitmap: 1");
       return bitmap;
    }

    private static Bitmap getBitmap(Context context, int drawableId) {
       Log.e("", "getBitmap: 2");
       Drawable drawable = ContextCompat.getDrawable(context, drawableId);
       if (drawable instanceof BitmapDrawable) {
           return BitmapFactory.decodeResource(context.getResources(), drawableId);
       } else if (drawable instanceof VectorDrawable) {
           return getBitmap((VectorDrawable) drawable);
       } else {
           throw new IllegalArgumentException("unsupported drawable type");
       }
    }

    The next one draw the svg below the position of the face.

       /**
        * Draws the face annotations for position on the supplied canvas.
        */
       @Override
       public void draw(Canvas canvas) {
           Face face = mFace;
           if (face == null) {
               return;
           }

           // Draws a circle at the position of the detected face, with the face's track id below.
           float x = translateX(face.getPosition().x + face.getWidth() / 2);
           float y = translateY(face.getPosition().y + face.getHeight() / 2);
           canvas.drawCircle(x, y, FACE_POSITION_RADIUS, mFacePositionPaint);
           canvas.drawText("id: " + mFaceId, x + ID_X_OFFSET, y + ID_Y_OFFSET, mIdPaint);
           canvas.drawText("happiness: " + String.format("%.2f", face.getIsSmilingProbability()), x - ID_X_OFFSET, y - ID_Y_OFFSET, mIdPaint);
           canvas.drawText("right eye: " + String.format("%.2f", face.getIsRightEyeOpenProbability()), x + ID_X_OFFSET * 2, y + ID_Y_OFFSET * 2, mIdPaint);
           canvas.drawText("left eye: " + String.format("%.2f", face.getIsLeftEyeOpenProbability()), x - ID_X_OFFSET*2, y - ID_Y_OFFSET*2, mIdPaint);

           // Draws a bounding box around the face.
           float xOffset = scaleX(face.getWidth() / 2.0f);
           float yOffset = scaleY(face.getHeight() / 2.0f);
           float left = x - xOffset;
           float top = y - yOffset;
           float right = x + xOffset;
           float bottom = y + yOffset;
           //bitmap = BitmapFactory.decodeResource(getOverlay().getContext().getResources(), R.drawable.ic_shirt);

           bitmap = getBitmap(getOverlay().getContext(), R.drawable.ic_tshirt);
           float eyeX = left-400;
    //        for(Landmark l : face.getLandmarks()){
    //            if(l.getType() == Landmark.LEFT_EYE){
    //                eyeX = l.getPosition().x + bitmap.getWidth() / 2;
    //            }
    //        }

           tshirt = Bitmap.createScaledBitmap(bitmap, (int) scaleX(bitmap.getWidth() / 2),
                   (int) scaleY(bitmap.getHeight()/2), false);
           float top_shirt=(face.getPosition().y + face.getHeight())+200;
           canvas.drawBitmap(tshirt, eyeX, top_shirt, new Paint());


           Canvas myCanvas = new Canvas(tshirt);
           myCanvas.drawBitmap(tshirt, eyeX, top_shirt, new Paint());
           HashMap args = new HashMap<>();
       args.put("tshirt", tshirt);
       new saveImg(args).execute();
           //canvas.drawRect(left, top, right, bottom, mBoxPaint);
       }

    This one save the image on the cellphone.

       package com.google.android.gms.samples.vision.face.facetracker;

    import android.graphics.Bitmap;
    import android.os.AsyncTask;
    import android.os.Environment;

    import java.io.File;
    import java.io.FileOutputStream;
    import java.util.ArrayList;
    import java.util.HashMap;

    /**
    * Created by diegohidalgo on 10/12/17.
    */

    public class saveImg extends AsyncTask {
       Bitmap tshirt;
       String name;
       saveImg(HashMap args){
           tshirt = (Bitmap)args.get("tshirt");
           File file=new File(Environment.getExternalStorageDirectory() + "/facedetection/");
           File[] list = file.listFiles();
           int count = 0;
           for (File f: list){
               String name = f.getName();
               if (name.endsWith(".png"))
                   count++;

           }
           name="img"+count+".png";
       }

       @Override
       protected Void doInBackground(Void... args) {
           File file = new File(Environment.getExternalStorageDirectory() + "/facedetection/"+ name);

           try {
               tshirt.compress(Bitmap.CompressFormat.PNG, 100, new FileOutputStream(file));
           } catch (Exception e) {
               e.printStackTrace();

           }
           System.gc();
           tshirt.recycle();
           tshirt= null;
           return null;
       }

       protected void onPostExecute() {

       }
    }

    This is the mistake that give me before close the app.

    10-13 10:38:17.526
    8443-8443/com.google.android.gms.samples.vision.face.facetracker
    W/art : Throwing OutOfMemoryError "Failed to allocate a 8916492 byte
    allocation with 1111888 free bytes and 1085KB until OOM" 10-13
    10:38:18.020
    8443-8443/com.google.android.gms.samples.vision.face.facetracker
    I/Process : Sending signal. PID : 8443 SIG : 9

    thanks in advance.