Recherche avancée

Médias (1)

Mot : - Tags -/iphone

Autres articles (44)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Gestion de la ferme

    2 mars 2010, par

    La ferme est gérée dans son ensemble par des "super admins".
    Certains réglages peuvent être fais afin de réguler les besoins des différents canaux.
    Dans un premier temps il utilise le plugin "Gestion de mutualisation"

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 is the first MediaSPIP stable release.
    Its official release date is June 21, 2013 and is announced here.
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

Sur d’autres sites (6904)

  • How to fix : GUI application who control a simple command of ffmpeg with python

    9 octobre 2019, par maoca

    I want to make a graphic application with only 2 buttons (start / stop) that allows to launch a subprocess (start) and stop it (stop).
    (I’m using Python3, PyQt5 and ffmpeg)
    The process captures the screen in a video and save it to an mp4 using the ffmpeg command to launch the POpen command.
    To make a clean output of the command, ffmpeg uses ’q’ that I write by stdin.

    In a simple script it works for me but I can’t get it to work within the buttons.

    My knowledge is very basic and as much as I look for information I do not understand what I am doing wrong, I appreciate any comments that let me move on.

    This is my code :

    import sys
    import subprocess
    from PyQt5.QtWidgets import QApplication, QWidget, QPushButton

    class Ventana(QWidget):

       def __init__(self):
           super().__init__()
           # Button 1
           pybutton = QPushButton('REC', self)
           pybutton.clicked.connect(self.clickMethodB1)
           pybutton.resize(50, 32)
           pybutton.move(50, 50)

           # BOTON 2
           pybutton = QPushButton('STOP', self)
           pybutton.clicked.connect(self.clickMethodB2)
           pybutton.resize(100, 32)
           pybutton.move(150, 50)


           self.initUI()


       def initUI(self):
           self.setGeometry(300, 300, 300, 220)
           self.setWindowTitle('FFMPEG')
           self.move(800, 400)
           self.show()

       def clickMethodB1(self):
           global ffmpeg
           filename_mp4 = 'c://tmp//output.mp4'
           print('REC')

           command = 'ffmpeg -f dshow -i video="screen-capture-recorder" '+ filename_mp4

           ffmpeg = subprocess.Popen(command, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE,                                                    encoding='utf-8', shell=True)


       def clickMethodB2(self):

           print('STOP')
           ffmpeg.stdin.write(str('q'))            


    if __name__ == '__main__':
       app = QApplication(sys.argv)
       ex = Ventana()
       sys.exit(app.exec_())
  • Joining realtime raw PCM streams with ffmpeg and streaming them back out

    15 avril 2024, par Nathan Ladwig

    I am trying to use ffmpeg to join two PCM streams. I have it sorta kinda working but it's not working great.

    


    I am using Python to receive two streams from two computers running Scream Audio Driver ( ttps ://github.com/duncanthrax/scream )

    


    I am taking them in over UDP and writing them to pipes. The pipes are being received by ffmpeg and mixed, it's writing the mixed stream to another pipe. I'm reading that back in Python and sending it to the target receiver.

    


    My ffmpeg command is

    


    ['ffmpeg', 
'-use_wallclock_as_timestamps', 'true', '-f', 's24le', '-ac', '2', '-ar', '48000', '-i', '/tmp/ffmpeg-fifo-1',
'-use_wallclock_as_timestamps', 'true', '-f', 's24le', '-ac', '2', '-ar', '48000', '-i', '/tmp/ffmpeg-fifo-2',
'-filter_complex', '[0]aresample=async=1[a0],[1]aresample=async=1[a1],[a0][a1]amix', '-y',
'-f', 's24le', '-ac', '2', '-ar', '48000', '/tmp/ffmpeg-fifo-in']


    


    My main issue is that it should be reading ffmpeg-fifo-1 and ffmpeg-fifo-2 asynchronously, but it appears to be not. When the buffers get more than 50 frames out of sync with each other ffmpeg hangs and doesn't recover. I would like to fix this.

    


    In this hacky test code the number of frames sent over each stream are counted and empty frames are sent if the count hits 12. This keeps ffmpeg happy.

    


    The code below takes in two 48KHz 24-bit stereo PCM streams with Scream's header, mixes them, applies the same header, and sends them back out.

    


    It works most of the time. Sometimes I'm getting blasted with static, I think this is when only one or two bytes of a frame are making it to ffmpeg, and it loses track.

    


    The header is always 1152 bytes of pcm data with a 5 byte header. It's described in the Scream repo readme

    


    This is my header :

    


    01 18 02 03 00

    


    01 - 48KHz
18 - Sampling Rate (18h=24d, 24bit)
02 - 2 channels
03 00 - WAVEFORMATEXTENSIBLE

    


    import socket
import struct
import threading
import os
import sys
import time
import subprocess
import tempfile
import select

class Sender(threading.Thread):
    def __init__(self):
        super().__init__()
        TEMPDIR = tempfile.gettempdir() + "/"
        self.fifoin = TEMPDIR + "ffmpeg-fifo-in"
        self.start()

    def run(self):
        self.fd = open(self.fifoin, "rb")
        self.sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
        while True:
            try:
                header = bytes([0x01, 0x18, 0x02, 0x03, 0x00])  # 48khz, 24-bit, stereo
                data = self.fd.read(1152)
                sendbuf = header + data
                self.sock.sendto(sendbuf, ("192.168.3.199", 4010))  # Audio sink
            except Exception as e:
                print("Except")
                print(e)

class Receiver(threading.Thread):
    def __init__(self):
        super().__init__()
        TEMPDIR = tempfile.gettempdir() + "/"
        self.fifo1 = TEMPDIR + "ffmpeg-fifo-1"
        self.fifo2 = TEMPDIR + "ffmpeg-fifo-2"
        self.fifoin = TEMPDIR + "ffmpeg-fifo-in"
        self.fifos = [self.fifo1, self.fifo2]
        try:
            try:
                os.remove(self.fifoin)
            except:
                pass
            os.mkfifo(self.fifoin)
        except:
            pass
        self.start()
        sender=Sender()

    def run(self):
        ffmpeg_command=['ffmpeg', '-use_wallclock_as_timestamps', 'true', '-f', 's24le', '-ac', '2', '-ar', '48000', '-i', self.fifo1,
                                  '-use_wallclock_as_timestamps', 'true', '-f', 's24le', '-ac', '2', '-ar', '48000', '-i', self.fifo2,
                                  '-filter_complex', '[0]aresample=async=1[a0],[1]aresample=async=1[a1],[a0][a1]amix', "-y", '-f', 's24le', '-ac', '2', '-ar', '48000', self.fifoin]
        print(ffmpeg_command)

        sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
        sock.setsockopt(socket.SOL_SOCKET,socket.SO_RCVBUF,4096)
        sock.bind(("", 16401))

        recvbuf = bytearray(1157)
        framecount = [0,0]
        closed = 1
        while True:
            ready = select.select([sock], [], [], .2)
            if ready[0]:
                recvbuf, addr = sock.recvfrom(1157)
                if closed == 1:
                    for fifo in self.fifos:
                        try:
                            try:
                                os.remove(fifo)
                            except:
                                pass
                            os.mkfifo(fifo)
                        except:
                            pass
                    framecount = [0,0]
                    print("data, starting ffmpeg")
                    ffmpeg = subprocess.Popen (ffmpeg_command, shell=False, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
                    fifo1_fd = os.open(self.fifo1, os.O_RDWR)
                    fifo1_file = os.fdopen(fifo1_fd, 'wb', 0)
                    fifo2_fd = os.open(self.fifo2, os.O_RDWR)
                    fifo2_file = os.fdopen(fifo2_fd, 'wb', 0)
                    closed = 0
                    for i in range(0,6):
                        fifo1_file.write(bytes([0]*1157))
                        fifo2_file.write(bytes([0]*1157))

                if addr[0] == "192.168.3.199":
                    fifo1_file.write(recvbuf[5:])
                    framecount[0] = framecount[0] + 1

                if addr[0] == "192.168.3.119":
                    fifo2_file.write(recvbuf[5:])
                    framecount[1] = framecount[1] + 1

                # Keep buffers roughly in sync while playing
                targetframes=max(framecount)
                if targetframes - framecount[0] > 11:
                    while (targetframes - framecount[0]) > 0:
                        fifo1_file.write(bytes([0]*1157))
                        framecount[0] = framecount[0] + 1

                if targetframes - framecount[1] > 11:
                    while (targetframes - framecount[1]) > 0:
                        fifo2_file.write(bytes([0]*1157))
                        framecount[1] = framecount[1] + 1
            else:
                if closed == 0:
                    ffmpeg.kill()
                    print("No data, killing ffmpeg")
                    fifo1_file.close()
                    fifo2_file.close()
                    closed = 1
receiver=Receiver()

while True:
    time.sleep(50000)


    


    Does anybody have any pointers on how I can make this better ?

    


  • FFMPEG library's some command not working on android

    21 février 2014, par Saurabh Prajapati

    I need following 2 commands to work on android platform. I found many article on this site where they inform these command works fine for them but it is not working at my end

    For Fedding Effect :
    "ffmpeg -i filename1 fade=in:5:8 output.mp4"

    For Concate Video Files :
    "ffmpeg -i concat : filename1|filename2 -codec copy output.mp4"

    Error : App throws error like unknown command "concate" and "fad-in5:8".

    My Goal : I need to concate 2 "mp4" video files on android platform with Fed In/Fed Out effects.

    Following is my code

    public class VideoTest extends Activity

    public static final String LOGTAG = "MJPEG_FFMPEG";
    byte[] previewCallbackBuffer;

    boolean recording = false;
    boolean previewRunning = false;

    File jpegFile;          
    int fileCount = 0;

    FileOutputStream fos;
    BufferedOutputStream bos;
    Button recordButton;

    Camera.Parameters p;

    NumberFormat fileCountFormatter = new DecimalFormat("00000");
    String formattedFileCount;

    ProcessVideo processVideo;

    String[] libraryAssets = {"ffmpeg","ffmpeg.so",
           "libavcodec.so", "libavcodec.so.52", "libavcodec.so.52.99.1",
           "libavcore.so", "libavcore.so.0", "libavcore.so.0.16.0",
           "libavdevice.so", "libavdevice.so.52", "libavdevice.so.52.2.2",
           "libavfilter.so", "libavfilter.so.1", "libavfilter.so.1.69.0",
           "libavformat.so", "libavformat.so.52", "libavformat.so.52.88.0",
           "libavutil.so", "libavutil.so.50", "libavutil.so.50.34.0",
           "libswscale.so", "libswscale.so.0", "libswscale.so.0.12.0"
    };

    @Override
    public void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);

       for (int i = 0; i < libraryAssets.length; i++) {
           try {
               InputStream ffmpegInputStream = this.getAssets().open(libraryAssets[i]);
               FileMover fm = new FileMover(ffmpegInputStream,"/data/data/com.mobvcasting.mjpegffmpeg/" + libraryAssets[i]);
               fm.moveIt();
           } catch (IOException e) {
               e.printStackTrace();
           }
       }

       Process process = null;

       try {
           String[] args = {"/system/bin/chmod", "755", "/data/data/com.mobvcasting.mjpegffmpeg/ffmpeg"};
           process = new ProcessBuilder(args).start();        
           try {
               process.waitFor();
           } catch (InterruptedException e) {
               e.printStackTrace();
           }
           process.destroy();

       } catch (IOException e) {
           e.printStackTrace();
       }

       File savePath = new File(Environment.getExternalStorageDirectory().getPath() + "/com.mobvcasting.mjpegffmpeg/");
       savePath.mkdirs();

       requestWindowFeature(Window.FEATURE_NO_TITLE);
       getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);

       setContentView(R.layout.main);


       processVideo = new ProcessVideo();
       processVideo.execute();
    }



    @Override
    public void onConfigurationChanged(Configuration conf)
    {
       super.onConfigurationChanged(conf);
    }  


    private class ProcessVideo extends AsyncTask {
       @Override
       protected Void doInBackground(Void... params) {
           Log.d("test", "VideoTest doInBackground Start");
           /*String videofile = Environment.getExternalStorageDirectory().getPath() + "/com.mobvcasting.mjpegffmpeg/splitter.mp4";
           File file = new File(videofile);
           if(file.exists())
               file.delete();
           file=null;*/

           Process ffmpegProcess = null;
           try {

               String filename1 = Environment.getExternalStorageDirectory().getPath()+ "/com.mobvcasting.mjpegffmpeg/test.mp4";
               String filename2 = Environment.getExternalStorageDirectory().getPath()+ "/com.mobvcasting.mjpegffmpeg/splitter.mp4";
               String StartPath = Environment.getExternalStorageDirectory().getPath() + "/com.mobvcasting.mjpegffmpeg/";

               //String[] ffmpegCommand = {"/data/data/com.mobvcasting.mjpegffmpeg/ffmpeg", "-i", "concat:\""+ filename1+"|"+ filename2+"\"", "-codec", "copy", Environment.getExternalStorageDirectory().getPath() + "/com.mobvcasting.mjpegffmpeg/output.mp4"};
               //String[] ffmpegCommand = {"/data/data/com.mobvcasting.mjpegffmpeg/ffmpeg", "-i", filename1, "fade=in:5:8", Environment.getExternalStorageDirectory().getPath() + "/com.mobvcasting.mjpegffmpeg/output.mp4"};

               ffmpegProcess = new ProcessBuilder(ffmpegCommand).redirectErrorStream(true).start();            

               OutputStream ffmpegOutStream = ffmpegProcess.getOutputStream();
               BufferedReader reader = new BufferedReader(new InputStreamReader(ffmpegProcess.getInputStream()));

               String line;

               Log.d("test", "***Starting FFMPEG***");
               while ((line = reader.readLine()) != null)
               {
                   Log.d("test", "***"+line+"***");
               }
               Log.d("test", "***Ending FFMPEG***");


           } catch (IOException e) {
               e.printStackTrace();
           }

           if (ffmpegProcess != null) {
               ffmpegProcess.destroy();        
           }
           Log.d("test", "doInBackground End");
           return null;
       }

        protected void onPostExecute(Void... result) {
            Log.d("test", "onPostExecute");
            Toast toast = Toast.makeText(VideoTest.this, "Done Processing Video", Toast.LENGTH_LONG);
            toast.show();
        }
    }

    Just for your information, I have copy source from following library

    https://github.com/pvskalyan/Android-MJPEG-Video-Capture-FFMPEG?source=c