Recherche avancée

Médias (1)

Mot : - Tags -/iphone

Autres articles (47)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Automated installation script of MediaSPIP

    25 avril 2011, par

    To overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
    You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
    The documentation of the use of this installation script is available here.
    The code of this (...)

Sur d’autres sites (7210)

  • how to encoded a 10bit .tif file into an HDR video by ffmpeg in command line [closed]

    6 juin 2024, par ziyuan

    how can i encoded a RGB 10 bit tif file into a hdr video by ffmpeg ?

    


    and one more question : should the input tiff must be 16bit big- or little- endian ?

    


    i have try the method here : How can I encode RGB images into HDR10 videos in ffmpeg command-line ?
,the same command and setting as the method. However i got this : code 1026 : YUV color family cannot have RGB matrix coefficients

    


    here is the tiff file i have tried :https://drive.google.com/file/d/1G8oYf9-FQJJPNUxwB0_FmlvBaMnUhCVL/view?usp=drive_link

    


    and i found that :
the procedure information it seems like the ffmpeg automatically set the output format as gbrp16le even when i set the -pix_fmt yuv420p10le.

    


  • Where do I find the Saved Image in Android ?

    29 octobre 2016, par FirstStep

    Excuse me, quick question :

    I have this routine of video stream, where I receive packets, convert them to byte[] ,then to bitmaps, then display them on the screen :

    dsocket.receive(packetReceived); // receive packet
    byte[] buff = packetReceived.getData(); // convert packet to byte[]
    final Bitmap ReceivedImage = BitmapFactory.decodeByteArray(buff, 0, buff.length); // convert byte[] to bitmap image

    runOnUiThread(new Runnable()
    {
       @Override
       public void run()
       {
           // this is executed on the main (UI) thread
           imageView.setImageBitmap(ReceivedImage);
       }
    });

    Now, I want to implementing a Recording feature. Suggestions say I need to use FFmpeg (which I have no idea how) but first, I need to prepare a directory of the ordered images to then convert it to a video file. Do do that I will save all the images internally, and I am using this answer to save each image :

    if(RecordVideo && !PauseRecording) {
       saveToInternalStorage(ReceivedImage, ImageNumber);
       ImageNumber++;
    }
    else
    {
       if(!RecordVideo)
           ImageNumber = 0;
    }

    // ...

    private void saveToInternalStorage(Bitmap bitmapImage, int counter){
           ContextWrapper cw = new ContextWrapper(getApplicationContext());

           // path to /data/data/yourapp/app_data/imageDir
           File MyDirectory = cw.getDir("imageDir", Context.MODE_PRIVATE);

           // Create imageDir
           File MyPath = new File(MyDirectory,"Image" + counter + ".jpg");

           FileOutputStream fos = null;
           try {
               fos = new FileOutputStream(MyPath);

               // Use the compress method on the BitMap object to write image to the OutputStream
               bitmapImage.compress(Bitmap.CompressFormat.PNG, 100, fos);

           } catch (Exception e) {
               e.printStackTrace();
           } finally {
               try {
                   fos.close();
               } catch (IOException e) {
                   e.printStackTrace();
               }
           }
           //return MyDirectory.getAbsolutePath();
       }

    But I can’t seem to find the directory on my device (To actually see if I succeeded in creating the directory) Where is // path to /data/data/yourapp/app_data/imageDir located exactly ?

  • How to stop recording dynamically in FFMPEG CLI Wrapper java

    14 décembre 2017, par user2237529

    https://github.com/bramp/ffmpeg-cli-wrapper/issues/13

    public class ScreenCaptureFFMPEG {

       public static void record(String outputVideo, String time) throws Exception
       {
           RunProcessFunction func = new RunProcessFunction();

           FFmpeg ffmpeg = new FFmpeg("C:\\FFMPEG\\ffmpeg.exe");
           FFmpegBuilder builder = new FFmpegBuilder()
                   .addExtraArgs("-rtbufsize", "1500M")
                   .addExtraArgs("-r", "30")
                   .setFormat("dshow")
                   .setInput("video=\"screen-capture-recorder\"")
                   .addOutput(outputVideo)
                   .setFormat("mp4")
                   .addExtraArgs("-crf", "0")
                   .setVideoCodec("libx264")
                   //.addExtraArgs("-ac", "1")
                   .addExtraArgs("-y")
           //overwrite file name

                   // .setAudioCodec("libmp3lame")
                   // .setAudioSampleRate(FFmpeg.AUDIO_SAMPLE_44100)
                   //  .setAudioBitRate(1_000_000)

                   //.addExtraArgs("-ar", "44100")
                   .addExtraArgs("-t", time)

                   //.setVideoPixelFormat("yuv420p")
                   //.setVideoResolution(426, 240)
                   //.setVideoBitRate(2_000_000)
                   //.setVideoFrameRate(30)
                   //.addExtraArgs("-deinterlace")
                   //.addExtraArgs("-preset", "medium")
                   //.addExtraArgs("-g", "30")
                   .done();
           FFmpegExecutor executor = new FFmpegExecutor(ffmpeg);

           executor.createJob(builder).run();


       }

       public static void capture(String name) throws Exception
       {
           BufferedImage image = new Robot().createScreenCapture(new Rectangle(Toolkit.getDefaultToolkit().getScreenSize()));
           ImageIO.write(image, "png", new File(name));

       }

       public static void main(String[]args) throws Exception {

           capture("start.png");
           //record("TC_test.mp4", "00:00:10");
           capture("end.png");
           Thread t = new Thread(new Runnable() {
               @Override
               public void run() {
                   // Insert some method call here.
                   try {
                       record("TC_test.mp4", "00:00:10");
                   }
                   catch(Exception e)
                   {
                       System.out.println("hello");
                   }
               }
           });
           Thread.sleep(5000);

           t.interrupt();

       }
    }

    I am trying to stop the recording by killing the subprocess or if there is any other method its fine too. But I am unable to do that. I know its supported according to the github link. How do I kill a subprocess please ?

    PS : THis is the first time i post a question on stackoverflow so If I made any mistakes please excuse me and provide details on how i can improve