Recherche avancée

Médias (1)

Mot : - Tags -/framasoft

Autres articles (62)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (9152)

  • How to apply 'simple 'opacity to combined(layered) mp4s in FFMPEG

    27 mai 2021, par Cam

    I am not getting the final image results I need when layering together multiple mp4s of the same length and format into a single output MP4. I am using ffmpeg to create a pseudo 'motion blur' effect on animation, and need to layer mp4s together with identical opacities to produce the final video.

    


    I am using a base 'black' MP4 as the first layer for a background, and then adding a series of source mp4s with equal opacity over the top in each pass. Here I am showing a photoshop mockup using their 'normal' blending mode which is exactly the blending effect I am trying to replicate with ffmpeg. I understand that the final composite is less "bright" but that's fine (unless you have any ideas).
photoshop mockup

    


    Instead of looking like the result above, I am getting output where the colors are either all pink, garbled, super dark or generally hugely overbright etc based on trying different blend modes.

    


    Here are the commands I am using :

    


    To create the original (uncompressed ?) 'black' MP4 from a sequence of black pngs :

    


    ffmpeg -start_number 0 -r 24 -f image2 -s 1920x1080 -i black_seq.%04d.png -vcodec libx264 -crf 0 -pix_fmt yuv420p   black_seq.mp4 -y


    


    I then take that "black_seq.mp4" and blend a set of n number of source mp4s over the top with an opacity value. This runs in a loop and the output.mp4 of each pass becomes the input.mp4 of the next pass until it completes. In this example a total of 10 source mp4s assigns their opacity to 0.1 for each pass, and this is a single pass (below). The source mp4s are all very similar in their appearance and color, essentially just the same sequence of animation but offset in time by fractions of a single frame and have been generated from color pngs, using the same code that produced the first black layer (above).

    


    ffmpeg i input.mp4 -i n_layer.mp4 -vcodec libx264 -crf 0 -pix_fmt yuv420p   -filter_complex "blend=all_mode='overlay':all_opacity=0.1" output.mp4 -y


    


    Then finally add some compression to the result as the final "blur.mp4"

    


    ffmpeg -i "output.mp4" -vcodec libx264 -crf 25 -pix_fmt yuv420p "blur.mp4" -y


    


    And yes, this is certainly highly inefficient as an approach, but I am learning. The main issue I am trying to solve is, despite the final blur.mp4 being less "bright", it has colors that are not matching the original animation and instead looks like the animation has been hue shifted somehow.

    


    This image shows a cropped output for comparison (the processed blur is set to zero for clarity)
input and output example pic

    


    I would love some insight.

    


  • make fifo pipe in java(windows), write some data into it, let other process read the pipe

    9 novembre 2017, par vs93

    My objective is to create a named pipe(fifo) in windows(java), write some data(coming from camera) into that and invoke ffmpeg command to make mp4 from that data. But I suspect that it is not opening a fifo pipe, rather it is opening a file. How can I make it open a fifo pipe ?
    Here is the code :

    public void run () {
    String mp4Folder = "C://Users/user_2/Desktop/streamDestination";            // Where the MP4 is to be kept
    VFrame frame = null;
    int actualPipeValLen = 0;
    FileOutputStream requestStream = null;
    long lastProcessedTS = 0;
    final String FIFOPATH = "D://FIFO//";
    final String PIPE_NAME = FIFOPATH + "myfifo";
    final String MKFIFOCOMMAND = "mkfifo -m=rw " + PIPE_NAME;
    final String DELFIFOCOMMAND = "rm -r " + PIPE_NAME;
    String mp4FileName = mp4Folder + File.separator + "1.mp4";
    mp4FileName = mp4FileName.replace("\\", "/");
    long firstTimeStamp = 0;
    try {
       Runtime.getRuntime().exec(MKFIFOCOMMAND);
    } catch (IOException e1) {
       e1.printStackTrace();
    }
    if(firstTimeStamp == 0) {
       firstTimeStamp = frame.getTimestamp();
    }
    if((frame.getTimestamp() - firstTimeStamp) > (15 * 1000)) {
       if(requestStream != null) {
           requestStream.close();
           requestStream = null;
       }
       if(actualPipeValLen > 0) {
           String[] ffmpeg = new String[] {"ffmpeg", "-i", PIPE_NAME , "-vcodec", "copy", mp4FileName };
           Process ffmpegProcess = Runtime.getRuntime().exec(ffmpeg);
           actualPipeValLen = 0;
           firstTimeStamp = lastProcessedTS;
           Thread.sleep(2 * 1000);
           try {
               Runtime.getRuntime().exec(DELFIFOCOMMAND);
           } catch (IOException e1) {
               e1.printStackTrace();
           }
           System.exit(0);
       }
    } else {
       System.out.println("Writing into pipe : " + actualPipeValLen);
       if(requestStream == null) {
           requestStream = new  FileOutputStream(PIPE_NAME);
       }
       requestStream.write(frame.getFrame());
       actualPipeValLen += frame.getFrame().length;
       lastProcessedTS = frame.getTimestamp();
    }

    }

  • Acurate cut mp4 avconv or ffmpeg

    3 septembre 2014, par diogopms
    avconv -y -i "/files/33.mp4" -ss 0.000000 -t 11.8 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-0.000000.ts"
    avconv -y -i "/files/33.mp4" -ss 11.800000 -t 11.768254 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-11.800000.ts"
    avconv -y -i "/files/33.mp4" -ss 23.568254 -t 11.888617 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-23.568254.ts"
    avconv -y -i "/files/33.mp4" -ss 35.456871 -t 11.423129 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-35.456871.ts"
    avconv -y -i "/files/33.mp4" -ss 46.880000 -t 12.075465 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-46.880000.ts"
    avconv -y -i "/files/33.mp4" -ss 58.955465 -t 11.404535 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-58.955465.ts"
    avconv -y -i "/files/33.mp4" -ss 70.360000 -t 12.094059 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-70.360000.ts"
    avconv -y -i "/files/33.mp4" -ss 82.454059 -t 11.385941 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-82.454059.ts"
    avconv -y -i "/files/33.mp4" -ss 93.840000 -t 12.112653 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-93.840000.ts"
    avconv -y -i "/files/33.mp4" -ss 105.952653 -t 11.367347 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-105.952653.ts"
    avconv -y -i "/files/33.mp4" -ss 117.320000 -t 12.131247 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-117.320000.ts"
    avconv -y -i "/files/33.mp4" -ss 129.451247 -t 11.348753 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-129.451247.ts"
    avconv -y -i "/files/33.mp4" -ss 140.800000 -t 12.149841 -async 1 -bsf h264_mp4toannexb -c copy "/encoder/live/bla-140.800000.ts"

    cat bla-*.ts > final.ts ---> this final movie is not fluid....What i need to do do fix it

    Anyone can help with the final solution ?