Recherche avancée

Médias (91)

Autres articles (86)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

  • Gestion des droits de création et d’édition des objets

    8 février 2011, par

    Par défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;

Sur d’autres sites (12016)

  • ffmpeg : Converting animated GIF files to video while upscaling produces a file with inaccurate colors

    14 juin 2020, par Metamoran

    I apologize if this is a dumb question, but even after using the search function I have not seen anyone asking about this.

    



    I'm trying to both 
(1) : Convert some low-resolution animated GIF files (pixel art in particular) to video, and
(2) : Upscale them at the same time, using nearest-neighbor to preserve the hard edges.

    



    ffmpeg does everything with no warnings or errors whatsoever, but the end result's colors look off. If I convert without upscaling, the color accuracy is preserved. I have tried both using and NOT using "palettegen", but it does not make a difference. For brevity, I'm only pasting the lines with palettegen in them. The end results are the same either way.

    



    This is what I've been using for upscaling :

    



    ffmpeg -i input.gif -c:v libx264 -b:v 10000K -y -vf "split[s0][s1];[s0]palettegen[p];[s1][p]paletteuse,scale=2*iw:2*ih:flags=neighbor" output.mp4


    



    This is what I used for testing conversion (with no upscaling) :

    



    ffmpeg -i input.gif -c:v libx264 -b:v 10000K -y -vf "split[s0][s1];[s0]palettegen[p];[s1][p]paletteuse" output.mp4


    



    Here's the input file :

    



    Original animated GIF file

    



    Here's a screenshot of how the final video looks if I don't upscale (colors look exactly like in the input file) :

    



    Screenshot - Final - No Upscaling

    



    And here's the result if I upscale (2x Nearest-Neighbor upscaling, with the screenshot downsized 50% to make comparison easier. Colors were not altered during the process.) :

    



    Screenshot - Final - 2x Upscaling (Nearest-Neighbor) [Downsized]

    



    Folder with all relevant files :
Google Drive

    



    Is there something I'm missing ? Or is there some sort of technical limitation, a step that will alter the colors of the video no matter what I try ? I'm not technically inclined, I'd like to know if that is the case. Thank you for your time.

    


  • avformat/dhav : fix backward scanning for get_duration and optimize seeking

    21 mars, par Justin Ruggles
    avformat/dhav : fix backward scanning for get_duration and optimize seeking
    

    The backwards scanning done for incomplete final packets should not
    assume a specific alignment at the end of the file. Truncated files
    result in hundreds of thousands of seeks if the final packet does not
    fall on a specific byte boundary, which can be extremely slow.
    For example, with HTTP, each backwards seek results in a separate
    HTTP request.

    This changes the scanning to check for the end tag 1 byte at a time
    and buffers the last 1 MiB to avoid additional seek operations.

    Co-authored-by : Derek Buitenhuis <derek.buitenhuis@gmail.com>
    Signed-off-by : Justin Ruggles <justinr@vimeo.com>
    Signed-off-by : Derek Buitenhuis <derek.buitenhuis@gmail.com>

    • [DH] libavformat/dhav.c
  • ProcessBuilder is not called when trying to start a process

    15 juin 2022, par xnok

    I am trying to understand more about the ffmpeg usage in JavaCV for android studio and for said task I am trying to use ProcessBuilder. I tried writting a simple program to debug the pb.start(); Although, I am not getting a response. What I did was to start a default/empty activity and pasted the following program :

    &#xA;

    package com.example.myapplication;&#xA;&#xA;import androidx.annotation.RequiresApi;&#xA;import androidx.appcompat.app.AppCompatActivity;&#xA;import java.io.BufferedReader;&#xA;import java.io.IOException;&#xA;import java.io.InputStreamReader;&#xA;import java.io.OutputStream;&#xA;&#xA;import org.bytedeco.javacpp.Loader;&#xA;&#xA;import android.os.Build;&#xA;import android.os.Bundle;&#xA;import android.util.Log;&#xA;&#xA;public class MainActivity extends AppCompatActivity {&#xA;    static final int cols = 192;&#xA;    static final int rows = 108;&#xA;    static final String ffmpeg = Loader.load(org.bytedeco.ffmpeg.ffmpeg.class);&#xA;    static final String rtmp_url = "test.flv";&#xA;    static final String[] command = {ffmpeg,&#xA;            "-y",&#xA;            "-f", "rawvideo",&#xA;            "-vcodec", "rawvideo",&#xA;            "-pix_fmt", "bgr24",&#xA;            "-s", (Integer.toString(cols) &#x2B; "x" &#x2B; Integer.toString(rows)),&#xA;            "-r", "10",&#xA;            "-i", "pipe:",&#xA;            "-c:v", "libx264",&#xA;            "-pix_fmt", "yuv420p",&#xA;            "-preset", "ultrafast",&#xA;            "-f", "flv",&#xA;            rtmp_url};&#xA;    @RequiresApi(api = Build.VERSION_CODES.O)&#xA;    @Override&#xA;    protected void onCreate(Bundle savedInstanceState) {&#xA;        super.onCreate(savedInstanceState);&#xA;        setContentView(R.layout.activity_main);&#xA;        new Thread(t1).start();&#xA;&#xA;    }&#xA;    private static Runnable t1 = () -> {&#xA;        Log.e("TAG", "void OnCreate called successfully!");&#xA;        ProcessBuilder pb = new ProcessBuilder(command).redirectErrorStream(true);&#xA;        pb.redirectErrorStream(true);&#xA;        try {&#xA;            Process process = pb.start();&#xA;            BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()));&#xA;            OutputStream writer = process.getOutputStream();&#xA;            Log.e("TAG", "Something good happened here");&#xA;        } catch (IOException e) {&#xA;            e.printStackTrace();&#xA;            Log.e("TAG", "Nothing good happened here");&#xA;        }&#xA;    };&#xA;&#xA;&#xA;}&#xA;

    &#xA;

    My current problem is that I can't seem to start properly the processBuilder process via pb.start() ;

    &#xA;

    I get the following logs from the logcat panel :

    &#xA;

    2022-06-14 17:24:46.328 13371-13371/com.example.myapplication E/TAG: void OnCreate called successfully!&#xA;2022-06-14 17:24:46.333 13371-13371/com.example.myapplication E/TAG: Nothing good happened here&#xA;

    &#xA;

    I'd like to understand why is it skipping the try/catch block and not starting the process ?

    &#xA;

    EDIT : I made some changes as per @g00se's suggestions and I got the following stack trace from the code above :

    &#xA;

    2022-06-15 00:32:26.700 29787-29787/? E/USNET: USNET: appName: com.example.myapplication&#xA;2022-06-15 00:32:29.328 29787-29828/com.example.myapplication E/TAG: void OnCreate called successfully!&#xA;2022-06-15 00:32:29.330 29787-29828/com.example.myapplication E/AndroidRuntime: FATAL EXCEPTION: Thread-4&#xA;    Process: com.example.myapplication, PID: 29787&#xA;    java.lang.NullPointerException&#xA;        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1012)&#xA;        at com.example.myapplication.MainActivity.lambda$static$0(MainActivity.java:48)&#xA;        at com.example.myapplication.MainActivity$$ExternalSyntheticLambda0.run(Unknown Source:0)&#xA;        at java.lang.Thread.run(Thread.java:920)&#xA;

    &#xA;