Recherche avancée

Médias (0)

Mot : - Tags -/publication

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (63)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

Sur d’autres sites (5503)

  • fftools/ffprobe : add analyze_frames option for CC and grain detection

    8 décembre 2024, par Marth64
    fftools/ffprobe : add analyze_frames option for CC and grain detection
    

    Currently, ffprobe has two stream-level fields that do not work,
    closed_captions and film_grain).

    Their value is always 0 because ffprobe cannot access the internal
    codec properties when it is setting up its stream contexts.

    In this commit, add the new option -analyze_frames to ffprobe,
    allowing the user to read frames up to the interval they have defined
    and fill these fields based on what is exposed in AVPacketSideData.

    Additionally, in the same commit, don't write these fields to
    the output unless analyze_frames is enabled. Finally, fix the
    FATE test refs accordingly and update the docs.

    Signed-off-by : Marth64 <marth64@proxyid.net>

    • [DH] doc/ffprobe.texi
    • [DH] fftools/ffprobe.c
    • [DH] tests/ref/fate/cavs-demux
    • [DH] tests/ref/fate/concat-demuxer-extended-lavf-mxf
    • [DH] tests/ref/fate/concat-demuxer-extended-lavf-mxf_d10
    • [DH] tests/ref/fate/concat-demuxer-simple1-lavf-mxf
    • [DH] tests/ref/fate/concat-demuxer-simple1-lavf-mxf_d10
    • [DH] tests/ref/fate/concat-demuxer-simple2-lavf-ts
    • [DH] tests/ref/fate/ffprobe_compact
    • [DH] tests/ref/fate/ffprobe_csv
    • [DH] tests/ref/fate/ffprobe_default
    • [DH] tests/ref/fate/ffprobe_flat
    • [DH] tests/ref/fate/ffprobe_ini
    • [DH] tests/ref/fate/ffprobe_json
    • [DH] tests/ref/fate/ffprobe_xml
    • [DH] tests/ref/fate/ffprobe_xsd
    • [DH] tests/ref/fate/flv-demux
    • [DH] tests/ref/fate/hapqa-extract-nosnappy-to-hapalphaonly-mov
    • [DH] tests/ref/fate/hapqa-extract-nosnappy-to-hapq-mov
    • [DH] tests/ref/fate/mov-zombie
    • [DH] tests/ref/fate/mxf-probe-applehdr10
    • [DH] tests/ref/fate/mxf-probe-d10
    • [DH] tests/ref/fate/mxf-probe-dnxhd
    • [DH] tests/ref/fate/mxf-probe-dv25
    • [DH] tests/ref/fate/mxf-probe-j2k
    • [DH] tests/ref/fate/ts-demux
    • [DH] tests/ref/fate/ts-small-demux
  • VLC dead input for RTP stream

    27 mars, par CaptainCheese

    I'm working on creating an rtp stream that's meant to display live waveform data from Pioneer prolink players. The motivation for sending this video out is to be able to receive it in a flutter frontend. I initially was just sending a base-24 encoding of the raw ARGB packed ints per frame across a Kafka topic to it but processing this data in flutter proved to be untenable and was bogging down the main UI thread. Not sure if this is the most optimal way of going about this but just trying to get anything to work if it means some speedup on the frontend. So the issue the following implementation is experiencing is that when I run vlc --rtsp-timeout=120000 --network-caching=30000 -vvvv stream_1.sdp where

    &#xA;

    % cat stream_1.sdp&#xA;v=0&#xA;o=- 0 1 IN IP4 127.0.0.1&#xA;s=RTP Stream&#xA;c=IN IP4 127.0.0.1&#xA;t=0 0&#xA;a=tool:libavformat&#xA;m=video 5007 RTP/AVP 96&#xA;a=rtpmap:96 H264/90000&#xA;

    &#xA;

    I see (among other questionable logs) the following :

    &#xA;

    [0000000144c44d10] live555 demux error: no data received in 10s, aborting&#xA;[00000001430ee2f0] main input debug: EOF reached&#xA;[0000000144b160c0] main decoder debug: killing decoder fourcc `h264&#x27;&#xA;[0000000144b160c0] main decoder debug: removing module "videotoolbox"&#xA;[0000000144b164a0] main packetizer debug: removing module "h264"&#xA;[0000000144c44d10] main demux debug: removing module "live555"&#xA;[0000000144c45bb0] main stream debug: removing module "record"&#xA;[0000000144a64960] main stream debug: removing module "cache_read"&#xA;[0000000144c29c00] main stream debug: removing module "filesystem"&#xA;[00000001430ee2f0] main input debug: Program doesn&#x27;t contain anymore ES&#xA;[0000000144806260] main playlist debug: dead input&#xA;[0000000144806260] main playlist debug: changing item without a request (current 0/1)&#xA;[0000000144806260] main playlist debug: nothing to play&#xA;[0000000142e083c0] macosx interface debug: Playback has been ended&#xA;[0000000142e083c0] macosx interface debug: Releasing IOKit system sleep blocker (37463)&#xA;

    &#xA;

    This is sort of confusing because when I run ffmpeg -protocol_whitelist file,crypto,data,rtp,udp -i stream_1.sdp -vcodec libx264 -f null -&#xA;I see a number logs about

    &#xA;

    [h264 @ 0x139304080] non-existing PPS 0 referenced&#xA;    Last message repeated 1 times&#xA;[h264 @ 0x139304080] decode_slice_header error&#xA;[h264 @ 0x139304080] no frame!&#xA;

    &#xA;

    After which I see the stream is received and I start getting telemetry on it :

    &#xA;

    Input #0, sdp, from &#x27;stream_1.sdp&#x27;:&#xA;  Metadata:&#xA;    title           : RTP Stream&#xA;  Duration: N/A, start: 0.016667, bitrate: N/A&#xA;  Stream #0:0: Video: h264 (Constrained Baseline), yuv420p(progressive), 1200x200, 60 fps, 60 tbr, 90k tbn&#xA;Stream mapping:&#xA;  Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))&#xA;Press [q] to stop, [?] for help&#xA;[libx264 @ 0x107f04f40] using cpu capabilities: ARMv8 NEON&#xA;[libx264 @ 0x107f04f40] profile High, level 3.1, 4:2:0, 8-bit&#xA;Output #0, null, to &#x27;pipe:&#x27;:&#xA;  Metadata:&#xA;    title           : RTP Stream&#xA;    encoder         : Lavf61.7.100&#xA;  Stream #0:0: Video: h264, yuv420p(tv, progressive), 1200x200, q=2-31, 60 fps, 60 tbn&#xA;      Metadata:&#xA;        encoder         : Lavc61.19.101 libx264&#xA;      Side data:&#xA;        cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A&#xA;[out#0/null @ 0x60000069c000] video:144KiB audio:0KiB subtitle:0KiB other streams:0KiB global headers:0KiB muxing overhead: unknown&#xA;frame= 1404 fps= 49 q=-1.0 Lsize=N/A time=00:00:23.88 bitrate=N/A speed=0.834x&#xA;

    &#xA;

    Not sure why VLC is turning me down like some kind of Berghain bouncer that lets nobody in the entire night.

    &#xA;

    I initially tried just converting the ARGB ints to a YUV420p buffer and used this to create the Frame objects but I couldn't for the life of me figure out how to properly initialize it as the attempts I made kept spitting out garbled junk.

    &#xA;

    Please go easy on me, I've made an unhealthy habit of resolving nearly all of my coding questions by simply lurking the internet for answers but that's not really helping me solve this issue.

    &#xA;

    Here's the Java I'm working on (the meat of the rtp comms occurs within updateWaveformForPlayer()) :

    &#xA;

    package com.bugbytz.prolink;&#xA;&#xA;import org.apache.kafka.clients.producer.KafkaProducer;&#xA;import org.apache.kafka.clients.producer.Producer;&#xA;import org.apache.kafka.clients.producer.ProducerConfig;&#xA;import org.apache.kafka.clients.producer.ProducerRecord;&#xA;import org.bytedeco.ffmpeg.global.avcodec;&#xA;import org.bytedeco.ffmpeg.global.avutil;&#xA;import org.bytedeco.javacv.FFmpegFrameGrabber;&#xA;import org.bytedeco.javacv.FFmpegFrameRecorder;&#xA;import org.bytedeco.javacv.FFmpegLogCallback;&#xA;import org.bytedeco.javacv.Frame;&#xA;import org.bytedeco.javacv.FrameGrabber;&#xA;import org.deepsymmetry.beatlink.CdjStatus;&#xA;import org.deepsymmetry.beatlink.DeviceAnnouncement;&#xA;import org.deepsymmetry.beatlink.DeviceAnnouncementAdapter;&#xA;import org.deepsymmetry.beatlink.DeviceFinder;&#xA;import org.deepsymmetry.beatlink.Util;&#xA;import org.deepsymmetry.beatlink.VirtualCdj;&#xA;import org.deepsymmetry.beatlink.data.BeatGridFinder;&#xA;import org.deepsymmetry.beatlink.data.CrateDigger;&#xA;import org.deepsymmetry.beatlink.data.MetadataFinder;&#xA;import org.deepsymmetry.beatlink.data.TimeFinder;&#xA;import org.deepsymmetry.beatlink.data.WaveformDetail;&#xA;import org.deepsymmetry.beatlink.data.WaveformDetailComponent;&#xA;import org.deepsymmetry.beatlink.data.WaveformFinder;&#xA;&#xA;import java.awt.*;&#xA;import java.awt.image.BufferedImage;&#xA;import java.io.File;&#xA;import java.nio.ByteBuffer;&#xA;import java.text.DecimalFormat;&#xA;import java.util.ArrayList;&#xA;import java.util.HashMap;&#xA;import java.util.HashSet;&#xA;import java.util.Map;&#xA;import java.util.Properties;&#xA;import java.util.Set;&#xA;import java.util.concurrent.ExecutionException;&#xA;import java.util.concurrent.Executors;&#xA;import java.util.concurrent.ScheduledExecutorService;&#xA;import java.util.concurrent.ScheduledFuture;&#xA;import java.util.concurrent.TimeUnit;&#xA;&#xA;import static org.bytedeco.ffmpeg.global.avutil.AV_PIX_FMT_RGB24;&#xA;&#xA;public class App {&#xA;    public static ArrayList<track> tracks = new ArrayList&lt;>();&#xA;    public static boolean dbRead = false;&#xA;    public static Properties props = new Properties();&#xA;    private static Map recorders = new HashMap&lt;>();&#xA;    private static Map frameCount = new HashMap&lt;>();&#xA;&#xA;    private static final ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1);&#xA;    private static final int FPS = 60;&#xA;    private static final int FRAME_INTERVAL_MS = 1000 / FPS;&#xA;&#xA;    private static Map schedules = new HashMap&lt;>();&#xA;&#xA;    private static Set<integer> streamingPlayers = new HashSet&lt;>();&#xA;&#xA;    public static String byteArrayToMacString(byte[] macBytes) {&#xA;        StringBuilder sb = new StringBuilder();&#xA;        for (int i = 0; i &lt; macBytes.length; i&#x2B;&#x2B;) {&#xA;            sb.append(String.format("%02X%s", macBytes[i], (i &lt; macBytes.length - 1) ? ":" : ""));&#xA;        }&#xA;        return sb.toString();&#xA;    }&#xA;&#xA;    private static void updateWaveformForPlayer(int player) throws Exception {&#xA;        Integer frame_for_player = frameCount.get(player);&#xA;        if (frame_for_player == null) {&#xA;            frame_for_player = 0;&#xA;            frameCount.putIfAbsent(player, frame_for_player);&#xA;        }&#xA;&#xA;        if (!WaveformFinder.getInstance().isRunning()) {&#xA;            WaveformFinder.getInstance().start();&#xA;        }&#xA;        WaveformDetail detail = WaveformFinder.getInstance().getLatestDetailFor(player);&#xA;&#xA;        if (detail != null) {&#xA;            WaveformDetailComponent component = (WaveformDetailComponent) detail.createViewComponent(&#xA;                    MetadataFinder.getInstance().getLatestMetadataFor(player),&#xA;                    BeatGridFinder.getInstance().getLatestBeatGridFor(player)&#xA;            );&#xA;            component.setMonitoredPlayer(player);&#xA;            component.setPlaybackState(player, TimeFinder.getInstance().getTimeFor(player), true);&#xA;            component.setAutoScroll(true);&#xA;            int width = 1200;&#xA;            int height = 200;&#xA;            Dimension dimension = new Dimension(width, height);&#xA;            component.setPreferredSize(dimension);&#xA;            component.setSize(dimension);&#xA;            component.setScale(1);&#xA;            component.doLayout();&#xA;&#xA;            // Create a fresh BufferedImage and clear it before rendering&#xA;            BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_INT_RGB);&#xA;            Graphics2D g = image.createGraphics();&#xA;            g.clearRect(0, 0, width, height);  // Clear any old content&#xA;&#xA;            // Draw waveform into the BufferedImage&#xA;            component.paint(g);&#xA;            g.dispose();&#xA;&#xA;            int port = 5004 &#x2B; player;&#xA;            String inputFile = port &#x2B; "_" &#x2B; frame_for_player &#x2B; ".mp4";&#xA;            // Initialize the FFmpegFrameRecorder for YUV420P&#xA;            FFmpegFrameRecorder recorder_file = new FFmpegFrameRecorder(inputFile, width, height);&#xA;            FFmpegLogCallback.set();  // Enable FFmpeg logging for debugging&#xA;            recorder_file.setFormat("mp4");&#xA;            recorder_file.setVideoCodec(avcodec.AV_CODEC_ID_H264);&#xA;            recorder_file.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);  // Use YUV420P format directly&#xA;            recorder_file.setFrameRate(FPS);&#xA;&#xA;            // Set video options&#xA;            recorder_file.setVideoOption("preset", "ultrafast");&#xA;            recorder_file.setVideoOption("tune", "zerolatency");&#xA;            recorder_file.setVideoOption("x264-params", "repeat-headers=1");&#xA;            recorder_file.setGopSize(FPS);&#xA;            try {&#xA;                recorder_file.start();  // Ensure this is called before recording any frames&#xA;                System.out.println("Recorder started successfully for player: " &#x2B; player);&#xA;            } catch (org.bytedeco.javacv.FFmpegFrameRecorder.Exception e) {&#xA;                e.printStackTrace();&#xA;            }&#xA;&#xA;            // Get all pixels in one call&#xA;            int[] pixels = new int[width * height];&#xA;            image.getRGB(0, 0, width, height, pixels, 0, width);&#xA;            recorder_file.recordImage(width,height,Frame.DEPTH_UBYTE,1,3 * width, AV_PIX_FMT_RGB24, ByteBuffer.wrap(argbToByteArray(pixels, width, height)));&#xA;            recorder_file.stop();&#xA;            recorder_file.release();&#xA;            final FFmpegFrameRecorder recorder = recorders.get(player);&#xA;            FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(inputFile);&#xA;&#xA;&#xA;            try {&#xA;                grabber.start();&#xA;            } catch (Exception e) {&#xA;                e.printStackTrace();&#xA;            }&#xA;            if (recorder == null) {&#xA;                try {&#xA;                    String outputStream = "rtp://127.0.0.1:" &#x2B; port;&#xA;                    FFmpegFrameRecorder initial_recorder = new FFmpegFrameRecorder(outputStream, grabber.getImageWidth(), grabber.getImageHeight());&#xA;                    initial_recorder.setFormat("rtp");&#xA;                    initial_recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);&#xA;                    initial_recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);&#xA;                    initial_recorder.setFrameRate(grabber.getFrameRate());&#xA;                    initial_recorder.setGopSize(FPS);&#xA;                    initial_recorder.setVideoOption("x264-params", "keyint=60");&#xA;                    initial_recorder.setVideoOption("rtsp_transport", "tcp");&#xA;                    initial_recorder.start();&#xA;                    recorders.putIfAbsent(player, initial_recorder);&#xA;                    frameCount.putIfAbsent(player, 0);&#xA;                    putToRTP(player, grabber, initial_recorder);&#xA;                }&#xA;                catch (Exception e) {&#xA;                    e.printStackTrace();&#xA;                }&#xA;            }&#xA;            else {&#xA;                putToRTP(player, grabber, recorder);&#xA;            }&#xA;            File file = new File(inputFile);&#xA;            if (file.exists() &amp;&amp; file.delete()) {&#xA;                System.out.println("Successfully deleted file: " &#x2B; inputFile);&#xA;            } else {&#xA;                System.out.println("Failed to delete file: " &#x2B; inputFile);&#xA;            }&#xA;        }&#xA;    }&#xA;&#xA;    public static void putToRTP(int player, FFmpegFrameGrabber grabber, FFmpegFrameRecorder recorder) throws FrameGrabber.Exception {&#xA;        final Frame frame = grabber.grabFrame();&#xA;        int frameCount_local = frameCount.get(player);&#xA;        frame.keyFrame = frameCount_local&#x2B;&#x2B; % FPS == 0;&#xA;        frameCount.put(player, frameCount_local);&#xA;        try {&#xA;            recorder.record(frame);&#xA;        } catch (FFmpegFrameRecorder.Exception e) {&#xA;            throw new RuntimeException(e);&#xA;        }&#xA;    }&#xA;    public static byte[] argbToByteArray(int[] argb, int width, int height) {&#xA;        int totalPixels = width * height;&#xA;        byte[] byteArray = new byte[totalPixels * 3];  // 4 bytes per pixel (ARGB)&#xA;&#xA;        for (int i = 0; i &lt; totalPixels; i&#x2B;&#x2B;) {&#xA;            int argbPixel = argb[i];&#xA;&#xA;            byteArray[i * 3] = (byte) ((argbPixel >> 16) &amp; 0xFF);  // Red&#xA;            byteArray[i * 3 &#x2B; 1] = (byte) ((argbPixel >> 8) &amp; 0xFF);   // Green&#xA;            byteArray[i * 3 &#x2B; 2] = (byte) (argbPixel &amp; 0xFF);  // Blue&#xA;        }&#xA;&#xA;        return byteArray;&#xA;    }&#xA;&#xA;&#xA;    public static void main(String[] args) throws Exception {&#xA;        VirtualCdj.getInstance().setDeviceNumber((byte) 4);&#xA;        CrateDigger.getInstance().addDatabaseListener(new DBService());&#xA;        props.put("bootstrap.servers", "localhost:9092");&#xA;        props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");&#xA;        props.put("value.serializer", "com.bugbytz.prolink.CustomSerializer");&#xA;        props.put(ProducerConfig.MAX_REQUEST_SIZE_CONFIG, "20971520");&#xA;&#xA;        VirtualCdj.getInstance().addUpdateListener(update -> {&#xA;            if (update instanceof CdjStatus) {&#xA;                try (Producer producer = new KafkaProducer&lt;>(props)) {&#xA;                    DecimalFormat df_obj = new DecimalFormat("#.##");&#xA;                    DeviceStatus deviceStatus = new DeviceStatus(&#xA;                            update.getDeviceNumber(),&#xA;                            ((CdjStatus) update).isPlaying() || !((CdjStatus) update).isPaused(),&#xA;                            ((CdjStatus) update).getBeatNumber(),&#xA;                            update.getBeatWithinBar(),&#xA;                            Double.parseDouble(df_obj.format(update.getEffectiveTempo())),&#xA;                            Double.parseDouble(df_obj.format(Util.pitchToPercentage(update.getPitch()))),&#xA;                            update.getAddress().getHostAddress(),&#xA;                            byteArrayToMacString(DeviceFinder.getInstance().getLatestAnnouncementFrom(update.getDeviceNumber()).getHardwareAddress()),&#xA;                            ((CdjStatus) update).getRekordboxId(),&#xA;                            update.getDeviceName()&#xA;                    );&#xA;                    ProducerRecord record = new ProducerRecord&lt;>("device-status", "device-" &#x2B; update.getDeviceNumber(), deviceStatus);&#xA;                    try {&#xA;                        producer.send(record).get();&#xA;                    } catch (InterruptedException ex) {&#xA;                        throw new RuntimeException(ex);&#xA;                    } catch (ExecutionException ex) {&#xA;                        throw new RuntimeException(ex);&#xA;                    }&#xA;                    producer.flush();&#xA;                    if (!WaveformFinder.getInstance().isRunning()) {&#xA;                        try {&#xA;                            WaveformFinder.getInstance().start();&#xA;                        } catch (Exception ex) {&#xA;                            throw new RuntimeException(ex);&#xA;                        }&#xA;                    }&#xA;                }&#xA;            }&#xA;        });&#xA;        DeviceFinder.getInstance().addDeviceAnnouncementListener(new DeviceAnnouncementAdapter() {&#xA;            @Override&#xA;            public void deviceFound(DeviceAnnouncement announcement) {&#xA;                if (!streamingPlayers.contains(announcement.getDeviceNumber())) {&#xA;                    streamingPlayers.add(announcement.getDeviceNumber());&#xA;                    schedules.putIfAbsent(announcement.getDeviceNumber(), scheduler.scheduleAtFixedRate(() -> {&#xA;                        try {&#xA;                            Runnable task = () -> {&#xA;                                try {&#xA;                                    updateWaveformForPlayer(announcement.getDeviceNumber());&#xA;                                } catch (InterruptedException e) {&#xA;                                    System.out.println("Thread interrupted");&#xA;                                } catch (Exception e) {&#xA;                                    throw new RuntimeException(e);&#xA;                                }&#xA;                                System.out.println("Lambda thread work completed!");&#xA;                            };&#xA;                            task.run();&#xA;                        } catch (Exception e) {&#xA;                            e.printStackTrace();&#xA;                        }&#xA;                    }, 0, FRAME_INTERVAL_MS, TimeUnit.MILLISECONDS));&#xA;                }&#xA;            }&#xA;&#xA;            @Override&#xA;            public void deviceLost(DeviceAnnouncement announcement) {&#xA;                if (streamingPlayers.contains(announcement.getDeviceNumber())) {&#xA;                    schedules.get(announcement.getDeviceNumber()).cancel(true);&#xA;                    streamingPlayers.remove(announcement.getDeviceNumber());&#xA;                }&#xA;            }&#xA;        });&#xA;        BeatGridFinder.getInstance().start();&#xA;        MetadataFinder.getInstance().start();&#xA;        VirtualCdj.getInstance().start();&#xA;        TimeFinder.getInstance().start();&#xA;        DeviceFinder.getInstance().start();&#xA;        CrateDigger.getInstance().start();&#xA;&#xA;        try {&#xA;            LoadCommandConsumer consumer = new LoadCommandConsumer("localhost:9092", "load-command-group");&#xA;            Thread consumerThread = new Thread(consumer::startConsuming);&#xA;            consumerThread.start();&#xA;&#xA;            Runtime.getRuntime().addShutdownHook(new Thread(() -> {&#xA;                consumer.shutdown();&#xA;                try {&#xA;                    consumerThread.join();&#xA;                } catch (InterruptedException e) {&#xA;                    Thread.currentThread().interrupt();&#xA;                }&#xA;            }));&#xA;            Thread.sleep(60000);&#xA;        } catch (InterruptedException e) {&#xA;            System.out.println("Interrupted, exiting.");&#xA;        }&#xA;    }&#xA;}&#xA;</integer></track>

    &#xA;

  • Android ffmpeg. Linker command failed with exit code 1

    8 juin 2017, par Wilsom Sanders

    I’m trying to build ffmpeg static libraries and link them to an android project in order to implement a video player. Goal is to make a player capable of receiving video file from various sources similar to p2p file sharing networks. (target api is 21 level which is 1 level short from supposed official solution with MediaSource)

    I managed to compile ffmpeg from this repo (all the code used as-is) but later i got stuck on a linking problem. Whenever I try to compile I get list of ffmpeg methods called in my code and a short eloquent message :

    Linker command failed with exit code 1

    I have no clue how to pass -v flag to the linker in android studio. Would be great if somebody hinted me that.

    I use android studio, build with gradle and cmake.

    There’s my files :
    build.gradle (Project)

    // Top-level build file where you can add configuration options common to all sub-projects/modules.

    buildscript {
       repositories {
           jcenter()
       }
       dependencies {
           classpath 'com.android.tools.build:gradle:2.3.1'

           // NOTE: Do not place your application dependencies here; they belong
           // in the individual module build.gradle files
       }
    }

    allprojects {
       repositories {
           jcenter()
       }
    }

    task clean(type: Delete) {
       delete rootProject.buildDir
    }

    build.gradle (Module)

    apply plugin: 'com.android.application'

       android {
           compileSdkVersion 25
           buildToolsVersion "25.0.3"
           productFlavors {
               x86 {
                   ndk {
                       abiFilter "x86"
                   }
               }
               arm {
                   ndk {
                       abiFilters "armeabi-v7a"
                   }
               }
               armv7 {
                   ndk {
                       abiFilters "armeabi-v7a"
                   }
               }
           }
           defaultConfig {
               applicationId "com.example.ledo.ndkapplication"
               minSdkVersion 22
               targetSdkVersion 25
               versionCode 1
               versionName "1.0"
               testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
               externalNativeBuild {
                   cmake {
                       cppFlags "-std=c++11 -frtti -fexceptions"
                       arguments '-DANDROID_PLATFORM=android-16'
                   }
               }
           }
           buildTypes {
               release {
                   minifyEnabled false
                   proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
               }
           }
           externalNativeBuild {
               cmake {
                   path "CMakeLists.txt"
               }
           }
           splits {
               abi {
                   enable true
                   reset()
                   include 'x86', 'armeabi-v7a'
                   universalApk true
               }
           }
       }

       dependencies {
           compile fileTree(dir: 'libs', include: ['*.jar'])
           androidTestCompile('com.android.support.test.espresso:espresso-core:2.2.2', {
               exclude group: 'com.android.support', module: 'support-annotations'
           })
           compile 'com.android.support:appcompat-v7:25.3.1'
           compile 'com.android.support.constraint:constraint-layout:1.0.2'
           compile 'com.android.support:design:25.3.1'
           compile 'com.writingminds:FFmpegAndroid:0.3.2'
           testCompile 'junit:junit:4.12'
       }

    CMakeLists.txt

    # For more information about using CMake with Android Studio, read the
    # documentation: https://d.android.com/studio/projects/add-native-code.html

    # Sets the minimum version of CMake required to build the native library.

    cmake_minimum_required(VERSION 2.8)

    # Creates and names a library, sets it as either STATIC
    # or SHARED, and provides the relative paths to its source code.
    # You can define multiple libraries, and CMake builds them for you.
    # Gradle automatically packages shared libraries with your APK.

    add_library( # Sets the name of the library.
                native-lib

                # Sets the library as a shared library.
                SHARED

                # Provides a relative path to your source file(s).
                src/main/cpp/native-lib.cpp
                src/main/cpp/NativePlayer.h
                src/main/cpp/NativePlayer.cpp)

    # Searches for a specified prebuilt library and stores the path as a
    # variable. Because CMake includes system libraries in the search path by
    # default, you only need to specify the name of the public NDK library
    # you want to add. CMake verifies that the library exists before
    # completing its build.

    find_library( # Sets the name of the path variable.
                 log-lib

                 # Specifies the name of the NDK library that
                 # you want CMake to locate.
                 log )
    find_library(png-lib png)

    # Specifies libraries CMake should link to your target library. You
    # can link multiple libraries, such as libraries you define in this
    # build script, prebuilt third-party libraries, or system libraries.

    #avcodec
    #avfilter
    #avformat
    #avutil
    #swresample
    #swscale

    #${ANDROID_ABI}

    message(${ANDROID_ABI})
    set(FFMPEG_ROOT_DIR src/main/libs/ffmpeg/${ANDROID_ABI})

    add_library(avcodec STATIC IMPORTED)
    add_library(avformat STATIC IMPORTED)
    add_library(avfilter STATIC IMPORTED)
    add_library(avutil STATIC IMPORTED)
    add_library(swresample STATIC IMPORTED)
    add_library(swscale STATIC IMPORTED)

    #SET_TARGET_PROPERTIES(avcodec PROPERTIES LINKER_LANGUAGE C)
    set_target_properties(avcodec PROPERTIES IMPORTED_LOCATION ${CMAKE_CURRENT_SOURCE_DIR}/${FFMPEG_ROOT_DIR}/lib/libavcodec.a)
    #SET_TARGET_PROPERTIES(avfilter PROPERTIES LINKER_LANGUAGE C)
    set_target_properties(avfilter PROPERTIES IMPORTED_LOCATION ${CMAKE_CURRENT_SOURCE_DIR}/${FFMPEG_ROOT_DIR}/lib/libavfilter.a)
    #SET_TARGET_PROPERTIES(avformat PROPERTIES LINKER_LANGUAGE C)
    set_target_properties(avformat PROPERTIES IMPORTED_LOCATION ${CMAKE_CURRENT_SOURCE_DIR}/${FFMPEG_ROOT_DIR}/lib/libavformat.a)
    #SET_TARGET_PROPERTIES(avutil PROPERTIES LINKER_LANGUAGE C)
    set_target_properties(avutil PROPERTIES IMPORTED_LOCATION ${CMAKE_CURRENT_SOURCE_DIR}/${FFMPEG_ROOT_DIR}/lib/libavutil.a)
    #SET_TARGET_PROPERTIES(swresample PROPERTIES LINKER_LANGUAGE C)
    set_target_properties(swresample PROPERTIES IMPORTED_LOCATION ${CMAKE_CURRENT_SOURCE_DIR}/${FFMPEG_ROOT_DIR}/lib/libswresample.a)
    #SET_TARGET_PROPERTIES(swscale PROPERTIES LINKER_LANGUAGE C)
    set_target_properties(swscale PROPERTIES IMPORTED_LOCATION ${CMAKE_CURRENT_SOURCE_DIR}/${FFMPEG_ROOT_DIR}/lib/libswscale.a)

    include_directories( ${CMAKE_CURRENT_SOURCE_DIR}/${FFMPEG_ROOT_DIR}/include )
    include_directories( ${CMAKE_CURRENT_SOURCE_DIR}/${FFMPEG_ROOT_DIR}/lib )
    target_link_libraries( # Specifies the target library.
                          native-lib

                          # Links the target library to the log library
                          # included in the NDK.
                          ${log-lib}
                          avcodec
                          avformat
                          #avfilter
                          #swresample
                          #swscale
                          #avutil
                          GLESv2)

    I have .a files in following locations :
    %PROJECT_DIR%/app/src/libs/ffmpeg/armeabi-v7a
    %PROJECT_DIR%/app/src/libs/ffmpeg/armeabi-v7a-neon
    %PROJECT_DIR%/app/src/libs/ffmpeg/x86

    It doesn’t look to me that linker misses files themselves. (I get different out put if I misplace them.)