Recherche avancée

Médias (0)

Mot : - Tags -/auteurs

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (112)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

Sur d’autres sites (12907)

  • Date and segment comparison feature

    31 octobre 2019, par Matomo Core Team — Analytics Tips, Development

    Get a clearer picture with the date and segment comparison feature

    What can you do with it ? What are the benefits ?

    Make informed decisions faster by easily comparing different segments and dates with each other.

    Compare report data for multiple segments next to each other

    Segment comparison feature

    Directly compare the behaviour of visitors from different segments e.g. customers with accounts vs. customers without accounts. Segment comparisons are a powerful way to compare different audience ; learn which ones perform better ; and in what way their actions differ. 

    Compare report data for two time periods next to each other

    Comparing date ranges

    See how your website performs compared to the previous month/week/year. Including seeing trends over those periods. Say, your business always picks up at the same times within a year, or there’s a sag in business for every user segment over this year and the last except one.

    By being able to compare date ranges you are able to get a quick overview of trends and period to period performance. Has a campaign worked better in September than in October ? Get an instant look by having the side-by-side comparison in Matomo.

    What is it capable of ?

    It lets you ask the question, “What is different ?”

    If you look at reports you’ll only see how people behave overall and if you look at specific segments you’ll see how they behave at face value, however, if you compare data together you’ll be quickly informed on what makes them unique. This data is still there when you don’t use the comparison feature, it’s just buried. Comparing data highlights discrepancies and leads to important questions and answers.

    For example, perhaps some class of users have very low engagement on a specific day compared to the rest of your visitors, and perhaps those users are responsible for an outsized proportion of churn. 

    Who could benefit from it, and why ?

    Everyone can benefit from using it (and probably should use it). It’s yours to experiment with ! You shouldn’t feel restricted to only comparing between the current and last period, or having questions before you start comparing. Follow your instincts and see what pops out when data from different segments is laid out next to each other.

    Where can you find it in Matomo ?

    • Segment comparison is activated by the new icon in the segment selector
    Segment comparison feature
    • Date comparison can be found in the calendar section of Matomo
    Date comparison feature
    • The list of active comparisons is visible at the top of the page for all pages that support comparison
    • Comparisons are visible in every report that supports comparing data, and reports that do not support it will display a message saying so

    How do you use it ?

    • To compare segments, click the icon in the segment selector
    • To compare periods, click the ‘compare’ checkbox in the period selector, then select what period you want to compare it against in the dropdown (previous period, previous year, or a custom range)
    • When comparisons are active, view your reports as normal

    Take it away !

    The comparison feature is a new tool from Matomo 3.12.0 that highlights discrepancies and differences in data that can lead to more clarity and understanding, so we’d encourage everyone to use it. 

    Try it out today in your Matomo and see the power behind this new data comparison mode !

  • How to properly close a FFmpeg stream and AVFormatContext without leaking memory ?

    13 décembre 2019, par Darkwonder

    I have built an app that uses FFmpeg to connect to remote IP cameras in order to receive video and audio frames via RTSP 2.0.

    The app is built using Xcode 10-11 and Objective-C with a custom FFmpeg build config.

    The architecture is the following :

    MyApp


    Document_0

       RTSPContainerObject_0
           RTSPObject_0

       RTSPContainerObject_1
           RTSPObject_1

       ...
    Document_1
    ...

    GOAL :

    1. After closing Document_0 no FFmpeg objects should be leaked.
    2. The closing process should stop-frame reading and destroy all objects which use FFmpeg.

    PROBLEM :

    enter image description here

    1. Somehow Xcode’s memory debugger shows two instances of MyApp.

    FACTS :

    • macOS’es Activity Monitor doesn’t show two instances of MyApp.

    • macOS’es Activity Monitor doesn’t any instances of FFmpeg or other child processes.

    • The issue is not related to some leftover memory due to a late memory snapshot since it can be reproduced easily.

    • Xcode’s memory debugger shows that the second instance only having RTSPObject's AVFormatContext and no other objects.

      1. The second instance has an AVFormatContext and the RTPSObject still has a pointer to the AVFormatContext.

    FACTS :

    • Opening and closing the second document Document_1 leads to the same problem and having two objects leaked. This means that there is a bug that creates scalable problems. More and more memory is used and unavailable.

    Here is my termination code :

      - (void)terminate
    {
       // * Video and audio frame provisioning termination *
       [self stopVideoStream];
       [self stopAudioStream];
       // *

       // * Video codec termination *
       avcodec_free_context(&_videoCodecContext); // NULL pointer safe.
       self.videoCodecContext = NULL;
       // *

    // * Audio codec termination *
    avcodec_free_context(&_audioCodecContext); // NULL pointer safe.
    self.audioCodecContext = NULL;
    // *

    if (self.packet)
    {
       // Free the packet that was allocated by av_read_frame.
       av_packet_unref(&packet); // The documentation doesn't mention NULL safety.
       self.packet = NULL;
    }

    if (self.currentAudioPacket)
    {
       av_packet_unref(_currentAudioPacket);
       self.currentAudioPacket = NULL;
    }

    // Free raw frame data.
    av_freep(&_rawFrameData); // NULL pointer safe.

    // Free the swscaler context swsContext.
    self.isFrameConversionContextAllocated = NO;
    sws_freeContext(scallingContext); // NULL pointer safe.

    [self.audioPacketQueue removeAllObjects];

    self.audioPacketQueue = nil;

    self.audioPacketQueueLock = nil;
    self.packetQueueLock = nil;
    self.audioStream = nil;
    BXLogInDomain(kLogDomainSources, kLogLevelVerbose, @"%s:%d: All streams have been terminated!", __FUNCTION__, __LINE__);

    // * Session context termination *
    AVFormatContext *pFormatCtx = self.sessionContext;
    BOOL shouldProceedWithInputSessionTermination = self.isInputStreamOpen && self.shouldTerminateStreams && pFormatCtx;
    NSLog(@"\nTerminating session context...");
    if (shouldProceedWithInputSessionTermination)
    {
       NSLog(@"\nTerminating...");
       //av_write_trailer(pFormatCtx);
       // Discard all internally buffered data.
       avformat_flush(pFormatCtx); // The documentation doesn't mention NULL safety.
       // Close an opened input AVFormatContext and free it and all its contents.
       // WARNING: Closing an non-opened stream will cause avformat_close_input to crash.
       avformat_close_input(&pFormatCtx); // The documentation doesn't mention NULL safety.
       NSLog(@"Logging leftovers - %p, %p  %p", self.sessionContext, _sessionContext, pFormatCtx);
       avformat_free_context(pFormatCtx);

       NSLog(@"Logging content = %c", *self.sessionContext);
       //avformat_free_context(pFormatCtx); - Not needed because avformat_close_input is closing it.
       self.sessionContext = NULL;
    }
    // *

    }

    IMPORTANT : The termination sequence is :

       New frame will be read.
    -[(RTSPObject)StreamInput currentVideoFrameDurationSec]
    -[(RTSPObject)StreamInput frameDuration:]
    -[(RTSPObject)StreamInput currentCGImageRef]
    -[(RTSPObject)StreamInput convertRawFrameToRGB]
    -[(RTSPObject)StreamInput pixelBufferFromImage:]
    -[(RTSPObject)StreamInput cleanup]
    -[(RTSPObject)StreamInput dealloc]
    -[(RTSPObject)StreamInput stopVideoStream]
    -[(RTSPObject)StreamInput stopAudioStream]

    Terminating session context...
    Terminating...
    Logging leftovers - 0x109ec6400, 0x109ec6400  0x109ec6400
    Logging content = \330
    -[Document dealloc]

    NOT WORKING SOLUTIONS :

    • Changing the order of object releases (The AVFormatContext has been freed first but it didn’t lead to any change).
    • Calling RTSPObject's cleanup method much sooner to give FFmpeg more time to handle object releases.
    • Reading a lot of SO answers and FFmpeg documentation to find a clean cleanup process or newer code which might highlight why the object release doesn’t happen properly.

    I am currently reading the documentation on AVFormatContext since I believe that I am forgetting to release something. This believe is based on the memory debuggers output that AVFormatContext is still around.

    Here is my creation code :

    #pragma mark # Helpers - Start

    - (NSError *)openInputStreamWithVideoStreamId:(int)videoStreamId
                                   audioStreamId:(int)audioStreamId
                                        useFirst:(BOOL)useFirstStreamAvailable
                                          inInit:(BOOL)isInitProcess
    {
       // NSLog(@"%s", __PRETTY_FUNCTION__); // RTSP
       self.status = StreamProvisioningStatusStarting;
       AVCodec *decoderCodec;
       NSString *rtspURL = self.streamURL;
       NSString *errorMessage = nil;
       NSError *error = nil;

       self.sessionContext = NULL;
       self.sessionContext = avformat_alloc_context();

       AVFormatContext *pFormatCtx = self.sessionContext;
       if (!pFormatCtx)
       {
           // Create approp error.
           return error;
       }


       // MUST be called before avformat_open_input().
       av_dict_free(&_sessionOptions);

           self.sessionOptions = 0;
           if (self.usesTcp)
           {
               // "rtsp_transport" - Set RTSP transport protocols.
               // Allowed are: udp_multicast, tcp, udp, http.
               av_dict_set(&_sessionOptions, "rtsp_transport", "tcp", 0);
           }
           av_dict_set(&_sessionOptions, "rtsp_transport", "tcp", 0);

       // Open an input stream and read the header with the demuxer options.
       // WARNING: The stream must be closed with avformat_close_input()
       if (avformat_open_input(&pFormatCtx, rtspURL.UTF8String, NULL, &_sessionOptions) != 0)
       {
           // WARNING: Note that a user-supplied AVFormatContext (pFormatCtx) will be freed on failure.
           self.isInputStreamOpen = NO;
           // Create approp error.
           return error;
       }

       self.isInputStreamOpen = YES;

       // user-supplied AVFormatContext pFormatCtx might have been modified.
       self.sessionContext = pFormatCtx;

       // Retrieve stream information.
       if (avformat_find_stream_info(pFormatCtx,NULL) < 0)
       {
           // Create approp error.
           return error;
       }

       // Find the first video stream
       int streamCount = pFormatCtx->nb_streams;

       if (streamCount == 0)
       {
           // Create approp error.
           return error;
       }

       int noStreamsAvailable = pFormatCtx->streams == NULL;

       if (noStreamsAvailable)
       {
           // Create approp error.
           return error;
       }

       // Result. An Index can change, an identifier shouldn't.
       self.selectedVideoStreamId = STREAM_NOT_FOUND;
       self.selectedAudioStreamId = STREAM_NOT_FOUND;

       // Fallback.
       int firstVideoStreamIndex = STREAM_NOT_FOUND;
       int firstAudioStreamIndex = STREAM_NOT_FOUND;

       self.selectedVideoStreamIndex = STREAM_NOT_FOUND;
       self.selectedAudioStreamIndex = STREAM_NOT_FOUND;

       for (int i = 0; i < streamCount; i++)
       {
           // Looking for video streams.
           AVStream *stream = pFormatCtx->streams[i];
           if (!stream) { continue; }
           AVCodecParameters *codecPar = stream->codecpar;
           if (!codecPar) { continue; }

           if (codecPar->codec_type==AVMEDIA_TYPE_VIDEO)
           {
               if (stream->id == videoStreamId)
               {
                   self.selectedVideoStreamId = videoStreamId;
                   self.selectedVideoStreamIndex = i;
               }

               if (firstVideoStreamIndex == STREAM_NOT_FOUND)
               {
                   firstVideoStreamIndex = i;
               }
           }
           // Looking for audio streams.
           if (codecPar->codec_type==AVMEDIA_TYPE_AUDIO)
           {
               if (stream->id == audioStreamId)
               {
                   self.selectedAudioStreamId = audioStreamId;
                   self.selectedAudioStreamIndex = i;
               }

               if (firstAudioStreamIndex == STREAM_NOT_FOUND)
               {
                   firstAudioStreamIndex = i;
               }
           }
       }

       // Use first video and audio stream available (if possible).

       if (self.selectedVideoStreamIndex == STREAM_NOT_FOUND && useFirstStreamAvailable && firstVideoStreamIndex != STREAM_NOT_FOUND)
       {
           self.selectedVideoStreamIndex = firstVideoStreamIndex;
           self.selectedVideoStreamId = pFormatCtx->streams[firstVideoStreamIndex]->id;
       }

       if (self.selectedAudioStreamIndex == STREAM_NOT_FOUND && useFirstStreamAvailable && firstAudioStreamIndex != STREAM_NOT_FOUND)
       {
           self.selectedAudioStreamIndex = firstAudioStreamIndex;
           self.selectedAudioStreamId = pFormatCtx->streams[firstAudioStreamIndex]->id;
       }

       if (self.selectedVideoStreamIndex == STREAM_NOT_FOUND)
       {
           // Create approp error.
           return error;
       }

       // See AVCodecID for codec listing.

       // * Video codec setup:
       // 1. Find the decoder for the video stream with the gived codec id.
       AVStream *stream = pFormatCtx->streams[self.selectedVideoStreamIndex];
       if (!stream)
       {
           // Create approp error.
           return error;
       }
       AVCodecParameters *codecPar = stream->codecpar;
       if (!codecPar)
       {
           // Create approp error.
           return error;
       }

       decoderCodec = avcodec_find_decoder(codecPar->codec_id);
       if (decoderCodec == NULL)
       {
           // Create approp error.
           return error;
       }

       // Get a pointer to the codec context for the video stream.
       // WARNING: The resulting AVCodecContext should be freed with avcodec_free_context().
       // Replaced:
       // self.videoCodecContext = pFormatCtx->streams[self.selectedVideoStreamIndex]->codec;
       // With:
       self.videoCodecContext = avcodec_alloc_context3(decoderCodec);
       avcodec_parameters_to_context(self.videoCodecContext,
                                     codecPar);

       self.videoCodecContext->thread_count = 4;
       NSString *description = [NSString stringWithUTF8String:decoderCodec->long_name];

       // 2. Open codec.
       if (avcodec_open2(self.videoCodecContext, decoderCodec, NULL) < 0)
       {
           // Create approp error.
           return error;
       }

       // * Audio codec setup:
       if (self.selectedAudioStreamIndex > -1)
       {
           [self setupAudioDecoder];
       }

       // Allocate a raw video frame data structure. Contains audio and video data.
       self.rawFrameData = av_frame_alloc();

       self.outputWidth = self.videoCodecContext->width;
       self.outputHeight = self.videoCodecContext->height;

       if (!isInitProcess)
       {
           // Triggering notifications in init process won't change UI since the object is created locally. All
           // objects which need data access to this object will not be able to get it. Thats why we don't notifiy anyone about the changes.
           [NSNotificationCenter.defaultCenter postNotificationName:NSNotification.rtspVideoStreamSelectionChanged
                                                             object:nil userInfo: self.selectedVideoStream];

           [NSNotificationCenter.defaultCenter postNotificationName:NSNotification.rtspAudioStreamSelectionChanged
                                                             object:nil userInfo: self.selectedAudioStream];
       }

       return nil;
    }

    UPDATE 1

    The initial architecture allowed using any given thread. Most of the below code would mostly run on the main thread. This solution was not appropriate since the opening of the stream input can take several seconds for which the main thread is blocked while waiting for a network response inside FFmpeg. To solve this issue I have implemented the following solution :

    • Creation and the initial setup are only allowed on the background_thread (see code snippet "1" below).
    • Changes are allowed on the current_thread(Any).
    • Termination is allowed on the current_thread(Any).

    After removing main thread checks and dispatch_asyncs to background threads, leaking has stopped and I can’t reproduce the issue anymore :

    // Code that produces the issue.  
    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
       // 1 - Create and do initial setup.
       // This block creates the issue.
    [self.rtspObject = [[RTSPObject alloc] initWithURL: ... ];
    [self.rtspObject openInputStreamWithVideoStreamId: ...
                                   audioStreamId: ...
                                        useFirst: ...
                                          inInit: ...];
    });

    I still don’t understand why Xcode’s memory debugger says that this block is retained ?

    Any advice or idea is welcome.

  • Android : FFMpeg (video creation) crashes with no exception when loading binaries in lower APIs (18 in my case) but works in newer ones

    6 mai 2019, par Diego Perez

    I have an app that uses FFMpeg for video creation (these next lines are the relevant build.gradle plugin files) :

    //writingminds
    api 'com.writingminds:FFmpegAndroid:0.3.2'
    //JavaCV video
    api group: 'org.bytedeco', name: 'javacv', version: '1.4.4'
    api group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '4.0.1-1.4.4', classifier: 'android-arm'
    api group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '4.1-1.4.4', classifier: 'android-arm'

    And my app is working (and creating video) just fine in my phone with a newer Android 8 version but I’m having weird problems in my old API 18 tablet (where, as a note, I had to install multidex).

    These next lines are the main part of FFMpeg video creation, where binaries are loaded, and, in fact, binaries load are where app crashes in my tablet in the line "ffmpeg.loadBinary(new LoadBinaryResponseHandler()..."

    As you can see, I have a try/catch where the app crashes, but it crashes with no aparent exception, as catch blocks are never hit.

    public static String recordVideo(JSONObject objJSON) {

       String strReturn = Enum.Result.OK;

       try {
           fileName = objJSON.has("file_name") ? String.valueOf(objJSON.getString("file_name")) : "";
           videoPath = objJSON.has("video_path") ? String.valueOf(objJSON.getString("video_path")) : "";
       } catch (JSONException e) {
           ExceptionHandler.logException(e);
       }

       FFmpeg ffmpeg = FFmpeg.getInstance(ApplicationContext.get());
       try {
           ffmpeg.loadBinary(new LoadBinaryResponseHandler() {

               @Override
               public void onStart() {}

               @Override
               public void onFailure() {}

               @Override
               public void onSuccess() {}

               @Override
               public void onFinish() {}
           });
       } catch (FFmpegNotSupportedException e) {
           // Handle if FFmpeg is not supported by device
       } catch (Exception e) {

       }
    ...

    These next lines are the relevant part of the LogCat, but I cannot figure out where the problem resides, maybe an out of memory problem ?

    Any help will be much appreciated.

    04-28 21:44:45.873 13743-13964/com.artandwords.thoughtoftheday A/libc: Fatal signal 11 (SIGSEGV) at 0x00000000 (code=1), thread 13964 (AsyncTask #4)
    04-28 21:44:45.973 144-144/? I/DEBUG: *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
    04-28 21:44:45.983 144-144/? I/DEBUG: Build fingerprint: 'asus/WW_epad/ME302C:4.3/JSS15Q/WW_epad-V5.0.21-20140701:user/release-keys'
    04-28 21:44:45.983 144-144/? I/DEBUG: Revision: '0'
    04-28 21:44:45.983 144-144/? I/DEBUG: pid: 13743, tid: 13964, name: AsyncTask #4  >>> com.artandwords.thoughtoftheday <<<
    04-28 21:44:45.983 144-144/? I/DEBUG: signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 00000000
    04-28 21:44:46.003 144-144/? I/DEBUG:     eax 00000000  ebx 000000c6  ecx 00000000  edx 00000000
    04-28 21:44:46.003 144-144/? I/DEBUG:     esi 00000e59  edi 00000000
    04-28 21:44:46.003 144-144/? I/DEBUG:     xcs 00000073  xds 0000007b  xes 0000007b  xfs 00000043  xss 0000007b
    04-28 21:44:46.003 144-144/? I/DEBUG:     eip 784ed378  ebp 2200ff0c  esp 2200fec4  flags 00210246
    04-28 21:44:46.003 144-144/? I/DEBUG: backtrace:
    04-28 21:44:46.003 144-144/? I/DEBUG:     #00  pc 00087378  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.003 144-144/? I/DEBUG:     #01  pc 00085d0e  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.003 144-144/? I/DEBUG:     #02  pc 00073328  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.003 144-144/? I/DEBUG:     #03  pc 0006f7ff  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.003 144-144/? I/DEBUG:     #04  pc 0006f3bf  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.003 144-144/? I/DEBUG:     #05  pc 000b92de  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.003 144-144/? I/DEBUG:     #06  pc ffffffff  <unknown>
    04-28 21:44:46.003 144-144/? I/DEBUG:     #07  pc 001445aa  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.013 144-144/? I/DEBUG: stack:
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436850  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436854  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436858  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b43685c  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436860  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436864  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436868  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b43686c  3822676c  /system/lib/arm/libc.so
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436870  7b436a98  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436874  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436878  7b4368c8  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b43687c  7b436a98  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436880  383003a0  /system/lib/arm/libdl.so
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436884  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436888  7b4368c8  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b43688c  785aa5ab  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.013 144-144/? I/DEBUG:     #07  7b436890  7b4368a0  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436894  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b436898  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b43689c  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368a0  7b4368c8  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368a4  7b436890  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368a8  785aa59d  /system/lib/libhoudini.so.3.4.7.44914
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368ac  7b436a98  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368b0  7b437930  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368b4  220001d0  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368b8  7b436a70  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368bc  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368c0  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368c4  00000000  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368c8  7b436a88  
    04-28 21:44:46.013 144-144/? I/DEBUG:          7b4368cc  785f3141  /system/lib/libhoudini.so.3.4.7.44914


       --------- beginning of /dev/log/system
    04-28 21:44:46.063 450-470/? I/BootReceiver: Copying /data/tombstones/tombstone_03 to DropBox (SYSTEM_TOMBSTONE)
    04-28 21:44:46.063 450-13973/? W/ActivityManager:   Force finishing activity com.artandwords.thoughtoftheday/.activities.DisplayThoughtActivity
    04-28 21:44:46.073 145-862/? E/IMGSRV: :0: PVRDRMOpen: TP3, ret = 75
    04-28 21:44:46.093 450-13973/? E/JavaBinder: !!! FAILED BINDER TRANSACTION !!!
    04-28 21:44:46.093 450-483/? W/InputDispatcher: channel '21edd9e8 com.artandwords.thoughtoftheday/com.artandwords.thoughtoftheday.activities.main.MainActivity (server)' ~ Consumer closed input channel or an error occurred.  events=0x9
    04-28 21:44:46.093 450-483/? E/InputDispatcher: channel '21edd9e8 com.artandwords.thoughtoftheday/com.artandwords.thoughtoftheday.activities.main.MainActivity (server)' ~ Channel is unrecoverably broken and will be disposed!
    04-28 21:44:46.093 155-13945/? W/TimedEventQueue: Event 25 was not found in the queue, already cancelled?
    04-28 21:44:46.093 155-3134/? W/AudioFlinger: session id 324 not found for pid 155
    04-28 21:44:46.103 450-450/? W/InputDispatcher: Attempted to unregister already unregistered input channel '21edd9e8 com.artandwords.thoughtoftheday/com.artandwords.thoughtoftheday.activities.main.MainActivity (server)'
    04-28 21:44:46.103 450-755/? I/WindowState: WIN DEATH: Window{21b33a28 u0 com.artandwords.thoughtoftheday/com.artandwords.thoughtoftheday.activities.DisplayThoughtActivity}
    04-28 21:44:46.103 450-13973/? W/ActivityManager: Exception thrown during pause
       android.os.TransactionTooLargeException
           at android.os.BinderProxy.transact(Native Method)
           at android.app.ApplicationThreadProxy.schedulePauseActivity(ApplicationThreadNative.java:642)
           at com.android.server.am.ActivityStack.startPausingLocked(ActivityStack.java:1007)
           at com.android.server.am.ActivityStack.finishActivityLocked(ActivityStack.java:3905)
           at com.android.server.am.ActivityStack.finishActivityLocked(ActivityStack.java:3837)
           at com.android.server.am.ActivityManagerService.handleAppCrashLocked(ActivityManagerService.java:8588)
           at com.android.server.am.ActivityManagerService.makeAppCrashingLocked(ActivityManagerService.java:8465)
           at com.android.server.am.ActivityManagerService.crashApplication(ActivityManagerService.java:9170)
           at com.android.server.am.ActivityManagerService.handleApplicationCrashInner(ActivityManagerService.java:8699)
           at com.android.server.am.NativeCrashListener$NativeCrashReporter.run(NativeCrashListener.java:86)
    04-28 21:44:46.103 450-450/? I/WindowState: WIN DEATH: Window{21edd9e8 u0 com.artandwords.thoughtoftheday/com.artandwords.thoughtoftheday.activities.main.MainActivity}
    04-28 21:44:46.103 450-636/? I/WindowState: WIN DEATH: Window{21be0690 u0 com.artandwords.thoughtoftheday/com.artandwords.thoughtoftheday.activities.DisplayThoughtActivity}
    04-28 21:44:46.113 450-13973/? I/WindowManager: computeScreenConfigurationLocked() Enter {1.15 ?mcc?mnc ?locale ?layoutDir ?swdp ?wdp ?hdp ?density ?lsize ?long ?orien ?uimode ?night ?touch ?keyb/?/? ?nav/?}
    04-28 21:44:46.113 450-13973/? I/WindowManager: dw=1200, dh=1920
    04-28 21:44:46.113 450-13973/? I/WindowManager: appWidth=1200, appHeight=1848
    04-28 21:44:46.113 450-13973/? I/WindowManager: tempdm=DisplayMetrics{density=1.5, width=1200, height=1848, scaledDensity=1.5, xdpi=221.201, ydpi=220.591}
    04-28 21:44:46.113 450-13973/? I/WindowManager: dm=DisplayMetrics{density=1.5, width=1200, height=1848, scaledDensity=1.5, xdpi=221.201, ydpi=220.591}, ro.product.device=ME302C
    04-28 21:44:46.113 450-13973/? I/WindowManager: getConfigDisplayWidth=1200, getConfigDisplayHeight=1810
    04-28 21:44:46.113 450-13973/? I/WindowManager: screenWidthDp=800, screenHeightDp=1206
    04-28 21:44:46.113 450-13973/? I/WindowManager: computeScreenConfigurationLocked() Leave {1.15 ?mcc?mnc ?locale ?layoutDir sw800dp w800dp h1206dp 240dpi xlrg port ?uimode ?night finger -keyb/v/h -nav/h}
    04-28 21:44:46.113 450-13973/? I/ActivityManager: Restarting because process died: ActivityRecord{21ab1f80 u0 com.artandwords.thoughtoftheday/.activities.main.MainActivity}
    04-28 21:44:46.113 450-13973/? W/ActivityManager: Exception when starting activity com.artandwords.thoughtoftheday/.activities.main.MainActivity
       android.os.DeadObjectException
           at android.os.BinderProxy.transact(Native Method)
           at android.app.ApplicationThreadProxy.scheduleLaunchActivity(ApplicationThreadNative.java:730)
           at com.android.server.am.ActivityStack.realStartActivityLocked(ActivityStack.java:733)
           at com.android.server.am.ActivityStack.startSpecificActivityLocked(ActivityStack.java:840)
           at com.android.server.am.ActivityStack.resumeTopActivityLocked(ActivityStack.java:1790)
           at com.android.server.am.ActivityStack.resumeTopActivityLocked(ActivityStack.java:1449)
           at com.android.server.am.ActivityStack.startPausingLocked(ActivityStack.java:1058)
           at com.android.server.am.ActivityStack.finishActivityLocked(ActivityStack.java:3905)
           at com.android.server.am.ActivityStack.finishActivityLocked(ActivityStack.java:3837)
           at com.android.server.am.ActivityManagerService.handleAppCrashLocked(ActivityManagerService.java:8588)
           at com.android.server.am.ActivityManagerService.makeAppCrashingLocked(ActivityManagerService.java:8465)
           at com.android.server.am.ActivityManagerService.crashApplication(ActivityManagerService.java:9170)
           at com.android.server.am.ActivityManagerService.handleApplicationCrashInner(ActivityManagerService.java:8699)
           at com.android.server.am.NativeCrashListener$NativeCrashReporter.run(NativeCrashListener.java:86)
    04-28 21:44:46.123 450-13973/? W/ContextImpl: Calling a method in the system process without a qualified user: android.app.ContextImpl.startService:1396 com.android.server.am.ActivityStack.sendActivityBroadcastLocked:4923 com.android.server.am.ActivityStack.removeActivityFromHistoryLocked:4089 com.android.server.am.ActivityStack.removeHistoryRecordsForAppLocked:4346 com.android.server.am.ActivityManagerService.handleAppDiedLocked:3163
    04-28 21:44:46.123 450-13973/? I/ActivityManager: Start proc com.artandwords.thoughtoftheday for activity com.artandwords.thoughtoftheday/.activities.main.MainActivity: pid=13975
    </unknown>

    Edit 1 :

    Still investigating and entered FFmpeg.java to the method loadBinary while debugging, which code I’ll paste below and the line making the crash is switch (CpuArchHelper.getCpuArch())

    @Override
    public void loadBinary(FFmpegLoadBinaryResponseHandler ffmpegLoadBinaryResponseHandler) throws FFmpegNotSupportedException {
       String cpuArchNameFromAssets = null;
       switch (CpuArchHelper.getCpuArch()) {
           case x86:
               Log.i("Loading FFmpeg for x86 CPU");
               cpuArchNameFromAssets = "x86";
               break;
           case ARMv7:
               Log.i("Loading FFmpeg for armv7 CPU");
               cpuArchNameFromAssets = "armeabi-v7a";
               break;
           case NONE:
               throw new FFmpegNotSupportedException("Device not supported");
       }

       if (!TextUtils.isEmpty(cpuArchNameFromAssets)) {
           ffmpegLoadLibraryAsyncTask = new FFmpegLoadLibraryAsyncTask(context, cpuArchNameFromAssets, ffmpegLoadBinaryResponseHandler);
           ffmpegLoadLibraryAsyncTask.execute();
       } else {
           throw new FFmpegNotSupportedException("Device not supported");
       }
    }

    I’ll keep on investigating...

    Edit 2 :

    Further debugging has just lead me to the exact line where app crashes, and it’s CpuArchHelper.java from FFmpeg library :

    The line causing the crash is the next :

    String archInfo = cpuNativeArchHelper.cpuArchFromJNI();

    and I cannot even go inside cpuArchFromJNI() with F7 as it just crashes.

    package com.github.hiteshsondhi88.libffmpeg;

    import android.os.Build;

    class CpuArchHelper {

       static CpuArch getCpuArch() {
           Log.d("Build.CPU_ABI : " + Build.CPU_ABI);
           // check if device is x86 or x86_64
           if (Build.CPU_ABI.equals(getx86CpuAbi()) || Build.CPU_ABI.equals(getx86_64CpuAbi())) {
               return CpuArch.x86;
           } else {
               // check if device is armeabi
               if (Build.CPU_ABI.equals(getArmeabiv7CpuAbi())) {
                   ArmArchHelper cpuNativeArchHelper = new ArmArchHelper();
                   String archInfo = cpuNativeArchHelper.cpuArchFromJNI();
                   // check if device is arm v7
                   if (cpuNativeArchHelper.isARM_v7_CPU(archInfo)) {
                       // check if device is neon
                       return CpuArch.ARMv7;
                   }
                   // check if device is arm64 which is supported by ARMV7
               } else if (Build.CPU_ABI.equals(getArm64CpuAbi())) {
                   return CpuArch.ARMv7;
               }
           }
           return CpuArch.NONE;
       }

       static String getx86CpuAbi() {
           return "x86";
       }

       static String getx86_64CpuAbi() {
           return "x86_64";
       }

       static String getArm64CpuAbi() {
           return "arm64-v8a";
       }

       static String getArmeabiv7CpuAbi() {
           return "armeabi-v7a";
       }
    }

    This is ArmArchHelper.java class :

    package com.github.hiteshsondhi88.libffmpeg;

    class ArmArchHelper {
       static {
           System.loadLibrary("ARM_ARCH");
       }

       native String cpuArchFromJNI();

       boolean isARM_v7_CPU(String cpuInfoString) {
           return cpuInfoString.contains("v7");
       }

       boolean isNeonSupported(String cpuInfoString) {
           // check cpu arch for loading correct ffmpeg lib
           return cpuInfoString.contains("-neon");
       }

    }

    Edit 3 :

    Reading carefully LogCat I’ve noticed there is a TransactionTooLarge Exception :

    04-28 21:44:46.103 450-13973/? W/ActivityManager: Exception thrown during pause
       android.os.TransactionTooLargeException
           at android.os.BinderProxy.transact(Native Method)
           at android.app.ApplicationThreadProxy.schedulePauseActivity(ApplicationThreadNative.java:642)
           at com.android.server.am.ActivityStack.startPausingLocked(ActivityStack.java:1007)
           at com.android.server.am.ActivityStack.finishActivityLocked(ActivityStack.java:3905)
           at com.android.server.am.ActivityStack.finishActivityLocked(ActivityStack.java:3837)
           at com.android.server.am.ActivityManagerService.handleAppCrashLocked(ActivityManagerService.java:8588)
           at com.android.server.am.ActivityManagerService.makeAppCrashingLocked(ActivityManagerService.java:8465)
           at com.android.server.am.ActivityManagerService.crashApplication(ActivityManagerService.java:9170)
           at com.android.server.am.ActivityManagerService.handleApplicationCrashInner(ActivityManagerService.java:8699)
           at com.android.server.am.NativeCrashListener$NativeCrashReporter.run(NativeCrashListener.java:86)

    Maybe this is what makes my app to crash, but I don’t know what to do with that, as it happens inside the FFMpeg library :s

    Edit 3 :

    As a note, I’ve tried in an old Android 4.2.2 (API 17) phone I also own and it’s working just fine, the video is generated without crashing.