Recherche avancée

Médias (0)

Mot : - Tags -/albums

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (51)

  • Other interesting software

    13 avril 2011, par

    We don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
    The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
    We don’t know them, we didn’t try them, but you can take a peek.
    Videopress
    Website : http://videopress.com/
    License : GNU/GPL v2
    Source code : (...)

  • Contribute to translation

    13 avril 2011

    You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
    To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
    MediaSPIP is currently available in French and English (...)

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

Sur d’autres sites (7341)

  • Issues with processing media on windows Azure

    23 septembre 2015, par Ahmed Mujtaba

    I have a website built on ASP.NET web forms that works as a media portal for users to upload videos. I’m using ffmpeg encoders to produce video contents to be streamed in the browser. I’m using the web deploy method to publish the site on the Azure server. The website get’s deployed properly however I get following issues in the live site.

    1. Video never get’s encoded and published. I get some sort of error.

    2. Video get’s published but the process of uploading and encoding the video is way too slow on the web server.

    My project solution contains upload.ashx that handles the upload requests and makes the call to encode.ashx which is responsible for the encoding and publishing of the videos. I tried to remotely debug the site but the debugger never get’s to encode.ashx.

    I was wondering if these issues can be resolved by having the website deployed with a VM ?

    Script that uploads the video file :

    var filesuploaded = 0;
       var faileduploaded = 0;

       $(function () {
           var uploader = new plupload.Uploader({
               runtimes: 'gears,html5,flash,silverlight,browserplus',
               browse_button: '<%= pickfiles.ClientID %>',
               container: 'container',
               max_file_size: '<%= MaxMediaSize %>mb',
               url: '<%=Config.GetUrl() %>videos/upload/upload.ashx',
               flash_swf_url: '<%=Config.GetUrl() %>plupload/js/plupload.flash.swf',
               silverlight_xap_url: '<%=Config.GetUrl() %>plupload/js/plupload.silverlight.xap',
               chunk_size: '4mb',
               <%= UniqueNames %>
               filters: [
               { title: '<%= AllowedFormatsDisplay %>', extensions: '<%= AllowedFormats %>'}],
               headers: { UName: '<%=UserName %>', MTP: '<%= MediaType %>' }
           });
           //uploader.bind('Init', function (up, params) {
           //    $('#filelist').html("<div>Current runtime: " + params.runtime + "</div>");
           //});

           uploader.init();

           $('#uploadfiles').click(function (e) {
               uploader.start();
               e.preventDefault();
               $("#uploadfiles").hide();
               $("#&lt;%= embd.ClientID %>").hide();
           });

           uploader.bind('FilesAdded', function (up, files) {
               $("#uploadfiles").show();
               $("#&lt;%= msg.ClientID %>").html("");
               var count=0;
               $.each(files, function (i, file) {
                   $('#filelist').append(
                       '<div class="item_pad_4 bx_br_bt">' + (count + 1) + ': ' + file.name + ' (' + plupload.formatSize(file.size) + ')  <b></b></div>' );
                   count++;
               });
               var maxupload = &lt;%= MaxVideoUploads %>;
               if(count > maxupload)
               {              
                   $.each(files, function(i, file) {
                       uploader.removeFile(file);
                   });

                   $('#filelist').html("");
                   $("#uploadfiles").hide();
                   Display_Message("#&lt;%= msg.ClientID %>", "Can't upload more than " + maxupload + " records at once!", 1, 1);
                   return false;
               }
               else {
                   $("#tfiles").html(count);
                   $("#uploadfiles").removeClass("disabled");
                   $("#&lt;%= pickfiles.ClientID %>").hide();
               }
               up.refresh(); // Reposition Flash/Silverlight
           });

           uploader.bind('UploadProgress', function (up, file) {
               $('#' + file.id + " b").html(file.percent + "%");
           });

           uploader.bind('Error', function (up, err) {
               $('#filelist').append("<div>Error: " + err.code +
                   ", Message: " + err.message +
                   (err.file ? ", File: " + err.file.name : "") +
                   "</div>"
               );
               up.refresh(); // Reposition Flash/Silverlight
           });

           var failedstatus = 0;
           uploader.bind('FileUploaded', function (up, file, info) {
               // encode started
               if (info.response != "failed" &amp;&amp; info.response != "") {
                   EncodeVD(file.id, info.response, file.size);
                   Display_Message('#' + file.id, "Please wait for final processing", 0, 1);
                   if (failedstatus == 0)
                       Redirect(info.response);
                   filesuploaded++;
               }
               else {
                   Display_Message('#' + file.id, "Response is: " + info.response, 0, 1);
               }
           });
       });
       var redcnt = 0;
       function Redirect(filename) {
           var IntervalID = setInterval(function () {
               redcnt++;
               if (redcnt > 2) {
                   clearInterval(IntervalID);
                   var tfiles = $("#tfiles").html();
                   if(tfiles == faileduploaded) { // break further processing all videos failed to upload
                   }
                   else if (filesuploaded >= tfiles) {
                       document.location = "&lt;%=ConfirmPageUrl %>?fn=" + filename + "&amp;gid=&lt;%=GalleryID %>&amp;uvids=" + $("#tfiles").html() + "&amp;mpid=" + $("#maxpid").html().trim() + "&lt;%=GroupParam %>";
                   }
           }
           }, 2000);
    }
    function EncodeVD(mid, mfile, msize) {
       var params = '&lt;%= EncodingParams %>&amp;fn=' + mfile;
           $.ajax({
               type: 'GET',
               url: '&lt;%= Encoding_Handler_Path %>',
               data: params,
               async: true,
               success: function (msg) {
                   if (msg == "Success" || msg == "") {
                       $('#' + mid).html('<strong>Uploading Completed Successfully - Wait for Processing.');
                   }
                   else {
                       failedstatus = 1;
                       faileduploaded++;
                       Display_Message('#' + mid, "Response is: " + msg, 0, 1);
                   }
               }
           });
       }
    </strong>

    Server side code for processing the file upload :

    private int MediaType = 0; // 0 : video, 1: audio

       public void ProcessRequest (HttpContext context) {
           try
           {
               context.Response.ContentType = "text/plain";
               context.Response.Write(ProcessMedia(context));
           }
           catch (Exception ex)
           {
               context.Response.Write("error|" + ex.Message);
           }
       }

       public string ProcessMedia(HttpContext context)
       {
           if (context.Request.Files.Count > 0)
           {
               int chunk = context.Request["chunk"] != null ? int.Parse(context.Request["chunk"]) : 0;
               string fileName = context.Request["name"] != null ? context.Request["name"] : string.Empty;
               //string _fileName = fileName.Remove(fileName.LastIndexOf(".")) + "-" + Guid.NewGuid().ToString().Substring(0, 6) + "" + fileName.Remove(0, fileName.LastIndexOf("."));
               HttpPostedFile fileUpload = context.Request.Files[0];

               string upath = "";
               if (context.Request.Headers["UName"] != null)
                   upath = context.Request.Headers["UName"].ToString();

               //if (CloudSettings.EnableCloudStorage &amp;&amp; upath != "")
               //    _fileName = upath.Substring(0, 3) + "-" + _fileName; // avoid duplication in cloud storage

               if (context.Request.Headers["MTP"] != null)
                   MediaType = Convert.ToInt32(context.Request.Headers["MTP"]);

               //string extensions = "";
               //if (MediaType == 0)
               //    extensions = Site_Settings.Video_Allowed_Formats;
               //else
               //    extensions = Site_Settings.Audio_Allowed_Formats;

               //bool sts = UtilityBLL.Check_File_Extension(extensions, fileName.ToLower());
               //if (sts == false)
               //{
               //    return "Invalid format, please upload proper video!"; // Invalid video format, please upload proper video
               //}

               int allowable_size_mb = 0;
               if (MediaType == 0)
               {
                   allowable_size_mb = Site_Settings.Video_Max_Size;
               }
               else
               {
                   allowable_size_mb = Site_Settings.Audio_Max_Size;
               }
               int UploadSize = allowable_size_mb * 1000000;
               if (fileUpload.ContentLength > UploadSize)
               {
                   return "Video Limit Exceeds";
               }

               string uploadPath = "";
               // check whether audio / mp3 encoding enabled
               if (this.MediaType == 1)
               {
                   // audio encoding
                   if (fileName.EndsWith(".mp3"))
                   {
                       // upload mp3 directly in mp3 path instead of default path
                       if (upath == "")
                           uploadPath = UrlConfig.MP3_Path(); // source video path
                       else
                           uploadPath = UrlConfig.MP3_Path(upath); // source video path
                   }
                   else
                   {
                       // default path
                       if (upath == "")
                           uploadPath = UrlConfig.Source_Video_Path(); // source video path
                       else
                           uploadPath = UrlConfig.Source_Video_Path(upath); // source video path
                   }
               }
               else
               {//azure
                   // default path
                   if (upath == "")
                       uploadPath = UrlConfig.Source_Video_Path(); // source video path
                   else
                       uploadPath = UrlConfig.Source_Video_Path(upath); // source video path
               }

               FileStream fs;
               using (fs = new FileStream(Path.Combine(uploadPath, fileName), chunk == 0 ? FileMode.Create : FileMode.Append))
               {
                   byte[] buffer = new byte[fileUpload.InputStream.Length];
                   fileUpload.InputStream.Read(buffer, 0, buffer.Length);

                   fs.Write(buffer, 0, buffer.Length);
               }
               return fileName; // "Success";
           }
           else
           {
               return "failed";
           }

           return "";
       }
       public bool IsReusable {
           get {
               return false;
           }
       }

    code in encode.aspx responsible for encoding the video :

    private string EncodeMedia(HttpContext context)
    {
       string sourcepath = "";
       string publishedpath = "";
       string mp3path = "";
       string thumbpath = "";
       if (this.UserName != "")
       {//azure
           sourcepath = UrlConfig.Source_Video_Path(this.UserName);
           publishedpath = UrlConfig.Published_Video_Path(this.UserName);
           mp3path = UrlConfig.MP3_Path(this.UserName);
           thumbpath = UrlConfig.Thumbs_Path(this.UserName);
       }
       else
       {
           sourcepath = UrlConfig.Source_Video_Path();
           publishedpath = UrlConfig.Published_Video_Path();
           mp3path = UrlConfig.MP3_Path();
           thumbpath = UrlConfig.Thumbs_Path();
       }
       if (this.FileName.EndsWith(".mp3") &amp;&amp; this.MediaType == 1)
       {
           // mp3 and audio format
           if (!File.Exists(mp3path + "/" + this.FileName))
           {
               return "Audio file not found!";
           }
       }
       else
       {
           // rest normal video and audio encoding
           if (!File.Exists(sourcepath + "/" + this.FileName))
           {
               return "Source file not found!";
           }
       }

       if (CloudSettings.EnableCloudStorage &amp;&amp; this.UserName != "")
           this.FileName = this.UserName.Substring(0, 3) + "-" + this.FileName; // avoid duplication in cloud storage


       //double f_contentlength = 0;
       //if (Site_Settings.Feature_Packages == 1)
       //{
       //    if (Config.GetMembershipAccountUpgradeType() != 1)
       //    {
       //        // Check whether user have enough space to upload content
       //        // Restriction only for normal or premium users
       //        f_contentlength = (double)fileUpload.ContentLength / 1000000;
       //        string media_field_name = "space_video";
       //        if (MediaType == 1)
       //            media_field_name = "space_audio";
       //        if (!User_PackagesBLL.Check_User_Space_Status(upath, media_field_name, f_contentlength) &amp;&amp; !isAdmin)
       //        {
       //            // insufficient credits to upload content
       //            return "Insufficient credits to upload media file"; //   Response.Redirect(Config.GetUrl("myaccount/packages.aspx?status=" + media_field_name), true);
       //        }
       //    }
       //}

       this.backgroundpublishing = true; // should be true on direct encoding
       // Video Processing
       string flv_filename = "";
       string original_filename = "";
       string thumb_filename = "";
       string duration = "";
       int duration_sec = 0;

       // set video actions : 1 -> on, 0 -> off
       int isenabled = 1;
       int ispublished = 1;
       int isreviewed = 1;
       int isresponse = 0;
       if (Response_VideoID > 0)
           isresponse = 1;

       string flv_url = "none";
       string thumb_url = "none";
       string org_url = "none";
       string _embed = "";

       string errorcode = "0";
       VideoInfo info = null;

       if (Site_Settings.Content_Approval == 0)
           isreviewed = 0;


       // check whether audio / mp3 encoding enabled
       if (this.FileName.EndsWith(".mp3") &amp;&amp; this.MediaType==1)
       {
           // audio encoding
           // mp3 file already
           // so no encoding happens
           MediaHandler _minfo = new MediaHandler();
           _minfo.FFMPEGPath = Encoding_Settings.FFMPEGPATH;
           _minfo.FileName = FileName;
           _minfo.InputPath = mp3path;
           info = _minfo.Get_Info();

           flv_filename = FileName;
           original_filename = FileName;
           duration = info.Duration;
           duration_sec = info.Duration_Sec;
           isenabled = 1; // enabled
       }
       else if (this.directpublishing)
       {            

           // publish video
           ArrayList itags = new ArrayList();
           MHPEncoder encoder = new MHPEncoder();
           //if (this.FileName.EndsWith(".mpeg") || this.FileName.EndsWith(".mpg")) // use mpg compatible ffmpeg encoder
           //    encoder.FfmpegPath = HttpContext.Current.Server.MapPath(HttpContext.Current.Request.ApplicationPath) + "\\ffmpeg\\ffmpegbk\\ffmpeg.exe";
               //encoder.ThumbFfmpegPath = Encoding_Settings.FFMPEGPATH; // use normal ffmpeg encoder for thumbs processing

           //azure
           encoder.FfmpegPath = Encoding_Settings.FFMPEGPATH; // use normal ffmpeg encoder
           encoder.FlvToolPath = Encoding_Settings.FLVTOOLPATH; // set meta information for flv videos
           encoder.Mp4BoxPath = Encoding_Settings.MP4BoxPath; // set meta information for mp4 videos
           encoder.SourcePath = sourcepath;
           encoder.SourceFileName = this.FileName;

           // No cloud storage on direct encoding
           //if (CloudSettings.EnableCloudStorage)
           //    encoder.EnableCloudStorage = true;


           if (MediaType == 1)
           {
               // audio encoding
               itags.Add("14");
               encoder.iTags = itags;
               encoder.GrabThumbs = false;
               encoder.PublishedPath = mp3path;

               //_vprocess.OutputPath = this.MP3Path;
               //_vprocess.isAudio = true;
           }
           else
           {
               // video encoding
               itags.Add(EncoderSettings.DefaultITagValue.ToString()); // 5 for 360p mp4 encoding    
               //itags.Add(7);   // this will call 7 case settings to publish next video ending with _7.mp4 instead of _5.mp4
               // so there will be 2 videos with different resoultions published at the end of the process?
               // yesmake sure use proper settings first test it directly via command.    
               //okay i got it. But i'm gonna have to use a different media players to incroporate those settings
               // once published you can load different videos for different user by checking _7.mp4 (end) va
              //okay got it.

               //azure
               encoder.PublishedPath = publishedpath;
               encoder.iTags = itags;
               encoder.ThumbsDirectory = thumbpath;
               encoder.TotalThumbs = 15;

               //_vprocess.ThumbPath = this.ThumbPath;
               //_vprocess.OutputPath = this.FLVPath;
               //if (Config.isPostWaterMark())
               //{
               //    // script for posting watermark on video
               //    _vprocess.WaterMarkPath = Server.MapPath(Request.ApplicationPath) + "\\contents\\watermark";
               //    _vprocess.WaterMarkImage = "watermark.gif";

               //}
           }
           int deleteoption = Site_Settings.Video_Delete_Original;
           if (deleteoption == 1)
           {
               encoder.DeleteSource = true;
           }
           // background processing
           if (this.backgroundpublishing &amp;&amp; this.MediaType==0)
           {
               encoder.BackgroundProcessing = true;
               // get information from source video in order to store it in database
               MediaHandler _minfo = new MediaHandler();
               //if (this.FileName.EndsWith(".mpeg") || this.FileName.EndsWith(".mpg")) // use mpg compatible ffmpeg encoder
               //    encoder.FfmpegPath = HttpContext.Current.Server.MapPath(HttpContext.Current.Request.ApplicationPath) + "\\ffmpeg\\ffmpegbk\\ffmpeg.exe";
               //else
                   _minfo.FFMPEGPath = Encoding_Settings.FFMPEGPATH;

               _minfo.FileName = FileName;
               _minfo.InputPath = sourcepath;
               info = _minfo.Get_Info();
           }
           // encode video processing
           Video_Information vinfo = encoder.Process();
           if (vinfo.ErrorCode > 0)
           {
               errorcode = vinfo.ErrorCode.ToString();
               ErrorLgBLL.Add_Log("Encoding Failed Log", "", "encoding error: " + vinfo.ErrorCode.ToString() + "<br />Description: " + vinfo.ErrorDescription.ToString());
                //return vinfo.ErrorDescription;
           }
           // Double check validation
           // if published video exist
           // if thumb exist
           // then proceed further

           if (MediaType == 0)
           {
               if (!File.Exists(encoder.PublishedPath + "/" + vinfo.FLVVideoName))
               {
                   return "Video failed to published properly.";
               }
               if (!File.Exists(encoder.ThumbsDirectory + "/" + vinfo.ThumbFileName))
               {
                   return "Thumbs failed to grab from video properly.";
               }
           }
           else
           {
               if (vinfo.FLVVideoName == "")
               {
                   vinfo.FLVVideoName = this.FileName.Remove(this.FileName.LastIndexOf(".")) + "_14.mp3"; // mp3 file path name
               }
               if (!File.Exists(encoder.PublishedPath + "/" + vinfo.FLVVideoName))
               {
                   return "Audio failed to published properly.";
               }
           }
           // Now thumbs and video published, procceed for data processing
           // get information from vinfo object
           if (this.backgroundpublishing &amp;&amp; this.MediaType == 0)
           {
               string OutputFileName = this.FileName.Remove(this.FileName.LastIndexOf("."));
               flv_filename = OutputFileName + "_" + EncoderSettings.DefaultITagValue + "." + EncoderSettings.Return_Output_Extension(EncoderSettings.DefaultITagValue);
               original_filename = vinfo.OriginalVideoName;
               thumb_filename = OutputFileName + "_008.jpg"; // info.ThumbFileName;
               duration = info.Duration;
               duration_sec = info.Duration_Sec;
           }
           else
           {
               flv_filename = vinfo.FLVVideoName;
               original_filename = vinfo.OriginalVideoName;
               thumb_filename = vinfo.ThumbFileName;
               duration = vinfo.Duration;
               duration_sec = vinfo.Duration_Sec;
               isenabled = vinfo.isEnabled;
           }

           // No cloud storage on direct encoding.
           // Note cloude storage only works if background processing is disabled
           // Or works in cased of sheduled processing
           if (CloudSettings.EnableCloudStorage &amp;&amp; errorcode == "0")
           {
               flv_url = "amazon";
               org_url = "https://s3.amazonaws.com/" + CloudSettings.OriginalVideoBucketName + "/" + this.FileName;
               thumb_url = "https://s3.amazonaws.com/" + CloudSettings.ThumbsBucketName + "/" + thumb_filename;

           }

       }
       else
       {
           // set publishing status off.
           ispublished = 0;
           original_filename = this.FileName;
       }

       // Store video information in database
       string ipaddress = context.Request.ServerVariables["REMOTE_ADDR"].ToString();

       // Store media information in database
       Video_Struct vd = new Video_Struct();
       vd.CategoryID = 0; // store categoryname or term instead of category id
       vd.Categories = Categories;
       vd.UserName = UserName;
       vd.Title = "";
       vd.Description = "";
       vd.Tags = Tags;
       vd.Duration = duration;
       vd.Duration_Sec = duration_sec;
       vd.OriginalVideoFileName = original_filename;
       vd.VideoFileName = flv_filename;
       vd.ThumbFileName = thumb_filename;
       vd.isPrivate = Privacy;
       vd.AuthKey = PAuth;
       vd.isEnabled = isenabled;
       vd.Response_VideoID = Response_VideoID; // video responses
       vd.isResponse = isresponse;
       vd.isPublished = ispublished;
       vd.isReviewed = isreviewed;
       vd.FLV_Url = flv_url;
       vd.Thumb_Url = thumb_url;
       vd.Org_Url = org_url;
       vd.Embed_Script = _embed;
       vd.isExternal = 0; // website own video, 1: embed video
       vd.IPAddress = ipaddress;
       vd.Type = MediaType;
       vd.YoutubeID = "";
       vd.isTagsreViewed = 1;
       vd.Mode = 0; // filter videos based on website sections
       //vd.ContentLength = f_contentlength;
       vd.GalleryID = GID;
       vd.ErrorCode = Convert.ToInt32(errorcode);
       long videoid = VideoBLL.Process_Info(vd, false);

       // Process tags
       if (Tags != "")
       {
           int tag_type = 0; // represent videos
           if (MediaType == 1)
               tag_type = 4; // represent audio file
           TagsBLL.Process_Tags(Tags, tag_type, 0);
       }

       if (Response_VideoID > 0)
       {
           VideoBLL.Update_Responses(Response_VideoID);
       }

       return "Success";
    }
  • Compile Shotdetect on CentOS 7

    14 septembre 2015, par BitLegacy01

    I’m migrating a service from a Debian server to a CentOS 7 one.

    The service needs the Shotdetect command (github , official site).

    I could not make it work on CentOS though I followed all steps :

    1. Compile ffmpeg compilation guide

      # ffmpeg
      ffmpeg version git-2015-08-21-7a806c6 Copyright (c) 2000-2015 the FFmpeg developers
       built with gcc 4.8.3 (GCC) 20140911 (Red Hat 4.8.3-9)
       configuration: --prefix=/root/ffmpeg_build --extra-cflags=-I/root/ffmpeg_build/include --extra-ldflags=-L/root/ffmpeg_build/lib --bindir=/root/bin --pkg-config-flags=--static --enable-gpl --enable-nonfree --enable-libfdk-aac --enable-libfreetype --disable-libmp3lame --disable-libopus --disable-libvorbis --disable-libvpx --enable-libx264 --disable-libx265 --disable-lzma
       libavutil      54. 31.100 / 54. 31.100
       libavcodec     56. 58.100 / 56. 58.100
       libavformat    56. 40.101 / 56. 40.101
       libavdevice    56.  4.100 / 56.  4.100
       libavfilter     5. 36.100 /  5. 36.100
       libswscale      3.  1.101 /  3.  1.101
       libswresample   1.  2.101 /  1.  2.101
       libpostproc    53.  3.100 / 53.  3.100
      Hyper fast Audio and Video encoder
      usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...
      1. Install needed packages

        # yum install gd-devel libxml2-devel libxslt-devel clang libvpx libvorbis
      2. Add pthread in the CMakeList file, in TARGET_LINK_LIBRARIES, because it throws an lpthread missing in linker error

      3. Compile Shotdetect

        # FFMPEG_DIR="/root/ffmpeg_build"  ./compile.sh cmd

    5.Lost in contemplation

    -- avformat library found: /root/ffmpeg_build/lib/libavformat.a
    -- avcodec library found: /root/ffmpeg_build/lib/libavcodec.a
    -- avutil library found: /root/ffmpeg_build/lib/libavutil.a
    -- swscale library found: /root/ffmpeg_build/lib/libswscale.a
    Found all FFmpeg libraries in /root/ffmpeg_build/lib/libavformat.a;/root/ffmpeg_build/lib/libavdevice.a;/root/ffmpeg_build/lib/libavcodec.a;/root/ffmpeg_build/lib/libavutil.a;/root/ffmpeg_build/lib/libswscale.a.
    -- Found GD: /usr/lib64/libgd.so
    Found libgd: /usr/lib64/libgd.so;/usr/lib64/libpng.so;/usr/lib64/libz.so;/usr/lib64/libjpeg.so in /usr/include
    Found libxml2: /usr/lib64/libxml2.so in /usr/include/libxml2
    Found libxslt: /usr/lib64/libxslt.so in /usr/include
    -- Configuring done
    -- Generating done
    -- Build files have been written to: /root/Shotdetect-master/build
    [ 71%] Built target shotdetect
    Linking CXX executable shotdetect-cmd
    /root/ffmpeg_build/lib/libavcodec.a(libfdk-aacdec.o): in the function « fdk_aac_decode_frame »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacdec.c:323: undefined reference to « aacDecoder_Fill »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacdec.c:329: undefined reference to « aacDecoder_DecodeFrame »
    /root/ffmpeg_build/lib/libavcodec.a(libfdk-aacdec.o): in the function « get_stream_info »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacdec.c:88: undefined reference to « aacDecoder_GetStreamInfo »
    /root/ffmpeg_build/lib/libavcodec.a(libfdk-aacdec.o): in the function « fdk_aac_decode_close »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacdec.c:201: undefined reference to « aacDecoder_Close »
    /root/ffmpeg_build/lib/libavcodec.a(libfdk-aacdec.o): in the function « fdk_aac_decode_init »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacdec.c:214: undefined reference to « aacDecoder_Open »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacdec.c:221: undefined reference to « aacDecoder_ConfigRaw »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacdec.c:228: undefined reference to « aacDecoder_SetParam »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacdec.c:252: undefined reference to « aacDecoder_SetParam »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacdec.c:270: undefined reference to « aacDecoder_SetParam »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacdec.c:277: undefined reference to « aacDecoder_SetParam »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacdec.c:284: undefined reference to « aacDecoder_SetParam »
    /root/ffmpeg_build/lib/libavcodec.a(libfdk-aacdec.o):/root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacdec.c:291: encore plus de références indéfinies suivent vers « aacDecoder_SetParam »
    /root/ffmpeg_build/lib/libavcodec.a(libfdk-aacdec.o): in the function « fdk_aac_decode_init »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacdec.c:261: undefined reference to « aacDecoder_AncDataInit »
    /root/ffmpeg_build/lib/libavcodec.a(libfdk-aacdec.o): in the function « fdk_aac_decode_flush »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacdec.c:367: undefined reference to « aacDecoder_SetParam »
    /root/ffmpeg_build/lib/libavcodec.a(libfdk-aacenc.o): in the function « aac_encode_close »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacenc.c:97: undefined reference to « aacEncClose »
    /root/ffmpeg_build/lib/libavcodec.a(libfdk-aacenc.o): in the function « aac_encode_frame »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacenc.c:357: undefined reference to « aacEncEncode »
    /root/ffmpeg_build/lib/libavcodec.a(libfdk-aacenc.o): in the function « aac_encode_init »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacenc.c:114: undefined reference to « aacEncOpen »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacenc.c:123: undefined reference to « aacEncoder_SetParam »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacenc.c:130: undefined reference to « aacEncoder_SetParam »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacenc.c:138: undefined reference to « aacEncoder_SetParam »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacenc.c:172: undefined reference to « aacEncoder_SetParam »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacenc.c:179: undefined reference to « aacEncoder_SetParam »
    /root/ffmpeg_build/lib/libavcodec.a(libfdk-aacenc.o):/root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacenc.c:197: encore plus de références indéfinies suivent vers « aacEncoder_SetParam »
    /root/ffmpeg_build/lib/libavcodec.a(libfdk-aacenc.o): in the function « aac_encode_init »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacenc.c:276: undefined reference to « aacEncEncode »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libfdk-aacenc.c:282: undefined reference to « aacEncInfo »
    /root/ffmpeg_build/lib/libavcodec.a(libx264.o): in the function « X264_frame »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:260: undefined reference to « x264_picture_init »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:262: undefined reference to « x264_bit_depth »
    /root/ffmpeg_build/lib/libavcodec.a(libx264.o): in the function « reconfig_encoder »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:246: undefined reference to « x264_encoder_reconfig »
    /root/ffmpeg_build/lib/libavcodec.a(libx264.o): in the function « X264_frame »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:283: undefined reference to « x264_encoder_encode »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:289: undefined reference to « x264_encoder_delayed_frames »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:283: undefined reference to « x264_encoder_encode »
    /root/ffmpeg_build/lib/libavcodec.a(libx264.o): in the function « reconfig_encoder »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:173: undefined reference to « x264_encoder_reconfig »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:178: undefined reference to « x264_encoder_reconfig »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:185: undefined reference to « x264_encoder_reconfig »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:205: undefined reference to « x264_encoder_reconfig »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:211: undefined reference to « x264_encoder_reconfig »
    /root/ffmpeg_build/lib/libavcodec.a(libx264.o):/root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:191: encore plus de références indéfinies suivent vers « x264_encoder_reconfig »
    /root/ffmpeg_build/lib/libavcodec.a(libx264.o): in the function « X264_init_static »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:806: undefined reference to « x264_bit_depth »
    /root/ffmpeg_build/lib/libavcodec.a(libx264.o): in the function « X264_close »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:339: undefined reference to « x264_encoder_close »
    /root/ffmpeg_build/lib/libavcodec.a(libx264.o): in the function « X264_init »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:413: undefined reference to « x264_param_default »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:418: undefined reference to « x264_param_default_preset »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:440: undefined reference to « x264_param_parse »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:470: undefined reference to « x264_param_parse »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:514: undefined reference to « x264_levels »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:516: undefined reference to « x264_levels »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:540: undefined reference to « x264_param_parse »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:541: undefined reference to « x264_param_parse »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:542: undefined reference to « x264_param_parse »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:543: undefined reference to « x264_param_parse »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:601: undefined reference to « x264_param_apply_fastfirstpass »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:651: undefined reference to « x264_param_apply_profile »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:703: undefined reference to « x264_param_parse »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:705: undefined reference to « x264_param_parse »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:717: undefined reference to « x264_param_parse »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:735: undefined reference to « x264_encoder_open_148 »
    /root/ffmpeg_sources/ffmpeg/libavcodec/libx264.c:744: undefined reference to « x264_encoder_headers »
    /root/ffmpeg_build/lib/libavcodec.a(opusdec.o): in the function « opus_decode_subpacket »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/opusdec.c:379: undefined reference to « swr_is_initialized »
    /root/ffmpeg_build/lib/libavcodec.a(opusdec.o): in the function « opus_decode_frame »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/opusdec.c:220: undefined reference to « swr_is_initialized »
    /root/ffmpeg_build/lib/libavcodec.a(opusdec.o): in the function « opus_init_resample »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/opusdec.c:161: undefined reference to « swr_init »
    /root/ffmpeg_sources/ffmpeg/libavcodec/opusdec.c:167: undefined reference to « swr_convert »
    /root/ffmpeg_build/lib/libavcodec.a(opusdec.o): in the function « opus_decode_frame »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/opusdec.c:234: undefined reference to « swr_convert »
    /root/ffmpeg_build/lib/libavcodec.a(opusdec.o): in the function « opus_flush_resample »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/opusdec.c:115: undefined reference to « swr_convert »
    /root/ffmpeg_build/lib/libavcodec.a(opusdec.o): in the function « opus_decode_subpacket »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/opusdec.c:411: undefined reference to « swr_close »
    /root/ffmpeg_build/lib/libavcodec.a(opusdec.o): in the function « opus_decode_flush »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/opusdec.c:614: undefined reference to « swr_close »
    /root/ffmpeg_build/lib/libavcodec.a(opusdec.o): in the function « opus_decode_close »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/opusdec.c:638: undefined reference to « swr_free »
    /root/ffmpeg_build/lib/libavcodec.a(opusdec.o): in the function « opus_decode_init »:
    /root/ffmpeg_sources/ffmpeg/libavcodec/opusdec.c:705: undefined reference to « swr_alloc »
    clang: error: linker command failed with exit code 1 (use -v to see invocation)
    make[2]: *** [shotdetect-cmd] Error 1
    make[1]: *** [CMakeFiles/shotdetect-cmd.dir/all] Error 2
    make: *** [all] Error 2

    I’ve done a lot of research about the above errors but I didn’t find any clue.

    How to compile successfully Shotdetect on CentOS7 ?

    Thanks to tryp here is a working CMakelist file :

    # CMake integration by Christian Frisson
    cmake_minimum_required(VERSION 2.8)
    PROJECT(shotdetect)

    SET(CMAKE_C_COMPILER "/usr/bin/clang")
    SET(CMAKE_CXX_COMPILER "/usr/bin/clang++")
    SET(CMAKE_MODULE_PATH "${CMAKE_SOURCE_DIR}/cmake")


    # Get git version information for automatic embedding in version string:

    include(GetGitRevisionDescription)
    get_git_head_revision(GIT_REFSPEC GIT_SHA1)
    # Take the first 12 characters of the SHA1 as short identifier:
    STRING(SUBSTRING ${GIT_SHA1} 0 11 GIT_SHA1_SHORT)
    configure_file("${CMAKE_CURRENT_SOURCE_DIR}/src/version.cc.in" "${CMAKE_CURRENT_BINARY_DIR}/version.cc" @ONLY)
    list(APPEND SOURCES "${CMAKE_CURRENT_BINARY_DIR}/version.cc" src/version.h)

    # Miscellaneous compilation options:

    OPTION(USE_WXWIDGETS "Compile GUI app with wxWidgets, otherwise commandline app" ON)
    OPTION(USE_POSTGRESQL "Compile with PostgreSQL support" OFF)

    # Dependency: pkg-config (required if cross-compiling with MXE)

    IF ( MINGW AND CMAKE_TOOLCHAIN_FILE)
       FIND_PACKAGE (PkgConfig)
       IF(NOT PKG_CONFIG_FOUND)
           MESSAGE(FATAL_ERROR "pkgconfig required for cross-compiling with MXE for windows")
       ENDIF()
       SET(PKG_CONFIG_EXECUTABLE ${PKG_CONFIG_EXECUTABLE} CACHE STRING "pkg-config")
    ENDIF()

    # Dependency: FFmpeg (required)

    FIND_PACKAGE( FFmpeg )
    IF ( MINGW AND CMAKE_TOOLCHAIN_FILE)
       PKG_CHECK_MODULES (FFMPEG_PKG REQUIRED libavcodec libavfilter libavutil libavdevice libavformat libavformat libswscale)
       IF (FFMPEG_PKG_FOUND)
           MESSAGE( "FFmpeg cflags found through pkg-config: ${FFMPEG_PKG_CFLAGS}" )
           MESSAGE( "FFmpeg ldflags found through pkg-config: ${FFMPEG_PKG_LDFLAGS}" )
           STRING(REGEX REPLACE ";" " " FFMPEG_PKG_CFLAGS "${FFMPEG_PKG_CFLAGS}")
           SET(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} ${FFMPEG_PKG_CFLAGS}")
           SET(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} ${FFMPEG_PKG_CFLAGS}")
           SET(FFMPEG_LIBRARIES ${FFMPEG_LIBRARY} ${FFMPEG_PKG_LDFLAGS})
           MESSAGE("FFMPEG_LIBRARIES ${FFMPEG_LIBRARIES}")
       ELSE (FFMPEG_PKG_FOUND)
           MESSAGE( FATAL_ERROR "FFmpeg cflags/ldflags not found through pkg-config." )
       ENDIF (FFMPEG_PKG_FOUND)
    ENDIF()
    IF(FFMPEG_LIBAVFORMAT_FOUND)
       MESSAGE(STATUS "avformat library found: ${FFMPEG_LIBAVFORMAT_LIBRARIES}")
    ELSE()
       MESSAGE(STATUS "avformat library not found.")
    ENDIF()
    IF(FFMPEG_LIBAVCODEC_FOUND)
       MESSAGE(STATUS "avcodec library found: ${FFMPEG_LIBAVCODEC_LIBRARIES}")
    ELSE()
       MESSAGE(STATUS "avcodec library not found.")
    ENDIF()
    IF(FFMPEG_LIBAVUTIL_FOUND)
       MESSAGE(STATUS "avutil library found: ${FFMPEG_LIBAVUTIL_LIBRARIES}")
    ELSE()
       MESSAGE(STATUS "avutil library not found.")
    ENDIF()
    IF(FFMPEG_LIBSWSCALE_FOUND)
       MESSAGE(STATUS "swscale library found: ${FFMPEG_LIBSWSCALE_LIBRARIES}")
    ELSE()
       MESSAGE(STATUS "swscale library not found.")
    ENDIF()
    IF(FFMPEG_FOUND)
       INCLUDE_DIRECTORIES(${FFMPEG_INCLUDE_DIR} ${FFMPEG_INCLUDE_DIRS})
       SET(FFMPEG_LIBRARIES "${FFMPEG_LIBRARIES};/root/ffmpeg_build/lib/libswresample.a;/root/ffmpeg_build/lib/libx264.a;/lib64/libdl.so.2;/root/ffmpeg_build/lib/libfdk-aac.a;/root/ffmpeg_build/lib/libswresample.a")
       MESSAGE("Found all FFmpeg libraries in ${FFMPEG_LIBRARIES}.")
    ELSE()
       MESSAGE(FATAL_ERROR "Some FFmpeg libraries are missing.")
    ENDIF()

    # Dependency: GD (required)

    FIND_PACKAGE(GD REQUIRED)
    IF ( MINGW AND CMAKE_TOOLCHAIN_FILE)
       EXEC_PROGRAM(${CMAKE_FIND_ROOT_PATH}/bin/gdlib-config ARGS "--cflags" OUTPUT_VARIABLE GD_PKG_CFLAGS)
       EXEC_PROGRAM(${CMAKE_FIND_ROOT_PATH}/bin/gdlib-config ARGS "--libs" OUTPUT_VARIABLE GD_PKG_LDFLAGS)
       MESSAGE( "gd cflags found through gdlib-config: ${GD_PKG_CFLAGS}" )
       MESSAGE( "gd ldflags found through gdlib-config: ${GD_PKG_LDFLAGS}" )
       SET(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} ${GD_PKG_CFLAGS}")
       SET(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} ${GD_PKG_CFLAGS}")
       SET(GD_LIBRARIES ${GD_LIBRARY} ${GD_PKG_LDFLAGS})
       MESSAGE("GD_LIBRARIES ${GD_LIBRARIES}")
    ENDIF()
    IF(GD_FOUND)
       INCLUDE_DIRECTORIES(${GD_INCLUDE_DIR})
       MESSAGE("Found libgd: ${GD_LIBRARIES} in ${GD_INCLUDE_DIR}")
    ELSE()
       MESSAGE(FATAL_ERROR "Couldn't find libgd")
    ENDIF()

    # Dependency: libxml2 (required)

    FIND_PACKAGE(LibXml2 2.7 REQUIRED)
    IF ( MINGW AND CMAKE_TOOLCHAIN_FILE)
       PKG_CHECK_MODULES (LIBXML2_PKG REQUIRED libxml-2.0)
       IF (LIBXML2_PKG_FOUND)
           MESSAGE( "LibXml2 cflags found through pkg-config: ${LIBXML2_PKG_CFLAGS}" )
           MESSAGE( "LibXml2 ldflags found through pkg-config: ${LIBXML2_PKG_LDFLAGS}" )
           STRING(REGEX REPLACE ";" " " LIBXML2_PKG_CFLAGS "${LIBXML2_PKG_CFLAGS}")
           SET(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} ${LIBXML2_PKG_CFLAGS}")
           SET(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} ${LIBXML2_PKG_CFLAGS}")
           SET(LIBXML2_LIBRARIES ${LIBXML2_LIBRARY} ${LIBXML2_PKG_LDFLAGS})
           MESSAGE("LIBXML2_LIBRARIES ${LIBXML2_LIBRARIES}")
       ELSE (LIBXML2_PKG_FOUND)
           MESSAGE( FATAL_ERROR "LibXml2 cflags/ldflags not found through pkg-config." )
       ENDIF (LIBXML2_PKG_FOUND)
    ENDIF()
    IF(LIBXML2_FOUND)
       MESSAGE("Found libxml2: ${LIBXML2_LIBRARIES} in ${LIBXML2_INCLUDE_DIR}")
       INCLUDE_DIRECTORIES(${LIBXML2_INCLUDE_DIR})
    ELSE()
       MESSAGE(FATAL_ERROR "Couldn't find libxml2")
    ENDIF()

    # Dependency: libxslt (required)

    FIND_PACKAGE(LibXslt REQUIRED)
    IF(LIBXSLT_FOUND)
       MESSAGE("Found libxslt: ${LIBXSLT_LIBRARIES} in ${LIBXSLT_INCLUDE_DIR}")
       INCLUDE_DIRECTORIES(${LIBXSLT_INCLUDE_DIR})
    ELSE()
       MESSAGE(FATAL_ERROR "Couldn't find libxslt")
    ENDIF()

    # Dependency: wxWidgets (optional)
    IF(USE_WXWIDGETS)
       FIND_PACKAGE(wxWidgets)# COMPONENTS core base)
       IF(wxWidgets_FOUND)
           MESSAGE("Found wxWidgets: ${wxWidgets_LIBRARIES} in ${wxWidgets_INCLUDE_DIRS}")
           INCLUDE_DIRECTORIES(${wxWidgets_INCLUDE_DIRS})
           ADD_DEFINITIONS(-DWXWIDGETS)
           INCLUDE("${wxWidgets_USE_FILE}")
       ELSE()
           MESSAGE(FATAL_ERROR "Couldn't find wxWidgets. Set USE_WXWIDGETS to OFF or install wxWidgets.")
       ENDIF()
    ENDIF()

    # Dependency: PostgreSQL (optional)
    IF(USE_POSTGRESQL)
       FIND_PACKAGE(PostgreSQL)
       IF(PostgreSQL_FOUND)
           MESSAGE("Found PostgreSQL: ${PostgreSQL_LIBRARIES} in ${PostgreSQL_INCLUDE_DIR}")
           INCLUDE_DIRECTORIES(${PostgreSQL_INCLUDE_DIR})
       ELSE()
           MESSAGE(FATAL_ERROR "Couldn't find PostgreSQL. Set USE_POSTGRESQL to OFF or install PostgreSQL.")
       ENDIF()
    ENDIF()

    # shotdetect

    SET(TARGET_NAME "shotdetect")
    INCLUDE_DIRECTORIES(.)

    # shotdetect library

    SET(${TARGET_NAME}_LIBRARY_SRCS src/film.cc src/graph.cc src/image.cc src/shot.cc src/xml.cc)
    SET(${TARGET_NAME}_LIBRARY_HDRS  src/film.h src/graph.h src/image.h src/shot.h src/xml.h)
    IF(USE_POSTGRESQL)
       SET(${TARGET_NAME}_LIBRARY_SRCS ${${TARGET_NAME}_LIBRARY_SRCS} src/bdd.cc)
       SET(${TARGET_NAME}_LIBRARY_HDRS ${${TARGET_NAME}_LIBRARY_HDRS} src/bdd.h)
    ENDIF()
    ADD_LIBRARY(${TARGET_NAME} ${${TARGET_NAME}_LIBRARY_SRCS} ${${TARGET_NAME}_LIBRARY_HDRS})
    TARGET_LINK_LIBRARIES(${TARGET_NAME} ${FFMPEG_LIBRARIES} ${LIBXML2_LIBRARIES} ${LIBXSLT_LIBRARIES} ${GD_LIBRARIES} "pthread" "m" "z")
    IF(USE_WXWIDGETS AND wxWidgets_FOUND)
       TARGET_LINK_LIBRARIES(${TARGET_NAME} ${wxWidgets_LIBRARIES})
    ENDIF()
    IF(USE_POSTGRESQL AND PostgreSQL_FOUND)
       TARGET_LINK_LIBRARIES(${TARGET_NAME} ${PostgreSQL_LIBRARY})
    ENDIF()

    # shotdetect application: gui or commandline

    IF(APPLE)
       SET(APP_TYPE "MACOSX_BUNDLE")
    ELSEIF(WIN32)
       SET(APP_TYPE "WIN32")
    ENDIF()

    IF(USE_WXWIDGETS AND wxWidgets_FOUND)
       LIST(APPEND ${TARGET_NAME}_GUI_SRCS src/main.cc src/ui/dialog_help.cc src/ui/dialog_shotdetect.cc src/ui/process_video_thread.cc)
       LIST(APPEND ${TARGET_NAME}_GUI_HDRS src/ui/dialog_help.h src/ui/dialog_shotdetect.h src/ui/process_video_thread.h)
       ADD_EXECUTABLE(${TARGET_NAME}-gui ${APP_TYPE} ${SOURCES} ${${TARGET_NAME}_GUI_SRCS} ${${TARGET_NAME}_GUI_HDRS})
       TARGET_LINK_LIBRARIES(${TARGET_NAME}-gui ${TARGET_NAME})
       # Make this target optional to install:
       SET(TARGETS_TO_INSTALL ${TARGET_NAME}-gui)
    ELSE()
       LIST(APPEND ${TARGET_NAME}_COMMANDLINE_SRCS src/commandline.cc)
       ADD_EXECUTABLE(${TARGET_NAME}-cmd ${SOURCES} ${${TARGET_NAME}_COMMANDLINE_SRCS})
       TARGET_LINK_LIBRARIES(${TARGET_NAME}-cmd ${TARGET_NAME})
       # Make this target optional to install:
       SET(TARGETS_TO_INSTALL ${TARGET_NAME}-cmd)
    ENDIF()


    # Routines for installing shotdetect.
    # Taken from official documentation (http://www.cmake.org/cmake/help/cmake2.6docs.html#command:install)
    install(
       TARGETS ${TARGETS_TO_INSTALL}
       RUNTIME DESTINATION bin
       LIBRARY DESTINATION lib
       ARCHIVE DESTINATION lib/static
    )

    the built binary will be in the built directory, you can move there and execute

    make install

    to make the binary system wide.

  • need help configuring ffmpeg to decode raw AAC with android ndk

    24 octobre 2016, par Matt Wolfe

    I’ve got an android app that gets raw AAC bytes from an external device and I want to decode that data but I can’t seem to get the decoder to work, yet ffmpeg seems to work fine for decoding an mp4 file that contains the same audio data (verified with isoviewer). Recently I was able to get this ffmpeg library on android to decode video frames from the same external device but audio won’t seem to work.

    Here is the ffmpeg output for the file with the same data :

    $ ffmpeg -i Video_2000-01-01_0411.mp4
    ffmpeg version 2.6.1 Copyright (c) 2000-2015 the FFmpeg developers
     built with Apple LLVM version 6.0 (clang-600.0.57) (based on LLVM 3.5svn)
     configuration: --prefix=/usr/local/Cellar/ffmpeg/2.6.1 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-libx264 --enable-libmp3lame --enable-libvo-aacenc --enable-libxvid --enable-vda
     libavutil      54. 20.100 / 54. 20.100
     libavcodec     56. 26.100 / 56. 26.100
     libavformat    56. 25.101 / 56. 25.101
     libavdevice    56.  4.100 / 56.  4.100
     libavfilter     5. 11.102 /  5. 11.102
     libavresample   2.  1.  0 /  2.  1.  0
     libswscale      3.  1.101 /  3.  1.101
     libswresample   1.  1.100 /  1.  1.100
     libpostproc    53.  3.100 / 53.  3.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'AXON_Flex_Video_2000-01-01_0411.mp4':
     Metadata:
       major_brand     : mp42
       minor_version   : 1
       compatible_brands: isom3gp43gp5
     Duration: 00:00:15.73, start: 0.000000, bitrate: 1134 kb/s
       Stream #0:0(eng): Audio: aac (LC) (mp4a / 0x6134706D), 8000 Hz, mono, fltp, 40 kb/s (default)
       Metadata:
         handler_name    : soun
       Stream #0:1(eng): Video: mpeg4 (Simple Profile) (mp4v / 0x7634706D), yuv420p, 640x480 [SAR 1:1 DAR 4:3], 1087 kb/s, 29.32 fps, 26.58 tbr, 90k tbn, 1k tbc (default)
       Metadata:
         handler_name    : vide

    Here is my ndk code for setting up and decoding the audio :

    jint ffmpeg_init(JNIEnv * env, jobject this) {
       audioCodec = avcodec_find_decoder(AV_CODEC_ID_AAC);
       if (!audioCodec) {
           LOGE("audio codec %d not found", AV_CODEC_ID_AAC);
           return -1;
       }

       audioContext = avcodec_alloc_context3(audioCodec);
       if (!audioContext) {
           LOGE("Could not allocate codec context");
           return -1;
       }

        int openRet = avcodec_open2(audioContext, audioCodec, NULL);
           if (openRet &lt; 0) {
             LOGE("Could not open codec, error:%d", openRet);
             return -1;
           }

       audioContext->sample_rate = 8000;
       audioContext->channel_layout = AV_CH_LAYOUT_MONO;
       audioContext->profile = FF_PROFILE_AAC_LOW;
       audioContext->bit_rate = 48 * 1024;
       audioContext->sample_fmt = AV_SAMPLE_FMT_FLTP;

     //  unsigned char extradata[] = {0x15, 0x88};
     //  audioContext->extradata = extradata;
     //  audioContext->extradata_size = sizeof(extradata);
       audioFrame = av_frame_alloc();
       if (!audioFrame) {
           LOGE("Could not create audio frame");
           return -1;
       }
    }


    jint ffmpeg_decodeAudio(JNIEnv *env, jobject this, jbyteArray aacData, jbyteArray output, int offset, int len) {

       LOGI("ffmpeg_decodeAudio()");
       char errbuf[128];
       AVPacket avpkt = {0};
       av_init_packet(&amp;avpkt);
       LOGI("av_init_packet()");
       int error, got_frame;    
       uint8_t* buffer = (uint8_t *) (*env)->GetByteArrayElements(env, aacData,0);
       uint8_t* copy = av_malloc(len);  
       memcpy(copy, &amp;buffer[offset], len);
       av_packet_from_data(&amp;avpkt, copy, len);


       if ((error = avcodec_decode_audio4(audioContext, audioFrame, &amp;got_frame, &amp;avpkt)) &lt; 0) {
           ffmpeg_log_error(error);
           av_free_packet(&amp;avpkt);
           return error;
       }
       if (got_frame) {
           LOGE("Copying audioFrame->extended_data to output jbytearray, linesize[0]:%d", audioFrame->linesize[0]);
           (*env)->SetByteArrayRegion(env, output, 0, audioFrame->linesize[0],  *audioFrame->extended_data);
       }

       return 0;

    }

    As you can see I’ve got an init function that opens the decoder and creates the context, these things all work fine, without error. However when I call avcodec_decode_audio4 I get an error :

    FFMPEG error : -1094995529, Invalid data found when processing input

    I’ve tried all sorts of combinations of AVCodecContext properties. I’m not sure which I need to set for the decoder to do it’s job but from reading online I should just need to set the channel layout and the sample_rate (which I’ve tried by themself). I’ve also tried setting the extradata/extradata_size parameters to that which should match the video settings per : http://wiki.multimedia.cx/index.php?title=MPEG-4_Audio
    But no luck.

    Since the device we’re getting packets from sends aac data that have no sound at the beginning (but are valid packets), I’ve tried to just send those since they definitely should decode correctly.

    Here is an example of the initial audio packets that are of silence :

    010c9eb43f21f90fc87e46fff10a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5a5dffe214b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4b4bbd1c429696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696969696978

    Note that the data shown above is just a hex encoding of the data that I’m putting in AVPacket, and it was sent from an external device to the android application. My application doesn’t have direct access to the file though so I need to decode the raw frames/samples as I get them. When I look at the audio track data in isoviewer I can see that the audio track’s first sample is the same data as what I got from the device that contained that file (thus, the external device is just sending me the sample’s raw data). I believe this data can be derived from reading stsz (sample size) box starting at stco (chunk offset) boxes from the mdat box of the file.

    Also, isoviewer shows the esds box as having the following :

    ESDescriptor{esId=0, streamDependenceFlag=0, URLFlag=0, oCRstreamFlag=0, streamPriority=0, URLLength=0, URLString='null', remoteODFlag=0, dependsOnEsId=0, oCREsId=0, decoderConfigDescriptor=DecoderConfigDescriptor{objectTypeIndication=64, streamType=5, upStream=0, bufferSizeDB=513, maxBitRate=32000, avgBitRate=32000, decoderSpecificInfo=null, audioSpecificInfo=AudioSpecificConfig{configBytes=1588, audioObjectType=2 (AAC LC), samplingFrequencyIndex=11 (8000), samplingFrequency=0, channelConfiguration=1, syncExtensionType=0, frameLengthFlag=0, dependsOnCoreCoder=0, coreCoderDelay=0, extensionFlag=0, layerNr=0, numOfSubFrame=0, layer_length=0, aacSectionDataResilienceFlag=false, aacScalefactorDataResilienceFlag=false, aacSpectralDataResilienceFlag=false, extensionFlag3=0}, configDescriptorDeadBytes=, profileLevelIndicationDescriptors=[[]]}, slConfigDescriptor=SLConfigDescriptor{predefined=2}}

    And the binary is this :

    00 00 00 30 65 73 64 73 00 00 00 00 03 80 80 80
    1f 00 00 00 04 80 80 80 14 40 15 00 02 01 00 00
    7d 00 00 00 7d 00 05 80 80 80 02 15 88 06 01 02