Recherche avancée

Médias (2)

Mot : - Tags -/doc2img

Autres articles (99)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

  • Problèmes fréquents

    10 mars 2010, par

    PHP et safe_mode activé
    Une des principales sources de problèmes relève de la configuration de PHP et notamment de l’activation du safe_mode
    La solution consiterait à soit désactiver le safe_mode soit placer le script dans un répertoire accessible par apache pour le site

Sur d’autres sites (10545)

  • Frame loss while FFmpeg video splitting

    27 mai 2016, par Gurinderbeer Singh

    I am using ffmpeg to split the video file into multiple parts by following command. But there is frame loss.

      ffmpeg -i input.mp4 -ss 00:00:00 -t 00:10:00 -c copy output.mp4

    For example there are total 17983 frames in complete video file. But total number of frames combined in all the splitted parts is 17970. So there are 13 frames less.

    Can anyone please tell, if there is any method by which we can split the video without any frame loss. Even by using some other tool than ffmpeg.

    Thanks....

  • how to merge an audio file with an image in android using javaCV

    11 mai 2016, par Darshan Soni

    I have aleady tried some code but it is throwing exception like this,

    05-11 17:34:00.449 1838-2316/darshan.fragments_demo E/AndroidRuntime: FATAL EXCEPTION: AsyncTask #1
                                                                         Process: darshan.fragments_demo, PID: 1838
                                                                         java.lang.RuntimeException: An error occurred while executing doInBackground()
                                                                             at android.os.AsyncTask$3.done(AsyncTask.java:309)
                                                                             at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:354)
                                                                             at java.util.concurrent.FutureTask.setException(FutureTask.java:223)
                                                                             at java.util.concurrent.FutureTask.run(FutureTask.java:242)
                                                                             at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234)
                                                                             at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
                                                                             at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
                                                                             at java.lang.Thread.run(Thread.java:818)
                                                                          Caused by: java.lang.UnsatisfiedLinkError: org.bytedeco.javacpp.avutil
                                                                             at java.lang.Class.classForName(Native Method)
                                                                             at java.lang.Class.forName(Class.java:324)
                                                                             at org.bytedeco.javacpp.Loader.load(Loader.java:413)
                                                                             at org.bytedeco.javacpp.Loader.load(Loader.java:381)
                                                                             at org.bytedeco.javacpp.avformat$AVFormatContext.<clinit>(avformat.java:2597)
                                                                             at org.bytedeco.javacv.FFmpegFrameGrabber.startUnsafe(FFmpegFrameGrabber.java:386)
                                                                             at org.bytedeco.javacv.FFmpegFrameGrabber.start(FFmpegFrameGrabber.java:380)
                                                                             at darshan.fragments_demo.Layout1$Test.doInBackground(Layout1.java:166)
                                                                             at darshan.fragments_demo.Layout1$Test.doInBackground(Layout1.java:136)
                                                                             at android.os.AsyncTask$2.call(AsyncTask.java:295)
                                                                             at java.util.concurrent.FutureTask.run(FutureTask.java:237)
                                                                             at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234) 
                                                                             at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113) 
                                                                             at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588) 
                                                                             at java.lang.Thread.run(Thread.java:818)
    </clinit>

     
    Now in gradle file I’ve included some libraries my gradle file is as follow,

    apply plugin: 'com.android.application'

    android {
       compileSdkVersion 23
       buildToolsVersion "23.0.2"

           packagingOptions {
               exclude 'META-INF/LICENSE'
               exclude 'META-INF/LICENSE-FIREBASE.txt'
               exclude 'META-INF/NOTICE'
               exclude 'META-INF/services/javax.annotation.processing.Processor'
               pickFirst  'META-INF/maven/org.bytedeco.javacpp-presets/opencv/pom.properties'
               pickFirst  'META-INF/maven/org.bytedeco.javacpp-presets/opencv/pom.xml'
               pickFirst  'META-INF/maven/org.bytedeco.javacpp-presets/ffmpeg/pom.properties'
               pickFirst  'META-INF/maven/org.bytedeco.javacpp-presets/ffmpeg/pom.xml'

           }


       defaultConfig {
           applicationId "darshan.fragments_demo"
           minSdkVersion 16
           targetSdkVersion 23
           versionCode 1
           versionName "1.0"
       }
       buildTypes {
           release {
               minifyEnabled false
               proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
           }
       }
    }

    dependencies {
       compile fileTree(dir: 'libs', include: ['*.jar'])
       testCompile 'junit:junit:4.12'

       compile 'com.android.support:appcompat-v7:23.1.1'
       compile 'com.android.support:design:23.1.1'
       compile group: 'org.bytedeco', name: 'javacv', version: '1.1'
       compile group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '2.4.11-0.11', classifier: 'android-arm'
       compile group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '2.4.11-0.11', classifier: 'android-x86'
       compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '2.6.1-0.11', classifier: 'android-arm'
       compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '2.6.1-0.11', classifier: 'android-x86'


    }

    And java file is as follow,

    public class Test extends AsyncTask {

       ProgressDialog dialog;
       FFmpegFrameRecorder recorder;
       String videoPath;
       protected void onPreExecute() {
           dialog = new ProgressDialog(Layout1.this);
           dialog.setMessage("Genrating video, Please wait.........");
           dialog.setCancelable(false);
           //dialog.show();
       };
       @Override
       protected Void doInBackground(Void... params) {

           File folder = Environment.getExternalStorageDirectory();
           String path = folder.getAbsolutePath() + "/DCIM/Camera";
           // ArrayList<string> paths = (ArrayList<string>) getListOfFiles(path, "jpg");
           long millis = System.currentTimeMillis();
           videoPath = path + "/" + "test_sham_"+millis+".3gp";

           try {



               //audio grabber

               FFmpegFrameGrabber grabber2 = new FFmpegFrameGrabber(folder.getAbsolutePath()+"/bluetooth/abcd.mp3");

               //video grabber
               FFmpegFrameGrabber grabber1 = new FFmpegFrameGrabber(path+"/1.jpg");
               grabber1.start();

               grabber2.start();

               recorder = new FFmpegFrameRecorder(path
                       + "/" + "test_sham_"+millis+".mp4",  grabber1.getImageWidth(), grabber1.getImageHeight(),2);

               //recorder.setVideoCodec(5);
               recorder.setVideoCodec(avcodec.AV_CODEC_ID_MPEG4);
               // recorder.setVideoCodec(avcodec.AV_CODEC_ID_MP4ALS);

               //recorder.setFormat("3gp");
                 recorder.setFormat("mp4");
               recorder.setFrameRate(30);
               recorder.setSampleRate(grabber2.getSampleRate());
               recorder.setVideoBitrate(30);
               long startTime = System.currentTimeMillis();
               recorder.start();

               Frame frame1, frame2 = null;

               while ((frame1 = grabber1.grabFrame()) != null ||

                       (frame2 = grabber2.grabFrame()) != null) {

                   recorder.record(frame1);

                   recorder.record(frame2);

               }

               recorder.stop();

               grabber1.stop();

               grabber2.stop();




               System.out.println("Total Time:- " + recorder.getTimestamp());

           } catch (Exception e) {
               e.printStackTrace();
           }

           return null;
       }
    }
    </string></string>

    I just cant figure out what is wrong with the implementation
    Please help

  • yet another screenshot encoding exercise with ffmpeg - stuck at getting AVFrame from ALT::CImage - VC++

    11 septembre 2013, par sith

    Total AV newbee here - trying to learn the ropes on using FFMpeg functions to encode movies. On searching for tutorials I found a few similar questions that I have linked here for reference :

    Encoding a screenshot into a video using FFMPEG

    [Libav-user] Encoding a screenshot into a video using FFMPEG

    Save bitmap to video (libavcodec ffmpeg)

    When converting from RGB to YUV using ffmpeg the video file the color is spread why ?

    How to convert RGB from YUV420p for ffmpeg encoder ?

    Encode bmp sequence with libavcodec...Help !

    Not able to encode image with ffmpeg

    For my setup FFMPEG is on VS12 - VC++ with MFC on win7.

    With the help of above samples, I am able to get "some" output from the encoder, but I am not sure in what format or state the output has been encoded. Neither VLC nor WMP can play this file. It does not even seem to recognize the metadata in the file to display the FPS or video length. What would normally cause that ? Also any pointers on what could be going wrong and how to approach fixing the problems would be great. [1]

    Here is the flow of my code :

    Step1 : capture desktop on to a CImg :

    int W=GetSystemMetrics(SM_CXSCREEN), H=GetSystemMetrics(SM_CYSCREEN), bpp=24;
    CImage cImg; cImg.Create(W,H,bpp)
    HDC hDC = cImg.GetDC();
    CWindowDC winDC(GetDesktopWindow());

    BitBlt(hDC, 0,0, rez.W(), rez.H(), winDC.m_hDC, 0, 0, SRCCOPY);

    At this point I am able to dump a screen shot into a bmp file -
    using cImg.Save( _T("test.bmp"), Gdiplus::ImageFormatBMP) ;

    Step2 : Extract the BMP bits from the CImg.

    HBITMAP hBitmap = (HBITMAP)cImg;
    HDC memDC = CreateCompatibleDC(NULL);
    SelectObject( memDC, hBitmap );

    BITMAPINFO bmi; // initialized bmi with {W,-H, plane=1, bitCount=24, comp=BI_RGB, size=W*H*3 }
    &lt;&lt; removed bmi init code for conciseness. >>>

    BYTE *rgb24Data = new BYTE[W*H*3]; // 3 for 24bpp. 4 for 32...
    int ret = GetDIBits(memDC, hBitmap, 0, H, rgb24Data, &amp;bmi, DIB_RGB_COLORS);

    At this point I faithfully believe rgb24Data points to pixel data :) - copied out of the cImg bitmap

    Step 3 : next I try to create an AV frame with the rgb24Data got from this CImg. Also this is where I have a massive knowledge gap. I am going to try and recover

    // setup the codecs and contexts here as per mohM&#39;s post

    AVCodec *currCodec = avcodec_find_encoder(CODEC_ID_MPEG4);

    AVCodecContext *codeCtxt = avcodec_alloc_context();  // init this with bate=400k, W, H,
    &lt;&lt; removed codeCtxt init code for conciseness. >>>   //  time base 1/25, gop=10, max_b=1, fmt=YUV420

    avcodec_open(codeCtxt, currCodec);

    SwsContext *currSWSCtxt = sws_getContext( W, H, AV_PIX_FMT_RGB24, // FROM
                                             W, H, AV_PIX_FMT_YUV420P, // TO
                                             SWS_FAST_BILINEAR,
                                             NULL, NULL, NULL);

    // allocate and fill AVFrame
    int numBytes = avpicture_get_size(PIX_FMT_YUV420P, W, H);
    uint8_t *buffer=new uint8_t[numBytes];
    AVFrame *avFrame = avcodec_alloc_frame();
    avpicture_fill( (AVPicture*)avFrame, buffer, PIX_FMT_YUV420P, W, H );

    Step 4 : transform the data frame into YUV420P as we fill the frame.

    uint8_t * inData[1] = { rgb24Data };
    int inLinesize[1] = { 3*W }; // RGB stride
    sws_scale( currSWSCtxt, inData, inLinesize, 0, H,
              avFrame->data, avFrame->linesize);

    step 5 encode the frame and write out the output buffer into a file.

    int out_size = avcodec_encode_video( codeCtxt,
                                        outBuf,
                                        outBufSize,
                                        avFrame );

    fwrite(outBuf, 1, outBufSize, outFile );

    finally I close the file off with [0x00 0x00 0x01 0xb7]

    The first hint of things gone haywire is that for a 50 screens of 1920X1080 at 24bpp encoded at 25fps gives me a 507MB unplayable-mpeg file.

    As mentioned earlier, neither VLC nor WMP can play this file nor they even recognize the metadata in the file to display the FPS or video length. What would normally cause that ? Also any pointers on what could be going wrong and how to approach fixing the problems would be great. [2]

    Any guidance is much appreciated.