Recherche avancée

Médias (0)

Mot : - Tags -/diogene

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (21)

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 is the first MediaSPIP stable release.
    Its official release date is June 21, 2013 and is announced here.
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

Sur d’autres sites (5614)

  • Watson NarrowBand Speech to Text not accepting ogg file

    19 janvier 2017, par Bob Dill

    NodeJS app using ffmpeg to create ogg files from mp3 & mp4. If the source file is broadband, Watson Speech to Text accepts the file with no issues. If the source file is narrow band, Watson Speech to Text fails to read the ogg file. I’ve tested the output from ffmpeg and the narrowband ogg file has the same audio content (e.g. I can listen to it and hear the same people) as the mp3 file. Yes, in advance, I am changing the call to Watson to correctly specify the model and content_type. Code follows :

    exports.createTranscript = function(req, res, next)
    { var _name = getNameBase(req.body.movie);
     var _type = getType(req.body.movie);
     var _voice = (_type == "mp4") ? "en-US_BroadbandModel" : "en-US_NarrowbandModel" ;
     var _contentType = (_type == "mp4") ? "audio/ogg" : "audio/basic" ;
     var _audio = process.cwd()+"/HTML/movies/"+_name+'ogg';
     var transcriptFile = process.cwd()+"/HTML/movies/"+_name+'json';

     speech_to_text.createSession({model: _voice}, function(error, session) {
       if (error) {console.log('error:', error);}
       else
         {
           var params = { content_type: _contentType, continuous: true,
            audio: fs.createReadStream(_audio),
             session_id: session.session_id
             };
             speech_to_text.recognize(params, function(error, transcript) {
               if (error) {console.log('error:', error);}
               else
                 { fs.writeFile(transcriptFile, JSON.stringify(transcript), function(err) {if (err) {console.log(err);}});
                   res.send(transcript);
                 }
             });
         }
     });
    }

    _type is either mp3 (narrowband from phone recording) or mp4 (broadband)
    model: _voice has been traced to ensure correct setting
    content_type: _contentType has been traced to ensure correct setting

    Any ogg file submitted to Speech to Text with narrowband settings fails with Error: No speech detected for 30s. Tested with both real narrowband files and asking Watson to read a broadband ogg file (created from mp4) as narrowband. Same error message. What am I missing ?

  • ffmpeg transpose corrupts video [on hold]

    15 décembre 2016, par Victor.dMdB

    I’m trying to set up a transcoding pipeline with ffmpeg C lib, but if I transpose it, the video is corrupted as shown below.

    If I don’t transpose, the video is fine, ie the rest of the pipeline is correctly set up.

    I’m not actually sure what is actually the issue, is it a problem with the pxiel format ? Why is the transpose corrupting the video stream ? Is there something wrong with my code (added below) ?

    video is rotated but corrupted

    The constructor (code was taken from here)

    MyFilter::MyFilter(const std::string filter_desc, AVCodecContext *data_ctx){
     avfilter_register_all();
     buffersrc_ctx = NULL;
     buffersink_ctx = NULL;

       filter_graph = avfilter_graph_alloc();

     AVFilter *buffersink = avfilter_get_by_name("buffersink");
     if (!buffersink) {
       throw error("filtering sink element not found\n");
     }

     if (avfilter_graph_create_filter(&buffersink_ctx, buffersink, "out", NULL, NULL, filter_graph) < 0) {
       throw error("Cannot create buffer sink\n");
     }


    filterInputs  = avfilter_inout_alloc();
     filterInputs->name       = av_strdup("out");
     filterInputs->filter_ctx = buffersink_ctx;
     filterInputs->pad_idx    = 0;
     filterInputs->next       = NULL;

       AVFilter *buffersrc  = avfilter_get_by_name("buffer");
       if (!buffersrc) {
           throw error("filtering source element not found\n");
       }

       char args[512];
       snprintf(args, sizeof(args), "video_size=%dx%d:pix_fmt=%d:time_base=%d/%d:pixel_aspect=%d/%d",
                       data_ctx->width, data_ctx->height, data_ctx->pix_fmt,
                       data_ctx->time_base.num, data_ctx->time_base.den,
                       data_ctx->sample_aspect_ratio.num, data_ctx->sample_aspect_ratio.den);

       log(Info, "Setting filter input with %s", args);


       if (avfilter_graph_create_filter(&buffersrc_ctx, buffersrc, "in", args, NULL, filter_graph) < 0) {
            throw error("Cannot create buffer source\n");
       }

       filterOutputs = avfilter_inout_alloc();
       filterOutputs->name       = av_strdup("in");
       filterOutputs->filter_ctx = buffersrc_ctx;
       filterOutputs->pad_idx    = 0;
       filterOutputs->next       = NULL;

       if ((avfilter_graph_parse(filter_graph, filter_desc.c_str(), filterInputs, filterOutputs, NULL)) < 0)
               log(Warning,"Could not parse input filters");

       if ((avfilter_graph_config(filter_graph, NULL)) < 0)
           log(Warning,"Could not configure filter graph");

    }

    And the process

    AVFrame * MyFilter::process(AVFrame *inFrame){

       if (av_buffersrc_add_frame_flags(buffersrc_ctx, inFrame->get(), AV_BUFFERSRC_FLAG_PUSH | AV_BUFFERSRC_FLAG_KEEP_REF ) < 0) {
            throw error("Error while feeding the filtergraph\n");
        }

       int i = 0;
       AVFrame* outFrame =  av_frame_alloc();
       if( av_buffersink_get_frame(buffersink_ctx, outFrame) < 0 ){
        throw error("Couldnt find a frame\n");
       }
     return outFrame;
    }

    And the filter I’m using is :

    std::string filter_desc = "transpose=cclock"

    As an extra note, it seems like the top bar(visible in the screen capture above) is actually composed of properly rotated pixels, and this works for the whole video. It just degrades for the remaining 99% of pixels.

    EDIT :

    Using this works std::string filter_desc = "rotate=1.58", but then the resolution is not properly shifted.

  • ffmpeg Exception : Working Directory : null Environment : null

    13 décembre 2016, par Dylan

    Creating the Android app, I am trying to crop video using ffmpeg library and store it in the app directory. I created the command :

    String command = "ffmpeg -i /storage/emulated/0/DMC/diamondVideo.mp4 -vf crop=471:592:162:462 -c:a copy /storage/sdcard0/DMC/diamondVideoCropped.mp4";
    String[] cmd = command.toString().split(" ");

    After that I pass this command to the AsyncTask :

    new CropVideoTask().execute(cmd);

    And in my AsyncTask I execute this command :

    private class CropVideoTask extends AsyncTask {
    protected Void doInBackground(String[]... cmd) {
      FFmpeg ffmpeg = FFmpeg.getInstance(getContext());
      try {
          ffmpeg.execute(cmd[0], new ExecuteBinaryResponseHandler() {

              @Override
              public void onStart() {}

              @Override
              public void onProgress(String message) {}

              @Override
              public void onFailure(String message) {}

              @Override
              public void onSuccess(String message) {}

              @Override
              public void onFinish() {}

          });
      } catch (FFmpegCommandAlreadyRunningException e) {
          // Handle if FFmpeg is already running
      }
      return null;
    }

    After the code enters in onStart method, it throw an Exception :

    12-13 15:43:02.832 28941-32324/com.example.dmc E/FFmpeg: Exception while trying to run: [Ljava.lang.String;@42851b38
                                                           java.io.IOException: Error running exec(). Command: [/data/data/com.studioidan.dmc/files/ffmpeg, ffmpeg, -i, /storage/sdcard0/DMC/diamondVideo.mp4, -vf, crop=471:592:162:462, -c:a, copy, /storage/sdcard0/DMC/diamondVideoCropped.mp4] Working Directory: null Environment: null
                                                               at java.lang.ProcessManager.exec(ProcessManager.java:211)
                                                               at java.lang.Runtime.exec(Runtime.java:168)
                                                               at java.lang.Runtime.exec(Runtime.java:123)
                                                               at com.github.hiteshsondhi88.libffmpeg.ShellCommand.run(ShellCommand.java:10)
                                                               at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:38)
                                                               at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:10)
                                                               at android.os.AsyncTask$2.call(AsyncTask.java:287)
                                                               at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:305)
                                                               at java.util.concurrent.FutureTask.run(FutureTask.java:137)
                                                               at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:230)
                                                               at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1076)
                                                               at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:569)
                                                               at java.lang.Thread.run(Thread.java:856)
                                                            Caused by: java.io.IOException: No such file or directory
                                                               at java.lang.ProcessManager.exec(Native Method)
                                                               at java.lang.ProcessManager.exec(ProcessManager.java:209)
                                                               at java.lang.Runtime.exec(Runtime.java:168)
                                                               at java.lang.Runtime.exec(Runtime.java:123)
                                                               at com.github.hiteshsondhi88.libffmpeg.ShellCommand.run(ShellCommand.java:10)
                                                               at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:38)
                                                               at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:10)
                                                               at android.os.AsyncTask$2.call(AsyncTask.java:287)
                                                               at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:305)
                                                               at java.util.concurrent.FutureTask.run(FutureTask.java:137)
                                                               at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:230)
                                                               at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1076)
                                                               at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:569)
                                                               at java.lang.Thread.run(Thread.java:856)

    Manifest.xml contains premission

    Did I miss something ?