Recherche avancée

Médias (1)

Mot : - Tags -/pirate bay

Autres articles (61)

  • Contribute to translation

    13 avril 2011

    You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
    To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
    MediaSPIP is currently available in French and English (...)

  • Problèmes fréquents

    10 mars 2010, par

    PHP et safe_mode activé
    Une des principales sources de problèmes relève de la configuration de PHP et notamment de l’activation du safe_mode
    La solution consiterait à soit désactiver le safe_mode soit placer le script dans un répertoire accessible par apache pour le site

  • Librairies et binaires spécifiques au traitement vidéo et sonore

    31 janvier 2010, par

    Les logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
    Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
    Binaires complémentaires et facultatifs flvtool2 : (...)

Sur d’autres sites (11849)

  • FFMPEG : RGB to YUV conversion by binary ffmpeg and by code C++ give different results

    15 mars 2015, par muocdich

    I am trying to convert RGB frames (ppm format) to YUV420P format using ffmpeg. To make sure that my code C++ is good , I compared the output with the one created by this command (the same filer BILINEAR) :
    ffmpeg -start_number 1 -i data/test512x512%d.ppm -sws_flags ’bilinear’ -pix_fmt yuv420p data/test-yuv420p.yuv

    My code :

    static unsigned char *readPPM(int i)
    {
     FILE *pF;
     unsigned char *imgRGB;
     unsigned char *imgBGR;
     int w,h;
     int c;
     int bit;
     char buff[16];

     char *filename;
     asprintf(&filename,"test512x512%d.ppm",i);
     pF = fopen(filename,"rb");
     free(filename);

     if (pF) {
       if (!fgets(buff, sizeof(buff), pF)) {
         return nullptr;
       }
       if (buff[0] != 'P' || buff[1] != '6') {
         fprintf(stderr, "Invalid image format (must be 'P6')\n");
       return nullptr;
     }
     c = getc(pF);
     while (c == '#') {
       while (getc(pF) != '\n') ;
         c = getc(pF);
     }
     ungetc(c, pF);
     // read size
     if (fscanf(pF, "%d %d", &w, &h) != 2) {
       fprintf(stderr, "Invalid image size (error loading '%s')\n", filename);
       return nullptr;

     }
     //read bit
     if (fscanf(pF, "%d", &bit) != 1) {
       fprintf(stderr, "Invalid rgb component (error loading '%s')\n", filename);
       exit(1);
     }

     imgRGB =(unsigned char*) malloc(3*h*w);
     imgBGR =(unsigned char*) malloc(3*h*w);
     //read pixel data from file
     int length = fread(imgBGR, sizeof(unsigned char)*3, w*h, pF) ;
     if (length != w*h) {
       fprintf(stderr, "Error loading image '%s'\n", filename);
       return nullptr;
     }


     int start=0;
     for (i=0; i < HEIGHT*WIDTH;i++) {
      imgRGB[start] = imgBGR[start];
      imgRGB[start+2]= imgBGR[start+2];
      imgRGB[start+1]= imgBGR[start+1];
      start+=3;
     }

     fclose(pF);
     free(imgBGR);
     return imgRGB;
    }
    else {
     return nullptr;
    }
    }

    void Test_FFMPEG::FillFrame (uint8_t* pic, int index)
    {

    avpicture_fill((AVPicture*)RGBFrame, pic, AV_PIX_FMT_RGB24, encodeContext->width, encodeContext->height);

     struct SwsContext* fooContext = sws_getContext(encodeContext->width, encodeContext->height,
     PIX_FMT_RGB24,
     encodeContext->width, encodeContext->height,
     PIX_FMT_YUV420P,
     SWS_BILINEAR  , nullptr, nullptr, nullptr);
     sws_scale(fooContext, RGBFrame->data, RGBFrame->linesize, 0, encodeContext->height, OrgFrame->data, OrgFrame->linesize);

     OrgFrame->pts = index;
    }

    The comparison result is not good. The are slight differences in Y and V but a lot in U. I cannot post my images but there is a part of Y is in U image. And it makes color change a little bit.

    Can you tell me where is my error ? Thanks you

  • ffprobe different results video duration using pipe and reading a file from the file system

    5 février 2024, par alex

    I have a method to convert a video file, after processing the file I use pipe to pass bytes to a method to get meta information about the file using pipe. But in this case I get wrong duration of video file, 8.22, but if I save the file on file system and read it to get meta information I get result 15.85. Why is this happening ?

    


    Video Convert method :

    


    // ConvertVideoWithPath converts a video file specified by its path using FFmpeg.
// It returns the converted video data and any error that occurred during conversion.
func (f *FFmpeg) ConvertVideoWithPath(filePath string) (bytes []byte, err error) {
    if filePath == "" {
        return nil, ErrEmptyPath
    }

    // Create a CmdRunner instance for executing FFmpeg.
    commander := &CmdRunner{}
    commander.Command = "ffmpeg"
    args := []string{
        "-loglevel", "fatal",
        "-i", filePath,
        "-y",
        "-filter:v", "crop=trunc(iw/2)*2:trunc(ih/2)*2",
        "-c:v", f.videoCodec, // libx264
        "-c:a", f.audioCodec, // aac
        "-pix_fmt", "yuv420p",
        "-movflags", "frag_keyframe+faststart",
        "-map_metadata", "-1",
        "-crf", "5",
        "-vsync", "2",
        "-bufsize", "15000000",
        "-maxrate", "5000000",
        "-preset", "medium",
        "-f", "mp4",
        "pipe:1",
    }
    commander.Args = args

    // Initialize output pipe.
    reader := commander.InitStdOutPipe()

    // Use WaitGroup to synchronize goroutines.
    wg := &sync.WaitGroup{}
    wg.Add(1)

    // Goroutine for reading data from the output pipe.
    go func() {
        defer reader.Close()
        defer wg.Done()

        // Read data from the output pipe.
        data, _ := io.ReadAll(reader)
        // Safely update the 'bytes' variable.
        f.mutex.Lock()
        bytes = data
        f.mutex.Unlock()
    }()

    // Run the FFmpeg command with pipes and wait for completion.
    err = <-commander.RunWithPipe()
    wg.Wait()

    return
}


    


    // MetadataWithReader retrieves metadata from media data provided by an io.Reader using FFprobe.
// It returns the metadata and any error that occurred during metadata retrieval.
func (f *FFmpeg) MetadataWithReader(fileBytes io.Reader) (*Metadata, error) {
    if fileBytes == nil {
        return nil, ErrInvalidArgument
    }

    // Create a CmdRunner instance for executing FFprobe.
    commander := &CmdRunner{}
    commander.Command = "ffprobe"
    args := []string{
        "-loglevel", "fatal",
        "-i", "pipe:0",
        "-print_format", "json",
        "-show_format", "-show_streams",
        "-show_error",
    }
    commander.Args = args

    // Get output data from FFprobe with pipes.
    err := commander.GetOutputWithPipe(fileBytes)
    if err != nil {
        return nil, err
    }

    // Unmarshal JSON output into a Metadata struct.
    output := &Metadata{}
    err = json.Unmarshal(commander.GetOutput(), output)
    if err != nil {
        return nil, err
    }

    return output, err
}


    


    // MetadataWithPath extracts metadata of a file using FFprobe.
// It returns a Metadata struct or an error if the operation fails.
func (f *FFmpeg) MetadataWithPath(filePath string) (*Metadata, error) {
    if filePath == "" {
        return nil, ErrEmptyPath
    }

    // Create a CmdRunner instance for executing FFprobe.
    commander := &CmdRunner{}
    commander.Command = "ffprobe"
    args := []string{
        "-loglevel", "fatal",
        "-i", filePath,
        "-loglevel",
        "fatal",
        "-print_format", "json",
        "-show_format", "-show_streams", "-show_error",
    }
    commander.Args = args
    buffer := bytes.NewBuffer([]byte{})
    commander.StdOutWriter = buffer

    err := commander.Run()
    if err != nil {
        return nil, err
    }

    // Unmarshal JSON output into a Metadata struct.
    output := &Metadata{}
    err = json.Unmarshal(buffer.Bytes(), output)
    if err != nil {
        return nil, err
    }

    return output, nil
}


    


    The source code of the CmdRunner biblio library can be found here link , so as not to overload the question with a large piece of code.

    


    Unit test code

    


    t.Run("convert video", func(t *testing.T) {
        ffmpeg := NewFFmpeg("aac", "libx264", "24M", "12M")

        filePath := "../../test/testdata/input_video_ts.mp4"
        firstMeta, err := ffmpeg.MetadataWithPath(filePath)
        assert.NoError(t, err)
        fmt.Print("first meta duration: ", firstMeta.Format.DurationSeconds) // 15.75

        outFile := "../../test/testdata/output_mp4.mp4"
        newVideoOut, err := ffmpeg.ConvertVideoWithPath(filePath)
        assert.NoError(t, err)
        assert.NotEmpty(t, newVideoOut)

        meta, err := ffmpeg.MetadataWithReader(bytes.NewBuffer(newVideoOut))
        assert.NoError(t, err)
        assert.NotEmpty(t, meta)

        err = os.WriteFile(outFile, newVideoOut, 0644)
        assert.NoError(t, err)
        assert.FileExists(t, outFile)

        fmt.Print("meta duration: ", meta.Format.DurationSeconds) // 8.22

        secondMeta, err := ffmpeg.MetadataWithPath(outFile)
        assert.NoError(t, err)
        fmt.Print("second meta duration: ", secondMeta.Format.DurationSeconds) //15.85

        err = os.Remove(outFile)
        assert.NoError(t, err)
    })


    


  • Rotating a video during encoding with ffmpeg and libav API results in half of video corrupted

    11 mai 2020, par Daniel Kobe

    I'm using the C API for ffmpeg/libav to rotate a vertically filmed iphone video during the encoding step. There are other questions asking to do a similar thing but they are all using the CLI tool to do so.

    



    So far I was able to figure out how to use the AVFilter to rotate the video, base off this example https://github.com/FFmpeg/FFmpeg/blob/master/doc/examples/filtering_video.c

    



    The problem is that half the output file is corrupt.
Corrupt Video Screenshot

    



    Here is the code for my encoding logic. Its written with GOLANG using CGO to interface with the C API.

    



    // Encode encode an AVFrame and return it
func Encode(enc Encoder, frame *C.AVFrame) (*EncodedFrame, error) {
    ctx := enc.Context()

    if ctx.buffersrcctx == nil {
        // initialize filter
        outputs := C.avfilter_inout_alloc()
        inputs  := C.avfilter_inout_alloc()
        m_pFilterGraph := C.avfilter_graph_alloc()
        buffersrc := C.avfilter_get_by_name(C.CString("buffer"))
        argsStr := fmt.Sprintf("video_size=%dx%d:pix_fmt=%d:time_base=%d/%d:pixel_aspect=%d/%d", ctx.avctx.width, ctx.avctx.height, ctx.avctx.pix_fmt, ctx.avctx.time_base.num, ctx.avctx.time_base.den, ctx.avctx.sample_aspect_ratio.num, ctx.avctx.sample_aspect_ratio.den)
        Log.Info.Println("yakotest")
        Log.Info.Println(argsStr)
        args := C.CString(argsStr)
        ret := C.avfilter_graph_create_filter(&ctx.buffersrcctx, buffersrc, C.CString("my_buffersrc"), args, nil, m_pFilterGraph)
        if ret < 0 {
            Log.Info.Printf("\n problem creating filter %v\n", AVError(ret).Error())
        }

        buffersink := C.avfilter_get_by_name(C.CString("buffersink"))
        ret = C.avfilter_graph_create_filter(&ctx.buffersinkctx, buffersink, C.CString("my_buffersink"), nil, nil, m_pFilterGraph)
        if ret < 0 {
            Log.Info.Printf("\n problem creating filter %v\n", AVError(ret).Error())
        }

        /*
         * Set the endpoints for the filter graph. The filter_graph will
         * be linked to the graph described by filters_descr.
         */

        /*
         * The buffer source output must be connected to the input pad of
         * the first filter described by filters_descr; since the first
         * filter input label is not specified, it is set to "in" by
         * default.
         */
        outputs.name       = C.av_strdup(C.CString("in"))
        outputs.filter_ctx = ctx.buffersrcctx
        outputs.pad_idx    = 0
        outputs.next       = nil

        /*
         * The buffer sink input must be connected to the output pad of
         * the last filter described by filters_descr; since the last
         * filter output label is not specified, it is set to "out" by
         * default.
         */
        inputs.name       = C.av_strdup(C.CString("out"))
        inputs.filter_ctx = ctx.buffersinkctx
        inputs.pad_idx    = 0
        inputs.next       = nil

        ret = C.avfilter_graph_parse_ptr(m_pFilterGraph, C.CString("transpose=clock,scale=-2:1080"),
            &inputs, &outputs, nil)
        if ret < 0 {
            Log.Info.Printf("\n problem with avfilter_graph_parse %v\n", AVError(ret).Error())
        }

        ret = C.avfilter_graph_config(m_pFilterGraph, nil)
        if ret < 0 {
            Log.Info.Printf("\n problem with graph config %v\n", AVError(ret).Error())
        }
    }

    filteredFrame :=  C.av_frame_alloc()

    /* push the decoded frame into the filtergraph */
    ret := C.av_buffersrc_add_frame_flags(ctx.buffersrcctx, frame, C.AV_BUFFERSRC_FLAG_KEEP_REF)
    if ret < 0 {
        Log.Error.Printf("\nError while feeding the filter greaph, err = %v\n", AVError(ret).Error())
        return nil, errors.New(ErrorFFmpegCodecFailure)
    }

    /* pull filtered frames from the filtergraph */
    for {
        ret = C.av_buffersink_get_frame(ctx.buffersinkctx, filteredFrame)
        if ret == C.AVERROR_EAGAIN || ret == C.AVERROR_EOF {
            break
        }
        if ret < 0 {
            Log.Error.Printf("\nCouldnt find a frame, err = %v\n", AVError(ret).Error())
            return nil, errors.New(ErrorFFmpegCodecFailure)
        }

        filteredFrame.pts = frame.pts
        frame = filteredFrame
        defer C.av_frame_free(&filteredFrame)
    }

    if frame != nil {
        frame.pict_type = 0 // reset pict type for the encoder
        if C.avcodec_send_frame(ctx.avctx, frame) != 0 {
            Log.Error.Printf("%+v\n", StackErrorf("codec error, could not send frame"))
            return nil, errors.New(ErrorFFmpegCodecFailure)
        }
    }

    for {
        ret := C.avcodec_receive_packet(ctx.avctx, ctx.pkt)
        if ret == C.AVERROR_EAGAIN {
            break
        }
        if ret == C.AVERROR_EOF {
            return nil, fmt.Errorf("EOF")
        }
        if ret < 0 {
            Log.Error.Printf("%+v\n", StackErrorf("codec error, receiving packet"))
            return nil, errors.New(ErrorFFmpegCodecFailure)
        }

        data := C.GoBytes(unsafe.Pointer(ctx.pkt.data), ctx.pkt.size)
        return &EncodedFrame{data, int64(ctx.pkt.pts), int64(ctx.pkt.dts),
            (ctx.pkt.flags & C.AV_PKT_FLAG_KEY) != 0}, nil
    }

    return nil, nil
}


    



    It seems like I need to do something with the scaling here but I'm struggling to find helpful information online.