Recherche avancée

Médias (0)

Mot : - Tags -/serveur

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (59)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

Sur d’autres sites (5543)

  • After upgrade ffmpeg code doesn't working clip build

    11 février, par Tchoune

    I have a problem after upgrading ffmpeg from 4.2.2 to 5.2.2, my code no longer works. When I upload a video to my React-Native application I get a file corruption error on my FFmpeg python agent.
-> sends to Laravel which stores the video on the minio storage ; the video is available -> sends http to the minio key to download locally the mp4 video is corrupted on the minio too...
I have the impression that it's an error downloading the video locally that makes the video corrupt, but I have no idea how I can debug this problem.
If I upload the video directly from my web interface I don't have this problem. The only difference is processClipSynchronously which is set to True on mobile and False on web.

    


    Laravel Agent send to python microservice :

    


     // Store uploaded video file
        $videoFilePath = $this->storeVideoFile($learningGoal, $videoFile);

        // Add video to storyboard
        $agentResponse =  Http::post($this->agentUrl . 'learning-goals/' . $learningGoal->id . '/storyboards/' . $storyboardId .  '/chapters/' . $chapterId . '/videos',
            [
                'clip' => $videoFilePath,
                'processClipSynchronously' => $processClipSynchronously
            ]);


    


    Python agent video :

    


    @app.route('/learning-goals//storyboards//chapters//videos',
           methods=['post'])
def post_storyboard_videos(learning_goal_id, storyboard_id, chapter_id):
    storyboard = get_storyboard(learning_goal_id, storyboard_id)
    chapter, position = get_chapter(storyboard, chapter_id)

    if 'clip' in request.get_json():
        chapter['clip'] = request.get_json()['clip']
        if 'duration' in storyboard:
            del chapter['duration']
        if 'thumbnail' in storyboard:
            del chapter['thumbnail']
        if 'ncAudioPreviewPath' in chapter:
            del chapter['ncAudioPreviewPath']
        if 'trim_start' in chapter:
            del chapter['trim_start']
        if 'trim_end' in chapter:
            del chapter['trim_end']
        if 'perform_nc' in chapter:
            del chapter['perform_nc']
    else:
        abort(400)

    new_storyboard = create_new_version_storyboard(storyboard)

    if 'processClipSynchronously' in request.get_json() and request.get_json()['processClipSynchronously']:
        treat_clip(new_storyboard, chapter) #Mobile trigger here
    else:
        thread = StoppableThread(target=treat_clip, args=(new_storyboard, chapter))
        thread.daemon = True
        thread.start()

    chapter, position = get_chapter(new_storyboard, chapter_id)

    return json.loads(dumps(chapter))

def treat_clip(storyboard, chapter):
    logging.info(
        'start treating clip (' + chapter['clip'] + ') for learning goal : ' + str(storyboard['learningGoalId']))
    file = app.config['VOLUME_PATH'] + chapter['clip']
    os.makedirs(dirname(file), exist_ok=True)
    temp_files_to_remove = []

    if not os.path.exists(file):
        # Download file from S3 storage.
        s3.download_file(chapter['clip'], file)
        # Clean the file at the end (it's already in S3).
        temp_files_to_remove.append(file)
    else:
        logging.warn(f'Not downloading {chapter["clip"]} from S3 as it already exists on the filesystem')

    resolution_width, resolution_height = get_resolution(file)
    is_rotated_video = is_rotated(file)
    sample_aspect_ratio = get_sample_aspect_ratio(file)
    frame_rate = get_frame_rate(file)
    if not file.endswith(
            '.mp4') or resolution_width != 1920 or resolution_height != 1080 or is_rotated_video or sample_aspect_ratio != '1:1' or frame_rate > 60:
        chapter['clip'] = format_video(chapter['clip'], resolution_width, resolution_height, frame_rate,
                                       is_rotated_video, str(storyboard['learningGoalId']), 1920, 1080)
        file = app.config['VOLUME_PATH'] + chapter['clip']

        # Update file to S3 storage
        s3.upload_file(file, chapter['clip'])

        # Clean the new file at the end.
        temp_files_to_remove.append(file)

    clip = VideoFileClip(file)
    chapter['duration'] = float(clip.duration)
    thumbnail_relative_path = create_video_thumbnail(storyboard, clip, 0)
    ....


    


    It's VideoFileClip from moviepy who generate error : Moov atom not found
I think S3 not have time to download file, and corrumpt, but I don't know how to test or fix that

    


    thanks in advance

    


  • fftp server stream server error- connection to tcp:10.10.X.X timeout=0 faild no route to host

    1er juillet 2023, par slowmonk

    When attempting to stream the webcam using RTSP, I encountered persistent timeout issues.

    


    https://hub.docker.com/r/aler9/rtsp-simple-server

    


    docker run -p 6020:8000 aler9/rtsp-simple-server:latest
2023/07/01 20:15:24 INF MediaMTX v0.23.7
2023/07/01 20:15:24 INF [RTSP] listener opened on :8554 (TCP), :8000 (UDP/RTP), :8001 (UDP/RTCP)
2023/07/01 20:15:24 INF [RTMP] listener opened on :1935
2023/07/01 20:15:24 INF [HLS] listener opened on :8888
2023/07/01 20:15:24 INF [WebRTC] listener opened on :8889 (HTTP)

$ sudo ffmpeg -f v4l2 -framerate 24 -video_size 480x480 -i /dev/video0 -f rtsp -rtsp_transport tcp rtsp://192.168.XXX.XXX:8554/webCamStream


    


    following error occur

    


    ffmpeg version 4.4.2-0ubuntu0.22.04.1 Copyright (c) 2000-2021 the FFmpeg developers
  built with gcc 11 (Ubuntu 11.2.0-19ubuntu1)
  configuration: --prefix=/usr --extra-version=0ubuntu0.22.04.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libdav1d --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librabbitmq --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzimg --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-pocketsphinx --enable-librsvg --enable-libmfx --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
  libavutil      56. 70.100 / 56. 70.100
  libavcodec     58.134.100 / 58.134.100
  libavformat    58. 76.100 / 58. 76.100
  libavdevice    58. 13.100 / 58. 13.100
  libavfilter     7.110.100 /  7.110.100
  libswscale      5.  9.100 /  5.  9.100
  libswresample   3.  9.100 /  3.  9.100
  libpostproc    55.  9.100 / 55.  9.100
[video4linux2,v4l2 @ 0x5642e55137c0] The V4L2 driver changed the video from 480x480 to 640x480
[video4linux2,v4l2 @ 0x5642e55137c0] The driver changed the time per frame from 1/24 to 1/30
Input #0, video4linux2,v4l2, from '/dev/video0':
  Duration: N/A, start: 46142.014949, bitrate: 147456 kb/s
  Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 640x480, 147456 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc
Stream mapping:
  Stream #0:0 -> #0:0 (rawvideo (native) -> mpeg4 (native))
Press [q] to stop, [?] for help
[tcp @ 0x5642e55e34c0] Connection to tcp://10.10.1068:9988?timeout=0 failed: No route to host
Could not write header for output file #0 (incorrect codec parameters ?): No route to host
Error initializing output stream 0:0 --


    


  • Getting ffmpeg to work on a nodejs server : Error : ffmpeg exited with code 1 : Output #0, wav, to 'output.wav'

    14 mai 2023, par Djaenike

    I need to convert an m4a buffer stream to a wav buffer stream. I'm trying to accomplish this through ffmpeg-static and fluent-ffmpeg on a nodejs backend. I'm using multer to process the buffer stream coming from the client, then pass it through a convertM4AToWAV, which

    


    Here is the code I'm using to establish ffmpeg and convert the buffer stream.

    


    const ffmpeg = require('fluent-ffmpeg');
const ffmpegStatic = require('ffmpeg-static');
ffmpeg.setFfmpegPath(ffmpegStatic);

...more code here for the api route...

const convertM4AToWAV = async (m4aBuffer) => {
  console.log(`buffer in: ${m4aBuffer}`);
  try {
    const wavBuffer = await ffmpeg(m4aBuffer)
      .toFormat('wav')
      .on('start', (command) => console.log('FFmpeg command:', command))
      .on('end', () => {
        console.log('Conversion complete.');
      })
      .save('output.wav');
    return wavBuffer;
  } catch (error) {
    console.error('Error converting audio buffer stream to wav:', error);
    throw error;
  }
};


    


    And here is the error message I'm seeing :

    


    //I can see the buffer come in
buffer in: ↑ftyp3gp4isom3gp4☺mdat????????.....
//I can see the command
FFmpeg command: ffmpeg -y -f wav output.wav
//Error message thrown
Error transcribing audio: Error: 13 INTERNAL: Request message serialization failure: invalid encoding
    at callErrorFromStatus
...file directory jibberish...
code: 13,
  details: 'Request message serialization failure: invalid encoding',
  metadata: Metadata { internalRepr: Map(0) {}, options: {} },
  note: 'Exception occurred in retry method that was not classified as transient'
}
error - uncaughtException: Error: ffmpeg exited with code 1: Output #0, wav, to 'output.wav':
Output file #0 does not contain any stream


    


    Any thoughts why this error is thrown ? My gut is telling me it's because I haven't configured ffmpeg correctly on the server.