Recherche avancée

Médias (0)

Mot : - Tags -/serveur

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (32)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

  • Other interesting software

    13 avril 2011, par

    We don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
    The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
    We don’t know them, we didn’t try them, but you can take a peek.
    Videopress
    Website : http://videopress.com/
    License : GNU/GPL v2
    Source code : (...)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

Sur d’autres sites (5020)

  • avformat/rtsp : extend the PATH buffer to 2048

    7 août 2024, par Stefano Mandelli
    avformat/rtsp : extend the PATH buffer to 2048
    

    Recently, I have been experiencing an increasing number of user that use ffmpeg
    to retrive RTSP stream from personal mediaproxies (e.g. MediaMtx) with
    authorization based on JWT. The current length of PATH does not permit to
    insert the token in the URL failing the authorization with no possibilities to
    get the video.

    VLC has just modified the RSTP max URL length, and it permits to use token
    inside the URL.

    For these reasons, I propose this patch to extend the PATH buffer from 1024 to
    2048 in order to use tokens and the authorization process based on JWT.

    Signed-off-by : Marton Balint <cus@passwd.hu>

    • [DH] libavformat/rtsp.c
  • FFmpeg Streaming Video SpringBoot endpoint not show video duration in video players

    7 avril 2024, par lxluxo23

    it turns out that I've been working on a personal project just out of curiosity.&#xA;the main function is to stream video by first encoding it through ffmpeg&#xA;then playback said video from any other device&#xA;call it "plex" very very primitive

    &#xA;

    although I achieve my goal which is to encode and send the video to the devices that make the request&#xA;this video is sent so to speak as a live broadcast.&#xA;I can only pause it, no forward or rewind, anyone have any idea what I am doing wrong or if I should take some other approach either in my service or controller ?

    &#xA;

    I leave fragments of my code

    &#xA;

    THE CONTROLLER

    &#xA;

    @RestController&#xA;@RequestMapping("/api")&#xA;@Log4j2&#xA;public class StreamController {&#xA;&#xA;    @Autowired&#xA;    VideoStreamingService videoStreamingService;&#xA;&#xA;    @Autowired&#xA;    VideoService videoService;&#xA;&#xA;&#xA;    @GetMapping("/stream/{videoId}")&#xA;    public ResponseEntity<streamingresponsebody> livestream(@PathVariable Long videoId,@RequestParam(required = false)  String codec) {&#xA;        Video video = videoService.findVideoById(videoId);&#xA;        if (video != null) {&#xA;            Codec codecEnum = Codec.fromString(codec);&#xA;            return ResponseEntity.ok()&#xA;                    .contentType(MediaType.valueOf("video/mp4"))&#xA;                    .body(outputStream -> videoStreamingService.streamVideo(video.getPath(), outputStream,codecEnum));&#xA;        }&#xA;        return ResponseEntity.notFound().build();&#xA;    }&#xA;}&#xA;</streamingresponsebody>

    &#xA;

    THE SERVICE

    &#xA;

    @Service&#xA;public class VideoStreamingService {&#xA;&#xA;    public void streamVideo(String videoPath, OutputStream outputStream, Codec codec) {&#xA;&#xA;        FFmpeg ffmpeg = FFmpeg.atPath()&#xA;                .addArguments("-i", videoPath)&#xA;                .addArguments("-b:v", "5000k")&#xA;                .addArguments("-maxrate", "5000k")&#xA;                .addArguments("-bufsize", "10000k")&#xA;                .addArguments("-c:a", "aac")&#xA;                .addArguments("-b:a", "320k")&#xA;                .addArguments("-movflags", "frag_keyframe&#x2B;empty_moov&#x2B;faststart")&#xA;                .addOutput(PipeOutput.pumpTo(outputStream)&#xA;                        .setFormat("mp4"))&#xA;                .addArgument("-nostdin");&#xA;        if (codec == Codec.AMD) {&#xA;            ffmpeg.addArguments("-profile:v", "high");&#xA;        }&#xA;        ffmpeg.addArguments("-c:v", codec.getFfmpegArgument());&#xA;        ffmpeg.execute();&#xA;    }&#xA;}&#xA;

    &#xA;

    I have some enums to vary the encoding and use hardware acceleration or not.

    &#xA;

    and here is an example from my player&#xA;the endpoint is the following

    &#xA;

    http://localhost:8080/api/stream/2?codec=AMD&#xA;screenshot

    &#xA;

    I'm not sure if there's someone with more knowledge in either FFmpeg or Spring who could help me with this minor issue, I would greatly appreciate it. I've included the URL of the repository in case anyone would like to review it when answering my question

    &#xA;

    repo

    &#xA;

    I tried changing the encoding format.&#xA;I tried copying the metadata from the original video.&#xA;I tried sending a custom header.&#xA;None of this has worked.

    &#xA;

    I would like to achieve what is mentioned in many sites, which is when you load a video from the network, the player shows how much of that video you have in the local "buffer".

    &#xA;

  • Anomaly in raw I420 video generated by GStreamer

    16 avril 2024, par Lea

    Situation

    &#xA;

    I'm trying to convert RGBA image data to YUV420P in multiple threads, then send this data to a main thread which splits the data it receives from each thread into separate frames and combines them in order to a video. Currently, I'm using FFmpeg for this task but I've found GStreamer to do a quicker job at colorspace conversion than FFmpeg.

    &#xA;

    Problem

    &#xA;

    The raw video generated by GStreamer does not match the expectations for YUV 4:2:0 planar video data. To test this, I've made a raw RGBA test video of 3 red 4x4 (16 pixel) frames.

    &#xA;

    ffmpeg -f lavfi -i color=color=red -t 3 -r 1 -s 4x4 -f rawvideo -pix_fmt rgba ./input.rgba&#xA;

    &#xA;

    Example data

    &#xA;

    FFmpeg

    &#xA;

    Now, first trying to convert it via FFmpeg as I'm doing it currently :

    &#xA;

    ffmpeg -f rawvideo -pix_fmt rgba -s 4x4 -i input.rgba -f rawvideo -pix_fmt yuv420p ./ffmpeg.yuv420p&#xA;

    &#xA;

    This creates a 72 byte file => 1.5 bytes per pixel, 24 bytes per frame : As expected for yuv420p data.

    &#xA;

    $ hexdump -C ./ffmpeg.yuv420p &#xA;00000000  51 51 51 51 50 50 50 50  50 50 50 50 50 50 50 50  |QQQQPPPPPPPPPPPP|&#xA;00000010  5b 5b 5b 5b ee ee ee ee  51 51 51 51 50 50 50 50  |[[[[....QQQQPPPP|&#xA;00000020  50 50 50 50 50 50 50 50  5b 5b 5b 5b ee ee ee ee  |PPPPPPPP[[[[....|&#xA;00000030  51 51 51 51 50 50 50 50  50 50 50 50 50 50 50 50  |QQQQPPPPPPPPPPPP|&#xA;00000040  5b 5b 5b 5b ee ee ee ee                           |[[[[....|&#xA;

    &#xA;

    GStreamer

    &#xA;

    Now trying to do the same via GStreamer, with the I420 format which corresponds to yuv420p as per their documentation :

    &#xA;

    gst-launch-1.0 filesrc location=./input.rgba ! rawvideoparse format=rgba width=4 height=4 \&#xA;! videoconvert ! video/x-raw,format=I420 ! filesink location=./gstreamer.yuv420p&#xA;

    &#xA;

    This creates a 96 byte file => 2 bytes per pixel, 32 bytes per frame (?) : Unusual for yuv420p data. Additionally, none of the sections match the FFmpeg output, ruling out some kind of padding.

    &#xA;

    $ hexdump -C ./gstreamer.yuv420p &#xA;00000000  50 50 50 50 50 50 50 50  50 50 50 50 50 50 50 50  |PPPPPPPPPPPPPPPP|&#xA;00000010  5a 5a 00 00 5a 5a 00 00  ee ee 00 00 ed ed 00 00  |ZZ..ZZ..........|&#xA;00000020  50 50 50 50 50 50 50 50  50 50 50 50 50 50 50 50  |PPPPPPPPPPPPPPPP|&#xA;00000030  5a 5a 63 6b 5a 5a 77 62  ee ee 78 6d ed ed 2c 20  |ZZckZZwb..xm.., |&#xA;00000040  50 50 50 50 50 50 50 50  50 50 50 50 50 50 50 50  |PPPPPPPPPPPPPPPP|&#xA;00000050  5a 5a 00 00 5a 5a 00 00  ee ee 00 00 ed ed 00 00  |ZZ..ZZ..........|&#xA;

    &#xA;

    This output can also not be interpreted correctly as yuv420p by FFmpeg, leading to corrupted frames when trying to do so :

    &#xA;

    ffmpeg -f rawvideo -pix_fmt yuv420p -s 4x4 -i gstreamer.yuv420p -f image2 "./%d.png"&#xA;

    &#xA;

    Corrupted yuv420p frames generated by GStreamer

    &#xA;

    Solution ?

    &#xA;

    For my personal problem I need a way to chop up raw I420 video generated by GStreamer into separate frames to work with. However, I would also like to understand why GStreamer behaves this way and which key piece I'm missing here.

    &#xA;

    Additional notes

    &#xA;

    I've ruled out an issue with the input in GStreamer, as piping it to autovideosink leads to a normal result. I'm also aware of multifilesink, but I would like to avoid writing to disk and rather work with the data directly in buffers.

    &#xA;