Recherche avancée

Médias (0)

Mot : - Tags -/clipboard

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (4)

  • Use, discuss, criticize

    13 avril 2011, par

    Talk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
    The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
    A discussion list is available for all exchanges between users.

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

  • Le plugin : Podcasts.

    14 juillet 2010, par

    Le problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
    Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
    Types de fichiers supportés dans les flux
    Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...)

Sur d’autres sites (3058)

  • Why Do Some of the Codecs in FFMPEG's Supported Codec List Show "encoders :" or "decoders :" in Parenthesis ?

    4 mai 2020, par spaceman

    If you open the Command Prompt, and run ffmpeg -codecs,
    
you will get a long list of Codecs that FFMPEG supports.

    



    Here's a small sample of the list :

    



     DEV.L. h261                 H.261
 DEV.L. h263                 H.263 / H.263-1996, H.263+ / H.263-1998 / H.263 version 2
 D.V.L. h263i                Intel H.263
 DEV.L. h263p                H.263+ / H.263-1998 / H.263 version 2
 DEV.LS h264                 H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 (encoders: libx264 libx264rgb)
 D.V.LS hevc                 H.265 / HEVC


    



    Now If you briefly go over the whole list,
    
you see that most Codecs in this list appear with their Name and Description,
    
but some of the Codecs also include parenthesis in the Description, and in the parenthesis,
    
they specify "encoders :" or "decoders :".

    



    For example :

    



    1)

    



     DEV.LS h264                 H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 (encoders: libx264 libx264rgb)


    



    2)

    



     DEVILS jpeg2000             JPEG 2000 (decoders: jpeg2000 libopenjpeg) (encoders: jpeg2000 libopenjpeg)


    



    3)

    



     DEV.L. msmpeg4v3            MPEG-4 part 2 Microsoft variant version 3 (decoders: msmpeg4) (encoders: msmpeg4)


    



    4)

    



     DEA.L. aac                  AAC (Advanced Audio Coding) (encoders: aac libvo_aacenc)


    



    5)

    



     DEA.L. amr_nb               AMR-NB (Adaptive Multi-Rate NarrowBand) (decoders: amrnb libopencore_amrnb) (encoders: libopencore_amrnb)
 DEA.L. amr_wb               AMR-WB (Adaptive Multi-Rate WideBand) (decoders: amrwb libopencore_amrwb) (encoders: libvo_amrwbenc)


    



    My question :

    



    Why do some Codecs have those parenthesis, specifying Encoders/Decoders,
    
while other (in fact : most) codecs don't have these parenthesis ?

    


  • FFmpeg Streaming Video SpringBoot endpoint not show video duration in video players

    7 avril 2024, par lxluxo23

    it turns out that I've been working on a personal project just out of curiosity.
the main function is to stream video by first encoding it through ffmpeg
then playback said video from any other device
call it "plex" very very primitive

    


    although I achieve my goal which is to encode and send the video to the devices that make the request
this video is sent so to speak as a live broadcast.
I can only pause it, no forward or rewind, anyone have any idea what I am doing wrong or if I should take some other approach either in my service or controller ?

    


    I leave fragments of my code

    


    THE CONTROLLER

    


    @RestController&#xA;@RequestMapping("/api")&#xA;@Log4j2&#xA;public class StreamController {&#xA;&#xA;    @Autowired&#xA;    VideoStreamingService videoStreamingService;&#xA;&#xA;    @Autowired&#xA;    VideoService videoService;&#xA;&#xA;&#xA;    @GetMapping("/stream/{videoId}")&#xA;    public ResponseEntity<streamingresponsebody> livestream(@PathVariable Long videoId,@RequestParam(required = false)  String codec) {&#xA;        Video video = videoService.findVideoById(videoId);&#xA;        if (video != null) {&#xA;            Codec codecEnum = Codec.fromString(codec);&#xA;            return ResponseEntity.ok()&#xA;                    .contentType(MediaType.valueOf("video/mp4"))&#xA;                    .body(outputStream -> videoStreamingService.streamVideo(video.getPath(), outputStream,codecEnum));&#xA;        }&#xA;        return ResponseEntity.notFound().build();&#xA;    }&#xA;}&#xA;</streamingresponsebody>

    &#xA;

    THE SERVICE

    &#xA;

    @Service&#xA;public class VideoStreamingService {&#xA;&#xA;    public void streamVideo(String videoPath, OutputStream outputStream, Codec codec) {&#xA;&#xA;        FFmpeg ffmpeg = FFmpeg.atPath()&#xA;                .addArguments("-i", videoPath)&#xA;                .addArguments("-b:v", "5000k")&#xA;                .addArguments("-maxrate", "5000k")&#xA;                .addArguments("-bufsize", "10000k")&#xA;                .addArguments("-c:a", "aac")&#xA;                .addArguments("-b:a", "320k")&#xA;                .addArguments("-movflags", "frag_keyframe&#x2B;empty_moov&#x2B;faststart")&#xA;                .addOutput(PipeOutput.pumpTo(outputStream)&#xA;                        .setFormat("mp4"))&#xA;                .addArgument("-nostdin");&#xA;        if (codec == Codec.AMD) {&#xA;            ffmpeg.addArguments("-profile:v", "high");&#xA;        }&#xA;        ffmpeg.addArguments("-c:v", codec.getFfmpegArgument());&#xA;        ffmpeg.execute();&#xA;    }&#xA;}&#xA;

    &#xA;

    I have some enums to vary the encoding and use hardware acceleration or not.

    &#xA;

    and here is an example from my player&#xA;the endpoint is the following

    &#xA;

    http://localhost:8080/api/stream/2?codec=AMD&#xA;screenshot

    &#xA;

    I'm not sure if there's someone with more knowledge in either FFmpeg or Spring who could help me with this minor issue, I would greatly appreciate it. I've included the URL of the repository in case anyone would like to review it when answering my question

    &#xA;

    repo

    &#xA;

    I tried changing the encoding format.&#xA;I tried copying the metadata from the original video.&#xA;I tried sending a custom header.&#xA;None of this has worked.

    &#xA;

    I would like to achieve what is mentioned in many sites, which is when you load a video from the network, the player shows how much of that video you have in the local "buffer".

    &#xA;

  • Streaming UDP packets to two different ports (for video and audio). Video works fine, but the audio does not show

    8 avril 2018, par Winston Chen

    I am taking a rtsp stream, split the video and audio out, and stream them to two different ports respectively using gstreamer so that my ffserver would be able to display the stream on my browser.

    My gstreamer pipeline :

    gst-launch-1.0 -v rtspsrc location=rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov latency=300 timeout=0 drop-on-latency=true rtp-blocksize=4096 name=rtsp_source ! \
     queue ! capsfilter caps="application/x-rtp,media=video" ! rtph264depay ! h264parse ! rtph264pay config-interval=1 pt=96 ! udpsink host=127.0.0.1 port=9527 rtsp_source. ! \
     queue ! rtpmp4apay pt=97 ! udpsink host=127.0.0.1 port=9327

    Here comes the sdp and my ffmpeg commnad :

    m=video 9527 RTP/AVP 96
    a=rtpmap:96 H264/90000
    c=IN IP4 127.0.0.1

    m=audio 9327 RTP/AVP 97
    a=rtpmap:97 mpeg4-generic/48000/6
    c=IN IP4 127.0.0.1

    ffmpeg -protocol_whitelist "file,tcp,rtp,udp" -i ~/test.sdp -max_muxing_queue_size 1024 http://localhost:8090/feed1.ffm

    And finally, this is my ffserver config (the important part) :

    <feed>               # This is the input feed where FFmpeg will send
      File ./feed1.ffm            # video stream.
      FileMaxSize 1GB             # Maximum file size for buffering video
      ACL allow 127.0.0.1         # Allowed IPs
    </feed>

    <stream>              # Output stream URL definition
      Feed feed1.ffm              # Feed from which to receive video
      Format webm

      # NoDefaults
      # NoAudio

      # Audio settings
      AudioCodec vorbis
      AudioBitRate 64             # Audio bitrate

      # Video settings
      VideoCodec libvpx
      VideoSize 240x160           # Video resolution
      VideoFrameRate 10           # Video FPS
      AVOptionVideo flags +global_header  # Parameters passed to encoder
                                          # (same as ffmpeg command-line parameters)

      PreRoll 0
      StartSendOnKey
      VideoGopSize 12
      VideoBitRate 256
    </stream>

    The thing is that if I take away the audio part and apply NoAudio, the video streams fine. However, I could not get the audio to work. Am I doing anything wrong ?