
Recherche avancée
Autres articles (34)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Contribute to documentation
13 avril 2011Documentation is vital to the development of improved technical capabilities.
MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
To contribute, register to the project users’ mailing (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)
Sur d’autres sites (3346)
-
avformat/rtsp : extend the PATH buffer to 2048
7 août 2024, par Stefano Mandelliavformat/rtsp : extend the PATH buffer to 2048
Recently, I have been experiencing an increasing number of user that use ffmpeg
to retrive RTSP stream from personal mediaproxies (e.g. MediaMtx) with
authorization based on JWT. The current length of PATH does not permit to
insert the token in the URL failing the authorization with no possibilities to
get the video.VLC has just modified the RSTP max URL length, and it permits to use token
inside the URL.For these reasons, I propose this patch to extend the PATH buffer from 1024 to
2048 in order to use tokens and the authorization process based on JWT.Signed-off-by : Marton Balint <cus@passwd.hu>
-
FFmpeg Streaming Video SpringBoot endpoint not show video duration in video players
7 avril 2024, par lxluxo23it turns out that I've been working on a personal project just out of curiosity.
the main function is to stream video by first encoding it through ffmpeg
then playback said video from any other device
call it "plex" very very primitive


although I achieve my goal which is to encode and send the video to the devices that make the request
this video is sent so to speak as a live broadcast.
I can only pause it, no forward or rewind, anyone have any idea what I am doing wrong or if I should take some other approach either in my service or controller ?


I leave fragments of my code


THE CONTROLLER


@RestController
@RequestMapping("/api")
@Log4j2
public class StreamController {

 @Autowired
 VideoStreamingService videoStreamingService;

 @Autowired
 VideoService videoService;


 @GetMapping("/stream/{videoId}")
 public ResponseEntity<streamingresponsebody> livestream(@PathVariable Long videoId,@RequestParam(required = false) String codec) {
 Video video = videoService.findVideoById(videoId);
 if (video != null) {
 Codec codecEnum = Codec.fromString(codec);
 return ResponseEntity.ok()
 .contentType(MediaType.valueOf("video/mp4"))
 .body(outputStream -> videoStreamingService.streamVideo(video.getPath(), outputStream,codecEnum));
 }
 return ResponseEntity.notFound().build();
 }
}
</streamingresponsebody>


THE SERVICE


@Service
public class VideoStreamingService {

 public void streamVideo(String videoPath, OutputStream outputStream, Codec codec) {

 FFmpeg ffmpeg = FFmpeg.atPath()
 .addArguments("-i", videoPath)
 .addArguments("-b:v", "5000k")
 .addArguments("-maxrate", "5000k")
 .addArguments("-bufsize", "10000k")
 .addArguments("-c:a", "aac")
 .addArguments("-b:a", "320k")
 .addArguments("-movflags", "frag_keyframe+empty_moov+faststart")
 .addOutput(PipeOutput.pumpTo(outputStream)
 .setFormat("mp4"))
 .addArgument("-nostdin");
 if (codec == Codec.AMD) {
 ffmpeg.addArguments("-profile:v", "high");
 }
 ffmpeg.addArguments("-c:v", codec.getFfmpegArgument());
 ffmpeg.execute();
 }
}



I have some enums to vary the encoding and use hardware acceleration or not.


and here is an example from my player
the endpoint is the following


http://localhost:8080/api/stream/2?codec=AMD

screenshot

I'm not sure if there's someone with more knowledge in either FFmpeg or Spring who could help me with this minor issue, I would greatly appreciate it. I've included the URL of the repository in case anyone would like to review it when answering my question




I tried changing the encoding format.
I tried copying the metadata from the original video.
I tried sending a custom header.
None of this has worked.


I would like to achieve what is mentioned in many sites, which is when you load a video from the network, the player shows how much of that video you have in the local "buffer".


-
Anomaly in raw I420 video generated by GStreamer
16 avril 2024, par LeaSituation


I'm trying to convert RGBA image data to YUV420P in multiple threads, then send this data to a main thread which splits the data it receives from each thread into separate frames and combines them in order to a video. Currently, I'm using FFmpeg for this task but I've found GStreamer to do a quicker job at colorspace conversion than FFmpeg.


Problem


The raw video generated by GStreamer does not match the expectations for YUV 4:2:0 planar video data. To test this, I've made a raw RGBA test video of 3 red 4x4 (16 pixel) frames.


ffmpeg -f lavfi -i color=color=red -t 3 -r 1 -s 4x4 -f rawvideo -pix_fmt rgba ./input.rgba





FFmpeg


Now, first trying to convert it via FFmpeg as I'm doing it currently :


ffmpeg -f rawvideo -pix_fmt rgba -s 4x4 -i input.rgba -f rawvideo -pix_fmt yuv420p ./ffmpeg.yuv420p



This creates a 72 byte file => 1.5 bytes per pixel, 24 bytes per frame : As expected for yuv420p data.


$ hexdump -C ./ffmpeg.yuv420p 
00000000 51 51 51 51 50 50 50 50 50 50 50 50 50 50 50 50 |QQQQPPPPPPPPPPPP|
00000010 5b 5b 5b 5b ee ee ee ee 51 51 51 51 50 50 50 50 |[[[[....QQQQPPPP|
00000020 50 50 50 50 50 50 50 50 5b 5b 5b 5b ee ee ee ee |PPPPPPPP[[[[....|
00000030 51 51 51 51 50 50 50 50 50 50 50 50 50 50 50 50 |QQQQPPPPPPPPPPPP|
00000040 5b 5b 5b 5b ee ee ee ee |[[[[....|



GStreamer


Now trying to do the same via GStreamer, with the
I420
format which corresponds to yuv420p as per their documentation :

gst-launch-1.0 filesrc location=./input.rgba ! rawvideoparse format=rgba width=4 height=4 \
! videoconvert ! video/x-raw,format=I420 ! filesink location=./gstreamer.yuv420p



This creates a 96 byte file => 2 bytes per pixel, 32 bytes per frame (?) : Unusual for yuv420p data. Additionally, none of the sections match the FFmpeg output, ruling out some kind of padding.


$ hexdump -C ./gstreamer.yuv420p 
00000000 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 |PPPPPPPPPPPPPPPP|
00000010 5a 5a 00 00 5a 5a 00 00 ee ee 00 00 ed ed 00 00 |ZZ..ZZ..........|
00000020 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 |PPPPPPPPPPPPPPPP|
00000030 5a 5a 63 6b 5a 5a 77 62 ee ee 78 6d ed ed 2c 20 |ZZckZZwb..xm.., |
00000040 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 50 |PPPPPPPPPPPPPPPP|
00000050 5a 5a 00 00 5a 5a 00 00 ee ee 00 00 ed ed 00 00 |ZZ..ZZ..........|



This output can also not be interpreted correctly as yuv420p by FFmpeg, leading to corrupted frames when trying to do so :


ffmpeg -f rawvideo -pix_fmt yuv420p -s 4x4 -i gstreamer.yuv420p -f image2 "./%d.png"





Solution ?


For my personal problem I need a way to chop up raw I420 video generated by GStreamer into separate frames to work with. However, I would also like to understand why GStreamer behaves this way and which key piece I'm missing here.


Additional notes


I've ruled out an issue with the input in GStreamer, as piping it to
autovideosink
leads to a normal result. I'm also aware of multifilesink, but I would like to avoid writing to disk and rather work with the data directly in buffers.