
Recherche avancée
Médias (1)
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
Autres articles (106)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Liste des distributions compatibles
26 avril 2011, parLe tableau ci-dessous correspond à la liste des distributions Linux compatible avec le script d’installation automatique de MediaSPIP. Nom de la distributionNom de la versionNuméro de version Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
Si vous souhaitez nous aider à améliorer cette liste, vous pouvez nous fournir un accès à une machine dont la distribution n’est pas citée ci-dessus ou nous envoyer le (...)
Sur d’autres sites (16192)
-
Cut video in chunks at nearest keyframe to the nearest scene change (ffmpeg)
11 janvier 2024, par ToxiroI want to cut a video in let’s say approximately 30 minute chunks. I first want to find the nearest scene change each 30 minutes and then cut at the nearest keyframe to that scene changes (without reencoding).


I guess I can’t do that in a single ffmpeg command, but I maybe here are some pros with very helpful tips. Is it possible for example to just output the nearest scene change (or maybe just the next scene change) at a timecode, so that ffmpeg does not need to analyze the whole video ?


I hope someone can help. If I have a solution, probably a bash script, I will post it here.


-
ffmpeg overlays the video on the background video and deletes the black background of the original video
4 avril 2023, par codecaptureI use this command to remove black background


ffmpeg -i bg.mp4 -i video.mp4 -filter_complex [1:v]colorkey=0x000000:0.1:0.1[ckout];[0:v][ckout]overlay[out] -map [out] -t 5 -c:a copy -c:v libx264 -y result.mp4



enter image description here
The black color on the person in the original video will also be removed, how to solve it ?


-
FFmpeg Streaming Video SpringBoot endpoint not show video duration in video players
7 avril 2024, par lxluxo23it turns out that I've been working on a personal project just out of curiosity.
the main function is to stream video by first encoding it through ffmpeg
then playback said video from any other device
call it "plex" very very primitive


although I achieve my goal which is to encode and send the video to the devices that make the request
this video is sent so to speak as a live broadcast.
I can only pause it, no forward or rewind, anyone have any idea what I am doing wrong or if I should take some other approach either in my service or controller ?


I leave fragments of my code


THE CONTROLLER


@RestController
@RequestMapping("/api")
@Log4j2
public class StreamController {

 @Autowired
 VideoStreamingService videoStreamingService;

 @Autowired
 VideoService videoService;


 @GetMapping("/stream/{videoId}")
 public ResponseEntity<streamingresponsebody> livestream(@PathVariable Long videoId,@RequestParam(required = false) String codec) {
 Video video = videoService.findVideoById(videoId);
 if (video != null) {
 Codec codecEnum = Codec.fromString(codec);
 return ResponseEntity.ok()
 .contentType(MediaType.valueOf("video/mp4"))
 .body(outputStream -> videoStreamingService.streamVideo(video.getPath(), outputStream,codecEnum));
 }
 return ResponseEntity.notFound().build();
 }
}
</streamingresponsebody>


THE SERVICE


@Service
public class VideoStreamingService {

 public void streamVideo(String videoPath, OutputStream outputStream, Codec codec) {

 FFmpeg ffmpeg = FFmpeg.atPath()
 .addArguments("-i", videoPath)
 .addArguments("-b:v", "5000k")
 .addArguments("-maxrate", "5000k")
 .addArguments("-bufsize", "10000k")
 .addArguments("-c:a", "aac")
 .addArguments("-b:a", "320k")
 .addArguments("-movflags", "frag_keyframe+empty_moov+faststart")
 .addOutput(PipeOutput.pumpTo(outputStream)
 .setFormat("mp4"))
 .addArgument("-nostdin");
 if (codec == Codec.AMD) {
 ffmpeg.addArguments("-profile:v", "high");
 }
 ffmpeg.addArguments("-c:v", codec.getFfmpegArgument());
 ffmpeg.execute();
 }
}



I have some enums to vary the encoding and use hardware acceleration or not.


and here is an example from my player
the endpoint is the following


http://localhost:8080/api/stream/2?codec=AMD

screenshot

I'm not sure if there's someone with more knowledge in either FFmpeg or Spring who could help me with this minor issue, I would greatly appreciate it. I've included the URL of the repository in case anyone would like to review it when answering my question




I tried changing the encoding format.
I tried copying the metadata from the original video.
I tried sending a custom header.
None of this has worked.


I would like to achieve what is mentioned in many sites, which is when you load a video from the network, the player shows how much of that video you have in the local "buffer".