
Recherche avancée
Autres articles (74)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Gestion des droits de création et d’édition des objets
8 février 2011, parPar défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;
-
Dépôt de média et thèmes par FTP
31 mai 2013, parL’outil MédiaSPIP traite aussi les média transférés par la voie FTP. Si vous préférez déposer par cette voie, récupérez les identifiants d’accès vers votre site MédiaSPIP et utilisez votre client FTP favori.
Vous trouverez dès le départ les dossiers suivants dans votre espace FTP : config/ : dossier de configuration du site IMG/ : dossier des média déjà traités et en ligne sur le site local/ : répertoire cache du site web themes/ : les thèmes ou les feuilles de style personnalisées tmp/ : dossier de travail (...)
Sur d’autres sites (10557)
-
mpegvideo : drop support for real (non-emulated) edges
20 décembre 2013, par Anton Khirnovmpegvideo : drop support for real (non-emulated) edges
Several decoders disable those anyway and they are not measurably faster
on x86. They might be somewhat faster on other platforms due to missing
emu edge SIMD, but the gain is not large enough (and those decoders
relevant enough) to justify the added complexity. -
Live streaming through video tag
1er février 2018, par ElmiSI am currently trying to make a video streaming service without using plugins.
My server is using ffmpeg to transmux a rtsp stream to a fragmented mp4 so that the HTML 5 video tag can play it.
The ffmpeg command is
> ffmpeg -rtsp_transport tcp -i rtsp_address -movflags frag_keyframe+empty_moov -c:v copy -f mp4 -
(My rtsp stream has no audio)
Then I feed it to the client page though websocket which then will use Media Source Extensions API to give data to the video tag.
This all works on firefox but does not work for chrome or edge/i.e and I have searched all over but could not find an answer as to why it won’t play.
The initial data such as the width and the height of the video seems to go through as the size of the video frame changes. However the video won’t start and when I use video.play() so that it would autoplay the video once loaded, I get an error saying
Uncaught (in promise) DOMException : The play() request was interrupted by a call to pause().
It seems like the video automatically pauses itself since I did not use pause() anywhere in my code.
My client side code is
var webSocket;
var video = document.getElementById("rtspPlayer");
var mediaSource;
var sourceArray;
var strCodec = 'video/mp4; codecs="avc1.420029"';
var buffer = [];
window.onload = function(){
var selector = document.getElementById("select");
var url = selector.options[selector.selectedIndex].value;
setUpWebSocket(url);
setUpMediaSource();
}
function setUpWebSocket(url){
webSocket = new WebSocket(url);
webSocket.onopen = onWebSocketOpen;
webSocket.onmessage = onWebSocketData;
}
function setUpMediaSource(){
mediaSource = new window.MediaSource();
var uri = window.URL.createObjectURL(mediaSource);
video.setAttribute('src', uri);
video.setAttribute('type', 'video/mp4');
mediaSource.onsourceopen = function(){
mediaSource.duration = Infinity;
sourceBuffer = mediaSource.addSourceBuffer(strCodec);
sourceBuffer.onupdateend = readFromBuffer();
}
}
function onWebSocketOpen(){
video.play();
}
function onWebSocketData(data){
var blob = data.data;
var fileReader = new FileReader();
fileReader.onload = function() {
buffer.push(this.result);
readFromBuffer();
};
fileReader.readAsArrayBuffer(blob);
}
function readFromBuffer(){
if(buffer.length === 0){
//console.log("Buffer length 0");
return;
}
else if(!sourceBuffer){
//console.log("No Source Buffer");
return;
}
else if(sourceBuffer.updating){
//console.log("SourceBuffer Updating");
return;
}
try{
var data = buffer.shift();
sourceBuffer.appendBuffer(data);
}
catch(e){
return;
}
}I’m certain that this is the right Codec for the video I want to play as I have saved 30 second of the video using ffmpeg and used Bento4’s mp4info to check the codec. Also the 30 second video clip played well though the media source. However playing live seems impossible for me to solve.
Help me please ! I’ve been trying to solve this problem for days.
-
ffmpeg - stretched pixel issue
5 juin 2023, par AdunatoContext


I'm converting a PNG sequence into a video using FFMPEG. The images are semi-transparent portraits where the background has been removed digitally.


Issue


The edge pixels of the subject are stretched all the way to the frame border, creating a fully opaque video.


Cause Analysis


The process worked fine in the previous workflow using rembg from command line however, since I started using rembg via python script using alpha_matting to obtain higher quality results, the resulting video has these issues.


The issue is present in both webm format (target) and mp4 (used for testing).


Command Used


Command used for webm is :


ffmpeg -thread_queue_size 64 -framerate 30 -i <png sequence="sequence" location="location"> -c:v libvpx -b:v 0 -crf 18 -pix_fmt yuva420p -auto-alt-ref 0 -c:a libvorbis <png output="output">
</png></png>


Throubleshooting Steps Taken


- 

- PNG Visual inspection The PNG images have a fully transparent background as desired.
- PNG Alpha Measurement I have created a couple of python scripts to look at alpha level in pixels and confirmed that there is no subtle alpha level in the background pixels
- Exported MP4 with AE Using the native AE renderer the resulting MP4/H.265 has a black background, so not showing the stretched pixel issue








Image of the Issue




Sample PNG Image from sequence



Code Context


rembg call via API using alpha_matting seems to generate a premultiplied alpha which uses non black pixels for 0 alpha pixels.


remove(input_data, alpha_matting=True, alpha_matting_foreground_threshold=250,
 alpha_matting_background_threshold=250, alpha_matting_erode_size=12)



A test using a rough RGB reset of 0-alpha pixels confirms that the images are being played with their RGB value ignoring Alpha.


def reset_alpha_pixels(img):
 # Open the image file
 # Process each pixel
 data = list(img.getdata())
 new_data = []
 for item in data:
 if item[3] == 0:
 new_data.append((0, 0, 0, 0))
 else:
 new_data.append((item[0], item[1], item[2], item[3]))
 # Replace the alpha value but keep the RGB
 

 # Update the image data
 img.putdata(new_data)

 return img



Updates


- 

- Added python context to make the question more relevant within SO scope.