
Recherche avancée
Médias (1)
-
The Great Big Beautiful Tomorrow
28 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Texte
Autres articles (100)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)
Sur d’autres sites (14110)
-
src/libFLAC/stream_decoder.c : Fix buffer read overflow.
18 novembre 2014, par Erik de Castro Lopo -
How to send continuous stream of frames to server most efficiently
27 avril 2019, par DuthopiI am trying to send frames from a local camera (raspberry pi camera, but could also be my laptop’s webcam) to a Google cloud instance, on which I am running AI processing of the frames.
I am managing to send frames captured through opencv via http (i.e. tcp ??) and receiving them on a flask server. When the flask server is running locally I can get good fps (50+ fps for image size 640x480), however once I send the frames to a flask app on the google instance the fps drop drastically to 5fps.
How I currently send frames :
while True:
frame = vs.read() #Separate thread, using cv2 to get the frame
ret, jpeg = cv2.imencode('.jpg', frame)
imgdata = jpeg.tobytes()
response = requests.post(
url='http://<ip address="address" of="of" google="google" instance="instance">:<port>',
data= imgdata,
headers={'content-type':'image/jpeg'},
)
</port></ip>I see two problems with this :
1 - using tcp means I am slower than udp protocol, however udp is limited in byte size. Correct me if I am wrong, but it seems very complex to send truncated frames and put them back together on the server..
2 - Even if I had udp working, there is no compression of frames, so I will never reach an efficient transferI expect the answer to be something like using ffmpeg, but so far I only figured out how to stream frames on a local port with ffmpeg, I do not know if it is possible to send frames to a remote server.
Any recommendations on the best way forward ?
-
Using ffmpeg to remove green screen from video
27 mai 2024, par CYADI have a video : https://drive.google.com/file/d/1tiP2fX0Xfc6YjIymcHyEXP8-JqEJrlgG/view


I'm trying to use ffmpeg to remove the green screen but none of the commands I use work. I saved a frame as a png and was able to remove the green from it :


ffmpeg -i green.png -vf chromakey=green:0.1 out.png


but the video edit does nothing.


ffmpeg -i video.mp4 -vf chromakey=DarkGreen:similarity=0.2:blend=0.3 output4.mov


I'm on a windows machine and need to output a format iOS and Android can use, though multiple formats are fine. Any thoughts ?