
Recherche avancée
Médias (91)
-
Spitfire Parade - Crisis
15 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Wired NextMusic
14 mai 2011, par
Mis à jour : Février 2012
Langue : English
Type : Video
-
Video d’abeille en portrait
14 mai 2011, par
Mis à jour : Février 2012
Langue : français
Type : Video
-
Sintel MP4 Surround 5.1 Full
13 mai 2011, par
Mis à jour : Février 2012
Langue : English
Type : Video
-
Carte de Schillerkiez
13 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (19)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (4933)
-
Pygame : Frame ghosting ?
5 octobre 2022, par Sam TubbI am working on a animation environment in python using pygame. The user draw's each frame, and then using ffmpeg the animation is saved as an .avi movie. I would like to implement a feature, but am not sure how.. frame ghosting. Like display the previous frame while you draw the current.



I tried creating a surface called
ghost
that copies the current frame when the next-frame key is pressed. Then draws it with an alpha level of 10, but this didn't work out correctly.


I am not sure what to do, here is the source code for anyone that thinks they have an idea :



#Anim8

import pygame,subprocess,shutil
from os import makedirs
from pygame.locals import *
from random import randrange
pygame.init()
screen=pygame.display.set_mode((740,580))
draw=pygame.Surface((740,540))
draw.fill((200,200,200))
bcol=(200,200,200)
gui=pygame.Surface((740,40))
gui.fill((50,50,50))
size=2
color=(0,0,0)
screen.fill((200,200,200))
prevcol=0
newcol=0
f=0
msg=''
framerate=60
try:
 makedirs('anim')
except:
 pass
def DrawColors(x,y):
 pygame.draw.rect(gui, (255,0,0), (x+3,y+3,15,15),0)
 pygame.draw.rect(gui, (0,0,0), (x+3,y+21,15,15),0)
 pygame.draw.rect(gui, (0,255,0), (x+21,y+3,15,15),0)
 pygame.draw.rect(gui, (200,200,200), (x+21,y+21,15,15),0)
 pygame.draw.rect(gui, (0,0,255), (x+39,y+3,15,15),0)
while True:
 pygame.display.set_caption('Anim8 - Sam Tubb - '+'Frame: '+str(f)+' '+str(msg))
 mse=pygame.mouse.get_pos()
 screen.blit(gui, (0,0))
 DrawColors(0,0)
 screen.blit(draw,(0,40))
 key=pygame.key.get_pressed()
 if key[K_1]:
 framerate=10
 msg='Frame Rate set to 10'
 if key[K_2]:
 framerate=20
 msg='Frame Rate set to 20'
 if key[K_3]:
 framerate=30
 msg='Frame Rate set to 30'
 if key[K_4]:
 framerate=40
 msg='Frame Rate set to 40'
 if key[K_5]:
 framerate=50
 msg='Frame Rate set to 50'
 if key[K_6]:
 framerate=60
 msg='Frame Rate set to 60'
 if key[K_7]:
 framerate=70
 msg='Frame Rate set to 70'
 if key[K_8]:
 framerate=80
 msg='Frame Rate set to 80'
 if key[K_9]:
 framerate=90
 msg='Frame Rate set to 90'
 if key[K_0]:
 framerate=100
 msg='Frame Rate set to 100'

 if key[K_a]:
 pygame.image.save(draw, 'anim/frame'+str(f)+'.png')
 f+=1
 for e in pygame.event.get():
 if e.type==QUIT:
 shutil.rmtree('anim')
 exit()
 if e.type==KEYDOWN:
 if e.key==K_s:
 msg='Added Frame!'
 pygame.image.save(draw, 'anim/frame'+str(f)+'.png')
 f+=1
 if e.key==K_c:
 draw.fill(bcol)
 if e.key==K_r:
 name='anim'+str(randrange(0,999))+str(randrange(0,999))+'.avi'
 msg='Rendering: '+name
 pygame.display.set_caption('Anim8 - Sam Tubb - '+'Frame: '+str(f)+' '+str(msg))
 subprocess.call('ffmpeg -f image2 -s 640x480 -i anim/frame%01d.png -r '+str(framerate)+' '+name,shell=True)
 msg='Done!'
 if e.key==K_p:
 subprocess.call('ffplay '+name,shell=True)
 if e.type==MOUSEBUTTONDOWN:
 if e.button==1:
 try:
 prevcol=color
 newcol=gui.get_at(mse)
 if newcol==(50,50,50):
 newcol=prevcol
 color=newcol
 except:
 pass
 if e.button==3:
 try:
 prevcol=bcol
 newcol=gui.get_at(mse)
 if newcol==(50,50,50):
 newcol=prevcol
 draw.fill(newcol)
 bcol=newcol
 except:
 pass
 if e.button==4:
 size+=1
 if size>7:
 size=7
 if e.button==5:
 size-=1
 if size==0:
 size=1 
 if e.type == pygame.MOUSEMOTION:
 lineEnd = pygame.mouse.get_pos()
 lineEnd = (lineEnd[0],lineEnd[1]-40)
 if pygame.mouse.get_pressed() == (1, 0, 0):
 pygame.draw.line(draw, color, lineStart, lineEnd, size)
 lineStart = lineEnd

 pygame.display.flip()




Oh, and on another note, just if anyone was curious, here is what the output looks like.. I made a little new year's animation :





-
How to play video file with audio with DearPyGUI (Python) ?
1er mars 2023, par Vi TietI'm using DearPyGUI to make a simple media player that can play video file (mp4, etc.) together with it's audio. The pre-requisite is that DearPyGUI is a must, however video feature will not exist until v2.0, which is still far in the future.


Currently, I can only render the frames using OpenCV library for Python, however, the problem is how can I play the audio as well as play it in sync with the output video frames ?


For context, I'm quite new to Python, and I don't know much about video and audio streaming, but I've thought of some approaches to this problem by looking through help posts online (However, I still have no idea how I can implement any of these seamlessly) :


- 

-
OpenCV for video frames, and audio ??? some libraries like ffmpeg-python or miniaudio to play sound... (How...?)


-
Extract video frames and audio here and then use the raw data to play it (How...?)


-
This example here is pretty close to what I want excluding the playing video and audio part, but I have no idea where to go from there. The video stream and the audio stream are instances of ffmpeg.nodes.FilterableStream, and they appear to hold addresses to somewhere. (No idea...)


-
Another very close idea is using ffpyplayer I was able to get the video frame. However, the below code yields a blueish purple color tint to the video, and the frame rate is very slow compared to original (So close...)












import time
import numpy as np
import cv2 as cv
from ffpyplayer.player import MediaPlayer


# https://github.com/Kazuhito00/Image-Processing-Node-Editor/blob/main/node_editor/util.py 
def cv2dpg(frame): 

 data = cv.resize(frame, (VIDEO_WIDTH, VIDEO_HEIGHT))
 data = np.flip(frame, 2)
 data = data.ravel()
 data = np.asfarray(data, dtype=np.float32)

 return np.true_divide(data, 255.0)


# https://stackoverflow.com/questions/59611075/how-would-i-go-about-playing-a-video-stream-with-ffpyplayer
# https://matham.github.io/ffpyplayer/examples.html#examples
def play_video(loaded_file_path):

 global player, is_playing
 player = MediaPlayer(loaded_file_path)

 while is_playing:

 frame, val = player.get_frame()

 if val == 'eof':
 is_playing = False
 break

 elif not frame:
 time.sleep(0.01)

 elif val != 'eof' and frame is not None:
 img, t = frame
 w = img.get_size()[0]
 h = img.get_size()[1]
 cv_mat = np.uint8(np.asarray(list(img.to_bytearray()[0])).reshape((h, w, 3)))
 texture_data = cv2dpg(cv_mat)
 dpg.set_value(VIDEO_CANVAS_TAG, texture_data)

 dpg.set_value(VIDEO_CANVAS_TAG, DEFAULT_VIDEO_TEXTURE)



I still need to do more research, but any pointer to somewhere good to start off (either handling raw data or using different libraries) would be greatly appreciated !


EDIT :
For more context, I'm using raw texture like this example of DearPyGUI official documentation to render the video frames that were extracted in the while loop.


-
-
How to set path to Opus on Heroku
19 décembre 2016, par onnexHi I’m stuck on getting opuslib to work on Heroku. I’ve installed the opus library buildback here https://elements.heroku.com/buildpacks/dubsmash/heroku-buildpack-opus
However my app throws an error
Error in command 'play' - OpusNotLoaded:
I’m assuming my app is not detecting the opus buildpack. This is where I get stuckFrom uploading changes to my repo the log shows
remote: ----------------------------------------------------------------------
remote: Libraries have been installed in:
remote: /tmp/build_b191a43c9f4f1541f35409377bad92ed/libopus/build/lib
remote:
remote: If you ever happen to want to link against installed libraries
remote: in a given directory, LIBDIR, you must either use libtool, and
remote: specify the full pathname of the library, or use the `-LLIBDIR'
remote: flag during linking and do at least one of the following:
remote: - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
remote: during execution
remote: - add LIBDIR to the `LD_RUN_PATH' environment variable
remote: during linking
remote: - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
remote: - have your system administrator add LIBDIR to `/etc/ld.so.conf'
remote:
remote: See any operating system documentation about shared libraries for
remote: more information, such as the ld(1) and ld.so(8) manual pages.
remote: ----------------------------------------------------------------------
remote: make[4]: Entering directory `/tmp/build_b191a43c9f4f1541f35409377bad92ed/libopus/opus-1.1/doc'
remote: make[5]: Entering directory `/tmp/build_b191a43c9f4f1541f35409377bad92ed/libopus/opus-1.1/doc'
remote: make[5]: Nothing to be done for `install-exec-am'.
remote: make[5]: Nothing to be done for `install-data-am'.
remote: make[5]: Leaving directory `/tmp/build_b191a43c9f4f1541f35409377bad92ed/libopus/opus-1.1/doc'
remote: make[4]: Leaving directory `/tmp/build_b191a43c9f4f1541f35409377bad92ed/libopus/opus-1.1/doc'
remote: /bin/mkdir -p '/tmp/build_b191a43c9f4f1541f35409377bad92ed/libopus/build/share/aclocal'
remote: /usr/bin/install -c -m 644 opus.m4 '/tmp/build_b191a43c9f4f1541f35409377bad92ed/libopus/build/share/aclocal'
remote: /bin/mkdir -p '/tmp/build_b191a43c9f4f1541f35409377bad92ed/libopus/build/lib/pkgconfig'
remote: /usr/bin/install -c -m 644 opus.pc '/tmp/build_b191a43c9f4f1541f35409377bad92ed/libopus/build/lib/pkgconfig'
remote: /bin/mkdir -p '/tmp/build_b191a43c9f4f1541f35409377bad92ed/libopus/build/include/opus'
remote: /usr/bin/install -c -m 644 include/opus.h include/opus_multistream.h include/opus_types.h include/opus_defines.h '/tmp/build_b191a43c9f4f1541f35409377bad92ed/libopus/build/include/opus'
remote: make[3]: Leaving directory `/tmp/build_b191a43c9f4f1541f35409377bad92ed/libopus/opus-1.1'
remote: make[2]: Leaving directory `/tmp/build_b191a43c9f4f1541f35409377bad92ed/libopus/opus-1.1'
remote: make[1]: Leaving directory `/tmp/build_b191a43c9f4f1541f35409377bad92ed/libopus/opus-1.1'
remote: -----> Python app detected
remote: $ pip install -r requirements.txt
remote:
remote: -----> ffmpeg app detected
remote: -----> Installing ffmpeg, ffmpeg-10bit, ffprobe, ffserver and qt-faststart ...
remote: exporting PATH
remote: -----> Discovering process types
remote: Procfile declares types -> worker
remote:
remote: -----> Compressing...
remote: Done: 220.2MHow would I set my library so my app can detect opus ? Your help would be greatly appreciated.