
Recherche avancée
Médias (2)
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (36)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.
Sur d’autres sites (6933)
-
Matplotlib animation quality loss
28 juin 2017, par Doe aI’m currently trying to use matplotlib to animate a grid using imshow. However, I am finding a fairly significant quality loss in the animation.
Below is a simple animation that illustrates my issue fairly well. If you look at any particular frame of imagelist, you will see that there is no aliasing or gradient between colours. But if I then look at one frame of my animation, that is no longer the case. There is now a 4-5 pixel gradient between my blocks of colour. Does anyone know how I can get around this compression ? The file size of the animation doesn’t matter too much, as long as I can get a good quality animation.
#Create list of images
image=np.array([[[0,0,0],(0.7,0,0),(0,0,0),(0,0,0.8)]])
imagelist=[]
for i in range(0,100):
image[0][1][0]=i/100
imagelist.append(np.copy(image))
#Create figure
fig = plt.figure()
plt.axis('off')
im = plt.imshow(imagelist[0], vmin=0, vmax=255,interpolation='none');
#Animation
def updatefig(j):
im.set_array(imagelist[j])
return [im]
ani = animation.FuncAnimation(fig, updatefig, frames=range(len(imagelist)), interval=20, blit=False)
#Save animation
FFMpegWriter = animation.writers['ffmpeg']
mywriter = FFMpegWriter(fps=30, bitrate=5000)
ani.save("test.mp4", writer=mywriter,codec="libx264") -
Live video from raw tcp packets
7 juin 2017, par benuutswe are trying to make a small python app that display a live video from sniffed packets using
scapy
andffplay
. This is part of our master degree research project. The goal is to make a proof of concept app that spies on video transimitted over tcp.
We have a working script that writes into a.dat
file and then we read it usingffplay
. It works ok but have a lot of latency and we think we could do better : directly stream it intoffplay
without the need to write raw data in a file.Here’s our script :
from scapy.all import *
import os
export_dat = open("data.dat", "a")
def write_packet_raw(packet):
export_dat.write(str(packet.getlayer(Raw)))
def realtime_packet():
p = sniff(iface="wlan0", filter="tcp and (port 5555)", count=5000, prn=write_packet_raw)
realtime_packet()
export_dat.close()And then we launch :
ffplay -window_title Videostream -framedrop -infbuf -f h264 -i data.dat
Any idea on how we can achieve that ? thanks.
-
MPEG-DASH Livestreaming using ffmpeg, avconv, MP4Box and Dashjs
28 avril 2017, par Sushant MongiaI’m working on delivering Live Streaming with DASH capabilities. Long story short, it’s a very crude testbed setup so I might be off the mark in some respects. I’m also posting this as a simple setup for the community and people out there struggling to look up a Live Streaming with DASH Tutorial.
Setup :
OS : Ubuntu 16.04
Encoding Tools :
ffmpeg : To record a livestream using a desktop webcam in mpeg2 format
avconv : To convert mpeg2 to mpeg4 file format
MP4Box : To DASH it, i.e. produce the .mpd, some conf files, seg_init and the segments
Dashjs : Reference Client 2.4.1
Server : ApacheProcess :
I’ve written 3 bash scripts, with basically an infinite while loop in them, with ffmpeg , avconv and the MP4Box commands, one in each of them. I first run the ffmpeg script that records a video using the desktop webcam and then I run the avconv script that kills the ffmpeg command and converts the file from mpeg2 format to mpeg 4 format. Now since the ffmpeg command is in an infinite while loop, it restarts. Then I run the MP4Box command that DASH-es the avconv commands’ output. Then everything is sent to the DASHjs client and pretty much the whole setup gets repeated every 5 seconds.Commands :
ffmpeg -f v4l2 -framerate 25 -video_size 640x480 -i /dev/video0 -f mpegts -codec:v mpeg1video -s 640x480 -b:v 1000k -bf 0 livestream
avconv -i livestream out.mp4
MP4Box command in a loop, MP4Box -dash-live 4000 -fps 24 -frag 6000 -profile dashavc264:live -dynamic -mpd-refresh 5000 -dash-ctx dashtest.txt -time-shift -1 -inter 0 -segment-name output-seg -bs-switching no out.mp4
Problem :
The ffmpeg sends a chunk of 5 seconds (that’s due to the sleep command in my avconv bash script) and the MP4Box reads that 5 second chunk and loops that chunk. So when the next chunk comes in, newer segments are produced, but the player is still playing the older segments, typically just the very first few segments in a loop.Questions :
1) Am I missing out on some core concept here ? Are the commands and their respective attributes with the right parameters and the right values ?
2) I believe there should be a way to pipeline these processes in a better manner, should I be looking into writing a python script maybe ?Happy to provide more info ! Cheers