Recherche avancée

Médias (2)

Mot : - Tags -/media

Autres articles (65)

  • Gestion générale des documents

    13 mai 2011, par

    MédiaSPIP ne modifie jamais le document original mis en ligne.
    Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
    Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (5729)

  • fftools/ffmpeg_mux_init : apply encoder options manually

    9 avril 2024, par Anton Khirnov
    fftools/ffmpeg_mux_init : apply encoder options manually
    

    Do not pass an options dictionary to the avcodec_open2() in enc_open().

    This is cleaner and more robust, as previously various bits of code
    would try to interpret the contents of the options dictionary, with
    varying degrees of correctness. Now they can just access the encoder
    AVCodecContext directly.

    Cf. 372c78dd42f2b1ca743473b9c32fad71c65919e0 - analogous change for
    decoding.

    A non-progressive field order is now written on the container level in
    interlaced ProRes encoding tests.

    • [DH] fftools/ffmpeg_enc.c
    • [DH] fftools/ffmpeg_mux_init.c
    • [DH] tests/ref/vsynth/vsynth1-prores_444_int
    • [DH] tests/ref/vsynth/vsynth1-prores_int
    • [DH] tests/ref/vsynth/vsynth2-prores_444_int
    • [DH] tests/ref/vsynth/vsynth2-prores_int
    • [DH] tests/ref/vsynth/vsynth3-prores_444_int
    • [DH] tests/ref/vsynth/vsynth3-prores_int
    • [DH] tests/ref/vsynth/vsynth_lena-prores_444_int
    • [DH] tests/ref/vsynth/vsynth_lena-prores_int
  • Can't view and record graph at the same time using FFMpegWriter [closed]

    7 juillet 2024, par Barkın Özer

    So this code is used for graphing and logging sensor data coming from bluetooth ports. I wanted to add an function that will record the graph in mp4 format. In order to achieve this I used ffmpegWriter. The issue is while this code records the graph I can't view the graph at the same time.

    


    import serial
import matplotlib.pyplot as plt
from matplotlib.animation import FuncAnimation, FFMpegWriter
import openpyxl
from datetime import datetime

# Constants
GRAVITY = 9.81  # Standard gravity in m/s²

# Initialize serial connections to HC-06 devices
ser_x_accel = serial.Serial('COM4', 9600, timeout=1)  # X-axis acceleration data
ser_y_angle = serial.Serial('COM11', 9600, timeout=1)  # Y-axis angle data

# Initialize empty lists to store data
x_accel_data = []
y_angle_data = []
timestamps = []

# Initialize Excel workbook
wb = openpyxl.Workbook()
ws = wb.active
ws.title = "Sensor Data"
ws.append(["Timestamp", "X Acceleration (m/s²)", "Y Angle (degrees)"])

# Function to update the plot and log data
def update(frame):
    # Read data from serial connections
    line_x_accel = ser_x_accel.readline().decode('utf-8').strip()
    line_y_angle = ser_y_angle.readline().decode('utf-8').strip()
    
    try:
        # Parse and process X-axis acceleration data
        x_accel_g = float(line_x_accel)  # Acceleration in g read from serial
        x_accel_ms2 = x_accel_g * GRAVITY  # Convert from g to m/s²
        x_accel_data.append(x_accel_ms2)
        
        # Parse and process Y-axis angle data
        y_angle = float(line_y_angle)
        y_angle_data.append(y_angle)
        
        # Append timestamp
        timestamps.append(datetime.now())

        # Limit data points to show only the latest 100
        if len(x_accel_data) > 100:
            x_accel_data.pop(0)
            y_angle_data.pop(0)
            timestamps.pop(0)

        # Log data to Excel with timestamp
        timestamp_str = timestamps[-1].strftime("%H:%M:%S")
        ws.append([timestamp_str, x_accel_data[-1], y_angle_data[-1]])

        # Clear and update plots
        ax1.clear()
        ax1.plot(timestamps, x_accel_data, label='X Acceleration', color='b')
        ax1.legend(loc='upper left')
        ax1.set_ylim([-20, 20])  # Adjust based on expected acceleration range in m/s²
        ax1.set_title('Real-time X Acceleration Data')
        ax1.set_xlabel('Time')
        ax1.set_ylabel('Acceleration (m/s²)')
        ax1.grid(True)

        ax2.clear()
        ax2.plot(timestamps, y_angle_data, label='Y Angle', color='g')
        ax2.legend(loc='upper left')
        ax2.set_ylim([-180, 180])
        ax2.set_title('Real-time Y Angle Data')
        ax2.set_xlabel('Time')
        ax2.set_ylabel('Angle (degrees)')
        ax2.grid(True)

        # Update text boxes with latest values
        text_box.set_text(f'X Acceleration: {x_accel_data[-1]:.2f} m/s²')
        text_box2.set_text(f'Y Angle: {y_angle_data[-1]:.2f}°')
        
        # Save the workbook periodically (every 100 updates)
        if frame % 100 == 0:
            wb.save("sensor_data.xlsx")
        
    except ValueError:
        pass  # Ignore lines that are not properly formatted

# Setup the plots
fig, (ax1, ax2) = plt.subplots(2, 1, figsize=(10, 8))
text_box = ax1.text(0.05, 0.95, '', transform=ax1.transAxes, fontsize=12, verticalalignment='top', bbox=dict(boxstyle='round', facecolor='wheat', alpha=0.5))
text_box2 = ax2.text(0.05, 0.95, '', transform=ax2.transAxes, fontsize=12, verticalalignment='top', bbox=dict(boxstyle='round', facecolor='wheat', alpha=0.5))

# Animate the plots
ani = FuncAnimation(fig, update, interval=100)  # Update interval of 100ms

# Save the animation as a video file
writer = FFMpegWriter(fps=10, metadata=dict(artist='Me'), bitrate=1800)
ani.save("sensor_data.mp4", writer=writer)

plt.tight_layout()
plt.show()

# Save the workbook at the end of the session
wb.save("sensor_data.xlsx")



    


    I tried using OpenCV to record the graph but then I didn't even got any recording. I think solving this issue with my original code would be a better approach.

    


  • washed out colors when converting h264 into vp9 using ffmpeg [closed]

    31 juillet 2024, par apes-together-strong

    I'm trying to convert an h264 video to vp9/opus webm.
Every attempt so far has had washed out colors.
Is there a fix for this issue or is it just the way h264->vp9 conversion is ?

    


    ffprobe of the source file :

    


    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'test1.mp4':
  Metadata:
    major_brand     : mp42
    minor_version   : 0
    compatible_brands: isommp42
    creation_time   : 2024-07-30T17:03:10.000000Z
    com.android.version: 10
  Duration: 00:01:04.07, start: 0.000000, bitrate: 20198 kb/s
  Stream #0:0[0x1](eng): Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc, bt470bg/bt470bg/smpte170m, progressive), 1920x1080, 20001 kb/s, SAR 1:1 DAR 16:9, 30 fps, 30 tbr, 90k tbn (default)
    Metadata:
      creation_time   : 2024-07-30T17:03:10.000000Z
      handler_name    : VideoHandle
      vendor_id       : [0][0][0][0]
    Side data:
      displaymatrix: rotation of -90.00 degrees
  Stream #0:1[0x2](eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 96 kb/s (default)
    Metadata:
      creation_time   : 2024-07-30T17:03:10.000000Z
      handler_name    : SoundHandle
      vendor_id       : [0][0][0][0]