Recherche avancée

Médias (91)

Autres articles (75)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

Sur d’autres sites (7658)

  • Visualizing Call Graphs Using Gephi

    1er septembre 2014, par Multimedia Mike — General

    When I was at university studying computer science, I took a basic chemistry course. During an accompanying lab, the teaching assistant chatted me up and asked about my major. He then said, “Computer science ? Well, that’s just typing stuff, right ?”

    My impulsive retort : “Sure, and chemistry is just about mixing together liquids and coming up with different colored liquids, as seen on the cover of my high school chemistry textbook, right ?”


    Chemistry fun

    In fact, pure computer science has precious little to do with typing (as is joked in CS circles, computer science is about computers in the same way that astronomy is about telescopes). However, people who study computer science often pursue careers as programmers, or to put it in fancier professional language, software engineers.

    So, what’s a software engineer’s job ? Isn’t it just typing ? That’s where I’ve been going with this overly long setup. After thinking about it for long enough, I like to say that a software engineer’s trade is managing complexity.

    A few years ago, I discovered Gephi, an open source tool for graph and data visualization. It looked neat but I didn’t have much use for it at the time. Recently, however, I was trying to get a better handle on a large codebase. I.e., I was trying to manage the project’s complexity. And then I thought of Gephi again.

    Prior Work
    One way to get a grip on a large C codebase is to instrument it for profiling and extract details from the profiler. On Linux systems, this means compiling and linking the code using the -pg flag. After running the executable, there will be a gmon.out file which is post-processed using the gprof command.

    GNU software development tools have a reputation for being rather powerful and flexible, but also extremely raw. This first hit home when I was learning how to use the GNU tool for code coverage — gcov — and the way it outputs very raw data that you need to massage with other tools in order to get really useful intelligence.

    And so it is with gprof output. The output gives you a list of functions sorted by the amount of processing time spent in each. Then it gives you a flattened call tree. This is arranged as “during the profiled executions, function c was called by functions a and b and called functions d, e, and f ; function d was called by function c and called functions g and h”.

    How can this call tree data be represented in a more instructive manner that is easier to navigate ? My first impulse (and I don’t think I’m alone in this) is to convert the gprof call tree into a representation suitable for interpretation by Graphviz. Unfortunately, doing so tends to generate some enormous and unwieldy static images.

    Feeding gprof Data To Gephi
    I learned of Gephi a few years ago and recalled it when I developed an interest in gaining better perspective on a large base of alien C code. To understand what this codebase is doing for a particular use case, instrument it with gprof, gather execution data, and then study the code paths.

    How could I feed the gprof data into Gephi ? Gephi supports numerous graphing formats including an XML-based format named GEXF.

    Thus, the challenge becomes converting gprof output to GEXF.

    Which I did.

    Demonstration
    I have been absent from FFmpeg development for a long time, which is a pity because a lot of interesting development has occurred over the last 2-3 years after a troubling period of stagnation. I know that 2 big video codec developments have been HEVC (next in the line of MPEG codecs) and VP9 (heir to VP8’s throne). FFmpeg implements them both now.

    I decided I wanted to study the code flow of VP9. So I got the latest FFmpeg code from git and built it using the options "--extra-cflags=-pg --extra-ldflags=-pg". Annoyingly, I also needed to specify "--disable-asm" because gcc complains of some register allocation snafus when compiling inline ASM in profiling mode (and this is on x86_64). No matter ; ASM isn’t necessary for understanding overall code flow.

    After compiling, the binary ‘ffmpeg_g’ will have symbols and be instrumented for profiling. I grabbed a sample from this VP9 test vector set and went to work.

    ./ffmpeg_g -i vp90-2-00-quantizer-00.webm -f null /dev/null
    gprof ./ffmpeg_g > vp9decode.txt
    convert-gprof-to-gexf.py vp9decode.txt > /bigdisk/vp9decode.gexf
    

    Gephi loads vp9decode.gexf with no problem. Using Gephi, however, can be a bit challenging if one is not versed in any data exploration jargon. I recommend this Gephi getting starting guide in slide deck form. Here’s what the default graph looks like :


    gprof-ffmpeg-gephi-1

    Not very pretty or helpful. BTW, that beefy arrow running from mid-top to lower-right is the call from decode_coeffs_b -> iwht_iwht_4x4_add_c. There were 18774 from the former to the latter in this execution. Right now, the edge thicknesses correlate to number of calls between the nodes, which I’m not sure is the best representation.

    Following the tutorial slide deck, I at least learned how to enable the node labels (function symbols in this case) and apply a layout algorithm. The tutorial shows the force atlas layout. Here’s what the node neighborhood looks like for probing file type :


    gprof-ffmpeg-gephi-2

    Okay, so that’s not especially surprising– avprobe_input_format3 calls all of the *_probe functions in order to automatically determine input type. Let’s find that decode_coeffs_b function and see what its neighborhood looks like :


    gprof-ffmpeg-gephi-3

    That’s not very useful. Perhaps another algorithm might help. I select the Fruchterman–Reingold algorithm instead and get a slightly more coherent representation of the decoding node neighborhood :


    gprof-ffmpeg-gephi-4

    Further Work
    Obviously, I’m just getting started with this data exploration topic. One thing I would really appreciate in such a tool is the ability to interactively travel the graph since that’s what I’m really hoping to get out of this experiment– watching the code flows.

    Perhaps someone else can find better use cases for visualizing call graph data. Thus, I have published the source code for this tool at Github.

  • Can't view and record graph at the same time using FFMpegWriter [closed]

    7 juillet 2024, par Barkın Özer

    So this code is used for graphing and logging sensor data coming from bluetooth ports. I wanted to add an function that will record the graph in mp4 format. In order to achieve this I used ffmpegWriter. The issue is while this code records the graph I can't view the graph at the same time.

    


    import serial
import matplotlib.pyplot as plt
from matplotlib.animation import FuncAnimation, FFMpegWriter
import openpyxl
from datetime import datetime

# Constants
GRAVITY = 9.81  # Standard gravity in m/s²

# Initialize serial connections to HC-06 devices
ser_x_accel = serial.Serial('COM4', 9600, timeout=1)  # X-axis acceleration data
ser_y_angle = serial.Serial('COM11', 9600, timeout=1)  # Y-axis angle data

# Initialize empty lists to store data
x_accel_data = []
y_angle_data = []
timestamps = []

# Initialize Excel workbook
wb = openpyxl.Workbook()
ws = wb.active
ws.title = "Sensor Data"
ws.append(["Timestamp", "X Acceleration (m/s²)", "Y Angle (degrees)"])

# Function to update the plot and log data
def update(frame):
    # Read data from serial connections
    line_x_accel = ser_x_accel.readline().decode('utf-8').strip()
    line_y_angle = ser_y_angle.readline().decode('utf-8').strip()
    
    try:
        # Parse and process X-axis acceleration data
        x_accel_g = float(line_x_accel)  # Acceleration in g read from serial
        x_accel_ms2 = x_accel_g * GRAVITY  # Convert from g to m/s²
        x_accel_data.append(x_accel_ms2)
        
        # Parse and process Y-axis angle data
        y_angle = float(line_y_angle)
        y_angle_data.append(y_angle)
        
        # Append timestamp
        timestamps.append(datetime.now())

        # Limit data points to show only the latest 100
        if len(x_accel_data) > 100:
            x_accel_data.pop(0)
            y_angle_data.pop(0)
            timestamps.pop(0)

        # Log data to Excel with timestamp
        timestamp_str = timestamps[-1].strftime("%H:%M:%S")
        ws.append([timestamp_str, x_accel_data[-1], y_angle_data[-1]])

        # Clear and update plots
        ax1.clear()
        ax1.plot(timestamps, x_accel_data, label='X Acceleration', color='b')
        ax1.legend(loc='upper left')
        ax1.set_ylim([-20, 20])  # Adjust based on expected acceleration range in m/s²
        ax1.set_title('Real-time X Acceleration Data')
        ax1.set_xlabel('Time')
        ax1.set_ylabel('Acceleration (m/s²)')
        ax1.grid(True)

        ax2.clear()
        ax2.plot(timestamps, y_angle_data, label='Y Angle', color='g')
        ax2.legend(loc='upper left')
        ax2.set_ylim([-180, 180])
        ax2.set_title('Real-time Y Angle Data')
        ax2.set_xlabel('Time')
        ax2.set_ylabel('Angle (degrees)')
        ax2.grid(True)

        # Update text boxes with latest values
        text_box.set_text(f'X Acceleration: {x_accel_data[-1]:.2f} m/s²')
        text_box2.set_text(f'Y Angle: {y_angle_data[-1]:.2f}°')
        
        # Save the workbook periodically (every 100 updates)
        if frame % 100 == 0:
            wb.save("sensor_data.xlsx")
        
    except ValueError:
        pass  # Ignore lines that are not properly formatted

# Setup the plots
fig, (ax1, ax2) = plt.subplots(2, 1, figsize=(10, 8))
text_box = ax1.text(0.05, 0.95, '', transform=ax1.transAxes, fontsize=12, verticalalignment='top', bbox=dict(boxstyle='round', facecolor='wheat', alpha=0.5))
text_box2 = ax2.text(0.05, 0.95, '', transform=ax2.transAxes, fontsize=12, verticalalignment='top', bbox=dict(boxstyle='round', facecolor='wheat', alpha=0.5))

# Animate the plots
ani = FuncAnimation(fig, update, interval=100)  # Update interval of 100ms

# Save the animation as a video file
writer = FFMpegWriter(fps=10, metadata=dict(artist='Me'), bitrate=1800)
ani.save("sensor_data.mp4", writer=writer)

plt.tight_layout()
plt.show()

# Save the workbook at the end of the session
wb.save("sensor_data.xlsx")



    


    I tried using OpenCV to record the graph but then I didn't even got any recording. I think solving this issue with my original code would be a better approach.

    


  • How to fix ffmpeg,js length error in react-js project | fix error in react project

    19 février 2024, par X3R0

    I've tried to import the ffmpeg js library into my react-typescript project, I can't really update my project's react version or react-scripts due to the current code base.

    


    Error

    


    ./node_modules/@ffmpeg/ffmpeg/dist/umd/ffmpeg.js
TypeError: REDACTED_PROJECT_ROOT\node_modules\@ffmpeg\ffmpeg\dist\umd\ffmpeg.js: Cannot read properties of undefined (reading 'length')


    


    Code

    


    import React, { useEffect, useRef, useState } from 'react';
import { FFmpeg } from '@ffmpeg/ffmpeg';

/* MORE CODE HERE */

const ffmpeg = new FFmpeg();
const baseURL = 'https://unpkg.com/@ffmpeg/core@0.12.6/dist/umd';
const coreURL = `${baseURL}/ffmpeg-core.js`;
const wasmURL = `${baseURL}/ffmpeg-core.wasm`;
const coreData = await fromURLToBlob(coreURL);
const wasmData = await fromURLToBlob(wasmURL);
const coreBlob = new Blob([coreData], { type: "text/javascript"});
const wasmBlob = new Blob([wasmData], { type: "application/wasm"});
await ffmpeg.load({
    coreURL: fromBlobToURL(coreBlob),
    wasmURL: fromBlobToURL(wasmBlob),
});       

/* MORE CODE HERE */


    


    Versions

    


    npm v8.19.4
node v16.20.2


    


    tsconfig.json

    


    {
  "compilerOptions": {
    "target": "es5",
    "lib": [
      "dom",
      "dom.iterable",
      "esnext"
    ],
    "allowJs": true,
    "skipLibCheck": true,
    "esModuleInterop": true,
    "allowSyntheticDefaultImports": true,
    "strict": false,
    "forceConsistentCasingInFileNames": true,
    "noFallthroughCasesInSwitch": true,
    "module": "esnext",
    "moduleResolution": "node",
    "resolveJsonModule": true,
    "isolatedModules": true,
    "noEmit": true,
    "jsx": "react-jsx"
  },
  "include": [
    "src"
  ]
}



    


    package.json

    


    {
  "name": "frontend",
  "version": "0.1.0",
  "private": true,
  "dependencies": {
    "@emotion/react": "^11.10.4",
    "@emotion/styled": "^11.10.4",
    "@ffmpeg/ffmpeg": "^0.12.10",
    "@material-ui/core": "^4.12.3",
    "@mui/base": "^5.0.0-beta.36",
    "@mui/icons-material": "^5.10.3",
    "@mui/material": "^5.15.10",
    "@mui/x-data-grid": "^6.19.2",
    "@mui/x-data-grid-pro": "^6.19.2",
    "@reduxjs/toolkit": "^1.8.5",
    "@testing-library/jest-dom": "^5.14.1",
    "@testing-library/react": "^11.2.7",
    "@testing-library/user-event": "^12.8.3",
    "@toast-ui/editor": "^3.1.3",
    "@toast-ui/react-editor": "^3.1.3",
    "@types/jest": "^29.0.1",
    "@types/node": "^18.7.17",
    "@types/react": "^17.0.49",
    "@types/react-dom": "^18.0.6",
    "@zalando/oauth2-client-js": "^0.0.18",
    "ajv": "^8.12.0",
    "ajv-errors": "^3.0.0",
    "apexcharts": "^3.28.1",
    "arraybuffer-concat": "^0.0.1",
    "axios": "^1.4.0",
    "base64-blob": "^1.4.1",
    "bootstrap": "^5.1.1",
    "datetime-diff": "^0.2.1",
    "fuzzy-time": "^1.0.7",
    "jquery": "^3.6.0",
    "jso": "^4.1.1",
    "luxon": "^2.3.0",
    "pretty-bytes": "^5.6.0",
    "react": "^17.0.2",
    "react-apexcharts": "^1.3.9",
    "react-beautiful-dnd": "^13.1.0",
    "react-bootstrap": "^2.0.0-rc.0",
    "react-dom": "^17.0.2",
    "react-export-excel": "^0.5.3",
    "react-facebook": "^9.0.12",
    "react-helmet": "^6.1.0",
    "react-icons": "^4.3.1",
    "react-media-recorder": "^1.6.6",
    "react-notifications": "^1.7.2",
    "react-pages": "^0.4.4",
    "react-redux": "^7.2.8",
    "react-router": "^5.2.1",
    "react-router-dom": "^5.2.1",
    "react-scripts": "4.0.3",
    "react-toastify": "^8.0.2",
    "react-tooltip": "^4.2.21",
    "react-webcam": "^6.0.0",
    "recharts": "^2.1.8",
    "redux": "^4.2.0",
    "redux-thunk": "^2.4.1",
    "typescript": "^4.8.3",
    "web-vitals": "^1.1.2",
    "website-popup": "^3.0.0"
  },
  "scripts": {
    "start": "react-scripts start",
    "build": "set NODE_OPTIONS=--max-old-space-size=4096 && react-scripts build --GENERATE_SOURCEMAP=false",
    "test": "react-scripts test",
    "eject": "react-scripts eject"
  },
  "eslintConfig": {
    "extends": [
      "react-app"
    ]
  },
  "browserslist": {
    "production": [
      ">0.2%",
      "not dead",
      "not op_mini all"
    ],
    "development": [
      "last 1 chrome version",
      "last 1 firefox version",
      "last 1 safari version"
    ]
  },
  "devDependencies": {
    "@babel/plugin-proposal-nullish-coalescing-operator": "^7.18.6"
  }
}