Recherche avancée

Médias (1)

Mot : - Tags -/3GS

Autres articles (89)

  • Mediabox : ouvrir les images dans l’espace maximal pour l’utilisateur

    8 février 2011, par

    La visualisation des images est restreinte par la largeur accordée par le design du site (dépendant du thème utilisé). Elles sont donc visibles sous un format réduit. Afin de profiter de l’ensemble de la place disponible sur l’écran de l’utilisateur, il est possible d’ajouter une fonctionnalité d’affichage de l’image dans une boite multimedia apparaissant au dessus du reste du contenu.
    Pour ce faire il est nécessaire d’installer le plugin "Mediabox".
    Configuration de la boite multimédia
    Dès (...)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • D’autres logiciels intéressants

    12 avril 2011, par

    On ne revendique pas d’être les seuls à faire ce que l’on fait ... et on ne revendique surtout pas d’être les meilleurs non plus ... Ce que l’on fait, on essaie juste de le faire bien, et de mieux en mieux...
    La liste suivante correspond à des logiciels qui tendent peu ou prou à faire comme MediaSPIP ou que MediaSPIP tente peu ou prou à faire pareil, peu importe ...
    On ne les connais pas, on ne les a pas essayé, mais vous pouvez peut être y jeter un coup d’oeil.
    Videopress
    Site Internet : (...)

Sur d’autres sites (9491)

  • Remove flicker, crop and upscale in ffmpeg

    22 août 2024, par Sabha

    I spent one full day on ffmpeg line commands by searching a lot on google but could not achieve what I wanted to and therefore I came here to seek some advice.
I have a video testinput.mpg which I believe is a mpeg-2 video. It is of 720x576 dimensions having 25 fps and a total bitrate of 4224 kbps

    


    First problem is that the exported footage is flickering which I wasn't able to remove using ffmpeg with lots of commands I tried like adjusting brightness, contrast, hue, saturation and all.

    


    Second problem was to extract the center portion which I was able to do it with the crop feature using following command.

    


    ffmpeg -i testinput.mpg -filter:v "crop=468:374" testoutput.mpg


    


    But after cropping I observed that that bitrate fell from 4224 kbps to 761 kbps and I assume this has reduced the quality of video.

    


    What I want to achieve is :

    


      

    1. Crop the video properly keeping the same quality (acodec copy vcodec copy) -> ffmpeg did not allow me to do both the things together (cropping and having same codec)
    2. 


    3. Remove the flicker from video and upscale it to 4K or HD quality so that it looks nice on big televison (preferably 4K)
    4. 


    


    I request some help on how to achieve the desired result. Can someone shed some light on it ?

    


    Here are 10 seconds sample videos on google drive that I am working on

    


    testinput.mpg

    


    testoutput.mpg

    


    Thanks

    


  • How to work with data received from streaming services in my Java application ?

    24 novembre 2020, par gabriel garcia

    I'm currently trying to develop an "streaming client" as a way to organize multiple stream services (twitch, yt, mitele...) in a single desktop application written in Java.

    


    It basically relies on streamlink (which relies in ffmpeg) thanks to all it's features so my project could be defined as a frontend for streamlink.

    


    Straight to the point, one of the features I'd like to add it is the option to programatically record streams in the background and showing this video stream to the user when it's requested. Since there's also the possibility that the user wants to watch the stream without recording it, I'm forced to work with all that byte-like data sent from those streaming sources.

    


    So, the problem is basically that I do not know much about video coding/decoding/muxing/demuxing nor video theory like container structure, video formats and such.

    


    But the idea is to work with all the data sent from the stream source (let's say twitch, for example), read this bytes (I'm not sure what kind of information is sent to the client nor format) from the java.lang.Process's stdout and then present it to the client.

    


    Here's another problem : I don't know how to play video streams in JavaFX and I don't think it's even supported right now. So I would have to extract each frame and sound associated from the stdout and show them to the user each time a new frame is received (oups, another problem since I don't know when does each frame starts/ends since I'm reading each stdout's line).

    


    As a summary :

    


      

    • What kind of data am I receiving from the streaming source ?
    • 


    • How can I know when does each frame starts/stops ?
    • 


    • How can I extract the image and sound from each frame ?
    • 


    


    I hope I'm not asking too much and that you could shed some light upon my darkness.

    


  • How to work with data from streaming services in my Java application ?

    24 novembre 2020, par gabriel garcia

    I'm currently trying to develop an "streaming client" as a way to organize multiple stream services (twitch, yt, mitele...) in a single desktop application written in Java.

    


    It basically relies on streamlink (which relies in ffmpeg) thanks to all it's features so my project could be defined as a frontend for streamlink.

    


    Straight to the point, one of the features I'd like to add it is the option to programatically record streams in the background and showing this video stream to the user when it's requested. Since there's also the possibility that the user wants to watch the stream without recording it, I'm forced to work with all that byte-like data sent from those streaming sources.

    


    So, the problem is basically that I do not know much about video coding/decoding/muxing/demuxing nor video theory like container structure, video formats and such.

    


    But the idea is to work with all the data sent from the stream source (let's say twitch, for example), read this bytes (I'm not sure what kind of information is sent to the client nor format) from the java.lang.Process's stdout and then present it to the client.

    


    Here's another problem : I don't know how to play video streams in JavaFX and I don't think it's even supported right now. So I would have to extract each frame and sound associated from the stdout and show them to the user each time a new frame is received (oups, another problem since I don't know when does each frame starts/ends since I'm reading each stdout's line).

    


    As a summary :

    


      

    • How can I know when does each frame starts/stops ?
    • 


    • How can I extract the image and sound from each frame ?
    • 


    


    I hope I'm not asking too much and that you could shed some light upon my darkness.