Recherche avancée

Médias (2)

Mot : - Tags -/documentation

Autres articles (41)

  • Personnaliser les catégories

    21 juin 2013, par

    Formulaire de création d’une catégorie
    Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
    Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
    On peut modifier ce formulaire dans la partie :
    Administration > Configuration des masques de formulaire.
    Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
    Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

Sur d’autres sites (8889)

  • Managing Music Playback Channels

    30 juin 2013, par Multimedia Mike — General

    My Game Music Appreciation site allows users to interact with old video game music by toggling various channels, as long as the underlying synthesizer engine supports it.


    5 NES voices

    Users often find their way to the Nintendo DS section pretty quickly. This is when they notice an obnoxious quirk with the channel toggling feature : specifically, one channel doesn’t seem to map to a particular instrument or track.

    When it comes to computer music playback methodologies, I have long observed that there are 2 general strategies : Fixed channel and dynamic channel allocation.

    Fixed Channel Approach
    One of my primary sources of computer-based entertainment used to be watching music. Sure I listened to it as well. But for things like Amiga MOD files and related tracker formats, there was a rich ecosystem of fun music playback programs that visualized the music. There exist music visualization modes in various music players these days (such as iTunes and Windows Media Player), but those largely just show you a single wave form. These files were real time syntheses based on multiple audio channels and usually showed some form of analysis for each channel. My personal favorite was Cubic Player :


    Open Cubic Player -- oscilloscopes

    Most of these players supported the concept of masking individual channels. In doing so, the user could isolate, study, and enjoy different components of the song. For many 4-channel Amiga MOD files, I observed that the common arrangement was to use the 4 channels for beat (percussion track), bass line, chords, and melody. Thus, it was easy to just listen to, e.g., the bass line in isolation.

    MODs and similar formats specified precisely which digital audio sample to play at what time and on which specific audio channel. To view the internals of one of these formats, one gets the impression that they contain an extremely computer-centric view of music.

    Dynamic Channel Allocation Algorithm
    MODs et al. enjoyed a lot of popularity, but the standard for computer music is MIDI. While MOD and friends took a computer-centric view of music, MIDI takes, well, a music-centric view of music.

    There are MIDI visualization programs as well. The one that came with my Gravis Ultrasound was called PLAYMIDI.EXE. It looked like this…


    Gravis Ultrasound PLAYMIDI.EXE application

    … and it confused me. There are 16 distinct channels being visualized but some channels are shown playing multiple notes. When I dug into the technical details, I learned that MIDI just specifies what notes need to be played, at what times and frequencies and using which instrument samples, and it was the MIDI playback program’s job to make it happen.

    Thus, if a MIDI file specifies that track 1 should play a C major chord consisting of notes C, E, and G, it would transmit events “key-on C ; delta time 0 ; key-on E ; delta time 0 ; key-on G ; delta time … ; [other commands]“. If the playback program has access to multiple channels (say, up to 32, in the case of the GUS), the intuitive approach would be to maintain a pool of all available channels. Then, when it’s time to process the “key-on C” event, fetch the first available channel from the pool, mark it as in-use, play C on the channel, and return that channel to the pool when either the sample runs its course or the corresponding “key-off C” event is encountered in the MIDI command stream.

    About That Game Music
    Circling back around to my game music website, numerous supported systems use the fixed channel approach for playback while others use dynamic channel allocation approach, including evey Nintendo DS game I have so far analyzed.

    Which approach is better ? As in many technical matters, there are trade-offs either way. For many systems, the fixed channel approach is necessary because for many older audio synthesis systems, different channels had very specific purposes. The 8-bit NES had 5 channels : 2 square wave generators (used musically for melody/treble), 1 triangle wave generator (usually used for bass line), a noise generator (subverted for all manner of percussive sounds), and a limited digital channel (was sometimes assigned richer percussive sounds). Dynamic channel allocation wouldn’t work here.

    But the dynamic approach works great on hardware with 16 digital channels available like, for example, the Nintendo DS. Digital channels are very general-purpose. What about the SNES, with its 8 digital channels ? Either approach could work. In practice, most games used a fixed channel approach : Games might use 4-6 channels for music while reserving the remainder for various in-game sound effects. Some notable exceptions to this pattern were David Wise’s compositions for Rare’s SNES games (think Battletoads and the various Donkey Kong Country titles). These clearly use some dynamic channel approach since masking all but one channel will give you a variety of instrument sounds.

    Epilogue
    There ! That took a long time to explain but I find it fascinating for some reason. I need to distill it down to far fewer words because I want to make it a FAQ on my website for “Why can’t I isolate specific tracks for Nintendo DS games ?”

    Actually, perhaps I should remove the ability to toggle Nintendo DS channels in the first place. Here’s a funny tale of needless work : I found the Vio2sf engine for synthesizing Nintendo DS music and incorporated it into the program. It didn’t support toggling of individual channels so I figured out a way to add that feature to the engine. And then I noticed that most Nintendo DS games render that feature moot. After I released the webapp, I learned that I was out of date on the Vio2sf engine. The final insult was that the latest version already supports channel toggling. So I did the work for nothing. But then again, since I want to remove that feature from the UI, doubly so.

  • Evolution #3518 (Nouveau) : #contenu et cfg utilisateur Grand écran / Petit écran

    29 juillet 2015, par PIerre LASZCZAK

    Bonjour,

    Le passage de Petit écran -> Ecran large n’a aucun effet sur la


    Au final cette option ne fait qu’élargir le bandeau et le pied de page...
    En l’état, cette fonctionnalité semble quasi inutile.

    Pour ma part, un simple

     body.large #contenuwidth:756px !important ;
    


    Sauve la mise de l’option "Ecran large" ( je n’ai jamais vu de colonne à droite dans l’espace privé de spip, mis à part un encart "dans la même rubrique" sur un article )

  • video proccesing : extract frames and encrypt them then insert them back to the video in java using xuggler

    24 juillet 2015, par Anas M. Jubara

    I’m working on a video encryption application .. the main idea is to input the video file to the application and the application should output it in an encrypted form... am using xuggler library to manipulate the video and get to the frames and AES for encryption.
    my code works fine for accessing the frames and encrypting them, what i need help with is how to write the encrypted frame back to the video file to replace the original one without corrupting the video file for the video players.
    Here is my code

    package xuggler;

    import com.xuggle.mediatool.IMediaReader;
    import com.xuggle.mediatool.IMediaWriter;
    import com.xuggle.mediatool.MediaListenerAdapter;
    import com.xuggle.mediatool.ToolFactory;
    import com.xuggle.mediatool.event.IVideoPictureEvent;
    import com.xuggle.xuggler.Global;
    import com.xuggle.xuggler.ICodec;

    import java.awt.Graphics2D;
    import java.awt.Point;
    import java.awt.Transparency;
    import java.awt.image.BufferedImage;
    import java.awt.image.ColorModel;
    import java.awt.image.ComponentColorModel;
    import java.awt.image.DataBuffer;
    import java.awt.image.DataBufferByte;
    import java.awt.image.Raster;
    import java.awt.image.WritableRaster;

    import java.io.File;

    import java.security.InvalidKeyException;
    import java.security.NoSuchAlgorithmException;
    import java.security.SecureRandom;

    import java.util.concurrent.TimeUnit;
    import java.util.logging.Level;
    import java.util.logging.Logger;

    import javax.crypto.BadPaddingException;
    import javax.crypto.Cipher;
    import javax.crypto.IllegalBlockSizeException;
    import javax.crypto.KeyGenerator;
    import javax.crypto.NoSuchPaddingException;
    import javax.crypto.SecretKey;

    import javax.imageio.ImageIO;


    public class DecodeAndCaptureFrames extends MediaListenerAdapter
    {

    // The number of seconds between frames.
         public static final double SECONDS_BETWEEN_FRAMES = 5;

     //The number of micro-seconds between frames.
     public static final long MICRO_SECONDS_BETWEEN_FRAMES =(long)      (Global.DEFAULT_PTS_PER_SECOND * SECONDS_BETWEEN_FRAMES);

     // Time of last frame write
     private static long mLastPtsWrite = Global.NO_PTS;

    private static final double FRAME_RATE = 50;

    private static final int SECONDS_TO_RUN_FOR = 20;

    private static final String outputFilename = "D:\\K.mp4";

    public static IMediaWriter writer = ToolFactory.makeWriter(outputFilename);
    //receive BufferedImage and returns its byte data
       public static byte[] get_byte_data(BufferedImage image) {
       WritableRaster raster = image.getRaster();
       DataBufferByte buffer = (DataBufferByte) raster.getDataBuffer();
       return buffer.getData();
    }


    //create new_img with the attributes of image
    public static BufferedImage user_space(BufferedImage image) {
       //create new_img with the attributes of image
       BufferedImage new_img = new BufferedImage(image.getWidth(), image.getHeight(), BufferedImage.TYPE_3BYTE_BGR);
       Graphics2D graphics = new_img.createGraphics();
       graphics.drawRenderedImage(image, null);
       graphics.dispose(); //release all allocated memory for this image
       return new_img;
    }

    public static BufferedImage toImage(byte[] imagebytes, int width, int height) {
       DataBuffer buffer = new DataBufferByte(imagebytes, imagebytes.length);
       WritableRaster raster = Raster.createInterleavedRaster(buffer, width, height,
          3 * width, 3, new int[]{2, 1, 0}, (Point) null);

       ColorModel cm = new ComponentColorModel(ColorModel.getRGBdefault().getColorSpace(),
               false, true, Transparency.OPAQUE, DataBuffer.TYPE_BYTE);
       return new BufferedImage(cm, raster, true, null);
    }

    public static byte[] encrypt(byte[] orgnlbytes, String key) throws NoSuchPaddingException, IllegalBlockSizeException, BadPaddingException {
       byte[] encbytes = null;
       try {
           Cipher cipher = Cipher.getInstance("AES");
           KeyGenerator keyGen = KeyGenerator.getInstance("AES");
           SecureRandom random = SecureRandom.getInstance("SHA1PRNG");
           // cryptograph. secure random
           random.setSeed(key.getBytes());

           keyGen.init(128, random);
           // for example
           SecretKey secretKey = keyGen.generateKey();
           try {
               cipher.init(Cipher.ENCRYPT_MODE, secretKey);
           } catch (InvalidKeyException ex) {
               Logger.getLogger(DecodeAndCaptureFrames.class.getName()).log(Level.SEVERE, null, ex);
           }
           encbytes = cipher.doFinal(orgnlbytes);
       }
       catch (NoSuchAlgorithmException ex) {
           Logger.getLogger(DecodeAndCaptureFrames.class.getName()).log(Level.SEVERE, null, ex);
       }        catch (NoSuchPaddingException ex)
       {
           System.out.print("can not encrypt buffer");
       }

       return encbytes;
    }


     /**
      * The video stream index, used to ensure we display frames from one
      * and only one video stream from the media container.
      */

     private int mVideoStreamIndex = -1;

     /**
      * Takes a media container (file) as the first argument, opens it and
      *  writes some of it's video frames to PNG image files in the
      *  temporary directory.
      *
      * @param args must contain one string which represents a filename
      */

     public static void main(String[] args)
     {
       // create a new mr. decode and capture frames


       DecodeAndCaptureFrames decodeAndCaptureFrames;
       decodeAndCaptureFrames = new DecodeAndCaptureFrames("D:\\K.mp4");
     }

     /** Construct a DecodeAndCaptureFrames which reads and captures
      * frames from a video file.
      *
      * @param filename the name of the media file to read
      */


     //makes reader to the file and read the data of it
     public DecodeAndCaptureFrames(String filename)
     {
       // create a media reader for processing video

      IMediaReader reader = ToolFactory.makeReader(filename);

    // stipulate that we want BufferedImages created in BGR 24bit color space
    reader.setBufferedImageTypeToGenerate(BufferedImage.TYPE_3BYTE_BGR);


    // note that DecodeAndCaptureFrames is derived from
    // MediaReader.ListenerAdapter and thus may be added as a listener
    // to the MediaReader. DecodeAndCaptureFrames implements
    // onVideoPicture().

    reader.addListener(this);

    // read out the contents of the media file, note that nothing else
    // happens here.  action happens in the onVideoPicture() method
    // which is called when complete video pictures are extracted from
    // the media source

    while (reader.readPacket() == null)
     do {} while(false);
     }

    /**
      * Called after a video frame has been decoded from a media stream.
      * Optionally a BufferedImage version of the frame may be passed
      * if the calling {@link IMediaReader} instance was configured to
      * create BufferedImages.
      *
      * This method blocks, so return quickly.
      */

     public void onVideoPicture(IVideoPictureEvent event)
     {
       try
       {
         // if the stream index does not match the selected stream index,
         // then have a closer look

     if (event.getStreamIndex() != mVideoStreamIndex)
     {
       // if the selected video stream id is not yet set, go ahead an
       // select this lucky video stream

       if (-1 == mVideoStreamIndex)
         mVideoStreamIndex = event.getStreamIndex();

       // otherwise return, no need to show frames from this video stream

       else
         return;
     }

     // if uninitialized, backdate mLastPtsWrite so we get the very
     // first frame

     if (mLastPtsWrite == Global.NO_PTS)
       mLastPtsWrite = event.getTimeStamp() - MICRO_SECONDS_BETWEEN_FRAMES;

     // if it's time to write the next frame

     if (event.getTimeStamp() - mLastPtsWrite >= MICRO_SECONDS_BETWEEN_FRAMES)
     {
       // Make a temporary file name

      // File file = File.createTempFile("frame", ".jpeg");

       // write out PNG

    //        ImageIO.write(event.getImage(), "png", file);

       BufferedImage orgnlimage = event.getImage();
           orgnlimage = user_space(orgnlimage);
           byte[] orgnlimagebytes = get_byte_data(orgnlimage);
           byte[] encryptedbytes = encrypt(orgnlimagebytes, "abc");
           BufferedImage encryptedimage = toImage(encryptedbytes, orgnlimage.getWidth(), orgnlimage.getHeight());


           ImageIO.write(encryptedimage, "png", File.createTempFile("frame", ".png"));
    //         indicate file written

       double seconds = ((double)event.getTimeStamp())
         / Global.DEFAULT_PTS_PER_SECOND;
    //        System.out.printf("at elapsed time of %6.3f seconds wrote: %s\n",seconds, file);

       // update last write time

       mLastPtsWrite += MICRO_SECONDS_BETWEEN_FRAMES;
     }
    }
    catch (Exception e)
    {
     e.printStackTrace();
    }
     }

    }