Recherche avancée

Médias (91)

Autres articles (80)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

Sur d’autres sites (11256)

  • dash.js : four channels audio streaming

    19 juillet 2016, par Carlos Chacon

    Does dash.js supports 4 channels in the audio stream ?

    I’m using mp4 container : video : h254 and audio : AAC.

    For the case that the AAC audio stream is 2 channels it works fine, both video and audio play correctly. But 4 channels does NOT work.

    I’m using FFMPEG to create the files and WOWZA to setup the DASH streaming.

    For the case that the AAC audio stream is 4 channels I get the following error :

    Video can't be played because the file is corrupt.

    Command line to generate the 2-channels file :

    ffmpeg -i video_in.mp4 -i audio_in_4ch.wav -c:v copy -c:a aac -ac 2 output_2channels.mp4

    Command line to generate the 4-channels file :

    ffmpeg -i video_in.mp4 -i audio_in_4ch.wav -c:v copy -c:a aac -ac 4 output_4channels.mp4

    This is the fmmpeg -i information printed for each file :

    2 Channels output file details :

    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'output_2channels.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf57.37.101
     Duration: 00:00:17.59, start: 0.000000, bitrate: 901 kb/s
       Stream #0:0(eng): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 1024x512 [SAR 1:1 DAR 2:1], 894 kb/s, 29.97 fps, 29.97 tbr, 30k tbn, 59.94 tbc (default)
       Metadata:
         handler_name    : VideoHandler
       Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 129 kb/s (default)
       Metadata:
         handler_name    : SoundHandler

    4 Channels output file details :

    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'output_4channels.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf57.37.101
     Duration: 00:00:17.59, start: 0.000000, bitrate: 1038 kb/s
       Stream #0:0(eng): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 1024x512 [SAR 1:1 DAR 2:1], 894 kb/s, 29.97 fps, 29.97 tbr, 30k tbn, 59.94 tbc (default)
       Metadata:
         handler_name    : VideoHandler
       Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, 4.0, fltp, 265 kb/s (default)
       Metadata:
         handler_name    : SoundHandler

    4 Channels output file mdp file :

    <?xml version="1.0" encoding="UTF-8"?>
    <mpd xmlns="urn:mpeg:dash:schema:mpd:2011" profiles="urn:mpeg:dash:profile:isoff-live:2011" type="static" publishtime="2016-07-19T05:56:28Z" mediapresentationduration="PT17.585S" minbuffertime="PT1.5S">
    <programinformation>
       
    </programinformation>
    <location>http://192.168.0.103:1935/vod/_definst_/mp4:output_4channels.mp4/manifest_w882219731.mpd</location>
    <period start="PT0.0S">
       <baseurl>http://192.168.0.103:1935/vod/_definst_/mp4:output_4channels.mp4/</baseurl>
       <adaptationset mimetype="video/mp4" width="1024" height="512" par="1024:512" framerate="30000/1001" segmentalignment="true" startwithsap="1" subsegmentalignment="true" subsegmentstartswithsap="1">
           <segmenttemplate presentationtimeoffset="0" timescale="90000" media="chunk_ctvideo_rid$RepresentationID$_cs$Time$_w882219731_mpd.m4s" initialization="chunk_ctvideo_rid$RepresentationID$_cinit_w882219731_mpd.m4s">
               <segmenttimeline>
                   <s t="0" d="1582650"></s>
               </segmenttimeline>
           </segmenttemplate>
           <representation codecs="avc1.42c01e" sar="1:1" bandwidth="894760"></representation>
       </adaptationset>
       <adaptationset mimetype="audio/mp4" lang="eng" segmentalignment="true" startwithsap="1" subsegmentalignment="true" subsegmentstartswithsap="1">
           <audiochannelconfiguration schemeiduri="urn:mpeg:dash:23003:3:audio_channel_configuration:2011" value="2"></audiochannelconfiguration>
           <segmenttemplate presentationtimeoffset="0" timescale="48000" media="chunk_ctaudio_rid$RepresentationID$_cs$Time$_w882219731_mpd.m4s" initialization="chunk_ctaudio_rid$RepresentationID$_cinit_w882219731_mpd.m4s">
               <segmenttimeline>
                   <s t="0" d="844080"></s>
               </segmenttimeline>
           </segmenttemplate>
           <representation codecs="mp4a.40.2" audiosamplingrate="48000" bandwidth="265851">
           </representation>
       </adaptationset>
    </period>
    </mpd>

    HTML Player Code :

    <code class="echappe-js">&lt;script src=&quot;http://cdn.dashjs.org/latest/dash.all.min.js&quot;&gt;&lt;/script&gt;

  • Using openCV library with Java works well on Linux but not on Windows

    6 août 2016, par user3586330

    I have a method that takes screenshots on absolute intervals (25%, 50%, 75% and 100%) from a video-file and save each of them to a separate .png-file. I use openCV with the JavaCV-Wrapper library to do that. The class/method of interest is :

    package de.stal.videoreporter;

    import org.bytedeco.javacpp.opencv_core;
    import static org.bytedeco.javacpp.opencv_imgcodecs.cvSaveImage;
    import org.bytedeco.javacv.FFmpegFrameGrabber;
    import org.bytedeco.javacv.Frame;
    import org.bytedeco.javacv.FrameGrabber;
    import org.bytedeco.javacv.OpenCVFrameConverter;

    class VideoThumbnailer {
    public void createThumbnails(String videoname) throws FrameGrabber.Exception {

       FFmpegFrameGrabber g = new FFmpegFrameGrabber("videos/" + videoname);
       g.start();
       OpenCVFrameConverter.ToIplImage converterToIplImage = new OpenCVFrameConverter.ToIplImage();
       int length = g.getLengthInFrames();
       int fifty = length / 2;
       int twentyfive = fifty / 2;
       int seventyfive = fifty + twentyfive;
       int hundred = length - 1;

       //each frame of video
       for (int j = 0; j &lt; length; j++) {

           if (j == twentyfive || j == fifty || j == seventyfive || j == hundred) {
               String ss = "";
               if (j == twentyfive) {
                   ss = "25";
               } else if (j == fifty) {
                   ss = "50";
               } else if (j == seventyfive) {
                   ss = "75";
               } else if (j == hundred) {
                   ss = "100";
               }

               g.setFrameNumber(j);

               Frame f = g.grabImage();

               opencv_core.IplImage image = converterToIplImage.convert(f);
               String img_path = "thumbnails/" + videoname + "." + ss + ".png";
               cvSaveImage(img_path, image);
           }

       }
       g.stop();
    }
    }

    That works fine on environment : Ubuntu 15.10 x64, Java v.1.7.0_101 and Netbeans 8.0.2 with Maven. So I exported the project to a runnable jar-file(with all dependencies included) and tried to start it on Windows 10 x64 via :

    java -jar VideoReporter-1.0-SNAPSHOT-jar-with-dependencies.jar

    On Windows an exception will be thrown when executing the .jar-file :

    Error putting member offsets for class org/bytedeco/javacpp/avutil$Pool_free_Pointer.
    Exception in thread "main" java.lang.NoClassDefFoundError: Could not initialize class org.bytedeco.javacpp.avutil
       at java.lang.Class.forName0(Native Method)
       at java.lang.Class.forName(Unknown Source)
       at org.bytedeco.javacpp.Loader.load(Loader.java:472)
       at org.bytedeco.javacpp.Loader.load(Loader.java:417)
       at org.bytedeco.javacpp.avformat$AVFormatContext.<clinit>(avformat.java:2719)
       at org.bytedeco.javacv.FFmpegFrameGrabber.startUnsafe(FFmpegFrameGrabber.java:391)
       at org.bytedeco.javacv.FFmpegFrameGrabber.start(FFmpegFrameGrabber.java:385)
       at de.stal.videoreporter.VideoThumbnailer.createThumbnails(VideoThumbnailer.java:15)
       at de.stal.videoreporter.MetaReader.slurpMetadata(MetaReader.java:67)
       at de.stal.videoreporter.VideoReporter.main(VideoReporter.java:19)
    </clinit>

    My pom.xml is :

    &lt;?xml version="1.0" encoding="UTF-8"?>
    <project xmlns="http://maven.apache.org/POM/4.0.0">
    <modelversion>4.0.0</modelversion>
    <groupid>de.stal</groupid>
    <artifactid>VideoReporter</artifactid>
    <version>1.0-SNAPSHOT</version>
    <packaging>jar</packaging>
    <properties>
       UTF-8
       1.7
       1.7
    </properties>
    <build>
       <plugins>
           <plugin>
               <artifactid>maven-assembly-plugin</artifactid>
               <configuration>
                   <archive>
                       <manifest>
                           <mainclass>de.stal.videoreporter.VideoReporter</mainclass>
                       </manifest>
                   </archive>
                   <descriptorrefs>
                       <descriptorref>jar-with-dependencies</descriptorref>
                   </descriptorrefs>
               </configuration>
               <executions>
                   <execution>
                       <id>make-assembly</id>
                       <phase>package</phase>
                       <goals>
                           <goal>single</goal>
                       </goals>
                   </execution>
               </executions>
           </plugin>
       </plugins>
    </build>
    <dependencies>
       <dependency>
           <groupid>org.apache.commons</groupid>
           <artifactid>commons-csv</artifactid>
           <version>1.1</version>
       </dependency>
       <dependency>
           <groupid>org.bytedeco</groupid>
           <artifactid>javacpp</artifactid>
           <version>1.2.1</version>
       </dependency>
       <dependency>
           <groupid>org.bytedeco</groupid>
           <artifactid>javacv</artifactid>
           <version>1.2</version>
       </dependency>
       <dependency>
           <groupid>com.fasterxml.jackson.dataformat</groupid>
           <artifactid>jackson-dataformat-csv</artifactid>
           <version>2.8.0.rc2</version>
       </dependency>
       <dependency>
           <groupid>javassist</groupid>
           <artifactid>javassist</artifactid>
           <version>3.12.1.GA</version>
       </dependency>
       <dependency>
           <groupid>commons-collections</groupid>
           <artifactid>commons-collections</artifactid>
           <version>3.2.1</version>
       </dependency>
       <dependency>
           <groupid>com.opencsv</groupid>
           <artifactid>opencsv</artifactid>
           <version>3.3</version>
       </dependency>
    </dependencies>
    </project>

    What might be the problem on Windows ? In my opinion there shouldn’t be a problem with access to openCV/FFMPEG-classes because they all have been included the .jar-file ? Is this a problem with the classpath ?

    Thanks, Peter

  • Writing A Dreamcast Media Player

    6 janvier 2017, par Multimedia Mike — Sega Dreamcast

    I know I’m not the only person to have the idea to port a media player to the Sega Dreamcast video game console. But I did make significant progress on an implementation. I’m a little surprised to realize that I haven’t written anything about it on this blog yet, given my propensity for publishing my programming misadventures.


    3 Dreamcast consoles in a row

    This old effort had been on my mind lately due to its architectural similarities to something else I was recently brainstorming.

    Early Days
    Porting a multimedia player was one of the earliest endeavors that I embarked upon in the multimedia domain. It’s a bit fuzzy for me now, but I’m pretty sure that my first exposure to the MPlayer project in 2001 arose from looking for a multimedia player to port. I fed it through the Dreamcast development toolchain but encountered roadblocks pretty quickly. However, this got me looking at the MPlayer source code and made me wonder how I could contribute, which is how I finally broke into practical open source multimedia hacking after studying the concepts and technology for more than a year at that point.

    Eventually, I jumped over to the xine project. After hacking on that for awhile, I remembered my DC media player efforts and endeavored to compile xine to the console. The first attempt was to simply compile the codebase using the Dreamcast hobbyist community’s toolchain. This is when I came to fear the multithreaded snake pit in xine’s core. Again, my memories are hazy on the specifics, but I remember the engine having a bunch of threading hacks with comments along the lines of “this code deadlocks sometimes, so on shutdown, monitor this lock and deliberately break it if it has been more than 3 seconds”.

    Something Workable
    Eventually, I settled on a combination of FFmpeg’s libavcodec library for audio and video decoders, xine’s demuxer library, and xine’s input API, combined with my own engine code to tie it all together along with video and output drivers provided by the KallistiOS hobbyist OS for Dreamcast. Here is a simple diagram of the data movement through this player :


    Architecture diagram for a Sega Dreamcast media player

    Details and Challenges
    This is a rare occasion when I actually got to write the core of a media player engine. I made some mistakes.

    xine’s internal clock ran at 90000 Hz. At least, its internal timestamps were all in reference to a 90 kHz clock. I got this brilliant idea to trigger timer interrupts at 6000 Hz to drive the engine. Whatever the timer facilities on the Dreamcast, I found that 6 kHz was the greatest common divisor with 90 kHz. This means that if I could have found an even higher GCD frequency, I would have used that instead.

    So the idea was that, for a 30 fps video, the engine would know to render a frame on every 200th timer interrupt. I eventually realized that servicing 6000 timer interrupts every second would incur a ridiculous amount of overhead. After that, my engine’s philosophy was to set a timer to fire for the next frame while beginning to process the current frame. I.e., when rendering a frame, set a timer to call back in 1/30th of a second. That worked a lot better.

    As I was still keen on 8-bit paletted image codecs at the time (especially since they were simple and small for bootstrapping this project), I got to use output palette images directly thanks to the Dreamcast’s paletted textures. So that was exciting. The engine didn’t need to convert the paletted images to a different colorspace before rendering. However, I seem to recall that the Dreamcast’s PowerVR graphics hardware required that 8-bit textures be twiddled/swizzled. Thus, it was still required to manipulate the 8-bit image before rendering.

    I made good progress on this player concept. However, a huge blocker for me was that I didn’t know how to make a proper user interface for the media player. Obviously, programming the Dreamcast occurred at a very low level (at least with the approach I was using), so there were no UI widgets easily available.

    This was circa 2003. I assumed there must have been some embedded UI widget libraries with amenable open source licenses that I could leverage. I remember searching and checking out a library named libSTK. I think STK stood for “set-top toolkit” and was positioned specifically for doing things like media player UIs on low-spec embedded computing devices. The domain hosting the project is no longer useful but this appears to be a backup of the core code.

    It sounded promising, but the libSTK developers had a different definition of “low-spec embedded” device than I did. I seem to recall that they were targeting something along with likes of a Pentium III clocked at 800 MHz with 128 MB RAM. The Dreamcast, by contrast, has a 200 MHz SH-4 CPU and 16 MB RAM. LibSTK was also authored in C++ and leveraged the Boost library (my first exposure to that code), and this all had the effect of making binaries quite large while I was trying to keep the player in lean C.

    Regrettably, I never made any serious progress on a proper user interface. I think that’s when the player effort ran out of steam.

    The Code
    So, that’s another project that I never got around to finishing or publishing. I was able to find the source code so I decided to toss it up on github, along with 2 old architecture outlines that I was able to dig up. It looks like I was starting small, just porting over a few of the demuxers and decoders that I knew well.

    I’m wondering if it would still be as straightforward to separate out such components now, more than 13 years later ?

    The post Writing A Dreamcast Media Player first appeared on Breaking Eggs And Making Omelettes.