Recherche avancée

Médias (0)

Mot : - Tags -/signalement

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (70)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

Sur d’autres sites (7036)

  • Convert H.264 Annex B to MPEG-TS

    4 juin 2014, par Lane

    SO...

    I have RAW H.264 video data captured via RTSP in a local file and I am attempting to playback the video in a Java FX application. In order to do this, I need to use Http Live Streaming.

    I have successfully prototyped a Java FX architecture that can play a video via HLS with a local server using a local folder containing a .m3u8 (HLS index) file and collection of .ts (MPEG-TS) files. The last piece for me is to replace the .ts files with .264 / .h264 files and in the local server, perform the conversion / wrapping of the H.264 Annex B data into MPEG-TS.

    I am having trouble figuring out what is required to get H.264 Annex B into MPEG-TS. I have found the following information...

    "Annex B is commonly used in live and streaming formats such as
    transport streams..."

    szatmary.org/blog/25

    "Annex B of of the document specifies one such format, which wraps NAL
    units in a format resembling a traditional MPEG video elementary
    stream, thus making it suitable for use with containers like MPEG
    PS/TS unable to provide the required framing..."

    wiki.multimedia.cx/ ?title=H.264

    "Java FX supports a number of different media types. A media type is
    considered to be the combination of a container format and one or more
    encodings. In some cases the container format might simply be an
    elementary stream containing the encoded data."

    docs.oracle.com/javafx/2/api/javafx/scene/media/package-summary.html

    "Use the CODECS attribute of the EXT-X-STREAM-INF tag. When this
    attribute is present, it must include all codecs and profiles required
    to play back the stream..."

    developer.apple.com/library/ios/documentation/networkinginternet/conceptual/streamingmediaguide/FrequentlyAskedQuestions/FrequentlyAskedQuestions.html

    It seems like I am missing something simple around Elementary and Transport Streams. I have used ffmpeg to convert my H.264 file into a TS file and try to understand the differences. I have an idea of the approximate format differences, but I am still lacking on the details to do it. Does anyone have a link showcasing this or know something simple about how to serve H.264 Annex B data over MPEG-TS ?

    I am not looking to use a tool, I need to have a custom file format locally where I parse out the H.264 Annex B data and perform the format change in memory, on the fly. I know of a way to use ffmpeg with pipes to accomplish this, but I do not want to have any dependencies and performance is important.

  • FFmpeg command help and pointers for documentation

    27 novembre 2011, par mahi

    I was working with FFmpeg for one of my android project. So far, I have been able to successfully able to compile FFmpeg for ARM. Now my approach is to write a JNI interface for playing videos using FFmpeg.

    I tried executing the command ./ffmpeg --help to see the options available with FFmpeg, and so far I am only able to understand that the input filename can be provided with -i fileName option.

    I have been searching for online tutorials / blogs for FFmpeg commands, and how to play a video / RTMP stream, but couldn't find a suitable link.

    I'd like to know the following :

    1. What is the command to play a video using FFMpeg ?
    2. What is the command to play a local file using FFMpeg
    3. What is the command to play a RTMP stream using FFMpeg
    4. Java / C sample code to play video using FFMpeg
    5. Is it possible to extract some piece of code from ffplay.c and write a custom code ?

    Any help with the above and / or any pointers to relevant links is highly appreciated.

    Thanks.

  • Convert wmv to mp4 with ffmpeg failing

    10 janvier 2012, par Morph

    I've seen quite a few posts on this, but I can't piece together whether I am doing things right, wrong, or need to download more stuff. I am converting from wmv to mp4 without complaints, but then when I go to play it on the browser window (HTML5) the player just turns grey and blanks out the controls.

    Installing ffmpeg I do

    ./configure --disable-yasm ; make ; make install

    Unless I include the disable yasm it wont go any further. Then I do

    ffmpeg -i myvideo.wmv myvideo.mp4

    All good so far. In my html source I have :

     <video width="320" height="240" controls="controls">
     <source src="myvideo.mp4" type="&#39;video/mp4;" codecs="avc1.42E01E, mp4a.40.2"></source>
     Your browser does not support the video tag.
     </video>

    I am playing this in Chrome 15 and ffmpeg -v is

    ffmpeg version 0.8.6, Copyright (c) 2000-2011 the FFmpeg developers
    built on Dec  1 2011 15:42:06 with gcc 4.1.2 20080704 (Red Hat 4.1.2-51)
    configuration: --disable-yasm
    libavutil    51.  9. 1 / 51.  9. 1
    libavcodec   53.  7. 0 / 53.  7. 0
    libavformat  53.  4. 0 / 53.  4. 0
    libavdevice  53.  1. 1 / 53.  1. 1
    libavfilter   2. 23. 0 /  2. 23. 0
    libswscale    2.  0. 0 /  2.  0. 0

    So I get the HTML5, click on it to play the movie but then the control bar greys out, leaving the play button, but then the play button cannot be clicked and nothing plays.

    Is there something wrong with what I have done above ? Do I need to download some separate mp4 driver and compile it ? I see people referring to h.264 but I thoughts ffmpeg had that already included...