Recherche avancée

Médias (0)

Mot : - Tags -/alertes

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (56)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Mise à disposition des fichiers

    14 avril 2011, par

    Par défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
    Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
    Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)

Sur d’autres sites (7118)

  • ffmpeg "End mismatch 1" warning, jpeg2000 to avi

    11 avril 2023, par jklebes

    Trying to convert a directory of jpeg2000 grayscale images to a video with ffmpeg, I get warnings

    


    [0;36m[jpeg2000 @ 0x55d8fa1b68c0] [0m[0;33mEnd mismatch 1


    


    (and lots of

    


    Last message repeated <n> times&#xA;</n>

    &#xA;

    )

    &#xA;

    The command was

    &#xA;

    ffmpeg -y -r 10 -start_number 1 -i <path>/surface_30///img_000%01d.jp2 -vcodec msmpeg4 -vf scale=1920:-1 -q:v 8 <path>//surface_30///surface_30.avi&#xA;</path></path>

    &#xA;

    The output is

    &#xA;

    ffmpeg version 4.2.2 Copyright (c) 2000-2019 the FFmpeg developers&#xA;  built with gcc 7.3.0 (crosstool-NG 1.23.0.449-a04d0)&#xA;  configuration: --prefix=/home/jklebes001/miniconda3 --cc=/tmp/build/80754af9/ffmpeg_1587154242452/_build_env/bin/x86_64-conda_cos6-linux-gnu-cc --disable-doc --enable-avresample --enable-gmp --enable-hardcoded-tables --enable-libfreetype --enable-libvpx --enable-pthreads --enable-libopus --enable-postproc --enable-pic --enable-pthreads --enable-shared --enable-static --enable-version3 --enable-zlib --enable-libmp3lame --disable-nonfree --enable-gpl --enable-gnutls --disable-openssl --enable-libopenh264 --enable-libx264&#xA;  libavutil      56. 31.100 / 56. 31.100&#xA;  libavcodec     58. 54.100 / 58. 54.100&#xA;  libavformat    58. 29.100 / 58. 29.100&#xA;  libavdevice    58.  8.100 / 58.  8.100&#xA;  libavfilter     7. 57.100 /  7. 57.100&#xA;  libavresample   4.  0.  0 /  4.  0.  0&#xA;  libswscale      5.  5.100 /  5.  5.100&#xA;  libswresample   3.  5.100 /  3.  5.100&#xA;  libpostproc    55.  5.100 / 55.  5.100&#xA;[0;36m[jpeg2000 @ 0x55cb44144480] [0m[0;33mEnd mismatch 1&#xA;&#xA;[0m    Last message repeated 1 times&#xA;    Last message repeated 2 times&#xA;    Last message repeated 3 times&#xA;

    &#xA;

    ...

    &#xA;

    Last message repeated 73 times&#xA;&#xA;Input #0, image2, from &#x27;<path>//surface_30///img_000%01d.jp2&#x27;:&#xA;&#xA;  Duration: 00:00:00.20, start: 0.000000, bitrate: N/A&#xA;&#xA;    Stream #0:0: Video: jpeg2000, gray, 6737x4869, 25 tbr, 25 tbn, 25 tbc&#xA;&#xA;Stream mapping:&#xA;&#xA;  Stream #0:0 -> #0:0 (jpeg2000 (native) -> msmpeg4v3 (msmpeg4))&#xA;&#xA;Press [q] to stop, [?] for help&#xA;&#xA;[0;36m[jpeg2000 @ 0x55cb4418e200] [0m[0;33mEnd mismatch 1&#xA;&#xA;[0m[0;36m[jpeg2000 @ 0x55cb441900c0] [0m[0;33mEnd mismatch 1&#xA;</path>

    &#xA;

    ...

    &#xA;

    (about 600 lines of "end mismatch" and "last message repeated" cut)

    &#xA;

    ...

    &#xA;

    [0m[0;36m[jpeg2000 @ 0x55cb4418e8c0] [0m[0;33mEnd mismatch 1&#xA;&#xA;[0mOutput #0, avi, to &#x27;<path>/surface_30///surface_30.avi&#x27;:&#xA;&#xA;  Metadata:&#xA;&#xA;    ISFT            : Lavf58.29.100&#xA;&#xA;    Stream #0:0: Video: msmpeg4v3 (msmpeg4) (MP43 / 0x3334504D), yuv420p, 1920x1388, q=2-31, 200 kb/s, 10 fps, 10 tbn, 10 tbc&#xA;&#xA;    Metadata:&#xA;&#xA;      encoder         : Lavc58.54.100 msmpeg4&#xA;&#xA;    Side data:&#xA;&#xA;      cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1&#xA;&#xA;frame=    2 fps=0.8 q=8.0 size=       6kB time=00:00:00.20 bitrate= 227.1kbits/s speed=0.0844x    &#xA;frame=    5 fps=1.7 q=8.0 size=       6kB time=00:00:00.50 bitrate=  90.8kbits/s speed=0.172x    &#xA;frame=    5 fps=1.7 q=8.0 Lsize=     213kB time=00:00:00.50 bitrate=3494.7kbits/s speed=0.172x    &#xA;video:208kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 2.732246%&#xA;</path>

    &#xA;

    What is the meaning of characters like [0 ;33m here ?

    &#xA;

    I thought it might have something to do with bit depth and color format. Setting -pix_fmt gray had no effect, and indeed the format of the jp2 images is already detected as 8-bit gray.

    &#xA;

    The output .avi exists and seems fine.

    &#xA;

    The line was previously used on jpeg files and works fine on jpeg. With jpeg, the output has the line

    &#xA;

    Input #0, image2, from &#x27;<path>/surface_30///img_000%01d.jpeg&#x27;:&#xA;&#xA;  Duration: 00:00:00.16, start: 0.000000, bitrate: N/A&#xA;&#xA;    Stream #0:0: Video: mjpeg (Baseline), gray(bt470bg/unknown/unknown), 6737x4869 [SAR 1:1 DAR 6737:4869], 25 tbr, 25 tbn, 25 tbc&#xA;&#xA;Stream mapping:&#xA;&#xA;  Stream #0:0 -> #0:0 (mjpeg (native) -> msmpeg4v3 (msmpeg4))&#xA;&#xA;Press [q] to stop, [?] for help&#xA;&#xA;Output #0, avi, to &#x27;<path>/surface_30///surface_30.avi&#x27;:&#xA;&#xA;  Metadata:&#xA;&#xA;    ISFT            : Lavf58.29.100&#xA;&#xA;    Stream #0:0: Video: msmpeg4v3 (msmpeg4) (MP43 / 0x3334504D), yuv420p, 6737x4869 [SAR 1:1 DAR 6737:4869], q=2-31, 200 kb/s, 10 fps, 10 tbn, 10 tbc&#xA;&#xA;    Metadata:&#xA;&#xA;      encoder         : Lavc58.54.100 msmpeg4&#xA;&#xA;    Side data:&#xA;&#xA;      cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1&#xA;&#xA;frame=    2 fps=0.0 q=8.0 size=    6662kB time=00:00:00.20 bitrate=272859.9kbits/s speed=0.334x    &#xA;frame=    3 fps=2.2 q=10.0 size=   10502kB time=00:00:00.30 bitrate=286764.2kbits/s speed=0.22x    &#xA;frame=    4 fps=1.9 q=12.3 size=   13574kB time=00:00:00.40 bitrate=277987.7kbits/s speed=0.19x    &#xA;frame=    4 fps=1.4 q=12.3 size=   13574kB time=00:00:00.40 bitrate=277987.7kbits/s speed=0.145x    &#xA;frame=    4 fps=1.4 q=12.3 Lsize=   13657kB time=00:00:00.40 bitrate=279702.3kbits/s speed=0.145x    &#xA;video:13652kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.041926%&#xA;</path></path>

    &#xA;

    detecting mjpeg format and similar, but more detailed format gray(bt470bg/unknown/unknown), 6737x4869 [SAR 1:1 DAR 6737:4869].

    &#xA;

    What is the difference when switching input to jp2 ?

    &#xA;

  • bitmap to yuv , video recorded has only green pixels

    19 janvier 2016, par UserAx

    I am trying to convert a bitmap to yuv, and recording this yuv in the ffmpeg frame recorder...
    I am getting the video output with only green pixels, though when i check the properties of this video it shows the set Frame rate and the resolution...

    The yuv encoding part is correct, but i feel i am making mistake somewhere else, mostly in returning the yuv bytes to recording part ( getByte(byte [] yuv ) because only there the yuv.length displayed in console is 0,, rest all methods return a big value in console ...

    Kindly help...

    @Override
    public void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);
       setContentView(R.layout.activity_main);
       directory.mkdirs();

       addListenerOnButton();

       play=(Button)findViewById(R.id.buttonplay);
       stop=(Button)findViewById(R.id.buttonstop);
       record=(Button)findViewById(R.id.buttonstart);

       stop.setEnabled(false);
       play.setEnabled(false);


       record.setOnClickListener(new View.OnClickListener() {
           @Override
           public void onClick(View v) {
               startRecording();
               getByte(new byte[]{});
           }
       });

       stop.setOnClickListener(new View.OnClickListener() {
           @Override
           public void onClick(View v) {
               stopRecording();
           }
       });


       play.setOnClickListener(new View.OnClickListener() {
           @Override
           public void onClick(View v) throws IllegalArgumentException, SecurityException, IllegalStateException {
               Intent intent = new Intent(Intent.ACTION_VIEW, Uri.parse(String.valueOf(asmileys)));
               intent.setDataAndType(Uri.parse(String.valueOf(asmileys)), "video/mp4");
               startActivity(intent);
               Toast.makeText(getApplicationContext(), "Playing Video", Toast.LENGTH_LONG).show();
           }
       });

    }

    ......//......



    public void getByte(byte[] yuv) {
       getNV21(640, 480, bitmap);
       System.out.println(yuv.length + " ");
       if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
           startTime = System.currentTimeMillis();
           return;
       }
       if (RECORD_LENGTH > 0) {
           int i = imagesIndex++ % images.length;
           yuvimage = images[i];
           timestamps[i] = 1000 * (System.currentTimeMillis() - startTime);
       }
           /* get video data */
       if (yuvimage != null &amp;&amp; recording) {
               ((ByteBuffer) yuvimage.image[0].position(0)).put(yuv);

               if (RECORD_LENGTH &lt;= 0) {
                   try {
                       long t = 1000 * (System.currentTimeMillis() - startTime);
                       if (t > recorder.getTimestamp()) {
                           recorder.setTimestamp(t);
                       }
                       recorder.record(yuvimage);
                   } catch (FFmpegFrameRecorder.Exception e) {

                       e.printStackTrace();
                   }
               }
           }
    }

    public byte [] getNV21(int inputWidth, int inputHeight, Bitmap bitmap) {

       int[] argb = new int[inputWidth * inputHeight];

       bitmap.getPixels(argb, 0, inputWidth, 0, 0, inputWidth, inputHeight);

       byte[] yuv = new byte[inputWidth * inputHeight * 3 / 2];
       encodeYUV420SP(yuv, argb, inputWidth, inputHeight);

       bitmap.recycle();
       System.out.println(yuv.length + " ");
       return yuv;

    }

    void encodeYUV420SP(byte[] yuv420sp, int[] argb, int width, int height) {
       final int frameSize = width * height;

       int yIndex = 0;
       int uIndex = frameSize;
       int vIndex = frameSize;
       System.out.println(yuv420sp.length + " " + frameSize);

       int a, R, G, B, Y, U, V;
       int index = 0;
       for (int j = 0; j &lt; height; j++) {
           for (int i = 0; i &lt; width; i++) {

               a = (argb[index] &amp; 0xff000000) >> 24; // a is not used obviously
               R = (argb[index] &amp; 0xff0000) >> 16;
               G = (argb[index] &amp; 0xff00) >> 8;
               B = (argb[index] &amp; 0xff) >> 0;

               // well known RGB to YUV algorithm

               Y = ((66 * R + 129 * G + 25 * B + 128) >> 8) + 16;
               U = ((-38 * R - 74 * G + 112 * B + 128) >> 8) + 128;
               V = ((112 * R - 94 * G - 18 * B + 128) >> 8) + 128;

               // NV21 has a plane of Y and interleaved planes of VU each sampled by a factor of 2
               //    meaning for every 4 Y pixels there are 1 V and 1 U.  Note the sampling is every other
               //    pixel AND every other scanline.
               yuv420sp[yIndex++] = (byte) ((Y &lt; 0) ? 0 : ((Y > 255) ? 255 : Y));
               if (j % 2 == 0 &amp;&amp; index % 2 == 0) {
                   yuv420sp[uIndex++] = (byte) ((U &lt; 0) ? 0 : ((U > 255) ? 255 : U));
                   yuv420sp[vIndex++] = (byte) ((V &lt; 0) ? 0 : ((V > 255) ? 255 : V));
               }

               index++;
           }
       }
    }

    .....//.....

    public void addListenerOnButton() {
    image = (ImageView) findViewById(R.id.imageView);
    image.setDrawingCacheEnabled(true);
    image.buildDrawingCache();
    bitmap = image.getDrawingCache();
    System.out.println(bitmap.getByteCount() + " " );

    button = (Button) findViewById(R.id.btn1);
    button.setOnClickListener(new OnClickListener() {
    @Override
    public void onClick(View view){
       image.setImageResource(R.drawable.image1);
     }
    });

    ......//......

    EDIT 1 :

    I made few changes in the above code :

    record.setOnClickListener(new View.OnClickListener() {
           @Override
           public void onClick(View v) {
               startRecording();
               getByte();
           }
       });
    .....//....

    public void getbyte() {
       byte[] yuv = getNV21(640, 480, bitmap);

    So now in the console ; i get same yuv length in this method as the yuv length from getNV21 method..

    But now i am getting half screen Black and Half screen green(black above and green below) pixels in the recorded video...

    If i add these lines to onCreate method ;

    image = (ImageView) findViewById(R.id.imageView);
    image.setDrawingCacheEnabled(true);
    image.buildDrawingCache();
    bitmap = image.getDrawingCache();

    I do get distorted frames( frames are 1/4th of the image displayed with mix up of colors here and there) in the video....

    All i am trying to learn is the image processing and flow of Bytes[] from one method to another ; but i am still a noob.. ;

    Kindly help..!

  • nodejs ffmpeg play video at specific time and stream it to client

    12 mars 2020, par bluejayke

    I’m trying to make a basic online video editor with nodeJS and ffmpeg.

    To do this I need 2 steps :

    1. set the in-and-out times of the videos from the client, which requires the client to view the video at specific times, and switch the position of the video. Meaning, if a single video is used as an input, and split it into smaller parts, it needs to replay from the starting time of the next edited segment, if that makes sense.

    2. send the input-output data to nodejs and export it with ffmpeg as a finished vide.

    At first I wanted to do 1. purely on the client, then upload the source video(s) to nodeJS, and generate the same result with ffmpeg, and send back the result.

    But there are may problems with video processing on the client side in HTML at the moment, so now I have a change of plans : to do all of the processing on the nodeJS server, including the video playing.

    This is the part I am stuck at now. I’m aware that ffmpeg can be used in many different ways from nodeJS, but I have not found a way to play a .mp4 webm video in realtime with ffmpeg, at a specific timestamp, and send the streaming video (again, at a certain timestamp) to the client.

    I’ve seen the pipe:1 attribute from ffmpeg, but I couldn’t find any tutorials to get it working with an mp4 webm video, and to parse the stdout data somehow with nodejs and send it to the client. And even if I could get that part to work, I still have no idea to play the video, in realtime, at a certain timestamp.

    I’ve also seen ffplay, but that’s only for testing as far as I know ; I haven’t seen any way of getting the video data from it in realtime with nodejs.

    So :

    how can I play a video, in nodeJS, at a specific time (preferably with ffmpeg), and send it back to the client in realtime ?

    What I have already seen :

    Best approach to real time http streaming to HTML5 video client

    Live streaming using FFMPEG to web audio api

    Ffmpeg - How to force MJPEG output of whole frames ?

    ffmpeg : Render webm from stdin using NodeJS

    No data written to stdin or stderr from ffmpeg

    node.js live streaming ffmpeg stdout to res

    Realtime video conversion using nodejs and ffmpeg

    Pipe output of ffmpeg using nodejs stdout

    can’t re-stream using FFMPEG to MP4 HTML5 video

    FFmpeg live streaming webm video to multiple http clients over Nodejs

    http://www.mobiuso.com/blog/2018/04/18/video-processing-with-node-ffmpeg-and-gearman/

    stream mp4 video with node fluent-ffmpeg

    How to get specific start & end time in ffmpeg by Node JS ?

    Live streaming : node-media-server + Dash.js configured for real-time low latency

    Low Latency (50ms) Video Streaming with NODE.JS and html5

    Server node.js for livestreaming

    HLS Streaming using node JS

    Stream part of the video to the client

    Video streaming with HTML 5 via node.js

    Streaming a video file to an html5 video player with Node.js so that the video controls continue to work ?

    How to (pseudo) stream H.264 video - in a cross browser and html5 way ?

    Pseudo Streaming an MP4 file

    How to stream video data to a video element ?

    How do I convert an h.264 stream to MP4 using ffmpeg and pipe the result to the client ?

    https://medium.com/@brianshaler/on-the-fly-video-rendering-with-node-js-and-ffmpeg-165590314f2

    node.js live streaming ffmpeg stdout to res

    Can Node.js edit video files ?