Recherche avancée

Médias (3)

Mot : - Tags -/spip

Autres articles (56)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Mise à disposition des fichiers

    14 avril 2011, par

    Par défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
    Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
    Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)

Sur d’autres sites (4825)

  • Android ffmpeg runtime error

    6 septembre 2013, par Syntaxicated

    I'm new to Stackoverflow and although I've tried to find an answer to my question, I can't see anything that fits my circumstances. Apologies if it is a duplicate, however.

    I'm trying my hand at some Android development and have hit a problem with ffmpeg.

    I have built a basic app which should take a file on the sdcard, rotate it and save it as a new file to the same folder.

    I'm using ffmpeg to do the transposing and cropping but so far I cannot get ffmpeg to execute.

    I've used the 'guardianproject' android-ffmpeg library on github (here) to execute the ffmpeg commands but it doesn't actually do anything.

    Here's my code :

    package com.tw.videwell;

    import java.io.File;

    import org.ffmpeg.android.FfmpegController;
    import org.ffmpeg.android.MediaDesc;
    import org.ffmpeg.android.ShellUtils.ShellCallback;
    import org.ffmpeg.android.filters.TransposeVideoFilter;

    import android.annotation.SuppressLint;
    import android.app.Activity;
    import android.os.Bundle;
    import android.view.Menu;
    import android.view.View;
    import android.widget.Button;

    public class MainActivity extends Activity {

    Button b;
    FfmpegController ff;
    MediaDesc in;
    MediaDesc out;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);
       setContentView(R.layout.activity_main);
       run();
    }

    @Override
    public boolean onCreateOptionsMenu(Menu menu) {
       // Inflate the menu; this adds items to the action bar if it is present.
       getMenuInflater().inflate(R.menu.main, menu);
       return true;
    }

    private void run(){
       b = (Button) findViewById(R.id.button1);    

       //Set click listener for shoot button
       b.setOnClickListener(new View.OnClickListener() {
           public void onClick(View v) {
               process();
           }    
       });


    }

    @SuppressLint("SdCardPath")
    private void process(){
       try {
           in = new MediaDesc();
           out = new MediaDesc();
           in.path="/sdcard/Video/test.mp4";
           TransposeVideoFilter tvf= new TransposeVideoFilter(TransposeVideoFilter.NINETY_CLOCKWISE);
           out.path="/sdcard/Video/out.mp4";
           out.videoFilter=tvf.toString();
           File fileTemp = null;
           ff = new FfmpegController(getApplicationContext(), fileTemp);
           //ff.processVideo(in, out, true, scb);

       } catch (Exception e) {
           // TODO Auto-generated catch block
           e.printStackTrace();
       }
    }

    private ShellCallback scb = new ShellCallback(){

       @Override
       public void shellOut(String shellLine) {
           // TODO Auto-generated method stub
           System.out.println("Out");
       }

       @Override
       public void processComplete(int exitValue) {
           // TODO Auto-generated method stub
           System.out.println("Complete");
       }

    };
    }

    When I run the app on my device the log reports the following :

    W/System.err(32114): android.content.res.Resources$NotFoundException: File res/raw/ffmpeg from drawable resource ID #0x7f040000
    W/System.err(32114):    at android.content.res.Resources.openRawResourceFd(Resources.java:982)
    W/System.err(32114):    at org.ffmpeg.android.FfmpegController.checkBinary(FfmpegController.java:49)
    W/System.err(32114):    at org.ffmpeg.android.FfmpegController.<init>(FfmpegController.java:41)
    W/System.err(32114):    at com.tw.videwell.MainActivity.process(MainActivity.java:61)
    W/System.err(32114):    at com.tw.videwell.MainActivity.access$0(MainActivity.java:52)
    W/System.err(32114):    at com.tw.videwell.MainActivity$2.onClick(MainActivity.java:44)
    W/System.err(32114):    at android.view.View.performClick(View.java:4211)
    W/System.err(32114):    at android.view.View$PerformClick.run(View.java:17362)
    W/System.err(32114):    at android.os.Handler.handleCallback(Handler.java:725)
    W/System.err(32114):    at android.os.Handler.dispatchMessage(Handler.java:92)
    W/System.err(32114):    at android.os.Looper.loop(Looper.java:137)
    W/System.err(32114):    at android.app.ActivityThread.main(ActivityThread.java:5227)
    W/System.err(32114):    at java.lang.reflect.Method.invokeNative(Native Method)
    W/System.err(32114):    at java.lang.reflect.Method.invoke(Method.java:511)
    W/System.err(32114):    at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:795)
    W/System.err(32114):    at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:562)
    W/System.err(32114):    at dalvik.system.NativeStart.main(Native Method)
    W/System.err(32114): Caused by: java.io.FileNotFoundException: This file can not be opened as a file descriptor; it is probably compressed
    W/System.err(32114):    at android.content.res.AssetManager.openNonAssetFdNative(Native Method)
    W/System.err(32114):    at android.content.res.AssetManager.openNonAssetFd(AssetManager.java:450)
    W/System.err(32114):    at android.content.res.Resources.openRawResourceFd(Resources.java:979)
    W/System.err(32114):    ... 16 more
    </init>

    From what I can deduce, the ffmpeg executable can't be found. It is stored in the /res/raw folder as I believe it should be and appears in the app folder on the device when I run it, but still can't find it.

    I'm developing on a Mac in Eclipse 4.2.1 with the latest Android SDK and am running the app on a Sony Xperia T with Android 4.2.2.

    I compiled the ffmpeg library using the installation method described on github, Android NDK 9 and the Mac XCode command line developer tools (make etc).

    What am I doing wrong ? Do I need to store the FFmpeg library file somewhere else ? Is there a massive error in my, albeit basic, coding ?

  • Using ffserver to stream older IP cam MJPEG to RTSP

    26 mai 2016, par tmar89

    I have an older Sony IP camera that has an MJPEG stream. I need to connect this to an NVR that only takes ONVIP or RTSP and I’m trying to use ffserver and ffmpeg to convert the MJPEG stream to RTSP but it’s not working. Any have some idea of what I may be doing wrong ? Saw an error in the attempted playback about an unsupported Protocol.
    Here is my ffserver config :

    Port 8090
    RTSPPort 5544
    BindAddress 0.0.0.0
    RTSPBindAddress 0.0.0.0
    MaxClients 100
    MaxBandwidth 10000

    <feed>
    File /tmp/feed27.ffm
    FileMaxSize 5M
    ACL allow 127.0.0.1
    </feed>

    <stream>
    Format rtp
    Feed feed27.ffm
    NoAudio
    VideoCodec mjpeg
    VideoFrameRate 30
    VideoSize 736x480
    </stream>

    And here is the ffmpeg command I am using :

    [tm@tele ffserver-rtsp]# ffmpeg -f mjpeg -r 30 -s 736x480 -i http://[CAMIP]/image http://localhost:8090/feed27.ffm
       FFmpeg version 0.6.5, Copyright (c) 2000-2010 the FFmpeg developers
         built on Jan 29 2012 17:52:15 with gcc 4.4.5 20110214 (Red Hat 4.4.5-6)
         configuration: --prefix=/usr --libdir=/usr/lib64 --shlibdir=/usr/lib64 --mandir=/usr/share/man --incdir=/usr/include --disable-avisynth --extra-cflags='-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -fPIC' --enable-avfilter --enable-avfilter-lavf --enable-libdc1394 --enable-libdirac --enable-libfaac --enable-libfaad --enable-libfaadbin --enable-libgsm --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-librtmp --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libx264 --enable-gpl --enable-nonfree --enable-postproc --enable-pthreads --enable-shared --enable-swscale --enable-vdpau --enable-version3 --enable-x11grab
         libavutil     50.15. 1 / 50.15. 1
         libavcodec    52.72. 2 / 52.72. 2
         libavformat   52.64. 2 / 52.64. 2
         libavdevice   52. 2. 0 / 52. 2. 0
         libavfilter    1.19. 0 /  1.19. 0
         libswscale     0.11. 0 /  0.11. 0
         libpostproc   51. 2. 0 / 51. 2. 0
       [mjpeg @ 0x1ece670]Estimating duration from bitrate, this may be inaccurate
       Input #0, mjpeg, from 'http://[CAMIP]/image':
         Duration: N/A, bitrate: N/A
           Stream #0.0: Video: mjpeg, yuvj422p, 736x480, 30 fps, 30 tbr, 1200k tbn, 30 tbc
       Output #0, ffm, to 'http://localhost:8090/feed27.ffm':
         Metadata:
           encoder         : Lavf52.64.2
           Stream #0.0: Video: mjpeg, yuvj420p, 736x480, q=2-31, 200 kb/s, 1000k tbn, 30 tbc
       Stream mapping:
         Stream #0.0 -> #0.0
       Press [q] to stop encoding
       [mjpeg @ 0x222d110]rc buffer underflow
       frame=  640 fps= 17 q=31.4 size=   12884kB time=21.33 bitrate=4947.5kbits/s

    When I use VLC to open the stream, it cannot be found :

    Your input can't be opened:
       VLC is unable to open the MRL 'rtsp://localhost:5544/stream27.mpg'. Check the log for details.

    Finally, using ffplay on the same machine :

    [tm@tele tmp]# ffplay rtsp://localhost:5544/stream27.sdp
    FFplay version 0.6.5, Copyright (c) 2003-2010 the FFmpeg developers
     built on Jan 29 2012 17:52:15 with gcc 4.4.5 20110214 (Red Hat 4.4.5-6)
     configuration: --prefix=/usr --libdir=/usr/lib64 --shlibdir=/usr/lib64 --mandir=/usr/share/man --incdir=/usr/include --disable-avisynth --extra-cflags='-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic -fPIC' --enable-avfilter --enable-avfilter-lavf --enable-libdc1394 --enable-libdirac --enable-libfaac --enable-libfaad --enable-libfaadbin --enable-libgsm --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-librtmp --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libx264 --enable-gpl --enable-nonfree --enable-postproc --enable-pthreads --enable-shared --enable-swscale --enable-vdpau --enable-version3 --enable-x11grab
     libavutil     50.15. 1 / 50.15. 1
     libavcodec    52.72. 2 / 52.72. 2
     libavformat   52.64. 2 / 52.64. 2
     libavdevice   52. 2. 0 / 52. 2. 0
     libavfilter    1.19. 0 /  1.19. 0
     libswscale     0.11. 0 /  0.11. 0
     libpostproc   51. 2. 0 / 51. 2. 0
    ALSA lib pulse.c:229:(pulse_connect) PulseAudio: Unable to connect: Connection refused

    rtsp://localhost:5544/stream27.sdp: Protocol not supported

    And here was the log from ffserver :

    127.0.0.1:5000 - - "PLAY stream27.mpg/streamid=0 RTP/UDP"
    [rtp @ 0x721dc0]Unsupported codec 8
    127.0.0.1:0 - - "PLAY stream27.mpg/streamid=0 RTP/TCP"
    [rtp @ 0x728cb0]Unsupported codec 8
    127.0.0.1 - - [SETUP] "rtsp://localhost:5544/stream27.mpg/streamid=0 RTSP/1.0" 200 641
  • Buffer overrun Blackmagic Intensity 4K as input to FFmpeg

    24 mai 2016, par colossus47

    I am trying to take direct video output from a 4k Sony Handycam, via HDMI directly into a Blackmagic Intensity Pro 4K. I can verify that the camera, Hdmi and blackmagic card are working as I can capture and view video using the provided "Media Express" program. When use ffmpeg I do get video output but I also get a buffer overrun.

    Here is the command :

    time ffmpeg -f decklink -i "Intensity Pro 4K@20" -c:v nvenc -b:v 100M -vf yadif=0:-1:0" -pix_fmt yuv420p -crf 29.97 -strict -2 output.mp4

    And I get the following output :

    ffmpeg version N-76538-gb83c849 Copyright (c) 2000-2015 the FFmpeg

    developers built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.3)
    configuration: --enable-nonfree --enable-nvenc --enable-nvresize --extra-cflags=-I../cudautils --extra-ldflags=-L../cudautils --enable-gpl --enable-libx264 --enable-libx265 --enable-decklink --extra-cflags=-I/home/tristan/Downloads/BlackmagicDeckLinkSDK10.6.5/Linux/include --extra-ldflags=-L/home/tristan/Downloads/BlackmagicDeckLinkSDK10.6.5/Linux/include
    libavutil      55.  5.100 / 55.  5.100
    libavcodec     57. 15.100 / 57. 15.100
    libavformat    57. 14.100 / 57. 14.100
    libavdevice    57.  0.100 / 57.  0.100
    libavfilter     6. 15.100 /  6. 15.100
    libswscale      4.  0.100 /  4.  0.100
    libswresample   2.  0.101 /  2.  0.101
    libpostproc    54.  0.100 / 54.  0.100
    [decklink @ 0x1ccd6e0] Found Decklink mode 3840 x 2160 with rate 29.97
    [decklink @ 0x1ccd6e0] Stream #1: not enough frames to estimate rate; consider increasing probesize
    Guessed Channel Layout for  Input Stream #0.0 : stereo
    Input #0, decklink, from 'Intensity Pro 4K@20':
    Duration: N/A, start: 0.000000, bitrate: 1536 kb/s
    Stream #0:0: Audio: pcm_s16le, 48000 Hz, 2 channels, s16, 1536 kb/s
    Stream #0:1: Video: rawvideo (UYVY / 0x59565955), uyvy422, 3840x2160, -5 kb/s, 29.97 tbr, 1000k tbn, 29.97 tbc
    Codec AVOption crf (Select the quality for constant quality mode) specified for output file #0 (output.mp4) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream.
    File 'output.mp4' already exists. Overwrite ? [y/N] y
    Output #0, mp4, to 'output.mp4':
    Metadata:
    encoder         : Lavf57.14.100
    Stream #0:0: Video: h264 (nvenc) ([33][0][0][0] / 0x0021), yuv420p, 3840x2160, q=-1--1, 100000 kb/s, 29.97 fps, 30k tbn, 29.97 tbc
    Metadata:
    encoder         : Lavc57.15.100 nvenc
    Stream #0:1: Audio: aac ([64][0][0][0] / 0x0040), 48000 Hz, stereo, fltp, 128 kb/s
    Metadata:
    encoder         : Lavc57.15.100 aac
    Stream mapping:
    Stream #0:1 -> #0:0 (rawvideo (native) -> h264 (nvenc))
    Stream #0:0 -> #0:1 (pcm_s16le (native) -> aac (native))
    Press [q] to stop, [?] for help
    [decklink @ 0x1ccd6e0] Decklink input buffer overrun!:03.15 bitrate=70411.7kbits/s  
    Last message repeated 1 times
    [decklink @ 0x1ccd6e0] Decklink input buffer overrun!:03.54 bitrate=73110.9kbits/s  
    Last message repeated 20 times
    [decklink @ 0x1ccd6e0] Decklink input buffer overrun!:03.92 bitrate=76270.2kbits/s  
    Last message repeated 15 times
    [decklink @ 0x1ccd6e0] Decklink input buffer overrun!:04.28 bitrate=78367.6kbits/s  
    Last message repeated 61 times
    frame=  140 fps= 22 q=-0.0 Lsize=   57266kB time=00:00:04.67 bitrate=100425.2kbits/s  
    video:57187kB audio:72kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.009844%
    [decklink @ 0x1ccd6e0] Decklink input buffer overrun!
    Last message repeated 7 times
    [aac @ 0x1cd7020] Qavg: 215.556

    real   0m8.808s
    user   0m5.785s
    sys   0m1.749s

    Some sort of insight into this, be that just some commands that may fix it the issue, or otherwise.