Recherche avancée

Médias (91)

Autres articles (111)

  • L’agrémenter visuellement

    10 avril 2011

    MediaSPIP est basé sur un système de thèmes et de squelettes. Les squelettes définissent le placement des informations dans la page, définissant un usage spécifique de la plateforme, et les thèmes l’habillage graphique général.
    Chacun peut proposer un nouveau thème graphique ou un squelette et le mettre à disposition de la communauté.

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

  • Sélection de projets utilisant MediaSPIP

    29 avril 2011, par

    Les exemples cités ci-dessous sont des éléments représentatifs d’usages spécifiques de MediaSPIP pour certains projets.
    Vous pensez avoir un site "remarquable" réalisé avec MediaSPIP ? Faites le nous savoir ici.
    Ferme MediaSPIP @ Infini
    L’Association Infini développe des activités d’accueil, de point d’accès internet, de formation, de conduite de projets innovants dans le domaine des Technologies de l’Information et de la Communication, et l’hébergement de sites. Elle joue en la matière un rôle unique (...)

Sur d’autres sites (13651)

  • Stream low latency RTSP video to android with ffmpeg

    21 octobre 2014, par grzebyk

    I am trying to stream live webcam video from Ubuntu 12.04 PC to android device with KitKat. So far I’ve written ffserver config file to receive ffm feed and broadcast it through a rtsp protocol. I am able to watch the stream on the other computer in the same LAN with ffplay.

    How to watch the stream on the android device ? The following code works well when the webcam image is streamed with vlc but it doesn’t with ffmpeg :

    public class MainActivity extends Activity implements MediaPlayer.OnPreparedListener,
           SurfaceHolder.Callback {

       final static String RTSP_URL = "rtsp://192.168.1.54:4424/test.sdp";

       private MediaPlayer _mediaPlayer;
       private SurfaceHolder _surfaceHolder;

       @Override
       protected void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           // Set up a full-screen black window.
           requestWindowFeature(Window.FEATURE_NO_TITLE);
           Window window = getWindow();
           window.setFlags(
                   WindowManager.LayoutParams.FLAG_FULLSCREEN,
                   WindowManager.LayoutParams.FLAG_FULLSCREEN);
           window.setBackgroundDrawableResource(android.R.color.black);
           getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
           setContentView(R.layout.activity_main);

           // Configure the view that renders live video.
           SurfaceView videoView =
                   (SurfaceView) findViewById(R.id.videoView); //where R.id.videoView is a simple SurfaceView element in the layout xml file
           _surfaceHolder = videoView.getHolder();
           _surfaceHolder.addCallback(this);
           _surfaceHolder.setFixedSize(320, 240);
       }
       @Override
       public void surfaceCreated(SurfaceHolder surfaceHolder) {
           _mediaPlayer = new MediaPlayer();
           _mediaPlayer.setDisplay(_surfaceHolder);
           Context context = getApplicationContext();
           Uri source = Uri.parse(RTSP_URL);
           try {
               // Specify the IP camera's URL and auth headers.
               _mediaPlayer.setDataSource(context, source);

               // Begin the process of setting up a video stream.
               _mediaPlayer.setOnPreparedListener(this);
               _mediaPlayer.prepareAsync();
           }
           catch (Exception e) {}
       }
       @Override
       public void onPrepared(MediaPlayer mediaPlayer) {
           _mediaPlayer.start();
       }
    }

    My ffserver.config file :

    HTTPPort 8090
    RTSPBindAddress 0.0.0.0
    RTSPPort 4424
    MaxBandwidth 10000
    CustomLog -

    <feed>
           File /tmp/feed1.ffm
           FileMaxSize 20M
           ACL allow 127.0.0.1
    </feed>
    <stream>
       Feed feed1.ffm
       Format rtp  
       VideoCodec libx264
       VideoSize 640x480
       AVOptionVideo flags +global_header
       AVOptionVideo me_range 16
       AVOptionVideo qdiff 4
       AVOptionVideo qmin 10
       AVOptionVideo qmax 51
       Noaudio
       ACL allow localhost
           ACL allow 192.168.0.0 192.168.255.255
    </stream>

    I am starting the stream with this command : ffmpeg -f v4l2 -i /dev/video0 -c:v libx264 -b:v 600k http://localhost:8090/feed1.ffm

  • Adobe Air , NativeProcess and ffmpeg

    18 août 2013, par Tom Lecoz

    First of all, please excuse me if my english is not perfect, I hope you could understand me...

    I discover yesterday this video tutorial about Air Native Process and FFMPEG, it shows how to create an Air app that can read every video format (avi, mkv, ...).

    http://www.youtube.com/watch?v=6N7eN9wvAGQ

    I tryed to reproduce it but obviously it didn't work...

    Then I looked for an example of it, with working source code, on the internet. I found this

    http://suzhiyam.wordpress.com/2011/05/05/as3ffmpeg-play-multi-video-formats-in-air/

    The author said it's a document class for a Flash project, it just need a button component, a text area and a video Object.

    I tryed it inside a Flash project, it didn't work...
    So, I made some adjusment to run it outside Flash (I just replaced the DisplayObject on the stage by ActionScript code), just because it easier if you want to test it.

    When I say "it didn't work", I mean my video is not read at all. The nativeProcess is working, my executable is recognize but I always get some ProgressEvent.STANDARD_ERROR_DATA and then the process stop...

    I'm working on Windows 7 64 bit.
    I tryed with FFMPEG 32 & 64 bit and get the exact same (no) result.

    You can find my code here
    pastebin.com/U3xRUKWe

    Can someone help me ?

    Please ! :)

    Thanks by advance !

    Tom

  • run ffmpeg commands from my own project

    28 octobre 2013, par bruno

    I'm starting a project where I want ppl to upload videos of a talk and a the video of slides for that talk and want to merge them (to play at the same time) and then show results.

    My question is :
    Is it possible to do that from code ? if it is, can you point me to the right doc ?
    I was able to do it running command line, but as I want this to run on a server with different ppl uploading their videos I think this would not be the best approach.
    I have a preference for Java if it's possible to do it, but I can manage to use other languages what do you guys suggest ?

    The idea would be to have a service where I can point the urls of the videos stored in my server and it would merge them and save file where I can later stream. With different ppl uploading videos at the same time and being able to watch the result in a reasonable amount of time.

    I used this tutorial to test :
    https://trac.ffmpeg.org/wiki/Create%20a%20mosaic%20out%20of%20several%20input%20videos

    Thanks for your time