Recherche avancée

Médias (1)

Mot : - Tags -/swfupload

Autres articles (86)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Gestion générale des documents

    13 mai 2011, par

    MédiaSPIP ne modifie jamais le document original mis en ligne.
    Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
    Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

Sur d’autres sites (10565)

  • Open USB camera with OpenCV and stream to rtsp server

    12 septembre 2017, par user2594166

    I got a Logitech C920 camera connected via USB to a NVIDIA TX1. I am trying to both stream the camera feed over rtsp to a server while doing some computer vision in OpenCV. I managed to read H264 video from the usb camera in Opencv

    #include <iostream>
    #include "opencv/cv.h"
    #include <opencv2></opencv2>opencv.hpp>
    #include "opencv/highgui.h"

    using namespace cv;
    using namespace std;

    int main()
    {
       Mat img;
       VideoCapture cap;
       int heightCamera = 720;
       int widthCamera = 1280;

       // Start video capture port 0
       cap.open(0);


       // Check if we succeeded
       if (!cap.isOpened())
       {
           cout &lt;&lt; "Unable to open camera" &lt;&lt; endl;
           return -1;
       }
       // Set frame width and height
       cap.set(CV_CAP_PROP_FRAME_WIDTH, widthCamera);
       cap.set(CV_CAP_PROP_FRAME_HEIGHT, heightCamera);
       cap.set(CV_CAP_PROP_FOURCC, CV_FOURCC('X','2','6','4'));

       // Set camera FPS
       cap.set(CV_CAP_PROP_FPS, 30);

       while (true)
       {
           // Copy the current frame to an image
           cap >> img;

           // Show video streams
           imshow("Video stream", img);

           waitKey(1);
       }

       // Release video stream
       cap.release();

       return 0;
    }
    </iostream>

    I also have streamed the USB camera to a rtsp server by using ffmpeg :
    ffmpeg -f v4l2 -input_format h264 -timestamps abs -video_size hd720 -i /dev/video0 -c:v copy -c:a none -f rtsp rtsp://10.52.9.104:45002/cameraTx1

    I tried to google how to combine this two functions, i.e. open usb camera in openCV and use openCV to stream H264 rtsp video. However, all I can find is people trying to open rtsp stream in openCV.

    Have anyone successfully stream H264 rtsp video using openCV with ffmpeg ?

    Best regards
    Sondre

  • FFMPEG Android Camera Streaming to RTMP

    2 février 2017, par Omer Abbas

    I need help for streaming android Camera using FFMPEG to a RTMP Server. I have compiled FFMPEG for android as a shared library. Everything from FFMPGEG side is working perfectly fine. I have tried to stream a already existed video file to RTMP and its working great. Then i have used Camera and write rawvideo to a file and stream it to RTMP server while writing camera rawvideo to the file, It’s also working but the issue is file size keep increasing. I have read about MediaRecorder LocalSocket that can stream data to ffmpeg as Unix Domain name local socket. I have used MediaRecorder Sample source of android sample https://github.com/googlesamples/android-MediaRecorder modified MediaRecorder to use local socket

    receiver = new LocalSocket();
       try {

           localSocketServer = new LocalServerSocket("camera2rtsp");

           // FileDescriptor the Camera can send to
           sender = localSocketServer.accept();
           sender.setReceiveBufferSize(500000);
           sender.setSendBufferSize(500000);

       } catch (IOException e1) {
           e1.printStackTrace();
           super.onResume();
           finish();
           //return;
       }

       mMediaRecorder.setOutputFile(sender.getFileDescriptor());

    Tried to access this socket with ffmpeg command, But it’s failed with the error unix ://camera2rtsp : no directory or file found.

    ffmpeg -i unix ://camera2rtsp -vcodec libx264 -f flv rtmp ://server

    so i tried ParcelFileDescriptor pipe

    pipe = getPipe();
    ParcelFileDescriptor parcelWrite  = new ParcelFileDescriptor(pipe[1]);
    mMediaRecorder.setOutputFile(parcelWrite.getFileDescriptor());

    With command

    "ffmpeg -re -r 30 -f rawvideo -i pipe :"+ pipe[0].getfd() +" -vcodec libx264 -f flv rtmp ://server"

    But it also not working, Seems like ffmpeg trying to read empty pipe and giving a warning of Size of Stream#0:0

    Also MediaRecorder with setOutputFile as ParcelFileDescriptor is giving error "E/MediaRecorder : start failed : -2147483648" in Galaxy S7 but working in a motorolla Phone with Kitkat.

    I might have been using Pipe or Socket Incorrectly Please if someone have any idea or experience of using ffmpeg stream with android camera please help me.

    I have figured out that MediaRecorder encode in MP4 format which is not seekable or streamable format because of headers or meta data.

    So i read a tutorial http://www.oodlestechnologies.com/blogs/Stream-video-from-camera-preview-to-wowza-server-in-Android this tutorial shows we can write raw video to a file and can pass this file to ffmpeg. It’s working great, But the issue is file size it increasing. So now my question is can i pass camera raw video to ffmpeg with ParcelFileDescriptor Pipe or LocalSocket ?

    Code :

    File f = new  File(Environment.getExternalStorageDirectory()+"/test.data");
                       if(!f.exists()){
                           f.createNewFile();
                       }

                       OutputStream outStream = new FileOutputStream(write);

                       Camera.Parameters parameters = mCamera.getParameters();
                       int imageFormat = parameters.getPreviewFormat();
                       if (imageFormat == ImageFormat.NV21) {
                           Camera.Size previewSize = parameters.getPreviewSize();
                           int frameWidth = previewSize.width;
                           int frameHeight = previewSize.height;

                           Rect rect = new Rect(0, 0, frameWidth, frameHeight);
                           YuvImage img = new YuvImage(arg0, ImageFormat.NV21, frameWidth, frameHeight, null);

                           outStream.write(arg0);
                           outStream.flush();

                           //img.compressToJpeg(rect, 50, processIn);
                       }
  • FFmpeg IP camera hls video not playing in IOS

    13 juin 2019, par Devesh Kumar

    We have create the node code which is create the IP camera HLS video from ffmpeg.

    Now its working in desktop browsers and Android devices using React-hls player based on videojs

    It is not working on IOS, I have attached the sample code for stream.

    Also I have checked with flowplayer, which is working on IOS with flowplayer sample hls video but not playing our camera stream.

    Here’s Sample Video hls video which is working IOS

    Also I have attached the screenshot or both video with there codec info.

    Video 1 : Our Camera Stream (Not Working on IOS)
    Video 2 : Flowplayer sample video (Not Working on IOS)

    Here’s Sample video codec sample

    Here is the nodejs sample which is used for the create HLS stream from live ip camera RTSP using FFMPEG.

    this.ffmpeg = child_process.spawn(`${process.cwd()}/ffmpeg/bin/ffmpeg.exe`,
    [
       "-fflags", "nobuffer",
       "-i", url, "-y",
       //"-vcodec", "libx264",
       "-c:v", "h264",
       "-preset:v", "ultrafast",
       "-acodec", "aac",
       "-ac", "1",
       "-strict", "-2",
       "-crf", "30",
       "-profile:v", "baseline",
       "-maxrate", "400k",
       "-bufsize", "535k",
       "-pix_fmt", "yuv420p",
       "-r", "30",
       "-flags", "-global_header",
       "-hls_time", "10",
       "-hls_list_size", "4",
       //"-hls_wrap", "4",
       "-hls_flags", "delete_segments+append_list+omit_endlist",
       "-hls_base_url", "segment/",
       "-start_number", "1",
       "-s", (size || '568x320'),
       `public/${this.camId}/out_${util.storeId}_${this.camId}_.m3u8`,
       //"-vcodec", "libx264",
       "-c:v", "h264",
       "-preset:v", "ultrafast",
       "-acodec", "aac",
       "-ac", "1",
       "-strict", "-2",
       "-crf", "30",
       "-profile:v", "baseline",
       "-maxrate", "2996k",
       "-bufsize", "4200k",
       "-pix_fmt", "yuv420p",
       "-r", "30",
       "-flags", "-global_header",
       "-hls_time", "10",
       "-hls_list_size", "4",
       //"-hls_wrap", "4",
       "-hls_flags", "delete_segments+append_list+omit_endlist",
       "-hls_base_url", "segment/",
       "-start_number", "1",
       "-s", (size || '1280x720'),
       `public/${this.camId}/out_${util.storeId}_${this.camId}_720.m3u8`
    ], { detached: true });