Recherche avancée

Médias (1)

Mot : - Tags -/getid3

Autres articles (106)

  • Pas question de marché, de cloud etc...

    10 avril 2011

    Le vocabulaire utilisé sur ce site essaie d’éviter toute référence à la mode qui fleurit allègrement
    sur le web 2.0 et dans les entreprises qui en vivent.
    Vous êtes donc invité à bannir l’utilisation des termes "Brand", "Cloud", "Marché" etc...
    Notre motivation est avant tout de créer un outil simple, accessible à pour tout le monde, favorisant
    le partage de créations sur Internet et permettant aux auteurs de garder une autonomie optimale.
    Aucun "contrat Gold ou Premium" n’est donc prévu, aucun (...)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • Other interesting software

    13 avril 2011, par

    We don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
    The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
    We don’t know them, we didn’t try them, but you can take a peek.
    Videopress
    Website : http://videopress.com/
    License : GNU/GPL v2
    Source code : (...)

Sur d’autres sites (4730)

  • Live video stream on server (PC) from images sent by robot through UDP

    3 février 2018, par Richard Knop

    Hmm. I found this which seems promising :

    http://sourceforge.net/projects/mjpg-streamer/


    Ok. I will try to explain what I am trying to do clearly and in much detail.

    I have a small humanoid robot with camera and wifi stick (this is the robot). The robot’s wifi stick average wifi transfer rate is 1769KB/s. The robot has 500Mhz CPU and 256MB RAM so it is not enough for any serious computations (moreover there are already couple modules running on the robot for motion, vision, sonar, speech etc).

    I have a PC from which I control the robot. I am trying to have the robot walk around the room and see a live stream video of what the robot sees in the PC.

    What I already have working. The robot is walking as I want him to do and taking images with the camera. The images are being sent through UDP protocol to the PC where I am receiving them (I have verified this by saving the incoming images on the disk).

    The camera returns images which are 640 x 480 px in YUV442 colorspace. I am sending the images with lossy compression (JPEG) because I am trying to get the best possible FPS on the PC. I am doing the compression to JPEG on the robot with PIL library.

    My questions :

    1. Could somebody please give me some ideas about how to convert the incoming JPEG images to a live video stream ? I understand that I will need some video encoder for that. Which video encoder do you recommend ? FFMPEG or something else ? I am very new to video streaming so I want to know what is best for this task. I’d prefer to use Python to write this so I would prefer some video encoder or library which has Python API. But I guess if the library has some good command line API it doesn’t have to be in Python.

    2. What is the best FPS I could get out from this ? Given the 1769KB/s average wifi transfer rate and the dimensions of the images ? Should I use different compression than JPEG ?

    3. I will be happy to see any code examples. Links to articles explaining how to do this would be fine, too.

    Some code samples. Here is how I am sending JPEG images from robot to the PC (shortened simplified snippet). This runs on the robot :

    # lots of code here

    UDPSock = socket(AF_INET,SOCK_DGRAM)

     while 1:
       image = camProxy.getImageLocal(nameId)
       size = (image[0], image[1])
       data = image[6]
       im = Image.fromstring("YCbCr", size, data)
       s = StringIO.StringIO()
       im.save(s, "JPEG")

       UDPSock.sendto(s.getvalue(), addr)

       camProxy.releaseImage(nameId)

     UDPSock.close()

     # lots of code here

    Here is how I am receiving the images on the PC. This runs on the PC :

     # lots of code here

     UDPSock = socket(AF_INET,SOCK_DGRAM)
     UDPSock.bind(addr)

     while 1:
       data, addr = UDPSock.recvfrom(buf)
       # here I need to create a stream from the data
       # which contains JPEG image

     UDPSock.close()

     # lots of code here
  • video stuttering using ffmpeg for RLSP>RTMP and nginx for hls

    21 février 2018, par Charlton Peoples

    here is my nginx.conf :

    #user  nobody;
    worker_processes  1;
    #error_log  logs/error.log;
    #error_log  logs/error.log  notice;
    #error_log  logs/error.log  info;

    #pid        logs/nginx.pid;


    events {
       worker_connections  1024;
    }


    http {
       include       mime.types;
       default_type application/octet-stream;
       sendfile        on;
       keepalive_timeout  65;
       server {
           listen       80;
           server_name  localhost;
           location / {
               root   html;
               index  index.html index.htm;
               #autoindex on;
           }


           # path to HLS application service
              location /hls {
              add_header Cache-Control no-cache;

              add_header 'Access-Control-Allow-Origin' '*' always;
              add_header 'Access-Control-Expose-Headers' 'Content-Length';

              if ($request_method = 'OPTIONS') {
                   add_header 'Access-Control-Allow-Origin' '*';
                   add_header 'Access-Control-Max-Age' 1728000;
                   add_header 'Content-Type' 'text/plain charset=UTF-8';
                   add_header 'Content-Length' 0;
                   return 204;
           }
               types {
                   #application/dash+xml mpd;
                   application/vnd.apple.mpegurl m3u8;
                   video/mp2t ts;
               }
               root /usr/local/nginx/html;

           }
           # redirect server error pages to the static page /50x.html
           error_page   500 502 503 504  /50x.html;
           location = /50x.html {
               root   html;
           }
       }

    }
       # HTTPS server
       #
       #server {
       #    listen       443 ssl;
       #    server_name  localhost;

       #    ssl_certificate      cert.pem;
       #    ssl_certificate_key  cert.key;

       #    ssl_session_cache    shared:SSL:1m;
       #    ssl_session_timeout  5m;

       #    ssl_ciphers  HIGH:!aNULL:!MD5;
       #    ssl_prefer_server_ciphers  on;

       #    location / {
       #        root   html;
       #        index  index.html index.htm;
       #    }
       #}




    rtmp {
       server {
           listen 1935; # Listen on standard RTMP port
           chunk_size 4000;

           application hls {
               live on;
               # Turn on HLS
               hls on;
    #            hls_type event;
               hls_path /usr/local/nginx/html/hls/;
               hls_fragment 3;
              hls_playlist_length 60;

               # disable consuming the stream from nginx as rtmp
    #            deny play all;
           exec_static /usr/bin/ffmpeg -rtsp_transport tcp -i rtsp://IP_ADDRESS -video_size 2048x1536 -c:v libx264 -crf 20 -preset veryfast -maxrate 3M -bufsize 3968k -r 10 -g 20 -movflags +faststart -tune zerolatency -hls_time 1 -hls_list_size 4  -start_number 1 -hls_allow_cache 0 -threads 1 -loglevel warning -f flv rtmp://localhost:1935/hls/onsyte;
    #        exec_static /usr/bin/ffmpeg -i rtsp://IP_ADDRESS:10555 -f flv -r 25 -s 640×480 -an rtmp://localhost:1935/hls/littles
           #exec_static /usr/bin/ffmpeg -rtsp_transport tcp -i rtsp://IP_ADDRESS:10555 -video_size 1920x1080 -c:v libx264 -preset veryfast -maxrate 8M -bufsize 9000k -r 10 -g 20 -movflags +faststart -tune zerolatency -hls_time 1 -hls_list_size 0 -start_number 1 -hls_allow_cache 0 -threads 4 -loglevel warning -f flv rtmp://localhost:1935/hls/littles;
    exec_static /usr/bin/ffmpeg -rtsp_transport tcp -i rtsp://IP_ADDRESS:10555 -video_size 1280x720 -c:v libx264 -crf 20 -preset veryfast -maxrate 3M -bufsize 3968k -r 10 -g 20 -movflags +faststart -tune zerolatency -hls_time 1 -hls_list_size 4  -start_number 1 -hls_allow_cache 0 -threads 1 -loglevel warning -f flv rtmp://localhost:1935/hls/littles;      
    exec_static /usr/bin/ffmpeg -rtsp_transport tcp -i rtsp://IP_ADDRESS:10556 -video_size 1280x720 -c:v libx264 -crf 20 -preset veryfast -maxrate 3M -bufsize 3968k -r 10 -g 20 -movflags +faststart -tune zerolatency -hls_time 1 -hls_list_size 4  -start_number 1 -hls_allow_cache 0 -threads 1 -loglevel warning -f flv rtmp://localhost:1935/hls/biggies;
           exec_static /usr/bin/ffmpeg -rtsp_transport tcp -i rtsp://IP_ADDRESS:10557 -video_size 1280x720 -c:v libx264 -crf 20 -preset veryfast -maxrate 3M -bufsize 3968k -r 10 -g 20 -movflags +faststart -tune zerolatency -hls_time 1 -hls_list_size 4  -start_number 1 -hls_allow_cache 0 -threads 1 -loglevel warning -f flv rtmp://localhost:1935/hls/dog;
           exec_static /usr/bin/ffmpeg -rtsp_transport tcp -i rtsp://IP_ADDRESS:10558 -video_size 1280x720 -c:v libx264 -crf 20 -preset veryfast -maxrate 3M -bufsize 3968k -r 10 -g 20 -movflags +faststart -tune zerolatency -hls_time 1 -hls_list_size 4  -start_number 1 -hls_allow_cache 0 -threads 1 -loglevel warning -f flv rtmp://localhost:1935/hls/room;
           exec_static /usr/bin/ffmpeg -rtsp_transport tcp -i rtsp://IP_ADDRESS:10559 -video_size 1280x720 -c:v libx264 -crf 20 -preset veryfast -maxrate 3M -bufsize 3968k -r 10 -g 20 -movflags +faststart -tune zerolatency -hls_time 1 -hls_list_size 4  -start_number 1 -hls_allow_cache 0 -threads 1 -loglevel warning -f flv rtmp://localhost:1935/hls/play;
           exec_static /usr/bin/ffmpeg -rtsp_transport tcp -i rtsp://IP_ADDRESS:10560 -video_size 1280x720 -c:v libx264 -crf 20 -preset veryfast -maxrate 3M -bufsize 3968k -r 10 -g 20 -movflags +faststart -tune zerolatency -hls_time 1 -hls_list_size 4  -start_number 1 -hls_allow_cache 0 -threads 1 -loglevel warning -f flv rtmp://localhost:1935/hls/gentles;
           exec_static /usr/bin/ffmpeg -rtsp_transport tcp -i rtsp://IP_ADDRESS:10555 -vcodec libx264 -vprofile baseline -g 10 -s 1280x720 -acodec libfaac -ar 44100 -ac 1 -f flv rtmp://localhost:1935/hls/littles;
          }

       }
    }

    here is my player html :

     

     
     <code class="echappe-js">&lt;script src=&quot;https://unpkg.com/video.js/dist/video.js&quot;&gt;&lt;/script&gt;

    &lt;script src=&quot;https://unpkg.com/videojs-contrib-hls/dist/videojs-contrib-hls.js&quot;&gt;&lt;/script&gt;

    &lt;script&gt;<br />
     &lt;/script&gt;

    server has 8 CPU and 8G RAM. What is causing the stuttering ? Video will play for no more than 10seconds without freezing for 3-20seconds. The first stream works fine and 2048x1536 and never stutters...the rest of the streams are the issue(i am replacing the ipcamlive service we pay for with this and i know it has to be a configuration setting because with both streaming services running concurrently, mine stutters while ipcamlive does not)
    Please let me know if I need to provide more information !

  • Display ffmpeg frames on opgel texture

    21 mars 2018, par naki

    I am using Dranger tutorial01 (ffmpeg) to decode the video and get AVI frames. I want to use OpenGL to display the video.

    http://dranger.com/ffmpeg/tutorial01.html

    The main function is as follows :

    int main (int argc, char** argv) {
    // opengl stuff
    glutInit(&amp;argc, argv);
    glutInitDisplayMode(GLUT_RGBA);
    glutInitWindowSize(800, 600);
    glutCreateWindow("Hello GL");

    glutReshapeFunc(changeViewport);
    glutDisplayFunc(render);

    GLenum err = glewInit();
    if(GLEW_OK !=err){
       fprintf(stderr, "GLEW error");
       return 1;
    }

    glClear(GL_COLOR_BUFFER_BIT);


    glEnable(GL_TEXTURE_2D);
    GLuint texture;
    glGenTextures(1, &amp;texture); //Make room for our texture
    glBindTexture(GL_TEXTURE_2D, texture);

    //ffmpeg stuff

    AVFormatContext *pFormatCtx = NULL;
    int             i, videoStream;
    AVCodecContext  *pCodecCtx = NULL;
    AVCodec         *pCodec = NULL;
    AVFrame         *pFrame = NULL;
    AVFrame         *pFrameRGB = NULL;
    AVPacket        packet;
    int             frameFinished;
    int             numBytes;
    uint8_t         *buffer = NULL;

    AVDictionary    *optionsDict = NULL;


    if(argc &lt; 2) {
    printf("Please provide a movie file\n");
    return -1;
    }
    // Register all formats and codecs

    av_register_all();

    // Open video file
    if(avformat_open_input(&amp;pFormatCtx, argv[1], NULL, NULL)!=0)
      return -1; // Couldn't open file

    // Retrieve stream information

    if(avformat_find_stream_info(pFormatCtx, NULL)&lt;0)
    return -1; // Couldn't find stream information

    // Dump information about file onto standard error
    av_dump_format(pFormatCtx, 0, argv[1], 0);

    // Find the first video stream

    videoStream=-1;
    for(i=0; inb_streams; i++)
    if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO) {
     videoStream=i;
     break;
    }
    if(videoStream==-1)
    return -1; // Didn't find a video stream

    // Get a pointer to the codec context for the video stream
    pCodecCtx=pFormatCtx->streams[videoStream]->codec;

    // Find the decoder for the video stream
    pCodec=avcodec_find_decoder(pCodecCtx->codec_id);
    if(pCodec==NULL) {
      fprintf(stderr, "Unsupported codec!\n");
      return -1; // Codec not found
    }
    // Open codec
    if(avcodec_open2(pCodecCtx, pCodec, &amp;optionsDict)&lt;0)
      return -1; // Could not open codec

    // Allocate video frame
    pFrame=av_frame_alloc();

    // Allocate an AVFrame structure
    pFrameRGB=av_frame_alloc();
    if(pFrameRGB==NULL)
    return -1;

    // Determine required buffer size and allocate buffer
    numBytes=avpicture_get_size(PIX_FMT_RGB24, pCodecCtx->width,
                 pCodecCtx->height);
    buffer=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));

    struct SwsContext      *sws_ctx = sws_getContext(pCodecCtx->width,
              pCodecCtx->height, pCodecCtx->pix_fmt, 800,
              600, PIX_FMT_RGB24, SWS_BICUBIC, NULL,
              NULL, NULL);


    // Assign appropriate parts of buffer to image planes in pFrameRGB
    // Note that pFrameRGB is an AVFrame, but AVFrame is a superset
    // of AVPicture
    avpicture_fill((AVPicture *)pFrameRGB, buffer, PIX_FMT_RGB24,
        pCodecCtx->width, pCodecCtx->height);

    // Read frames and save first five frames to disk
    i=0;
    while(av_read_frame(pFormatCtx, &amp;packet)>=0) {


    // Is this a packet from the video stream?
    if(packet.stream_index==videoStream) {
     // Decode video frame
     avcodec_decode_video2(pCodecCtx, pFrame, &amp;frameFinished,
              &amp;packet);

     // Did we get a video frame?
     if(frameFinished) {
    // Convert the image from its native format to RGB
     /*  sws_scale
       (
           sws_ctx,
           (uint8_t const * const *)pFrame->data,
           pFrame->linesize,
           0,
           pCodecCtx->height,
           pFrameRGB->data,
           pFrameRGB->linesize
       );
      */
    sws_scale(sws_ctx, pFrame->data, pFrame->linesize, 0, pCodecCtx->height, pFrameRGB->data, pFrameRGB->linesize);
     // additional opengl
       glBindTexture(GL_TEXTURE_2D, texture);

           //gluBuild2DMipmaps(GL_TEXTURE_2D, 3, pCodecCtx->width, pCodecCtx->height, GL_RGB, GL_UNSIGNED_INT, pFrameRGB->data[0]);
      // glTexSubImage2D(GL_TEXTURE_2D, 0, 0,0, 840, 460, GL_RGB, GL_UNSIGNED_BYTE, pFrameRGB->data[0]);

           glTexImage2D(GL_TEXTURE_2D,                //Always GL_TEXTURE_2D
               0,                            //0 for now
               GL_RGB,                       //Format OpenGL uses for image
               pCodecCtx->width, pCodecCtx->height,  //Width and height
               0,                            //The border of the image
               GL_RGB, //GL_RGB, because pixels are stored in RGB format
               GL_UNSIGNED_BYTE, //GL_UNSIGNED_BYTE, because pixels are stored
                               //as unsigned numbers
               pFrameRGB->data[0]);               //The actual pixel data
     // additional opengl end  

    // Save the frame to disk
    if(++i&lt;=5)
     SaveFrame(pFrameRGB, pCodecCtx->width, pCodecCtx->height,
           i);
     }
    }

    glColor3f(1,1,1);
    glBindTexture(GL_TEXTURE_2D, texture);
    glBegin(GL_QUADS);
       glTexCoord2f(0,1);
       glVertex3f(0,0,0);

       glTexCoord2f(1,1);
       glVertex3f(pCodecCtx->width,0,0);

       glTexCoord2f(1,0);
       glVertex3f(pCodecCtx->width, pCodecCtx->height,0);

       glTexCoord2f(0,0);
       glVertex3f(0,pCodecCtx->height,0);

    glEnd();
    // Free the packet that was allocated by av_read_frame
    av_free_packet(&amp;packet);
    }


     // Free the RGB image
    av_free(buffer);
    av_free(pFrameRGB);

    // Free the YUV frame
    av_free(pFrame);

    // Close the codec
    avcodec_close(pCodecCtx);

    // Close the video file
    avformat_close_input(&amp;pFormatCtx);

    return 0;
    }

    Unfortunately i could not find my solution here

    ffmpeg video to opengl texture

    The program compiles but does not show any video on the texture. Just a OpenGL window is created.