Recherche avancée

Médias (1)

Mot : - Tags -/copyleft

Autres articles (90)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

Sur d’autres sites (16210)

  • FFMPEG "listfile.txt : Invalid data found when processing input" concat error

    18 juillet 2019, par Léandre

    I have a txt file, where I have all my mp4 file to concat with ffmpeg. I saw the documentation of FFMPEG but I don’t see where I did something wrong.

    Here is my txt file :

    file '/home/leandre/PycharmProjects/video_test/1.mp4'
    file '/home/leandre/PycharmProjects/video_test/2.mp4'
    file '/home/leandre/PycharmProjects/video_test/3.mp4'
    file '/home/leandre/PycharmProjects/video_test/4.mp4'
    file '/home/leandre/PycharmProjects/video_test/5.mp4'
    file '/home/leandre/PycharmProjects/video_test/6.mp4'
    file '/home/leandre/PycharmProjects/video_test/7.mp4'
    file '/home/leandre/PycharmProjects/video_test/8.mp4'
    file '/home/leandre/PycharmProjects/video_test/9.mp4'
    file '/home/leandre/PycharmProjects/video_test/10.mp4'

    And this is my ffmpeg command : ffmpeg -f concat -safe 0 -i listfile.txt -c copy out_concat.mp4

    And my error listfile.txt: Invalid data found when processing input

  • gstreamer : Internal data error, in appsink "pull-sample" mode

    9 mai 2018, par Amir Raza

    I am getting Internal data error, in appsink .
    My application is to read .yuv data , encode and write to a buffer.

    I have accomplished the writing it file but when i changed the code to write it buffer it giving error.
    Its only able to write only single packet (188bytes).

    Output of program :

    (ConsoleApplication6.exe:14432): GStreamer-WARNING **: Failed to load plugin 'C:\gstreamer\1.0\x86_64\lib\gstreamer-1.0\libgstopenh264.dll': 'C:\gstreamer\1.0\x86_64\lib\gstreamer-1.0\libgstopenh264.dll': The specified procedure could not be found.
       pipeline:  filesrc location=Transformers1080p.yuv blocksize=4147200 ! videoparse  width=1920 height=1080 framerate=60/1 ! videoconvert ! video/x-raw,format=I420,width=1920,height=1080,framerate=60/1 !  x264enc ! mpegtsmux ! queue !  appsink name = sink
       Now playing: Transformers1080p.yuv
       Running...

        on_new_sample_from_sink

        sample got of size = 188
       Error: Internal data stream error.
       Returned, stopping playback
       Deleting pipeline

    my code :

    #define _CRT_SECURE_NO_WARNINGS 1
    //#pragma warning(disable:4996)
    #include <gst></gst>gst.h>
    #include <gst></gst>audio/audio.h>
    #include <gst></gst>app/gstappsrc.h>
    #include <gst></gst>base/gstpushsrc.h>
    #include <gst></gst>app/gstappsink.h>
    #include <gst></gst>video/video.h>
    #include <gst></gst>video/gstvideometa.h>
    #include <gst></gst>video/video-overlay-composition.h>

    #include
    #include

    #include
    #include

    using namespace std;

    GstElement *SinkBuff;
    char *out_file_path;
    FILE *out_file;

    //gst-launch-1.0.exe -v filesrc location=Transformers1080p.yuv blocksize=4147200 !  
    //videoconvert ! video/x-raw,format=I420,width=1920,height=1080,framerate=60/1 !  
    //openh264enc ! mpegtsmux ! filesink location=final.ts


    static gboolean bus_call(GstBus     *bus, GstMessage *msg, gpointer    data)
    {
           GMainLoop *loop = (GMainLoop *)data;

           switch (GST_MESSAGE_TYPE(msg))
           {
           case GST_MESSAGE_EOS:
                   g_print("End of stream\n");
                   g_main_loop_quit(loop);
                   break;

           case GST_MESSAGE_ERROR:
             {
                   gchar  *debug;
                   GError *error;

                   gst_message_parse_error(msg, &amp;error, &amp;debug);
                   g_free(debug);

                   g_printerr("Error: %s\n", error->message);
                   g_error_free(error);

                   g_main_loop_quit(loop);
                   break;
             }
           default:
                   break;
       }
           return TRUE;
    }

    /* called when the appsink notifies us that there is a new buffer ready for
    * processing */
    static void  on_new_sample_from_sink(GstElement * elt, void *ptr)
    {
           guint size;
           GstBuffer *app_buffer, *buffer;
           GstElement *source;
           GstMapInfo map = { 0 };
           GstSample *sample;
           static GstClockTime timestamp = 0;
           printf("\n on_new_sample_from_sink \n ");
           /* get the buffer from appsink */
           g_signal_emit_by_name(SinkBuff, "pull-sample", &amp;sample, NULL);
           if (sample)
           {
                   buffer = gst_sample_get_buffer(sample);
                   gst_buffer_map(buffer, &amp;map, GST_MAP_READ);

                   printf("\n sample got of size = %d \n", map.size);
                   //Buffer
                   fwrite((char *)map.data, 1, sizeof(map.size), out_file);

                   gst_buffer_unmap(buffer, &amp;map);
                   gst_sample_unref(sample);
           }
    }


    int main(int   argc, char *argv[])
    {
           GMainLoop *loop;
           int width, height;

           GstElement *pipeline;
           GError *error = NULL;
           GstBus *bus;
           char pipeline_desc[1024];
           out_file = fopen("output.ts", "wb");


           /* Initialisation */
           gst_init(&amp;argc, &amp;argv);

           // Create gstreamer loop
           loop = g_main_loop_new(NULL, FALSE);

           sprintf(
                   pipeline_desc,
                   " filesrc location=Transformers1080p.yuv blocksize=4147200 !"
                   " videoparse  width=1920 height=1080 framerate=60/1 !"
                   " videoconvert ! video/x-raw,format=I420,width=1920,height=1080,framerate=60/1 ! "
                   //" x264enc ! mpegtsmux ! filesink location=final.ts");
                   " x264enc ! mpegtsmux ! queue !  appsink name = sink");


           printf("pipeline: %s\n", pipeline_desc);

           /* Create gstreamer elements */
           pipeline = gst_parse_launch(pipeline_desc, &amp;error);

           /* TODO: Handle recoverable errors. */

           if (!pipeline) {
                   g_printerr("Pipeline could not be created. Exiting.\n");
                   return -1;
           }

           /* get sink */
           SinkBuff = gst_bin_get_by_name(GST_BIN(pipeline), "sink");
           g_object_set(G_OBJECT(SinkBuff), "emit-signals", TRUE, "sync", FALSE, NULL);
           g_signal_connect(SinkBuff, "new-sample", G_CALLBACK(on_new_sample_from_sink), NULL);


           /* Set up the pipeline */
           /* we add a message handler */
           bus = gst_pipeline_get_bus(GST_PIPELINE(pipeline));
           gst_bus_add_watch(bus, bus_call, loop);
           gst_object_unref(bus);

           /* Set the pipeline to "playing" state*/
           g_print("Now playing: Transformers1080p.yuv \n");
           gst_element_set_state(pipeline, GST_STATE_PLAYING);

           /* Iterate */
           g_print("Running...\n");
           g_main_loop_run(loop);

           /* Out of the main loop, clean up nicely */
           g_print("Returned, stopping playback\n");
           gst_element_set_state(pipeline, GST_STATE_NULL);

           g_print("Deleting pipeline\n");
           gst_object_unref(GST_OBJECT(pipeline));
           fclose(out_file);
           g_main_loop_unref(loop);


           return 0;
    }
  • Create a video with timestamp from multiple images with "picture taken date/time" in the meta data with ffmpeg or similar ?

    16 mars 2018, par m4D_guY

    I have two time lapse videos with a rate of 1 fps. The camera took 1 Image every minute. Unfortunately it was missed to set the camera to burn/print on every image the time and date. I am trying to burn the time and date afterwards into the video.

    I decoded with ffmpeg the two .avi files into  7000 single images each and wrote a R script that renamed the files into their "creation" date (the time and date the pictures where taken). Then i used exiftoolto write those information "into" the file, into their exif data or meta data or whatever this is called.

    The final images in the folder are looking like this :

    2018-03-12 17_36_40.png

    2018-03-12 17_35_40.png

    2018-03-12 17_34_40.png

    ...

    Is it possible to create a Video from these images again with ffmpeg or similiar with a "timestamp" in the video so you can see while watching a time and date stamp in the video ?