Recherche avancée

Médias (2)

Mot : - Tags -/doc2img

Autres articles (44)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Les vidéos

    21 avril 2011, par

    Comme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
    Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
    Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

Sur d’autres sites (7152)

  • Android ffmpeg command shows java.io.IOException : Error running exec()

    13 avril 2015, par Jamal

    I would like to use ffmpeg binary executable in my Android project.for this purpose I have used pre built ffmpeg Android binary from this(https://github.com/hiteshsondhi88/ffmpeg-android/releases/download/v0.3.3/prebuilt-binaries.zip) link.

    As per procedure I have to place the executable file into /data/data/com.example.rampedsample directory, here com.example.rampedsample is my project packageName.I couldn’t find this location in my device as it is un rooted.So I pasted that executable into Android emulator’s com.example.rampedsample directory using DDMS perspective.

    In my Activity used the below code

    try {
               Process p = Runtime.getRuntime().exec("/data/data/com.example.rampedsample/ffmpeg "+Environment.getExternalStorageDirectory()+"/Movies/ramp_video.mp4"
                       +" -map 0:v -codec copy "+Environment.getExternalStorageDirectory()+"/Movies/ramp_video2.mp4");


           } catch (IOException e) {
               // TODO Auto-generated catch block
               e.printStackTrace();
           }

    AndroidManifest permission

    error

    04-13 16:59:55.314: W/System.err(11387): java.io.IOException: Error running exec(). Command: [/data/data/com.example.rampedsample/ffmpeg, /mnt/sdcard/Movies/ramp_video.mp4, -map, 0:v, -codec, copy, /mnt/sdcard/Movies/ramp_video2.mp4] Working Directory: null Environment: null
    04-13 16:59:55.314: W/System.err(11387):    at java.lang.ProcessManager.exec(ProcessManager.java:211)
    04-13 16:59:55.355: W/System.err(11387):    at java.lang.Runtime.exec(Runtime.java:168)
    04-13 16:59:55.355: W/System.err(11387):    at java.lang.Runtime.exec(Runtime.java:241)
    04-13 16:59:55.355: W/System.err(11387):    at java.lang.Runtime.exec(Runtime.java:184)
    04-13 16:59:55.355: W/System.err(11387):    at com.example.rampedsample.MainActivity.onCreate(MainActivity.java:18)
    04-13 16:59:55.355: W/System.err(11387):    at android.app.Activity.performCreate(Activity.java:5008)
    04-13 16:59:55.355: W/System.err(11387):    at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1079)
  • How do I automate running a batch command with 3 arguments in C# (MVC 3) ?

    13 janvier 2014, par kheya

    I have this command in process.bat file
    This file takes 3 arguments - %1 = input file path %2=output path %3=output file name

    This is how I call it in command prompt :

    C:apps\xyz\>process.bat "c:\files\uploads" "c:\files\output" "123"

    This creates 2 files : c :\files\output\abc.mp4 and c :\files\output\123.jpg using FFMPEG
    Things work fine if I run in command prompt.

    @echo off
    set w=480
    set h=320
    for "%%a" in ("%1\*.avi")
    do (ffmpeg -i "%%a" -c:v libx264 -movflags +faststart -preset slow -crf 22 -b:v 500k -vf "scale=480:trunc(ow/a/2)*2" -threads 0 -c:a libfdk_aac -b:a 128k "%2/%%~na.mp4" -vf select="not(mod(n\\,10))" -r 1 -t 1 -ss 3 -s sqcif "%2\%3.jpg")

    But I need to automate this process. So I want a scheduled job or some other process that will process file periodically or when there are files to process.

    I was thinking to create a C# Console app that will run every x minutes.
    Console app will pass the 3 parameter to the bat file and run it.
    But I am having nightmare with the console app. It just doesn't work.
    I never see the files generated nor any error given.

    What will be the best practice to implement this automation in windows os (C#, MVC 3) ?

    Here is what I tried that never worked :

    public static string RunBatchFile(string fullPathToBatch, string args) {            
       using (var proc = new Process {
           StartInfo =
           {                                        
               FileName = fullPathToBatch,
               Arguments = args,
               UseShellExecute = false,
               CreateNoWindow = true,
               RedirectStandardOutput = false,
               RedirectStandardError = false
           }
       })
       {
           try {
               proc.Start();
               proc.WaitForExit();                  
           } catch (Win32Exception e) {
               if (e.NativeErrorCode == 2)
                   return "File not found exception";
               else if (e.NativeErrorCode == 5)
                   return "Access Denied Exception";
           }
       }

       return "OK";
    }
  • Is it possible to send a temporary slate (image or video) into a running Azure Live Event RTMP-stream ?

    15 novembre 2020, par Brian Frisch

    I'm currently building a video streaming app which leverages Azure Media Services Live Events.

    


    It consists of :

    


      

    1. a mobile app that can stream live video and.
    2. 


    3. a web client that plays the live event video.
    4. 


    5. a producer screen with controls to start and stop the web client access to the video.
    6. 


    7. a server that handles various operations around the entire system
    8. 


    


    It's working very well, but I would like to add a feature that would enable the producer to add some elegance to the experience. Therefore I'm trying to get my head around how I can enable the producer be able to switch the incoming source of the stream to a pre-recorded video or event a still image at any point during the recording, and also to switch back to live-video. A kill-switch of some kind, that would cover waiting-time if there's technical difficulties on the set, and it could also be used for pre-/post-roll branding slates when introing and outroing a video event. I would like this source switch to be embedded in the video stream (also so it would be possible to get this into the final video-product if I need it in an archive for later playback)

    


    I'm trying to do it in a way where the producer can set a timestamp for when the video override should come in, and when it should stop. The I want to have my server respond to these timestamps and send the instructions over RTMP to the Azure Live Event. Is it possible to send such an instruction ("Hey, play this video-bit/show this image in the stream for x-seconds") in the RTMP-protocol ? I've tried to figure it out, and I've read about SCTE-35 markers and such, but I have not been able to find any examples on how to do it, so I'm a bit stuck.

    


    My plan-B is to make it possible to stream an image from the mobile application that already handles the live video-stream, but I'm initially targeting an architecture where the mobile app is unaware of anything else than live streaming, and this override switch should preferably be handled by the server, which is a firebase functions setup.

    


    If you are able to see other ways of doing it, I'm all ears.

    


    I've already tried to build a ffmpeg method that listens to updates to the producer-set state, and then streams an image to the same RTMP-url that the video goes to from the mobile app. But it only works when the live video isn't already streaming - it seems like I cannot take over a RTMP-stream when it's already running.