Recherche avancée

Médias (91)

Autres articles (73)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (8188)

  • ffmpeg won't execute properly in google app engine standard nodejs

    3 septembre 2019, par tommyc38

    I have tried for three full days to get GAE (standard - nodejs) to run a simple video transcoder from MOV to MP4 using ffmpeg. I have tried using ffluent-ffmpeg, kicking off a child process (e.g. spawn), and nothing works. As soon as it hits the call to the executable it always errors. I have confirmed ffmpeg is installed and even tried using ffmpeg-static. Moreover, I have it working on my local machine with no problems (using all of the aforementioned ways).

    I have also tried logging the errors and nothing is really all that helpful. I can see its working through any installed package including ffmpeg (system package).

    Below is the pseudo code...step three is where the problem occurs.

    1. Send file name to GAE endpoint
    2. Download the file from google cloud storage to a temp file
    3. Transcode using ffmpeg
    4. Upload temp file to google cloud storage
    5. Remove old google cloud storage file
    6. Remove temp file

    The file I am using to test is 6MB...a 5 second video I took on my iPhone. Thank you in advance.

    UPDATE : I successfully deployed the exact same code to Node Flex environment and everything works great. I wasn’t able to get any errors in the standard environment that directed me where to look but my guess is it has something to do with how it stores the file I pipe into FFMPEG on GAE Node Standard. The docs say its a virtual file system that uses RAM. I’d love to hear if anybody managed to get it working in the standard environment.

  • how to allow a worker to run a ffmpeg command on heroku for my python/django app ?

    10 mars 2013, par GetItDone

    I've been stuck trying to figure this out for weeks. I previously asked a similar question found here but I never got any replies. I really cannot find any good documentation anywhere. All I need to do is use a worker (don't care what worker have django-celery and rq installed) to convert a file to flv when it is uploaded from a form. I was able to get this done easily locally, but after over a week I haven't been able to get it to work no matter what I have tried. I tried adding a tasks.py file for celery, or a worker.py file for rq, and I have no idea what else (if anything) needs to be done, such as in my settings.py or Procfile. My procfile looks like :

    web: gunicorn lftv.wsgi -b 0.0.0.0:$PORT
    celeryd: celery -A tasks worker --loglevel=info
    worker: python worker.py

    My requirements.txt showing what I have installed looks like this :

    Django==1.4.3
    Logbook==0.4.1
    amqp==1.0.6
    anyjson==0.3.3
    billiard==2.7.3.19
    boto==2.6.0
    celery==3.0.13
    celery-with-redis==3.0
    distribute==0.6.31
    dj-database-url==0.2.1
    django-celery==3.0.11
    django-s3-folder-storage==0.1
    django-storages==1.1.6
    gunicorn==0.16.1
    kombu==2.5.4
    pil==1.1.7
    psycopg2==2.4.5
    python-dateutil==1.5
    pytz==2012j
    redis==2.7.2
    requests==1.1.0
    rq==0.3.2
    six==1.2.0
    times==0.6

    The only thing relevant in my settings.py are as follows :

    BROKER_BACKEND = 'django'
    BROKER_URL = #For this I copy/pasted the code from my redistogo add-on from heroku. Not sure if correct
    BROKER_TRANSPORT_OPTIONS = {'visibility_timeout': 1800}

    Without trying to take up too much more space, my tasks.py looks like this :

    import subprocess

    @task
    def ffmpeg_conversion(input_file):
       converted_file = subprocess.call(input_file)
       return converted_file

    I use S3 to store my static and media files, and the upload works (adding uploads to my bucket), however no matter what I try the conversion never will. Is there a good tutorial for absolute beginners ? I followed the heroku redis tutorial, celery docs, rq docs, and whatever else I can find, and got the examples to work, but the worker will not execute the command from my view. For example one of the many things I tried :

    ...
    ffmpeg = "ffmpeg -i %s -acodec mp3 -ar 22050 -f flv -s 320x240 %s" % (sourcefile, targetfile)
    ffmpegresult = ffmpeg_conversion.delay(ffmpeg)
    ...

    or using rq

    ...
    q = Queue(connection=conn)
    result = q.enqueue(ffmpeg_conversion, ffmpeg)
    ...

    I seems like it should be simple, however I am completely self-taught and have never deployed a project whatsoever, and there just doesn't seem to be any good documentation or tutorial available for what I am trying to do. I can't judge whether I am completely off and completely missing something significant or relatively close to getting this to work. I really do appreciate any input whatsoever, this is driving me nuts. Thanks in advance.

  • Streaming webm with ffmpeg/ffserver

    20 octobre 2014, par Mediocre Gopher

    I’m attempting to cast my desktop screen to an ffserver and stream it as a webm. I’m using the following ffserver configuration :

    <feed>               # This is the input feed where FFmpeg will send
      File ./feed1.ffm            # video stream.
      FileMaxSize 1G              # Maximum file size for buffering video
      ACL allow 127.0.0.1
      ACL allow localhost
    </feed>

    <stream>              # Output stream URL definition
      Feed feed1.ffm              # Feed from which to receive video
      Format webm

      # Audio settings
      AudioCodec vorbis
      AudioBitRate 64             # Audio bitrate

      # Video settings
      VideoCodec libvpx
      VideoSize 720x576           # Video resolution
      VideoFrameRate 25           # Video FPS

      AVOptionVideo cpu-used 10
      AVOptionVideo qmin 10
      AVOptionVideo qmax 42
      AVOptionVideo quality good
      AVOptionAudio flags +global_header
      PreRoll 15
      StartSendOnKey
      VideoBitRate 400            # Video bitrate
    </stream>

    And the following command on my desktop :

    ffmpeg -f x11grab -r 25 -s 1280x800 -i :0.0 -f alsa -i pulse http://127.0.0.1:8090/feed1.ffm

    With ffmpeg being version 2.4.2 and with libvpx enabled (latest on Arch). I get the error :

    [libvpx @ 0x20a21a0] CQ level 0 must be between minimum and maximum quantizer value (10-42)

    On the client side. As far as I can tell from calling ffmpeg -h full there’s no way of setting the cq-level, and setting qmin to 0 doesn’t work (it ends up as 3 for some reason, I guess ffmpeg enforces a minimum).

    This configuration seems to have worked for others on the internet, but I can’t see how if cq-level defaults 0. If anyone has any ideas I’d really appreciate it.