Recherche avancée

Médias (2)

Mot : - Tags -/map

Autres articles (13)

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

  • L’agrémenter visuellement

    10 avril 2011

    MediaSPIP est basé sur un système de thèmes et de squelettes. Les squelettes définissent le placement des informations dans la page, définissant un usage spécifique de la plateforme, et les thèmes l’habillage graphique général.
    Chacun peut proposer un nouveau thème graphique ou un squelette et le mettre à disposition de la communauté.

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

Sur d’autres sites (3029)

  • reduce bitrate using ffmpeg

    27 mai 2017, par WantIt

    I have this command to run with ffmpeg to reduce bitrate.

    ffmpeg -y -i input.mp4 -b:v 1024k output_1024k.mp4

    where we want to take the input file and just reduced it’s bitrate (just 1024k). It works in the commandline.
    Now, I like to reduce bitrate via Java. I decided to use bytedeco/javacv FFmpegFrameGrabber and FFmpegFrameRecorder to perform the same function. Below is my code :

    FFmpegFrameGrabber grabber = new FFmpegFrameGrabber("input.mp4");
               grabber.start();

               FrameRecorder recorder = new FFmpegFrameRecorder("out.mp4", grabber.getAudioChannels());
               recorder.setSampleRate(256);
               recorder.start();

               Frame frame;
               while ((frame = grabber.grabFrame()) != null) {
                   recorder.record(frame);
               }
               recorder.stop();
               grabber.stop();

    but it is saying the following error :

    org.bytedeco.javacv.FrameRecorder$Exception: avcodec_open2() error -22: Could not open audio codec.
       at org.bytedeco.javacv.FFmpegFrameRecorder.startUnsafe(FFmpegFrameRecorder.java:732)
       at org.bytedeco.javacv.FFmpegFrameRecorder.start(FFmpegFrameRecorder.java:351)
       at com.goxhere.api.monte.resources.VideoResource.uploadFile(VideoResource.java:337)
       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
       at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
       at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
       at java.lang.reflect.Method.invoke(Method.java:498)
       at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:74)
       at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:144)
       at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:161)
       at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:247)
       at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:99)
       at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:388)
       at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:346)
       at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:102)
       at org.glassfish.jersey.server.ServerRuntime$2.run(ServerRuntime.java:337)
       at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271)
       at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267)
       at org.glassfish.jersey.internal.Errors.process(Errors.java:315)
       at org.glassfish.jersey.internal.Errors.process(Errors.java:297)
       at org.glassfish.jersey.internal.Errors.process(Errors.java:267)
       at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:280)
       at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:316)
       at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1084)
       at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:418)
       at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:372)
       at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:389)
       at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:342)
       at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:229)
       at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231)
       at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
       at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
       at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)
       at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)
       at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:198)
       at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:96)
       at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:478)
       at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140)
       at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:80)
       at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:624)
       at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87)
       at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:342)
       at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:799)
       at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)
       at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:861)
       at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1455)
       at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
       at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
       at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
       at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
       at java.lang.Thread.run(Thread.java:748)

    Basically the error says I have problem with audio codec, but when I tried to run the ffmpeg commandline directly, it reduces the bitrate just fine. Are we missing some setup in javacode ? Thank you.

  • avfilter/vf_dnn_processing : add a generic filter for image proccessing with dnn networks

    31 octobre 2019, par Guo, Yejun
    avfilter/vf_dnn_processing : add a generic filter for image proccessing with dnn networks
    

    This filter accepts all the dnn networks which do image processing.
    Currently, frame with formats rgb24 and bgr24 are supported. Other
    formats such as gray and YUV will be supported next. The dnn network
    can accept data in float32 or uint8 format. And the dnn network can
    change frame size.

    The following is a python script to halve the value of the first
    channel of the pixel. It demos how to setup and execute dnn model
    with python+tensorflow. It also generates .pb file which will be
    used by ffmpeg.

    import tensorflow as tf
    import numpy as np
    import imageio
    in_img = imageio.imread('in.bmp')
    in_img = in_img.astype(np.float32)/255.0
    in_data = in_img[np.newaxis, :]
    filter_data = np.array([0.5, 0, 0, 0, 1., 0, 0, 0, 1.]).reshape(1,1,3,3).astype(np.float32)
    filter = tf.Variable(filter_data)
    x = tf.placeholder(tf.float32, shape=[1, None, None, 3], name='dnn_in')
    y = tf.nn.conv2d(x, filter, strides=[1, 1, 1, 1], padding='VALID', name='dnn_out')
    sess=tf.Session()
    sess.run(tf.global_variables_initializer())
    output = sess.run(y, feed_dict=x : in_data)
    graph_def = tf.graph_util.convert_variables_to_constants(sess, sess.graph_def, ['dnn_out'])
    tf.train.write_graph(graph_def, '.', 'halve_first_channel.pb', as_text=False)
    output = output * 255.0
    output = output.astype(np.uint8)
    imageio.imsave("out.bmp", np.squeeze(output))

    To do the same thing with ffmpeg :
    - generate halve_first_channel.pb with the above script
    - generate halve_first_channel.model with tools/python/convert.py
    - try with following commands
    ./ffmpeg -i input.jpg -vf dnn_processing=model=halve_first_channel.model:input=dnn_in:output=dnn_out:fmt=rgb24:dnn_backend=native -y out.native.png
    ./ffmpeg -i input.jpg -vf dnn_processing=model=halve_first_channel.pb:input=dnn_in:output=dnn_out:fmt=rgb24:dnn_backend=tensorflow -y out.tf.png

    Signed-off-by : Guo, Yejun <yejun.guo@intel.com>
    Signed-off-by : Pedro Arthur <bygrandao@gmail.com>

    • [DH] configure
    • [DH] doc/filters.texi
    • [DH] libavfilter/Makefile
    • [DH] libavfilter/allfilters.c
    • [DH] libavfilter/vf_dnn_processing.c
  • iPhone video conversion to Android supported format using FFMPEG

    21 janvier 2016, par Muhammad Umar

    I have an iPhone video created as follow

    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'vid.mp4':
     Metadata:
       major_brand     : qt  
       minor_version   : 0
       compatible_brands: qt  
       creation_time   : 2016-01-20 09:21:08
       make            : Apple
       make-eng        : Apple
       encoder         : 8.2
       encoder-eng     : 8.2
       date            : 2016-01-20T14:20:57+0500
       date-eng        : 2016-01-20T14:20:57+0500
       model           : iPhone 5
       model-eng       : iPhone 5
     Duration: 00:00:10.80, start: 0.000000, bitrate: 765 kb/s
       Stream #0:0(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 62 kb/s (default)
       Metadata:
         creation_time   : 2016-01-20 09:21:08
         handler_name    : Core Media Data Handler
       Stream #0:1(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p(tv, bt709), 320x320, 696 kb/s, 30 fps, 30 tbr, 600 tbn, 1200 tbc (default)
       Metadata:
         creation_time   : 2016-01-20 09:21:08
         handler_name    : Core Media Data Handler
         encoder         : H.264

    I want to convert it into Android support format so that all iPhone and maximum androids can read it.

    However if i convert to mpeg4 format using

    ffmpeg -i vid.mp4 -s 320x320 -vcodec mpeg4 -acodec aac -strict -2 -ac 2 -ar 44100 -ab 128k output.mp4

    it really degrades the quality

    How can i set the ffmpeg to create atleast as close as possible quality which runs on both android and iPhone