
Recherche avancée
Médias (1)
-
Revolution of Open-source and film making towards open film making
6 octobre 2011, par
Mis à jour : Juillet 2013
Langue : English
Type : Texte
Autres articles (107)
-
Demande de création d’un canal
12 mars 2010, parEn fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...) -
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)
Sur d’autres sites (12256)
-
What is an easy way to call a FFmpeg executable compiled for Android from java code ?
30 mai 2017, par dentexI have compiled FFmpeg for Android to suite my needs in terms of codecs, muxers etc.
Now I have an executable that, from what I understand, should be placed in my project dir under/external//data/data//app_opt
. What I have insideapp_opt
now is :.
├── bin
│ └── ffmpeg
├── include
│ ├── libavcodec
│ │ ├── avcodec.h
│ │ ├── avfft.h
│ │ ├── dxva2.h
│ │ ├── vaapi.h
│ │ ├── vda.h
│ │ ├── vdpau.h
│ │ ├── version.h
│ │ └── xvmc.h
│ ├── libavdevice
│ │ └── avdevice.h
│ ├── libavfilter
│ │ ├── asrc_abuffer.h
│ │ ├── avcodec.h
│ │ ├── avfiltergraph.h
│ │ ├── avfilter.h
│ │ ├── buffersink.h
│ │ ├── buffersrc.h
│ │ ├── version.h
│ │ └── vsrc_buffer.h
│ ├── libavformat
│ │ ├── avformat.h
│ │ ├── avio.h
│ │ └── version.h
│ ├── libavutil
│ │ ├── adler32.h
│ │ ├── aes.h
│ │ ├── attributes.h
│ │ ├── audioconvert.h
│ │ ├── audio_fifo.h
│ │ ├── avassert.h
│ │ ├── avconfig.h
│ │ ├── avstring.h
│ │ ├── avutil.h
│ │ ├── base64.h
│ │ ├── bprint.h
│ │ ├── bswap.h
│ │ ├── common.h
│ │ ├── cpu.h
│ │ ├── crc.h
│ │ ├── dict.h
│ │ ├── error.h
│ │ ├── eval.h
│ │ ├── fifo.h
│ │ ├── file.h
│ │ ├── imgutils.h
│ │ ├── intfloat.h
│ │ ├── intfloat_readwrite.h
│ │ ├── intreadwrite.h
│ │ ├── lfg.h
│ │ ├── log.h
│ │ ├── lzo.h
│ │ ├── mathematics.h
│ │ ├── md5.h
│ │ ├── mem.h
│ │ ├── opt.h
│ │ ├── parseutils.h
│ │ ├── pixdesc.h
│ │ ├── pixfmt.h
│ │ ├── random_seed.h
│ │ ├── rational.h
│ │ ├── samplefmt.h
│ │ ├── sha.h
│ │ ├── timecode.h
│ │ └── timestamp.h
│ ├── libpostproc
│ │ └── postprocess.h
│ ├── libswresample
│ │ └── swresample.h
│ └── libswscale
│ └── swscale.h
├── lib
│ ├── libavcodec.a
│ ├── libavdevice.a
│ ├── libavfilter.a
│ ├── libavformat.a
│ ├── libavutil.a
│ ├── libpostproc.a
│ ├── libswresample.a
│ ├── libswscale.a
│ └── pkgconfig
│ ├── libavcodec.pc
│ ├── libavdevice.pc
│ ├── libavfilter.pc
│ ├── libavformat.pc
│ ├── libavutil.pc
│ ├── libpostproc.pc
│ ├── libswresample.pc
│ └── libswscale.pc
└── share
└── ffmpeg
├── examples
│ ├── decoding_encoding.c
│ ├── filtering_audio.c
│ ├── filtering_video.c
│ ├── Makefile
│ ├── metadata.c
│ └── muxing.c
├── ffprobe.xsd
├── libvpx-1080p50_60.ffpreset
├── libvpx-1080p.ffpreset
├── libvpx-360p.ffpreset
├── libvpx-720p50_60.ffpreset
├── libvpx-720p.ffpreset
├── libx264-ipod320.ffpreset
└── libx264-ipod640.ffpresetDo I need just the
ffmpeg
underbin
to place in my project’s/res/raw
dir ?And what is the easiest way to call
ffmpeg
and feed it with a command string ?I compiled FFmpeg with limited decoders and demuxers, because I need audio extraction only.
See : How can I get FFmpeg to locate installed libraries when —sysroot is pointing to another directory ?
I would use it in background and notify the user in notification bar on completion.
I know that, here on SO, other similar questions are present, but they are a bit vague or confusing, at least for me. I understand at this point I lack of competences (actually my App is a jigsaw made of java-code-snippets from the Net that work together).
I’d appreciate some guidance.
Thanks. -
avcodec/hevc_sei : fix amount of bits skipped when reading picture timing SEI message
6 mai 2017, par James Almeravcodec/hevc_sei : fix amount of bits skipped when reading picture timing SEI message
The code was skipping the entire reported SEI message size regardless of
the amount of bits read.
While in theory safe for NALU where the picture timing SEI message is alone
or at the end as we're using the checked bitstream reader, it isn't in any
other situation, where every SEI message in the NALU after the picture
timing one would potentially fail to parse.Change the function name to one more in line with the rest of file, and
remove the bogus "Skipped SEI" debug message while at it.Reviewed-by : Michael Niedermayer <michael@niedermayer.cc>
Signed-off-by : James Almer <jamrial@gmail.com> -
How do I bind a specific wlan interface to an ffmpeg/ffplay call without modifying the source ?
5 mai 2017, par FalimondI have two wifi USB adapters connected to a Raspberry Pi running Raspbian jessie with their respective wlan0 and wlan1 interfaces set up. I can successfully use
wpa_supplicant
to connect to two individual, identical devices from which I can individually access a UDP video stream usingffplay
. Now I need to simultaneously bring up both UDP streams. This would be easy if the access URLs were different, but they are not and I cannot change the IP addresses, requiring another solution.Unless I am mistaken, there is no way to specify an interface in the call to
ffplay/ffmpeg
. I have looked through the FFmpeg source relevant to the UDP protocol and know that I can specify an interface in the appropriatesetsockopt
calls in libavformat/udp.c, but modifying the source in this way is rather involved and I’d like to avoid it if possible.I looked into adding namespaces using
ip
but this doesn’t seem like it will work because I can only bind the wireless hardware device phy0, not the separate wlan interfaces associated with phy0.Are there alternate means of dealing with such a situation where the UDP streams’ URLs are the same or am I stuck with modifying the FFmpeg source code ?