
Recherche avancée
Médias (33)
-
Stereo master soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
#7 Ambience
16 octobre 2011, par
Mis à jour : Juin 2015
Langue : English
Type : Audio
-
#6 Teaser Music
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#5 End Title
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#3 The Safest Place
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#4 Emo Creates
15 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
Autres articles (40)
-
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)
Sur d’autres sites (5901)
-
Circular Overlay of Video on Another Video using FFMPEG in Android
4 novembre 2018, par kdblueI am trying to overlay Second videos(on Circle) on First video. I tried but i am getting glitches in Video !
My Command is
command = "-i " + this.video1Path.getPath() + " -i " + this.video2Path.getPath() + " -filter_complex [1]trim=end_frame=1,geq=lum_expr='st(3,pow(X-(W/2),2)+pow(Y-(H/2),2));if(lte(ld(3),"
+ (this.mZoomLayout.getZoomedWidth()/2) + "*" + (this.mZoomLayout.getZoomedWidth()/2) + "),255,0)':128:128,format=gray,loop=-1:1,setpts=N/FRAME_RATE/TB[mask];[1][mask]alphamerge,format=rgba,lutrgb=a=if(gte(val\\,16)\\,val)[cutout];[0][cutout]overlay="
+ this.mZoomLayout.getCircleX() + ":" + this.mZoomLayout.getCircleY() + ":enable='between(t,0," + this.videoTwoDuration + ") -c:v libx264 -crf 24 -preset ultrafast " + videoPath.getPath();Example : i want like this
But i am getting glitches in Video
I am using ffmpeg android library :- https://github.com/bravobit/FFmpeg-Android
Note : I tried this links but never worked
https://stackoverflow.com/questions/42518592/circular-movie-overlay-in-ffmpeg (getting glitches)
FFMPEG Log
11-04 19:56:37.505 28420-28420/app.kdblue.com.ffmpegdemo E/ffmpeg Success: ffmpeg version n4.0-39-gda39990 Copyright (c) 2000-2018 the FFmpeg developers
built with gcc 4.9.x (GCC) 20150123 (prerelease)
configuration: --target-os=linux --cross-prefix=/root/bravobit/ffmpeg-android/toolchain-android/bin/arm-linux-androideabi- --arch=arm --cpu=cortex-a8 --enable-runtime-cpudetect --sysroot=/root/bravobit/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-ffprobe --enable-libopus --enable-libvorbis --enable-libfdk-aac --enable-libfreetype --enable-libfribidi --enable-libmp3lame --enable-fontconfig --enable-libvpx --enable-libass --enable-yasm --enable-pthreads --disable-debug --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-linux-perf --disable-doc --disable-shared --enable-static --enable-runtime-cpudetect --enable-nonfree --enable-network --enable-avresample --enable-avformat --enable-avcodec --enable-indev=lavfi --enable-hwaccels --enable-ffmpeg --enable-zlib --enable-gpl --enable-small --enable-nonfree --pkg-config=pkg-config --pkg-config-flags=--static --prefix=/root/bravobit/ffmpeg-android/build/armeabi-v7a --extra-cflags='-I/root/bravobit/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all' --extra-ldflags='-L/root/bravobit/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-cxxflags=
libavutil 56. 14.100 / 56. 14.100
libavcodec 58. 18.100 / 58. 18.100
libavformat 58. 12.100 / 58. 12.100
libavdevice 58. 3.100 / 58. 3.100
libavfilter 7. 16.100 / 7. 16.100
libavresample 4. 0. 0 / 4. 0. 0
libswscale 5. 1.100 / 5. 1.100
libswresample 3. 1.100 / 3. 1.100
libpostproc 55. 1.100 / 55. 1.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/MixVideos/video1_1541341464579.mp4':
Metadata:
major_brand : mp42
minor_version : 0
compatible_brands: isommp42
creation_time : 2018-11-04T14:24:29.000000Z
com.android.version: 8.1.0
com.android.manufacturer: OnePlus
com.android.model: ONE A2003
Duration: 00:00:04.15, start: 0.000000, bitrate: 9983 kb/s
Stream #0:0(eng): Video: h264 (avc1 / 0x31637661), yuv420p(tv, bt709), 1920x1080, 9978 kb/s, SAR 1:1 DAR 16:9, 29.39 fps, 30 tbr, 90k tbn, 180k tbc (default)
Metadata:
rotate : 90
creation_time : 2018-11-04T14:24:29.000000Z
handler_name : VideoHandle
Side data:
displaymatrix: rotation of -90.00 degrees
Input #1, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/MixVideos/video2_1541341478507.mp4':
Metadata:
major_brand : mp42
minor_version : 0
compatible_brands: isommp42
creation_time : 2018-11-04T14:24:42.000000Z
com.android.version: 8.1.0
com.android.manufacturer: OnePlus
com.android.model: ONE A2003
Duration: 00:00:02.62, start: 0.000000, bitrate: 9833 kb/s
Stream #1:0(eng): Video: h264 (avc1 / 0x31637661), yuv420p(tv, bt709), 1920x1080, 9824 kb/s, SAR 1:1 DAR 16:9, 29.73 fps, 30 tbr, 90k tbn, 180k tbc (default)
Metadata:
rotate : 90
creation_time : 2018-11-04T14:24:42.000000Z
handler_name : VideoHandle
Side data:
displaymatrix: rotation of -90.00 degrees
Stream mapping:
Stream #0:0 (h264) -> overlay:main
Stream #1:0 (h264) -> trim
Stream #1:0 (h264) -> alphamerge:main
overlay -> Stream #0:0 (libx264)
Press [q] to stop, [?] for help
[swscaler @ 0xed8a7000] No accelerated colorspace conversion found from yuv420p to rgba.
[libx264 @ 0xf2016c00] using SAR=1/1
[libx264 @ 0xf2016c00] using cpu capabilities: ARMv6 NEON
[libx264 @ 0xf2016c00] profile Constrained Baseline, level 4.0
[libx264 @ 0xf2016c00] 264 - core 152 r2851M ba24899 - H.264/MPEG-4 AVC codec - Copyleft 2003-2017 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=12 lo -
Need help : Can i get ffmpeg to burn in the source timecode of my file ?
6 novembre 2018, par MylesI have a .mov file that contains original source timecode metadata but i can’t figure out a way to get ffmpeg to burn the original timecode into the picture.
If i open the original file in QuickTime Player we can see it displays the true timecode on the far left :
I can also see that ffprobe is able to see the metadata when i run the following :
Command :
ffprobe -i test.mov -show_streams
Abbreviated Result :
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'test.mov':
Metadata:
major_brand : qt
minor_version : 537199360
compatible_brands: qt
creation_time : 2018-11-05T14:20:51.000000Z
timecode : 09:59:53:00
Duration: 00:16:37.64, start: 0.000000, bitrate: 1680 kb/sSo i can see that ffprobe is able to determine the start timecode of the file in its metadata results. The question is how to i pass that information into an ffmpeg command so that the timecode seen by ffprobe is what gets used when i convert the file for timecode burn-in ?
An example of a standard burnt in timecode command would be this :
ffmpeg -i test.mov -vcodec libx264 -cmp 22 -vf
"drawtext=fontfile=DroidSansMono.ttf : timecode=’09:59:53:00’ : r=25 :
x=(w-tw)/2 : y=h-(2*lh) : fontcolor=white : box=1 : boxcolor=0x00000099"
-y test_bitc.movThe only problem there though is that i’ve had to manually put the timecode in myself. I want the command to use the existing timecode metadata as the timecode input value so the same command can be used on multiple files.
Does anyone know how to do this ?
-
Matomo’s new story : our stronger vision for the future
31 octobre 2018, par Matthieu Aubry — CommunityOver the past year, the team here at Matomo have been working on a very exciting project we’d love to share with you.
It’s to do with the impact we hope for Matomo to have.
As you all know, the world changes at too fast a pace. New technologies, new phones, new everything in the blink of an eye. That’s not what will be happening here.
Instead, we’d like to believe it’s a refresh. Taking stock of how far we’ve come, what we’ve achieved so far, and how far we still have to go.
So we’re rebranding.
The rebrand
Like a caterpillar emerging from a cocoon, we hope to be a reborn analytics butterfly.
As a result of some careful planning and reflection we’ll be updating our logo, website and reasserting our voice.
It’s our chance to look at ourselves in a new light. We are a mighty analytics platform and it should be known we’re comparable to the likes of Google Analytics 360.
Along with the refresh of imagery, we listened to your feedback about the confusion between our two identities, so we’re also taking this opportunity to unite both the business brand of Innocraft with the community brand Matomo into one website.
It makes it easier for people from all walks of life, either as individuals or in large companies, to see us as being able to get down to business with a powerful analytics tool, as well as think on behalf of our community.
We’re the same, but with slight changes in our appearance and a stronger vision for the future.
How far we’ve come …
When we started out, it was about building a community around a movement. From the beginning we were concerned about data ownership, privacy and all things that came with that.
With the help of our community and contributors, we turned Matomo (formerly Piwik) into the trusted #1 open source analytics tool it is today. We’re committed to our community. But we also need to do more.
We’ve been niche and happy staying small, but now we need to take action and start shouting far and wide about what we do.
We once said we need : “To create, as a community, the leading international open source digital analytics platform, that gives every user full control of their data.”
We believe we’ve done that, so we’ll take it one step further.
A web analytics revolution has begun …
Begun ?
The line signifies a new beginning.
This is us standing up and reasserting our voice.
Our new chapter.
The rebrand is our chance to show that, yes, the world is changing, but when it comes to privacy, there are matters meant to be sacred. Privacy is a human right.
What makes it worse in this ever-changing landscape, with data breaches and stolen information, is that losing control of our data is scary, we have a right to know what’s going on with our information and this must start with us.
We know we need to champion this cause for privacy and data ownership.
We came together as a community and built something powerful, a free open-source analytics platform, that kept the integrity of the people using it.
It’s important for us now to feel more empowered to believe in our right to privacy, information and our ability to act independently of large corporations.
The time is here for us to speak up and take back control.
Once more, we need to come together to build something even more powerful, a safer online society.
Join us.
Sincerely,
Matthieu Aubry on behalf of the Matomo team