
Recherche avancée
Autres articles (44)
-
Pas question de marché, de cloud etc...
10 avril 2011Le vocabulaire utilisé sur ce site essaie d’éviter toute référence à la mode qui fleurit allègrement
sur le web 2.0 et dans les entreprises qui en vivent.
Vous êtes donc invité à bannir l’utilisation des termes "Brand", "Cloud", "Marché" etc...
Notre motivation est avant tout de créer un outil simple, accessible à pour tout le monde, favorisant
le partage de créations sur Internet et permettant aux auteurs de garder une autonomie optimale.
Aucun "contrat Gold ou Premium" n’est donc prévu, aucun (...) -
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...) -
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)
Sur d’autres sites (6960)
-
texture rendering issue on iOS using OpenGL ES in Unity project
28 mars 2016, par Time1essI’m working on a project, part of it is to streaming video to my iPhone, currently I use my laptop to create the video stream to my iPhone with ffmpeg.The stream code in shell is below :
ffmpeg \
-f avfoundation -i "1" -s 1280*720 -r 29.97 \
-c:v mpeg2video -q:v 20 -pix_fmt yuv420p -g 1 -threads 4\
-f mpegts udp://192.168.1.102:6666with this, I successfully create my video stream.
In Unity, I want to decode the video stream to create a texture. After I have gone through with some ffmpeg tutorial and Unity tutorial(Since I’m new to both of them), I followed tutorials and created my link library. Some of these codes are below(ask me if more is needed) :
In my library :
buffer alloc :
uint8_t *buffer;
int buffer_size;
buffer_size = avpicture_get_size(AV_PIX_FMT_RGBA, VIEW_WIDTH, VIEW_HEIGHT);
buffer = (uint8_t *) av_malloc(buffer_size*sizeof(uint8_t));
avpicture_fill((AVPicture *) pFrameRGB, buffer, AV_PIX_FMT_RGBA,
VIEW_WIDTH, VIEW_HEIGHT);getContext :
is->sws_ctx = sws_getContext
(
is->video_st->codec->width,
is->video_st->codec->height,
is->video_st->codec->pix_fmt,
VIEW_WIDTH,
VIEW_HEIGHT,
AV_PIX_FMT_RGBA,
SWS_BILINEAR,
NULL,
NULL,
NULL
);sws_scale :
sws_scale(
is->sws_ctx,
(uint8_t const * const *)pFrame->data,
pFrame->linesize,
0,
is->video_st->codec->height,
pFrameRGB->data,
pFrameRGB->linesize
);texture render :
static void UNITY_INTERFACE_API OnRenderEvent(int texID)
{
GLuint gltex = (GLuint)(size_t)(texID);
glBindTexture(GL_TEXTURE_2D, gltex);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, VIEW_WIDTH, VIEW_HEIGHT,
GL_RGBA, GL_UNSIGNED_BYTE, pFrameRGB->data[0]);
glGetError();
return;
}
extern "C" UnityRenderingEvent UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API GetRenderEventFunc()
{
return OnRenderEvent;
}In Unity :
texture created :
private Texture2D texture;
private int texID;
texture = new Texture2D (width, height, TextureFormat.RGBA32, false);
texture.filterMode = FilterMode.Point;
texture.Apply ();
GetComponent<renderer> ().material.mainTexture = texture;
texID = texture.GetNativeTexturePtr ().ToInt32();
</renderer>update func :
void Update ()
{
GL.IssuePluginEvent(GetRenderEventFunc(), texID);
}Video stream info :
Input #0, mpegts, from 'udp://0.0.0.0:6666':
Duration: N/A, start: 2.534467, bitrate: N/A
Program 1
Metadata:
service_name : Service01
service_provider: FFmpeg
Stream #0:0[0x100]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), yuv420p(tv), 1280x720 [SAR 1:1 DAR 16:9], max. 104857 kb/s, 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbcLeave other details, my library works fine on the Unity simulator, but when I compiled all my libraries for arm64 and used the xcode project that Unity created to build my app and ran it, I couldn’t get any texture rendered in my iPhone, I checkd my network and I’m sure that data had been sent to my iPhone and the Debug log showed me that frame has been successfully decoded also the
OnRenderEvent
func had been called.I’m confused and try to find answer on stackoverflow, maybe I’m a beginer cause I can’t find answers, so I ask you guys to help me plz.
FYI :
Unity 5.3.2f1 Personal
Xcode 7.2.1
iOS 9.2.1
ffmpeg 3.0
-
Decoder return of av_find_best_stream vs. avcodec_find_decoder
7 octobre 2016, par Jason CThe docs for libav’s
av_find_best_stream
function (libav 11.7, Windows, i686, GPL) specify a parameter that can be used to receive a pointer to an appropriateAVCodec
:decoder_ret - if non-NULL, returns the decoder for the selected stream
There is also the
avcodec_find_decoder
function which can find anAVCodec
given an ID.However, the official demuxing + decoding example uses
av_find_best_stream
to find a stream, but chooses to useavcodec_find_decoder
to find the codec in lieu ofav_find_best_stream
’s codec return parameter :ret = av_find_best_stream(fmt_ctx, type, -1, -1, NULL, 0);
...
stream_index = ret;
st = fmt_ctx->streams[stream_index];
...
/* find decoder for the stream */
dec = avcodec_find_decoder(st->codecpar->codec_id);As opposed to something like :
ret = av_find_best_stream(fmt_ctx, type, -1, -1, &dec, 0);
My question is pretty straightforward : Is there a difference between using
av_find_best_stream
’s return parameter vs. usingavcodec_find_decoder
to find theAVCodec
?The reason I ask is because the example chose to use
avcodec_find_decoder
rather than the seemingly more convenient return parameter, and I can’t tell if the example did that for a specific reason or not. The documentation itself is a little spotty and disjoint, so it’s hard to tell if things like this are done for a specific important reason or not. I can’t tell if the example is implying that it "should" be done that way, or if the example author did it for some more arbitrary personal reason. -
Senior Software Engineer for Enterprise Analytics Platform
28 janvier 2016, par Matthieu Aubry — JobsWe’re looking for a lead developer to work on Piwik Analytics core platform software. We have some exciting challenges to solve and need you !
You’ll be working with both fellow employees and our open-source community. Piwik PRO staff lives in New Zealand, Europe (Poland, Germany) and in the U.S. We do the vast majority of our collaboration online.
We are a small, flexible team, so when you come aboard, you will play an integral part in engineering. As a leader you’ll help us to prioritise work and grow our community. You’ll help to create a welcoming environment for new contributors and set an example with your development practices and communications skills. You will be working closely with our CTO to build a future for Piwik.
Key Responsibilities
- Strong competency coding in PHP and JavaScript.
- Scaling existing backend system to handle ever increasing amounts of traffic and new product requirements.
- Outstanding communication and collaboration skills.
- Drive development and documentation of internal and external APIs (Piwik is an open platform).
- Help make our development practices better and reduce friction from idea to deployment.
- Mentor junior engineers and set the stage for personal growth.
Minimum qualifications
- 5+ years of experience in product development, security, usable interface design.
- 5+ years experience building successful production software systems.
- Strong competency in PHP5 and JavaScript application development.
- Skill at writing tests and reviewing code.
- Strong analytical skills.
Location
- Remote work position !
- or you can join us in our office based in Wellington, New Zealand or in Wrocław, Poland.
Benefits
- Competitive salary.
- Equity in Piwik PRO.
- Remote work is possible.
- Yearly meetup with the whole team abroad.
- Be part of a successful open source company and community.
- In our Wellington (NZ) and Wroclaw (PL) offices : snacks, coffee, nap room, Table football, Ping pong…
- Regular events.
- Great team of people.
- Exciting projects.
Learn more
Learn more what it’s like to work on Piwik in our blog post
About Piwik
At Piwik and Piwik PRO we develop the leading open source web analytics platform, used by more than one million websites worldwide. Our vision is to help the world liberate their analytics data by building the best open alternative to Google Analytics.
The Piwik platform collects, stores and processes a lot of information : hundreds of millions of data points each month. We create intuitive, simple and beautiful reports that delight our users.
About Piwik PRO company
At Piwik PRO we’re solving hard problems with simple solutions that make our users and customers happy. We practise agile methodology, test driven development and fast release cycles. Our backend is mostly built in modern PHP with a bit of Python. We use MySQL/MariaDB and Redis as data stores. Our frontends is built in JavaScript using AngularJS and jQuery. Our tools include Github, Travis CI, PhpStorm and Slack.
As a Lead Software Developer for Piwik PRO, you will be writing open source code that will run on more than 200,000 servers and be used in 200+ countries and 50 languages !
Apply online
To apply for this position, please Apply online here. We look forward to receiving your applications !