
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (82)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)
Sur d’autres sites (10768)
-
Thread safety of FFmpeg when using av_lockmgr_register
12 août 2013, par StocasticoMy application uses FFmpeg to read video streams. So far, I ensured thread safety by defining my own global lock and looking for all the methods inside FFmpeg libraries which are not thread safe.
This makes the code a bit messy, so while looking for better ideas I found this answer, but apparently I couldn't make use of the suggestions.
I tried testing it in my own environment, but I always get critical heap error. Here's the test codeclass TestReader
{
public:
TestReader( std::string sVid )
{
m_sVid = sVid;
m_cVidPtr.reset( new VideoReader() );
}
~TestReader()
{}
void operator() ()
{
readVideoThread();
}
private:
int readVideoThread()
{
m_cVidPtr->init( m_sVid.c_str() );
MPEGFrame::pointer cFramePtr;
for ( int i=0; i< 500; i++ )
{
cFramePtr = m_cVidPtr->getNextFrame();
}
return 0;
}
boost::shared_ptr<videoreader> m_cVidPtr;
std::string m_sVid;
};
/*****************************************************************************/
int lockMgrCallback(void** cMutex, enum AVLockOp op)
{
if (nullptr == cMutex)
return -1;
switch(op)
{
case AV_LOCK_CREATE:
{
*cMutex = nullptr;
boost::mutex* m = new boost::mutex();
*cMutex = static_cast(m);
break;
}
case AV_LOCK_OBTAIN:
{
boost::mutex* m = static_cast(*cMutex);
m->lock();
break;
}
case AV_LOCK_RELEASE:
{
boost::mutex * m = static_cast(*cMutex);
m->unlock();
break;
}
case AV_LOCK_DESTROY:
{
boost::mutex * m = static_cast(*cMutex);
delete m;
break;
}
default:
break;
}
return 0;
}
int testFFmpegMultiThread( std::string sVideo )
{
if ( ::av_lockmgr_register( &lockMgrCallback ) )
{
std::cout << "Could not initialize lock manager!" << std::endl;
return -1;
}
TestReader c1(sVideo);
TestReader c2(sVideo);
boost::thread t1( c1 );
boost::thread t2( c2 );
t1.join();
t2.join();
return 0;
}
</videoreader>The classes VideoReader and MPEGFrame are just wrappers and have always worked perfectly in single threaded scenarios, or in multi-threaded scenario managed using my own global lock.
Am I missing something obvious ? Can anybody point me to some working code ? Thanks in advance -
Elacarte Presto Tablets
14 mars 2013, par Multimedia Mike — GeneralI visited an Applebee’s restaurant this past weekend. The first thing I spied was a family at a table with what looked like a 7-inch tablet. It’s not an uncommon sight. However, as I moved through the restaurant, I noticed that every single table was equipped with such a tablet. It looked like this :
For a computer nerd like me, you could probably guess that I was be far more interested in this gadget than the cuisine. The thing said “Presto” on the front and “Elacarte” on the back. Putting this together, we get the website of Elacarte, the purveyors of this restaurant tablet technology. Months after the iPad was released on 2010, I remember stories about high-end restaurants showing their wine list via iPads. This tablet goes well beyond that.
How was it ? Well, confusing, mostly. The hostess told us we could order through the tablet or through her. Since we already knew what we wanted, she just manually took our order and presumably entered it into the system. So, right away, the question is : Do we order through a human or through a computer ? Or a combination ? Do we have to use the tablet if we don’t want to ?
Hardware
When picking up the tablet, it’s hard not to notice that it is very heavy. At first, I suspected that it was deliberately weighted down as some minor attempt at an anti-theft measure. But then I remembered what I know about power budgets of phones and tablets– powering the screen accounts for much of the battery usage. I realized that this device needs to drive the screen for about 14 continuous hours each day. I.e., the weight must come from a massive battery.The screen is good. It’s a capacitive touchscreen, so nice and responsive. When I first spied the device, I felt certain it would be a resistive touchscreen (which is more accurately called a touch-and-press-down screen). There is an AC adapter on the side of the tablet. This is the only interface to the device :
That looks to me like an internal SATA connector (different from an eSATA connector). Foolishly, I didn’t have a SATA cable on me so I couldn’t verify.
User Interface
The interface options are : Order, Games, Neighborhood, and Pay. One big benefit of accessing the menu through the Order option is that each menu item can have a picture. For people who order more by picture than text description, this is useful. Rather, it would be, if more items had pictures. I’m not sure there were more pictures than seen in the print menu.
For Games, there were a variety of party games. The interface clearly stated that we got to play 2 free games. This implied to me that further games cost money. We tried one game briefly and the food came.2 more options : Neighborhood– I know I dug into this option, but I forget what it was. Maybe it discussed local attractions. Finally, Pay. This thing has an integrated credit card reader. There is no integrated printer, though, so if you want one, you will have to request one from a human.
Experience
So we ordered through a human since we didn’t feel like being thrust into this new paradigm when we just wanted lunch. The staff was obviously amenable to that. However, I got a chance to ask them a lot of questions about the particulars. Apparently, they have had this system for about 5 months. It was confirmed that the tablets do, in fact, have gargantuan batteries that have to last through the restaurant’s entire business hours. Do they need to be charged every night ? Yes, they do. But how ? The staff described this several large charging blocks with many cables sprouting out. Reportedly, some units still don’t make it through the entire day.When it was time to pay, I pressed the Pay button on the interface. The bill I saw had nothing in common with what we ordered (actually, it was cheaper, so perhaps I should have just accepted it). But I pointed it out to a human and they said that this happens sometimes. So they manually printed my bill. There was a dollar charge for the game that was supposed to be free. I pointed this out and they removed it. It’s minor, I know, but it’s still worth trying to work out these bugs.
One of the staff also described how a restaurant doesn’t need to employ as many people thanks to the tablet. She gave a nervous, awkward, self-conscious laugh when she said this. All I could think of was this Dilbert comic strip in which the boss realizes that his smartphone could perform certain key functions previously handled by his assistant.
Not A New Idea
Some people might think this is a totally new concept. It’s not. I was immediately reminded of my university days in Boulder, Colorado, USA, circa 1997. The local Taco Bell and Arby’s restaurants both had touchscreen ordering kiosks. Step up, interact with the (probably resistive) touchscreen, get a number, and step to the counter to change money, get your food, and probably clarify your order because there is only so much that can be handled through a touchscreen.What I also remember is when they tore out those ordering kiosks, also circa 1997. I don’t know the exact reason. Maybe people didn’t like them. Maybe there were maintenance costs that made them not worth the hassle.
Then there are the widespread self-checkout lanes in grocery stores. Personally, I like those, though I know many don’t. However, this restaurant tablet thing hasn’t won me over yet. What’s the difference ? Perhaps that automated lanes at grocery stores require zero external assistance– at least, if you do everything correctly. Personally, I work well with these lanes because I can pretty much guess the constraints of the system and I am careful not to confuse the computer in any way. Until they deploy serving droids, or at least food conveyors, there still needs to be some human interaction and I think the division between the human and computer roles is unintuitive in the restaurant case.
I don’t really care to return to the same restaurant. I’ll likely avoid any other restaurant that has these tablets. For some reason, I think I’m probably supposed to be the ideal consumer of this concept. But the idea will probably perform all right anyway. Elacarte’s website has plenty of graphs demonstrating that deploying these tablets is extremely profitable.
-
Decoding by libjpeg -> Encoding by x264, strange artefacts on frames
15 mai 2013, par mmmaaakI have a collection of jpeg, which must be decoded by lib jpeg, and after it, encoded by x264 (after it encoded packets are streamed via rtmp).
Code I used for decoding :struct my_error_mgr
{
struct jpeg_error_mgr pub;
jmp_buf setjmp_buffer;
};
typedef my_error_mgr *my_error_ptr;
METHODDEF(void) my_error_exit (j_common_ptr cinfo)
{
my_error_ptr myerr = (my_error_ptr) cinfo->err;
(*cinfo->err->output_message) (cinfo);
longjmp(myerr->setjmp_buffer, 1);
}
void init_source(j_decompress_ptr ptr)
{
Q_UNUSED(ptr)
}
boolean fill_input_buffer(j_decompress_ptr ptr)
{
Q_UNUSED(ptr)
return TRUE;
}
void term_source(j_decompress_ptr ptr)
{
Q_UNUSED(ptr)
}
void skip_input_data(j_decompress_ptr ptr, long num_bytes)
{
if(num_bytes>0)
{
ptr->src->next_input_byte+=(size_t)num_bytes;
ptr->src->bytes_in_buffer-=(size_t)num_bytes;
}
}
EtherDecoder::EtherDecoder(QObject *parent):
QObject(parent)
{
}
void EtherDecoder::dataBlockReady(QByteArray data)
{
jpeg_decompress_struct decompressInfo;
jpeg_create_decompress(&decompressInfo);
my_error_mgr err;
decompressInfo.do_fancy_upsampling = FALSE;
decompressInfo.src = (jpeg_source_mgr *) (*decompressInfo.mem->alloc_small) ((j_common_ptr) &decompressInfo, JPOOL_PERMANENT, sizeof(jpeg_source_mgr));
decompressInfo.err = jpeg_std_error(&err.pub);
err.pub.error_exit = my_error_exit;
if (setjmp(err.setjmp_buffer))
{
jpeg_destroy_decompress(&decompressInfo);
return;
}
decompressInfo.src->init_source = init_source;
decompressInfo.src->resync_to_restart = jpeg_resync_to_restart;
decompressInfo.src->fill_input_buffer = fill_input_buffer;
decompressInfo.src->skip_input_data = skip_input_data;
decompressInfo.src->term_source = term_source;
decompressInfo.src->next_input_byte = reinterpret_cast<const>(data.data());
decompressInfo.src->bytes_in_buffer = data.size();
jpeg_read_header(&decompressInfo, TRUE);
jpeg_start_decompress(&decompressInfo);
int size = 0;
int n_samples = 0;
char *samples = new char[5242880];
char *reserv = samples;
while (decompressInfo.output_scanline < decompressInfo.output_height)
{
n_samples = jpeg_read_scanlines(&decompressInfo, (JSAMPARRAY) &samples, 1);
samples += n_samples * decompressInfo.image_width * decompressInfo.num_components;
size += n_samples * decompressInfo.image_width * decompressInfo.num_components;
}
jpeg_finish_decompress(&decompressInfo);
QByteArray output(reserv, size);
emit frameReady(output, decompressInfo.output_width, decompressInfo.output_height);
jpeg_destroy_decompress(&decompressInfo);
delete[] reserv;
}
</const>When I emit frameReady signal, I send data to Encoder, method, where I init Encedor looks like :
bool EtherEncoder::initEncoder(unsigned int width, unsigned int height)
{
x264_param_t param;
x264_param_default_preset(&param, "veryfast", "zerolatency");
param.i_width=width;
param.i_height=height;
param.i_frame_total=0;
param.i_csp=X264_CSP_I420;
param.i_timebase_num=1;
param.i_timebase_den=96000;
param.b_annexb=true;
param.b_repeat_headers=false;
x264_param_apply_fastfirstpass(&param);
x264_param_apply_profile(&param, "baseline");
_context=x264_encoder_open(&param);
if(!_context)
return false;
int nal_count;
x264_nal_t *nals;
if(x264_encoder_headers(_context, &nals, &nal_count)<0)
{
x264_encoder_close(_context);
_context=0;
return false;
}
_extradata=QByteArray();
_width=width;
_height=height;
if(nal_count>0)
{
_extradata=QByteArray(
(const char *)nals[0].p_payload,
nals[nal_count-1].p_payload+nals[nal_count-1].i_payload-nals[0].p_payload);
}
return true;
}And encoding method :
void EtherEncoder::onFrameReady(QByteArray data, int width, int height)
{
while(data.size()>0)
{
if(!_context && initEncoder(width, height))
{
_timestampDelta=realTimestamp();
}
if(_context)
{
x264_picture_t pic;
x264_picture_init(&pic);
pic.i_type=X264_TYPE_AUTO;
pic.i_pts=_timestampDelta*96000;
pic.img.i_csp=X264_CSP_I420;
pic.img.i_plane=3;
int planeSize = width*height;
uint8_t *p = (uint8_t*)data.data();
pic.img.plane[0]=p;
p+=planeSize;
pic.img.plane[1]=p;
p+=planeSize/4;
pic.img.plane[2]=p;
pic.img.i_stride[0]=width;
pic.img.i_stride[1]=width/2;
pic.img.i_stride[2]=width/2;
if(_forceKeyFrame)
{
pic.i_type=X264_TYPE_I;
_forceKeyFrame=false;
}
int nal_count;
x264_nal_t *nals;
int rc=x264_encoder_encode(_context, &nals, &nal_count, &pic, &pic);
if(rc>0)
{
_mutex.lock();
_packets.push_back(
Packet(
QByteArray(
(const char *)nals[0].p_payload, nals[nal_count- 1].p_payload+nals[nal_count-1].i_payload-nals[0].p_payload),
_timestampDelta/96.0,
_timestampDelta/96.0,
pic.b_keyframe));
_timestampDelta+=40;
data.clear();
_mutex.unlock();
emit onPacketReady();
}
}
}
}Decoding and encoding proceeds without errors, at the end I get valid video stream, but, it seems that in one of this steps I set Invalid data for decoder/encoder. I get only 1/4 part of image (top-left, as I understood) and it has invalid color and come color stripes. Maybe I set invalid strides and planes when encode frame, or maybe my setting data for libjpeg decoder is incorrect.. Please ask questions about my code, I'll try to make some explanations for you. I explodes my brain.. Thank you.