2012-10-01 08:49:34

by Glauber Costa

[permalink] [raw]
Subject: Re: [PATCH v3 04/13] kmem accounting basic infrastructure

On 09/30/2012 02:37 PM, Tejun Heo wrote:
> Hello, James.
>
> On Sun, Sep 30, 2012 at 09:56:28AM +0100, James Bottomley wrote:
>> The beancounter approach originally used by OpenVZ does exactly this.
>> There are two specific problems, though, firstly you can't count
>> references in generic code, so now you have to extend the cgroup
>> tentacles into every object, an invasiveness which people didn't really
>> like.
>
> Yeah, it will need some hooks. For dentry and inode, I think it would
> be pretty well isolated tho. Wasn't it?
>

We would still need something for the stack. For open files, and for
everything that becomes a potential problem. We then end up with 35
different knobs instead of one. One of the perceived advantages of this
approach, is that it condenses as much data as a single knob as
possible, reducing complexity and over flexibility.


2012-10-03 22:59:41

by Tejun Heo

[permalink] [raw]
Subject: Re: [PATCH v3 04/13] kmem accounting basic infrastructure

Hello, Glauber.

On Mon, Oct 01, 2012 at 12:46:02PM +0400, Glauber Costa wrote:
> > Yeah, it will need some hooks. For dentry and inode, I think it would
> > be pretty well isolated tho. Wasn't it?
>
> We would still need something for the stack. For open files, and for
> everything that becomes a potential problem. We then end up with 35
> different knobs instead of one. One of the perceived advantages of this
> approach, is that it condenses as much data as a single knob as
> possible, reducing complexity and over flexibility.

Oh, I didn't mean to use object-specific counting for all of them.
Most resources don't have such common misaccounting problem. I mean,
for stack, it doesn't exist by definition (other than cgroup
migration). There's no reason to use anything other than first-use
kmem based accounting for them. My point was that for particularly
problematic ones like dentry/inode, it might be better to treat them
differently.

Thanks.

--
tejun