2003-03-08 18:40:31

by Ruth Forester

[permalink] [raw]
Subject: OLS2003 Performance BOF Proposals

Everyone,

I would very much appreciate comments (even one-liners) on any
community interest in these two OLS Performance BoF sessions. I
believe the topics are dissimilar and relevant enough to justify both:
------------------------------------------------------------------------
-------------------------------------------
PROPOSAL FOR LINUX BENCHMARK AUTOMATION
This BOF will include a discussion on Linux benchmark automation. We
will discuss the features needed to provide an effective benchmark
automation process for Linux. This will include, defining the
configuration, input files, benchmark execution, output files, etc. We
will also discuss the types of benchmarks that are tailored for rapid
execution and results analysis, for maximum development impact.

PROPOSAL FOR LINUX PERFORMANCE
Linux changes occur very quickly in the open source community. There is
a strong need to quickly collect and share performance data and
analysis. However, there may be some instances where good, quality
performance data collection and analysis take longer than the short
turnaround required for maximum impact regarding newly released
patches. We plan to discuss the most effective methodology for
impacting Linux performance in a rapidly changing Linux open source
community environment.
------------------------------------------------------------------------
---------------------------------------------
Please reply immediately so we can quickly submit them to OLS?
Thanks for your (speedy) replies!

ruth
Ruth Forester, Linux Performance LTC
[email protected]
notes: [email protected]
IBM Linux Technology Center
Beaverton, Oregon


2003-03-11 01:02:10

by Craig Thomas

[permalink] [raw]
Subject: Re: OLS2003 Performance BOF Proposals

I think both can be combined into one. The first seems to outline
the way to run a benchmark (multiple runs, std dev, variance, etc)
and a list of micro benchmarks. The second seems to handle the larger
performance tests such as large database loads, very long running tests,
etc.

I wonder if you can include a discussion of monitoring tools for
performance data collection as part of the first BOF? We have found
that for 2.5 the packages report different numbers when
comparing between sysstat, vmstat, ziostat, and iostat.


On Sat, 2003-03-08 at 10:51, Ruth Forester wrote:
> Everyone,
>
>
> I would very much appreciate comments (even one-liners) on any
> community interest in these two OLS Performance BoF sessions. I
> believe the topics are dissimilar and relevant enough to justify both:
>
> ----------------------------------------------------------------------
>
> This BOF will include a discussion on Linux benchmark automation. We
> will discuss the features needed to provide an effective benchmark
> automation process for Linux. This will include, defining the
> configuration, input files, benchmark execution, output files, etc.
> We will also discuss the types of benchmarks that are tailored for
> rapid execution and results analysis, for maximum development impact.
>
>
> PROPOSAL FOR LINUX PERFORMANCE
>
> Linux changes occur very quickly in the open source community. There
> is a strong need to quickly collect and share performance data and
> analysis. However, there may be some instances where good, quality
> performance data collection and analysis take longer than the short
> turnaround required for maximum impact regarding newly released
> patches. We plan to discuss the most effective methodology for
> impacting Linux performance in a rapidly changing Linux open source
> community environment.
>
> --------------------------------------------------------
>
> Please reply immediately so we can quickly submit them to OLS?
>
> Thanks for your (speedy) replies!
>
>
> ruth
>
> Ruth Forester, Linux Performance LTC
>
> <smaller>[email protected]
>
> notes: [email protected]
>
> IBM Linux Technology Center
>
> Beaverton, Oregon</smaller>
--
Craig Thomas <[email protected]>
OSDL