Christoph Hellwig wrote:
> On Wed, Jun 04, 2003 at 01:58:02AM +0100, P. Benie wrote:
>
>>- if (down_interruptible(&tty->atomic_write)) {
>>- return -ERESTARTSYS;
>>+ if (file->f_flags & O_NONBLOCK) {
>>+ if (down_trylock(&tty->atomic_write))
>>+ return -EAGAIN;
>>+ }
>>+ else {
>
>
> The else should be on the same line as the closing brace, else
> the patch looks fine.
I am in general agreement with those who feel we should have a common
standard for code formatting. There are particular places where it's
VERY important to maximize consistency and readability, such as function
headers.
But when do standards turn into nitpicks?
I personally always write else as you suggest, "} else {", but the way
the other fellow did it does not in any way hurt readability for me.
Yes, it does irritate me sometimes when people put the braces and else
on three different lines, but mostly because it reduces the amount of
code I can see at one time. But even then, it doesn't make it any less
readable to me.
I can see patches getting rejected because they violate function header
standards. That would make sense to me. But if the above patch were to
be rejected on the basis of the "else", I would be hard pressed to see
that as a valid justification.
Perhaps it would be good to have an explanation for the relative
importance of placing braces and else on the same line as compared to
other formatting standards.
On Wed, Jun 04, 2003 at 11:21:41AM -0400, Timothy Miller wrote:
>
> Perhaps it would be good to have an explanation for the relative
> importance of placing braces and else on the same line as compared to
> other formatting standards.
Please read Documentation/CodingStyle.
If you want more justification, read my OLS 2001 paper.
Hope this helps,
greg k-h
On Fri, 6 Jun 2003, Greg KH wrote:
> On Wed, Jun 04, 2003 at 11:21:41AM -0400, Timothy Miller wrote:
> >
> > Perhaps it would be good to have an explanation for the relative
> > importance of placing braces and else on the same line as compared to
> > other formatting standards.
>
> Please read Documentation/CodingStyle.
>
> If you want more justification, read my OLS 2001 paper.
I think you mean 2002. Good reading.
--
Alex Goddard
[email protected]
Greg KH wrote:
> On Wed, Jun 04, 2003 at 11:21:41AM -0400, Timothy Miller wrote:
>
>>Perhaps it would be good to have an explanation for the relative
>>importance of placing braces and else on the same line as compared to
>>other formatting standards.
>
>
> Please read Documentation/CodingStyle.
>
> If you want more justification, read my OLS 2001 paper.
>
Well, the coding style you propose isn't exactly the way I do things
(although it's pretty close). For instance, I'm not accustomed to using
a single tab character for indent. Having more experience with X11
internals than Linux, you can see where I would get my four-space indent
habit from. No matter. I can easily adapt.
One thing I wanted to mention, however, is that your tongue-in-cheek
style doesn't help you. Coding style is something that needs to be
taken seriously when you're setting standards.
On Mon, 9 June 2003 12:24:35 -0400, Timothy Miller wrote:
>
> One thing I wanted to mention, however, is that your tongue-in-cheek
> style doesn't help you. Coding style is something that needs to be
> taken seriously when you're setting standards.
Coding style is secondary. It doesn't effect the compiled code. That
simple.
In the case of the kernel, there is quite a bit of horrible coding
style. But a working device driver for some hardware is always better
that no working device driver for some hardware, and if enforcing the
coding style more results is scaring away some driver writers, the
style clearly loses.
J?rn
--
They laughed at Galileo. They laughed at Copernicus. They laughed at
Columbus. But remember, they also laughed at Bozo the Clown.
-- unknown
On Mon, 9 Jun 2003, [iso-8859-1] J?rn Engel wrote:
> In the case of the kernel, there is quite a bit of horrible coding
> style. But a working device driver for some hardware is always better
> that no working device driver for some hardware, and if enforcing the
> coding style more results is scaring away some driver writers, the
> style clearly loses.
There's no such a thing as "horrible coding style", since coding style is
strictly personal. Whoever try to convince you that one style is better
than another one is simply plain wrong. Every reason they will give you to
justify one style can be wiped with other opposite reasons. The only
horrible coding style is to not respect coding standards when you work
inside a project. This is a form of respect for other people working
inside the project itself, give the project code a more professional
look and lower the fatigue of reading the project code. Jumping from 24
different coding styles does not usually help this. I do not believe
professional developers can be scared by a coding style, if this is the
coding style adopted by the project where they have to work in.
- Davide
Davide Libenzi wrote:
> On Mon, 9 Jun 2003, [iso-8859-1] J?rn Engel wrote:
>
>
>>In the case of the kernel, there is quite a bit of horrible coding
>>style. But a working device driver for some hardware is always better
>>that no working device driver for some hardware, and if enforcing the
>>coding style more results is scaring away some driver writers, the
>>style clearly loses.
>
>
> There's no such a thing as "horrible coding style", since coding style is
> strictly personal. Whoever try to convince you that one style is better
> than another one is simply plain wrong. Every reason they will give you to
> justify one style can be wiped with other opposite reasons. The only
> horrible coding style is to not respect coding standards when you work
> inside a project.
I beg to differ: http://www0.us.ioccc.org/2001/anonymous.c ;)
Eli
--------------------. "If it ain't broke now,
Eli Carter \ it will be soon." -- crypto-gram
eli.carter(a)inet.com `-------------------------------------------------
On Mon, 9 Jun 2003, Eli Carter wrote:
> Davide Libenzi wrote:
> > On Mon, 9 Jun 2003, [iso-8859-1] J?rn Engel wrote:
> >
> >
> >>In the case of the kernel, there is quite a bit of horrible coding
> >>style. But a working device driver for some hardware is always better
> >>that no working device driver for some hardware, and if enforcing the
> >>coding style more results is scaring away some driver writers, the
> >>style clearly loses.
> >
> >
> > There's no such a thing as "horrible coding style", since coding style is
> > strictly personal. Whoever try to convince you that one style is better
> > than another one is simply plain wrong. Every reason they will give you to
> > justify one style can be wiped with other opposite reasons. The only
> > horrible coding style is to not respect coding standards when you work
> > inside a project.
>
> I beg to differ: http://www0.us.ioccc.org/2001/anonymous.c ;)
>
> Eli
Last I looked, we had a good example in the Buslogic SCSI driver.
However, just in case it's been changed, I submit herewith an
example of real code written by a "professional".
//
// This is an example of the kind of 'C' code that is being written
// by so-called experts. It is unreadable, illogical, but it works.
// I wish I was kidding! This is the junk I see being written right
// now by so-called professional programmers!
// Richard B. Johnson [email protected]
//
//
#include<stdio.h>
#define SuccessfulReturnValue 0
typedef int MainReturnType;
typedef int DefaultCounterType;
typedef void NothingWeCareAbout;
typedef const char StringThatIsntGoingToBeModified;
typedef char StringThatCanBeModified;
MainReturnType main(NothingWeCareAbout);
StringThatIsntGoingToBeModified MessageToBeWrittenToTheScreen[]={0x48,0x64,0x6e,0x6f,0x6b,0x25,0x71,0x68,0x7a,0x65,0x6e,0x2a,0x0c};
MainReturnType main(){
StringThatCanBeModified LocalStringBuffer[sizeof(MessageToBeWrittenToTheScreen)];
DefaultCounterType CharacterCounter;
for(CharacterCounter=0;CharacterCounter<sizeof(MessageToBeWrittenToTheScreen);CharacterCounter++)
LocalStringBuffer[CharacterCounter]=MessageToBeWrittenToTheScreen[CharacterCounter]^CharacterCounter;
puts(LocalStringBuffer);
return SuccessfulReturnValue;
}
Cheers,
Dick Johnson
Penguin : Linux version 2.4.20 on an i686 machine (797.90 BogoMips).
Why is the government concerned about the lunatic fringe? Think about it.
yOn Mon, 9 Jun 2003, Richard B. Johnson wrote:
> Last I looked, we had a good example in the Buslogic SCSI driver.
> However, just in case it's been changed, I submit herewith an
> example of real code written by a "professional".
You know why the code you reported is *wrong* (besides from how
techincally do things) ? Mixing lower and upper case, using long variable
and function names, etc... are simply a matter of personal taste and you
cannot say that such code is "absolutely" wrong. The code is damn wrong
because it violates about 25 sections of the project's defined CodingStyle
document, that's why it is wrong.
- Davide
On Mon, 9 June 2003 11:07:32 -0700, Davide Libenzi wrote:
>
> You know why the code you reported is *wrong* (besides from how
> techincally do things) ? Mixing lower and upper case, using long variable
> and function names, etc... are simply a matter of personal taste and you
> cannot say that such code is "absolutely" wrong. The code is damn wrong
> because it violates about 25 sections of the project's defined CodingStyle
> document, that's why it is wrong.
Call it as you may. Whether some style violates more sections of the
CodingStyle than exist in written form or it hurts the taste of 99% of
all developers ever having to tough it, my short form for that is "bad
style".
Point remains, there is a lot of "bad style" and inconsistency in the
kernel. But fixing all of it and keeping it fixed would result in a
lot of work and maybe a couple of device drivers less. For what gain?
J?rn
--
Measure. Don't tune for speed until you've measured, and even then
don't unless one part of the code overwhelms the rest.
-- Rob Pike
J?rn Engel wrote:
> On Mon, 9 June 2003 12:24:35 -0400, Timothy Miller wrote:
>
>>One thing I wanted to mention, however, is that your tongue-in-cheek
>>style doesn't help you. Coding style is something that needs to be
>>taken seriously when you're setting standards.
>
>
> Coding style is secondary. It doesn't effect the compiled code. That
> simple.
Agreed.
> In the case of the kernel, there is quite a bit of horrible coding
> style. But a working device driver for some hardware is always better
> that no working device driver for some hardware, and if enforcing the
> coding style more results is scaring away some driver writers, the
> style clearly loses.
It is a trivial fact that all coding styles are completely arbitrary.
Yes, there may be many things which are chosen because they make the
most sense, but there are always numerous choices along the way, all of
which would be reasonable, that have to be reduced to one. Some
philosophers will tell you that all of reality is completely arbitrary
and made up; of course, they're referring to our perceptions and choices
moreso than to, say, physics. Well, what exemplifies arbitrary reality
more than computer science? Every last drop of it was invented out of
whole cloth. So when you think about it, the C syntax itself is
arbitrary, and thus even moreso are the coding styles.
But we have a practical goal in mind here. Not only does something have
to WORK (compile to working machine code), but our grandchildren, using
Linux 20.14.6 are going to have to be able to make sense out of what we
wrote. Were it not for the fact that Linux is a collaborative project,
we would not need these standards.
So, yes, while it may seem silly to do it "just because K&R did it that
way", it is nevertheless a reasonable (albeit arbitrary) choice to make.
Someone has to make the choice, enforce it, and make sure that
everyone understands it. If there is one style, then it will be easier
for new people to understand it once they have read the style guide.
Still, it IS nice to have someone produce justification for their
choices once in a while.
Davide Libenzi wrote:
> On Mon, 9 Jun 2003, [iso-8859-1] J?rn Engel wrote:
>
>
>>In the case of the kernel, there is quite a bit of horrible coding
>>style. But a working device driver for some hardware is always better
>>that no working device driver for some hardware, and if enforcing the
>>coding style more results is scaring away some driver writers, the
>>style clearly loses.
>
>
> There's no such a thing as "horrible coding style", since coding style is
> strictly personal. Whoever try to convince you that one style is better
> than another one is simply plain wrong. Every reason they will give you to
> justify one style can be wiped with other opposite reasons. The only
> horrible coding style is to not respect coding standards when you work
> inside a project. This is a form of respect for other people working
> inside the project itself, give the project code a more professional
> look and lower the fatigue of reading the project code. Jumping from 24
> different coding styles does not usually help this. I do not believe
> professional developers can be scared by a coding style, if this is the
> coding style adopted by the project where they have to work in.
Oh, yes, there is most certainly "horrible coding style". When I was in
college, I met one CS student after another who really just did not
belong in CS, and you should have seen the code they wrote.
Imagine a 200 line program which is ALL inside of main(). There is no
indenting. Lines of code are broken in random places. Blank lines are
inserted randomly. The variable names chosen are a, b, c, d, e, etc.
It's impossible to tell which '{' is associated with which '}'.
It's been a while. I can't remember all of the violations of reason and
sanity I saw. I pity the grad students who were faced with grading
these monstrosities.
On Mon, 9 Jun 2003, Timothy Miller wrote:
> > There's no such a thing as "horrible coding style", since coding style is
> > strictly personal. Whoever try to convince you that one style is better
> > than another one is simply plain wrong. Every reason they will give you to
> > justify one style can be wiped with other opposite reasons. The only
> > horrible coding style is to not respect coding standards when you work
> > inside a project. This is a form of respect for other people working
> > inside the project itself, give the project code a more professional
> > look and lower the fatigue of reading the project code. Jumping from 24
> > different coding styles does not usually help this. I do not believe
> > professional developers can be scared by a coding style, if this is the
> > coding style adopted by the project where they have to work in.
>
> Oh, yes, there is most certainly "horrible coding style". When I was in
> college, I met one CS student after another who really just did not
> belong in CS, and you should have seen the code they wrote.
> On Mon, 9 June 2003 11:07:32 -0700, Davide Libenzi wrote:
> >
> > You know why the code you reported is *wrong* (besides from how
> > techincally do things) ? Mixing lower and upper case, using long variable
> > and function names, etc... are simply a matter of personal taste and you
> > cannot say that such code is "absolutely" wrong. The code is damn wrong
> > because it violates about 25 sections of the project's defined CodingStyle
> > document, that's why it is wrong.
>
> Call it as you may. Whether some style violates more sections of the
> CodingStyle than exist in written form or it hurts the taste of 99% of
> all developers ever having to tough it, my short form for that is "bad
> style".
>
> Point remains, there is a lot of "bad style" and inconsistency in the
> kernel. But fixing all of it and keeping it fixed would result in a
> lot of work and maybe a couple of device drivers less. For what gain?
If you try to define a bad/horrible "whatever" in an *absolute* way you
need either the *absolutely* unanimous consent or you need to prove it
using a logical combination of already proven absolute concepts. Since you
missing both of these requirements you cannot say that something is
bad/wrong in an absolute way. You can say though that something is
wrong/bad when dropped inside a given context, and a coding standard might
work as an example. If you try to approach a developer by saying that he
has to use ABC coding standard because it is better that his XYZ coding
standard you're just wrong and you'll have hard time to have him to
understand why he has to use the suggested standard when coding inside the
project JKL. The coding standard gives you the *rule* to define something
wrong when seen inside a given context, since your personal judgement does
not really matter here.
- Davide
> If you try to define a bad/horrible "whatever" in an *absolute* way you
> need either the *absolutely* unanimous consent or you need to prove it
> using a logical combination of already proven absolute concepts. Since you
> missing both of these requirements you cannot say that something is
> bad/wrong in an absolute way. You can say though that something is
> wrong/bad when dropped inside a given context, and a coding standard might
> work as an example. If you try to approach a developer by saying that he
> has to use ABC coding standard because it is better that his XYZ coding
> standard you're just wrong and you'll have hard time to have him to
> understand why he has to use the suggested standard when coding inside the
> project JKL. The coding standard gives you the *rule* to define something
> wrong when seen inside a given context, since your personal judgement does
> not really matter here.
>
> - Davide
This is just bad philosophy. You might as well argue that a canvas that's
been randomly pissed on is just as much art as the Mona Lisa. In fact, it's
a worse argument than that because coding styles aim at objective,
measurable goals. Why does consent matter? If some imbecile wants to argue
that it's good to write code that's hard to understand and debug, why should
we care what he has to say? The consent of people whose opinions are
nonsensical is of no value to people who are trying to create rules that
meet their objective requirements.
Coding styles aim at specific measurable goals. Code should be easy to
understand, extend, and debug. If someone argues code should be hard to
understand, maintain, and debug, we just ignore him. We don't care if he
agrees with us or not because his opinion is obviously (and objectively) of
no value.
We can measure, for different coding style, how long it takes to find a
bug. We can measure how long it takes a new programmer to get to the point
that he can contribute to the existing code.
Coding styles are engineering rules. We can validate them based upon the
results they produce. Objective, measureable results.
DS
On Mon, 9 June 2003 11:58:43 -0700, Davide Libenzi wrote:
>
> If you try to define a bad/horrible "whatever" in an *absolute* way you
> need either the *absolutely* unanimous consent or you need to prove it
> using a logical combination of already proven absolute concepts. Since you
> missing both of these requirements you cannot say that something is
> bad/wrong in an absolute way. You can say though that something is
> wrong/bad when dropped inside a given context, and a coding standard might
> work as an example. If you try to approach a developer by saying that he
> has to use ABC coding standard because it is better that his XYZ coding
> standard you're just wrong and you'll have hard time to have him to
> understand why he has to use the suggested standard when coding inside the
> project JKL. The coding standard gives you the *rule* to define something
> wrong when seen inside a given context, since your personal judgement does
> not really matter here.
The definition in an absolute way is not the real problem. The real
problem is that good/bad coding style has more than just one
dimension. Trying to rate it in just one dimension will almost always
fail.
That said, this discussion appears to have zero impact on the kernel
itself, so it might be time to fade it out.
J?rn
--
Measure. Don't tune for speed until you've measured, and even then
don't unless one part of the code overwhelms the rest.
-- Rob Pike
On Mon, 9 June 2003 14:44:53 -0400, Timothy Miller wrote:
>
> But we have a practical goal in mind here. Not only does something have
> to WORK (compile to working machine code), but our grandchildren, using
> Linux 20.14.6 are going to have to be able to make sense out of what we
> wrote. Were it not for the fact that Linux is a collaborative project,
> we would not need these standards.
Nice picture. That implies that coding standards don't matter for
device drivers for some short-lived hardware like drivers/cdrom/ but
do a lot more for core code like mm/.
All right, let's stop beating the grass while there is still a shadow
of the dead horse remaining.
J?rn
--
Write programs that do one thing and do it well. Write programs to work
together. Write programs to handle text streams, because that is a
universal interface.
-- Doug MacIlroy
> There's no such a thing as "horrible coding style", since coding style is
> strictly personal.
yeah i think there is.
GetHandleToChangeLightBulbA
SetHandleToChangeLightBuldA
Need i say more ?
On Mon, 9 Jun 2003, David Schwartz wrote:
> This is just bad philosophy.
Actually, that's logic/mathematics. Philosophy is on the other side. We
can say that mathematic/logic compare to philosophy like facts to bullshits.
> been randomly pissed on is just as much art as the Mona Lisa. In fact, it's
> a worse argument than that because coding styles aim at objective,
> measurable goals. Why does consent matter? If some imbecile wants to argue
> that it's good to write code that's hard to understand and debug, why should
> we care what he has to say? The consent of people whose opinions are
> nonsensical is of no value to people who are trying to create rules that
> meet their objective requirements.
A coding style is a very personal thing, you cannot say that the XYZ's
coding style is wrong. Period. Is like saying that your taste for cars is
wrong because you picked up an ABC against a JKL. If you say that XYZ's
coding style is wrong, without dropping it inside a specific context
(like an environment ruled by a coding standard for example), you
are trying to give an absolute judgement of it. Absolute judgements need
either absolutely unanimous consent or they need to be proven using a set
of already proven absolute concepts. You can say that *for you* (for-you
== relative) this is right :
if (a == b) {
...
}
while this is wrong :
if( a == b )
{
...
}
I might agree with you, that makes two. But there will be for sure someone
that will personally prefer the latter. And this will break the unanimous
consent. Are you able to prove using a set of already proven absolute
concepts that the former is right and the latter is wrong ? The only way
that you have to say that something personal like a coding style is
"wrong" is through a set of rules like a coding standard document. So, to
close the circle, a coding standard document (like the one we have) more
than your very personal judgement, enable you to say that some code is
wrong. If you fail to understand this you will have hard times to
gracefully convince your developers why it is good to use the dictated
coding standard inside professional projects. The "your style sux" is not
generally well accepted by persons with serious attitude problems like
developers.
- Davide
On Tue, 2003-06-10 at 10:55, Davide Libenzi wrote:
> Absolute judgements need either absolutely unanimous consent or they
> need to be proven using a set...
Of course unanimous consent doesn't make a judgement absolute. It just
makes it universally accepted (it might still be wrong or an opinion,
depending on the context).
Regards,
Nigel
On Monday 09 June 2003 13:55, Timothy Miller wrote:
> Davide Libenzi wrote:
>
> > There's no such a thing as "horrible coding style", since coding style is
> > strictly personal. Whoever try to convince you that one style is better
> > than another one is simply plain wrong. Every reason they will give you
> > to justify one style can be wiped with other opposite reasons. The only
> > horrible coding style is to not respect coding standards when you work
> > inside a project. This is a form of respect for other people working
> > inside the project itself, give the project code a more professional look
> > and lower the fatigue of reading the project code. Jumping from 24
> > different coding styles does not usually help this. I do not believe
> > professional developers can be scared by a coding style, if this is the
> > coding style adopted by the project where they have to work in.
>
> Oh, yes, there is most certainly "horrible coding style". When I was in
> college, I met one CS student after another who really just did not
> belong in CS, and you should have seen the code they wrote.
>
> Imagine a 200 line program which is ALL inside of main(). There is no
> indenting. Lines of code are broken in random places. Blank lines are
> inserted randomly. The variable names chosen are a, b, c, d, e, etc.
> It's impossible to tell which '{' is associated with which '}'.
>
> It's been a while. I can't remember all of the violations of reason and
> sanity I saw. I pity the grad students who were faced with grading
> these monstrosities.
ummm been there... Actually, after the first 20 it got easy... If I couldn't
read it, it got an "F" (whether it worked or not).
If it could be read with difficulty (and worked) it got a D
If it could be read and worked it got a C
If it could be read and was clear (and worked) it got a B
If it was short, clear, and worked it got an A
And I have met some of the idiots (including Piled higher and Deeper ones)
that couldn't program their way through a "hello there" program.
On Monday 09 June 2003 13:58, Davide Libenzi wrote:
[snip]
>
> If you try to define a bad/horrible "whatever" in an *absolute* way you
> need either the *absolutely* unanimous consent or you need to prove it
> using a logical combination of already proven absolute concepts. Since you
> missing both of these requirements you cannot say that something is
> bad/wrong in an absolute way. You can say though that something is
> wrong/bad when dropped inside a given context, and a coding standard might
> work as an example. If you try to approach a developer by saying that he
> has to use ABC coding standard because it is better that his XYZ coding
> standard you're just wrong and you'll have hard time to have him to
> understand why he has to use the suggested standard when coding inside the
> project JKL. The coding standard gives you the *rule* to define something
> wrong when seen inside a given context, since your personal judgement does
> not really matter here.
The coding standards were written by people who said
"Do it this way because 'I' have to read it and understand it to be able to
maintain it."
Nuff said.
On Tue, 10 Jun 2003, Jesse Pollard wrote:
> On Monday 09 June 2003 13:58, Davide Libenzi wrote:
> [snip]
> >
> > If you try to define a bad/horrible "whatever" in an *absolute* way you
> > need either the *absolutely* unanimous consent or you need to prove it
> > using a logical combination of already proven absolute concepts. Since you
> > missing both of these requirements you cannot say that something is
> > bad/wrong in an absolute way. You can say though that something is
> > wrong/bad when dropped inside a given context, and a coding standard might
> > work as an example. If you try to approach a developer by saying that he
> > has to use ABC coding standard because it is better that his XYZ coding
> > standard you're just wrong and you'll have hard time to have him to
> > understand why he has to use the suggested standard when coding inside the
> > project JKL. The coding standard gives you the *rule* to define something
> > wrong when seen inside a given context, since your personal judgement does
> > not really matter here.
>
> The coding standards were written by people who said
>
> "Do it this way because 'I' have to read it and understand it to be able to
> maintain it."
The whole sub-thread wasn't talking about democracy in coding styles ;)
- Davide