Wrong:
foo > 0
Right:
0 < foo
Why? Doesn’t this break the yoda rule? Yes it does, but this one’s better.
Readability
It’s important that code is readable. To humans. Humans are much better at spatial and visual reasoning than abstract reasoning. It’s why this post has a BIG header that says READABILITY. The shape and size give a hint that it’s the BIG idea.
So how can we apply this to something as abstract as value comparison? The number line:
The number line reads small to big, left to right. So, when doing a comparison, we should also have our data go small to big when reading left to right.
Therefore instead of
foo > 0
we will do
0 < foo
That way, we can easily transform code logic to a number line in our minds and see where 0 and foo are spatially relative to each other and therefore mathematically their relation.
Another example is ranges:
Wrong:
foo > 0 && foo < 100
Right:
0 < foo && foo < 100
Hmmm, it makes sense from a certain, mathematical, perspective. Did you just make this up or is it based on some study of human perception? Is there evidence that when programmers examine a comparison that they imagine a number line? In my experience, what I look for in a comparison is two things: what is the variable I'm evaluating, and what is the value it is being compared to. Semantically I want to know what the variable is first. It is the core piece of meaningful information, what the comparison is setup to evaluate. When I see 'foo' I mentally envision what this quantity represents. Then when I read the value being compared to I have a frame of reference, I can fit the value into the range of allowed values for the variable. So I like foo > 0. For multiple conditions, I like foo > 0 and foo < 100 (but not foo<100 and foo > 0; so maybe your number line hypothesis works here: put the smaller value in the left-hand condition).