Back in our review of sets, I wrote:
I talked about the “first” and “second” element of an ordered pair. But the suggestion of an “order” to these elements can mislead. Really the important thing is just that we keep track of which element comes from which set, or which element comes from a single set playing the role of one side of the Cartesian product rather than the other. Instead of the “first” element and the “second” element, we could instead talk about the “west” element and the “east” element. And for ordered quadruples, about the “north,” “south,” “east” and “west” elements. If someone then took it in mind to ask whether the west element comes before or after the east element, or whether the north element came in between them, these questions wouldn’t have any established sense.
Similarly, if you take the ordered quadruple
(a,b,c,d)
, we might call it “increasing” iffa ≤ b ≤ c ≤ d
. But I could just as easily define another notion, call it “ascending,” which holds iffa ≤ d ≤ b ≤ c
. There’s no sense in which one of these two notion is more intrinsically natural or less gruesome than the other.
With strings, on the other hand, the collection does have a more natural intrinsic ordering. It’s genuinely more natural to count the letter
"b"
as coming “between” the letters"a"
and"c"
in"abc"
than it is to count"c"
as coming “between”"a"
and"b"
, because of way the string"abc"
is inherently structured.
Sometimes ordered pairs, triples, and so on — the general class of things I will call
n
-tuples or just tuples — are referred to as “ordered sets.” Avoid this usage; it will be too confusing when we look at a different notion of ordered set, in a few classes.
Now we are going to talk about a notion where the label “ordered” can be taken more seriously. We saw before that one special category of binary relations on a set (those that are reflexive, symmetric, and transitive) are distinguished with the name “equivalence relation.” Other special categories of binary relations on a set are distinguished with the name orders. But here there are a couple of different patterns.
First off, there is a contrast between a partial order and a total or linear order. The general idea here is that with a partial order, not every pair in the relation’s domain needs to be comparable; with total/linear orders they do (at least, if they’re distinct objects).
Unlike with functions, where an unqualified “function” means “total function,” with order relations an unqualified “order” usually means “partial order.” Since an order relation is always defined on some set (the relation’s domain), we can describe the set together with that order relation as a partially ordered set (poset for short).
(Just as the label “partial function” doesn’t require but only allows the function to be merely partial, that is non-total, so too the label “partial order” doesn’t require but only allows the order to be non-total.)
Our second contrast is between orders that are more like ≤
on the natural numbers, and ⊆
on sets, on the one hand, and orders that are more like <
on the natural numbers, and ⊂
(the proper subset relation) on sets, on the other. The former are called weak or non-strict orders, and the latter are called strong or strict orders. When no qualifier is given, usually a non-strict order is meant.
There’s also a more specific usage of the label “weak order,” that only applies to some of these. So I will avoid that vocabulary, and use non-strict/strict instead.
Here are some definitions:
R
is a non-strict partial order iff it’s transitive, reflexive, and anti-symmetric.R
is a strict partial order iff it’s transitive, irreflexive, and anti-symmetric. (Being irreflexive and anti-symmetric is equivalent to being asymmetric.)Examples of the former are the familiar ≤
relation on ℕ
; and of the latter are the familiar <
relation on ℕ
. But these examples also have special properties: note that for any two numbers x,y
, either x ≤ y
or y ≤ x
. The notion of a partial order permits but does not require this.
Recall back in our discussion of relations, I introduced these two notions:
R
is weakly connected: for every x,y
in its domain, x ≠ y ⊃ (xRy ∨ yRx)
R
is strongly connected: for every x,y
in its domain, (xRy ∨ yRx)
(this is equivalent to being weakly connected plus reflexive)When a non-strict partial order is weakly connected (and thus also strongly connected, since it’s reflexive), we say it’s (not merely a partial but also) a total/linear order (sometimes these are called “chains”, but that label is also used in different ways). When a strict partial order is weakly connected, we also say it’s a total/linear order (these are sometimes also called chains).
The examples of the familiar relations ≤
and <
on ℕ
are not merely partial orders but in fact total or linear orders.
Some orders though are merely partial. Some of the elements in the domain they are defined on fail to be comparable by that order relation. Here are some examples.
℘{"a","b"}
.) The relation ⊆
is a non-strict partial order on this set, but merely partial: the elements {"a"}
and {"b"}
aren’t comparable (neither is a subset of the other).⊂
is a strict partial order on that same set.Consider the relation that holds between two strings when the length of one is no greater than the other. The string "a"
stands in this relation to itself and also to "ab"
. But it also stands in this relation to the string "b"
, and "b"
also stands in it to "a"
, though these strings aren’t identical to each other. So this relation is not anti-symmetric, and won’t count as a partial order. In some respects it’s like a partial order though. Relations like this are sometimes called preorders or quasiorders (and are sometimes represented using ≲
).
If we want to order people by what their height is, so that Dash ⊏ Caleb
would say that Dash is less tall than Caleb, then again we’re going to want to allow distinct elements to belong to the same position in the ordering. Dash is less tall than Caleb, but perhaps he’s equally tall as Lilly.
So for such an ordering, we would not want anti-symmetry. It would be only a preorder, not a partial order. There’s the further question of whether the order is connected/total, which in this case it presumably is. This would plausibly be a total preorder.
On the other hand, suppose we don’t order people by their height, but instead order people’s heights — where these are thought of as numbers or some other kind of abstract mathematical quantity. Presumably here we would want anti-symmetry. If Dash and Lilly are equally tall, then Dash’s height would be the very same quantity as Lilly’s height. Here we wouldn’t want to allow for the possibility of numerically distinct heights occupying the same position in the ordering.
So whether you say you’ve got a preorder or a partial order will depend on what it is you take yourself to be ordering. The underlying facts about who is how tall remain the same, but if we’re thinking of people ordered by their heights — it’s an ordering of people — the structure is going to look different than if we’re thinking of people’s heights themselves, as separate abstract quantities, being ordered.
For any non-strict “order” relation ≤
, there are three possibilities: (a) its merely being a pre/quasiorder; (b) its also being a partial order, so there are no ties where distinct elements are ≤
each other; (c) its also being a total order, where no elements are incomparable (it never happens that x ≰ y ∧ y ≰ x
).
For strict orders <
, the first two of these possibilities collapse into a single option. But there is still a possibility that the order not be total.
If a text wants to talk about some arbitrary partial (or total) order, they’ll often represent it as ≤
or <
. In those usages, these symbols aren’t interpreted to mean the familiar arithmetic relations of being less-than. If the first is used, the author clearly wants to talk about a non-strict order; if the second is used, they probably want to talk about a strict order, but they may be using this notation instead to talk about a non-strict order. You’ll have to check the context to determine which. It’s easier if we use different notation to represent an arbitrary order relation. Sometimes authors use ≼
(or ≺
for a strict order), or ⊑
(or ⊏
for a strict order). I’ll use the last of these, as it’s easier to see the difference from ≤
and <
.
If an author has specified some non-strict order ⊑
, the corresponding strict order ⊏
can be defined as:
x ⊏ y =def x ⊑ y & ~(y ⊑ x)
If you’re focusing on partial orders, this could equivalently be expressed as x ⊑ y & y ≠ x
, but the definition given above can also be used for pre/quasiorders.
If an author has specified some order ⊑
, and then writes things like y ⊒ x
, they’re using ⊒
to represent the inverse relation to ⊑
: that is, y ⊒ x iff x ⊑ y
. Similarly for ⊏
and ⊐
.
If some binary relation ⊑
on a domain Α
is a partial order, we need to be able to talk both about the domain and the relation (“the domain’s ordering”).
Authors commonly call the pair (Α, ⊑)
a partially ordered set or poset. Sometimes they’ll specify the strict order ⊏
rather than the corresponding non-strict order ⊑
. For partial orders, it’s straightforward to convert between these.
Sometimes authors will just talk about the set Α
as being a poset, leaving it implicit what the order in question is. It should be clear from context what order they have in mind, but do remember that sets can usually be ordered in multiple ways. We’ll look at examples in a moment.
Sometimes authors will just say “⊑
is a partial order,” leaving it implicit what its domain is. That also should be clear from context. (Some authors equate relations not just with their graph, but with a pair of their domain and their graph, so that given a relation you can always “extract” what its domain is.)
The elements of ℕ
have the familiar (total/linear) ordering:
0 < 1 < 2 < 3 < 4 < 5 < ...
But we can also order them in other ways. The examples we’ll consider here are all also total/linear orders; later we’ll consider more examples of orderings that are merely partial. Here’s another linear ordering of ℕ
:
0 ⊏ 2 ⊏ 4 ⊏ ... ⊏ 1 ⊏ 3 ⊏ 5 ⊏ ...
Here’s how this is supposed to be understood. There is an “infinitely ascending” sequence of even numbers after 0, 2, 4, ...
. What I mean by this doesn’t have to do with the numerical values of the numbers in the sequence (though in this example, those also ascend). What I mean is that there is a number that comes immediately after 4
in the ordering, and then another that comes immediately after that, and so on forever. And then after all those numbers (that is, ordered higher by the relation than them) come the odd numbers 1, 3, 5, ...
and so on again infinitely ascending. All of those odd numbers are ordered higher than all the evens.
Or the elements of ℕ
could be ordered like this:
0 ⊏ 2 ⊏ 3 ⊏ 4 ⊏ 5 ⊏ ... ⊏ 1
Here just one number is ordered higher than all the rest. It comes after an infinitely ascending sequence of all the other numbers.
Sometimes we’ll talk about orderings “with the same structure.” This can be made precise with the notion of an “isomorphism” — which we’re not going to explain until a later class. But the intuitive idea should be clear. When we’re talking about an ordering “structure,” we don’t care about the identities of the particular elements in the ordering. Think of them as just anonymous dots. Any other order that has corresponding elements in the same relative positions would count as having the same structure.
So although this next example is a different order relation on ℕ
than the one in Example 3, it does share the same structure:
0 ⊏ 1 ⊏ 3 ⊏ 4 ⊏ 5 ⊏ ... ⊏ 2
This ordering of the set of strings {empty,"z","a","aa","aaa","aaaa",...}
would also have that same structure:
empty ⊏ "a" ⊏ "aa" ⊏ "aaa" ⊏ "aaaa" ⊏ ... ⊏ "z"
On the other hand, this ordering with two numbers higher than all the others would have a different structure:
0 ⊏ 3 ⊏ 4 ⊏ 5 ⊏ ... ⊏ 1 ⊏ 2
Here are three finite orderings, all with the same structure:
0 ⊏ 1 ⊏ 2 ⊏ 3
6 ⊏ 4 ⊏ 2 ⊏ 0
"a" ⊏ "b" ⊏ "c" ⊏ "d"
Here are three more orderings to consider (each with different structures than the other examples):
... ⊏ 6 ⊏ 4 ⊏ 2 ⊏ 0 ⊏ 1 ⊏ 3 ⊏ 5 ⊏ ...
In Example 8, all the evens are ordered first, with numbers that are smaller by the familiar ordering <
ordered higher, then still higher in the ordering are all the odds, with numbers that are larger by the familiar ordering <
ordered higher.
0 ⊏ ... ⊏ 6 ⊏ 4 ⊏ 2 ⊏ ... ⊏ 5 ⊏ 3 ⊏ 1
Example 9 is like Example 2, in that all the odds are ordered higher than all the evens. But now all the positive numbers are ordered with numbers that are smaller by the familiar ordering <
ordered higher.
2 ⊏ 4 ⊏ 8 ⊏ 16 ⊏ ... ⊏ 3 ⊏ 9 ⊏ 27 ⊏ ... ⊏ 5 ⊏ 25 ⊏ 125 ⊏ ... ⊏ 7 ⊏ 49 ⊏ ...
Example 10 is an ordering of a proper subset of ℕ
, those numbers that can be represented as pk for some prime p
and k ∈
positive ℕ
. In this ordering, all the numbers in the domain divisible without remainder by 2
are ordered lowest, then the numbers divisible without remainder by 3
, and then those divisible by 5
, then by 7
, and so on for all primes p
.
When authors talk of order-types like ω
, or ω+1
, or ωω, and so on, they are referring to some of these ordering structures. The three finite orderings in Examples 7 have an ordering structure called “order type 4.” The familiar ordering structure in Example 1 is called “order type ω
.” The structure in Examples 3, 4, and 5 is called “order type ω+1
.” The structure in Example 6 is called “order type ω+2
.” The structure in Example 2 is called “order type ω+ω
.” The structure in Example 10 is called “order type ω²
.” There are many more of these.
Some of our examples (8 and 9) don’t have names in this scheme, because their structure differs in an important way from the rest.
Notice how in those orderings, there are regions where you can “infinitely descend” in the ordering, in the sense of keep finding a next lower element, without getting to an element that has no next lower element.
(By “descending” and “lower” here we mean with respect to the ⊏
order in question, not with respect to the familiar ordering <
.)
Whereas with, say, Example 2, if you descend from any odd, you have to eventually get to 1, which has no next lower element.
In other words, for any x ⊏ 1
, there is a y
such that x ⊏ y ⊏ 1
.
If you descend from any even, you have to eventually get to 0
, which has no lower element at all.
We can put this by saying that whereas Examples 8 and 9 have infinitely descending chains, the others do not. (Most of them do have infinitely ascending chains.)
We’ll learn some vocabulary and apparatus for talking about this difference below.
The naming scheme for order types that we were considering is used only for the cases without infinitely descending chains. (So not Examples 8 and 9.)
In all of these examples, I illustrated the order types using (subsets of) ℕ
, or using strings from a finite alphabet. There are some order types that those sets cannot exemplify with any order relation, because they don’t have enough elements.
That is, those are order types that can only be exemplified by sets with uncountable cardinality.
We won’t be talking about those order types in this class; you may do so if you get into serious set theory.
Many authors won’t just use ω
and the like as names for order types, but will talk about them as being specific ordinal numbers and/or sets. When they do that, they are working with some reduction or definition of numbers in terms of pure sets, and the set they call, for example, ω+1
, will have elements that, using their definition of "<"
, have what we’re calling the order type ω+1
. They will count it as a number, but it won’t be an element of ℕ
(so for them, "<"
is defined on a superset of ℕ
).
We won’t need to talk about reducing/defining numbers in terms of pure sets, or ordinal numbers in the sense just described. You will do so if you get into serious set theory.
When a binary relation is dense, and is also a strict partial order (and so is asymmetric), it follows that “between” any two distinct elements in the ordering there will be more elements (at least one more). The familiar arithmetical relation <
has this property for the domain of the rationals (and also for the reals); but not for the domain of natural numbers (nor for the integers).
When we defined “density” for arbitrary relations, we allowed that the “in between” element might be identical to one of the endpoints. When the term “density” is applied to orders, usually theorists are talking about strict (irreflexive) orders, so this can’t happen.
When we have a partially ordered set (that is, a domain with a partial order defined on it), we sometimes talk about “maximal” or “greatest” elements of the set when so-ordered. These aren’t the same, and don’t always exist. For example, there is no maximal or greatest element of ℕ
ordered by the familiar relation ≤
. But some sets have maximal elements, and they may have more than one. For example, if we use ⊑
to express the relation “is a prefix of,” and we consider the set of strings {"a","b","ab","aba","abb"}
, then all three of "b"
, "aba"
, and "abb"
are maximal with respect to that ordering, because there are no elements of the set (other than themselves) that they are prefixes of.
Officially:
m
is a maximal element of partial order ⊑
on set Α =def ∀a ∈ Α (m ⊑ a ⊃ a ⊑ m)
Since partial orders are anti-symmetric, the consequent could also be expressed as a = m
.
The idea of a greatest element is more specific. That has to be such that everything in the set stands in the ⊑
relation to it. Officially:
g
is a greatest element of partial order ⊑
on set Α =def ∀a ∈ Α (a ⊑ g)
In our previous example with “is a prefix of,” there is no greatest element.
A set can have at most one greatest element (wrt a given partial order ⊑
) and any greatest element will be maximal (wrt that order). A set can have a single maximal element but no greatest element: consider the set {"b","a","aa","aaa",...}
ordered by “is a prefix of.”
If an order is total/linear, any maximal element must be greatest.
The notions of “minimal” and “least” elements are defined analogously.
The definitions of maximal and greatest can also be used for pre/quasiorders. With these, a greatest element need not be unique. For example, a set of people may have more than one member who are greatest in height (that is, at least as tall as everyone else). With partial orders, there cannot be ties of that sort. The anti-symmetry of a partial order means that any x,y
where x ⊑ y
and y ⊑ x
must be equivalent. So with partial orders, if there is a greatest element, it must be unique; and it will be the only maximal element. But as we saw a moment ago with the ordering of {"b","a","aa","aaa",...}
by “is a prefix of,” a unique maximal element needn’t be greatest.
Every pre/quasi- or partial ordering of a finite domain must have some maximal and some minimal elements. (Some elements may be both.) These need not be unique, because some elements of the domain may be incomparable. But if the order is weakly-connected (so total/linear), the maximal element(s) will be greatest and the minimal element(s) will be least.
With infinite domains, there may be no maximal elements, or many, or just one but it not be greatest (as we saw), or some maximal elements all of which are greatest. As we said, in the case of partial orders (which have to be anti-symmetric) for the last option there can be only one such; but for pre/quasiorders there may be several.
When the greatest and/or least elements of a poset exist, they are sometimes called top or ⊤ or one and bottom or ⊥ or zero. The symbols ⊤
and ⊥
are also used in other ways, and of course the words “zero” and “one” are commonly used to refer to familiar elements of ℕ
, which need not be the bottom or top of particular partial orderings they belong to.
Suppose we have a poset (Α, ⊑)
. The rest of this page will discuss what might be true about the relation ⊑
for elements of various subsets Β ⊆ Α
.
One interesting notion is when a poset (Α, ⊑)
is such that every non-empty subset of Α
(it needn’t be a finite subset) has at least one minimal element with respect to ⊑
. This is called a well-founded ordering. (The notion where you require maximal elements instead is called “converse or upwards well-founded,” or “Noetherian.” It’s much less often discussed.) One example of this are the natural numbers ℕ
, ordered by the familiar <
. Indeed, all of Examples 1 through 7 given above, and also Example 10, have this property.
Here are two examples of well-founded orderings that are merely partially ordered:
{(n,x) | x mod n = 0}
. Note that the ordering specified here differs from the one in Example 10, not just because here we are ordering numbers like 1
and 6
that Example 10 does not, but also because in Example 10 it was stipulated that 2 ⊏ 3
. But in the current example 2
and 3
are incomparable.
"a"
and "b"
, ordered by the relation of being a substring of.
Some posets that aren’t well-founded orderings are Examples 8 and 9 above. Also:
<
, because we can form subsets of these which have no minimal element.
<
, and even if you also included 0
in the domain, although it would be minimal, the set would still have subsets with no minimal element.
When you have a well-founded ordering that’s also a total/linear order (as in Examples 1 through 7 and 10, but not in Examples 11 and 12), it’s said that the relation well-orders its domain. These are the ordering structures where the names “order type ω
” and so on are applied.
Here’s what makes these notions interesting.
One usually learns how to do mathematical inductions in cases where we have a domain ordered like we do in Example 1. First, we prove that the base case 0
has some property, and also we prove that whenever some arbitrary element k
has the property, the next element (or elements) in the order has the property too. We initially do this with natural numbers, but at some point we realize we can do it with orderings of other domains too, including cases like Example 12 (there we could have several “next elements”). What if we had a domain where the natural ordering for proving things was more like Example 2, repeated here:
0 ⊏ 2 ⊏ 4 ⊏ ... ⊏ 1 ⊏ 3 ⊏ 5 ⊏ ...
There’s a difficulty here, in that 1 is not a “next element in the order” for any element. (For any x ⊏ 1
, there is a y
such that x ⊏ y ⊏ 1
.) We could use the same proof strategy as before, if we were able to include 1
as an additional base case and directly prove that it has the property of interest, when we prove that 0
does. But sometimes we might not be able to easily give such a direct proof for 1
. There’s another strategy we could use in these cases. Instead of proving that whenever some element k
has the property, the next element(s) in the ordering has it too, we prove something of this form (for all k
in the domain):
If ∀j ⊏ k (
element j
has the property of interest)
, then element k
has the property of interest.
If we proved that, we wouldn’t even need to give direct proofs that the minimal elements in the ordering like 0
have the property of interest, because they satisfy the antecedent of this conditional trivially. So proving the conditional would be enough to establish that they have the property. It should be intuitive that if the conditional is true for the property and type of ordering we’re considering, then element 1
will have the property, and so too every element in the ordering. (We won’t prove this rigorously, but that can be done.)
The strategy of proving a conditional of that form, where ⊏
is a well-founded ordering on the domain, is called well-founded or strong or transfinite induction.
Cases where the ordering is not merely well-founded but also total/linear (and so a well-ordering of the domain) have a further interesting property, that any subset with upper bounds will have a least upper bound (these notions to be explained below). An application of this is to give a recipe that always specifies a unique “next element” in the ordering (except for the greatest element, if there is such). The recipe is this. You start with some (non-greatest) element k
of the domain. Take the set of all elements ⊐ k
in the ordering. Since the ordering is total/linear and k
is not greatest, this set will be non-empty. Since the order is well-founded, this set must have minimal elements. Since the order is total/linear (and anti-symmetric), there will be just one minimal element and it will be least. This is the “next element” in the ordering.
Some orderings that aren’t well-founded and total/linear are also ones where we can always find a “next element,” such as Example 8 (not well-founded), or Example 2 if we make the evens and odds incomparable (no longer total/linear). But this general recipe for specifying the next element can’t be used in such cases.
Another interesting notion is when a poset (Α, ⊑)
is such that some Β ⊆ Α
has upper bounds. These will be elements u ∈ Α
(they don’t necessarily have to be ∈ Β
) such that ∀b ∈ Β (b ⊑ u)
. So this is like the notion of a greatest element of Β
, except that we are no longer requiring that it be an element of Β
. It just needs to be an element of the (perhaps larger) domain that ⊑
is defined on.
The notion of an upper bound makes sense for pre/quasiorders, too, not just partial orders. In that case, the upper bounds may have more than one element that are least with respect to the order, and this would complicate our discussion. Henceforth, we’ll focus just on partial orders, where any set can have at most one least element.
For any such Β
, Α
, and ⊑
, there are four possibilities:
Β
may have no upper boundsΒ
may have some upper bounds, but none of them are in Β
and none is least with respect to the order ⊑
Β
has a least upper bound but it’s not an element of Β
Β
has an upper bound that is an element of Β
, and is also Β
’s greatest element and least upper boundWhenever Β
has a greatest element, it will be a least upper bound for Β
. But there are also cases (a), (b), and (c), where there are no upper bounds that are elements of Β
.
An example of case (a) is where our poset is (ℕ, ≤)
, and the subset Β
is the evens. There is no element of ℕ
such that every even number is ⊑
it.
An example of case (b) is where our poset is {"a","b","ab","ba"}
ordered by the relation of being a substring, and the subset Β
is {"a","b"}
. This has two upper bounds — "ab"
and "ba"
— but they are not comparable by our ordering, so neither is least.
Another example of case (b) that uses a total/linear ordering is the poset (
the rationals, ≤)
where the subset Β
is {x | x² < 2}
. The subset has no greatest element: for any rational less than √2
, there’s another larger one that’s still less than √2
, and so still ∈ Β
. But there are plenty of rationals larger than any element of Β
, and so which are upper bounds on Β
. For example, 3/2. But there is no least element of these upper bounds. For any rational greater than √2
, there’s another smaller one that’s still greater than √2
.
An example of case (c) could use the same poset as in the previous example, but this time let the subset Β
be {x | x < 1}
. This Β
does have a least upper bound in our poset, namely 1
, but 1 ∉ Β
.
The least upper bound of a subset Β
is also called Β
’s join or supremum. It can be symbolized as lub(Β)
or sup(Β)
or ∨Β
. When Β
consists of two elements {b₁,b₂}
, it can also be written as a binary operator: b₁ ∨ b₂
. In all of these notations it will be left implicit what the order relation ⊑
and superset domain Α
are. Keep in mind that depending on the ordering and subset chosen, a join/lub may not exist.
The notions of a lower bound, and of a greatest lower bound, are defined analogously. The greatest lower bound of a subset Β
is also called Β
’s meet or infimum, and can be symbolized as glb(Β)
or inf(Β)
or ∧Β
. When Β
consists of two elements {b₁,b₂}
, it can also be written as a binary operator: b₁ ∧ b₂
.