id
stringlengths
1
260
contents
stringlengths
1
234k
21973
\section{Subset Relation is Antisymmetric} Tags: Subsets \begin{theorem} The relation "is a subset of" is antisymmetric: :$\paren {R \subseteq S} \land \paren {S \subseteq R} \iff R = S$ \end{theorem} \begin{proof} This is a direct statement of the definition of set equality: :$R = S := \paren {R \subseteq S} \land \paren {S \subseteq R}$ {{qed}} \end{proof}
21974
\section{Subset Relation is Compatible with Subset Product} Tags: Compatible Relations, Subset Products \begin{theorem} Let $\struct {S, \circ}$ be a magma. Let $\powerset S$ be the power set of $S$. Let $\circ_\PP$ be the operation induced on $\powerset S$ by $\circ$. Then the subset relation $\subseteq$ is compatible with $\circ_\PP$. \end{theorem} \begin{proof} Let $A, B, C \in \powerset S$. Let $A \subseteq B$. Let $x \in A \circ_\PP C$. Then for some $a \in A$ and some $c \in C$: :$x = a \circ c$ Since $A \subseteq B$, $a \in B$. Thus $x \in B \circ_\PP C$. Since this holds for all $x \in A \circ_\PP C$: :$A \circ_\PP C \subseteq B \circ_\PP C$ The same argument shows that: :$C \circ_\PP A \subseteq C \circ_\PP B$ {{qed}} Category:Compatible Relations Category:Subset Products \end{proof}
21975
\section{Subset Relation is Compatible with Subset Product/Corollary 1} Tags: Compatible Relations, Subset Products \begin{theorem} Let $\struct {S, \circ}$ be a magma. Let $\powerset S$ be the power set of $S$. Let $\circ_\PP$ be the operation induced on $\powerset S$ by $\circ$. Let $A, B, C, D \in \powerset S$. Let $A \subseteq B$ and $C \subseteq D$. Then: :$A \circ_\PP C \subseteq B \circ_\PP D$ \end{theorem} \begin{proof} By Subset Relation is Compatible with Subset Product, $\subseteq$ is compatible with $\circ_\PP$. By Subset Relation is Transitive, $\subseteq$ is transitive. Thus the theorem holds by Operating on Transitive Relationships Compatible with Operation. {{qed}} Category:Subset Products Category:Compatible Relations \end{proof}
21976
\section{Subset Relation is Compatible with Subset Product/Corollary 2} Tags: Subset Products, Subset Product \begin{theorem} Let $\struct {S, \circ}$ be a magma. Let $A, B \in \powerset S$, the power set of $S$. {{improve|No need to use power set. $A \subseteq B \subseteq S$ sufficient.}} Let $A \subseteq B$. Let $x \in S$. Then: :$x \circ A \subseteq x \circ B$ :$A \circ x \subseteq B \circ x$ \end{theorem} \begin{proof} This follows from Subset Relation is Compatible with Subset Product and the definition of the subset product with a singleton. {{qed}} Category:Subset Products \end{proof}
21977
\section{Subset Relation is Ordering} Tags: Order Theory, Orderings, Subsets, Subset \begin{theorem} Let $S$ be a set. Let $\powerset S$ be the power set of $S$. Let $\mathbb S \subseteq \powerset S$ be any subset of $\powerset S$, that is, an arbitrary set of subsets of $S$. Then $\subseteq$ is an ordering on $\mathbb S$. In other words, let $\struct {\mathbb S, \subseteq}$ be the relational structure defined on $\mathbb S$ by the relation $\subseteq$. Then $\struct {\mathbb S, \subseteq}$ is an ordered set. \end{theorem} \begin{proof} To establish that $\subseteq$ is an ordering, we need to show that it is reflexive, antisymmetric and transitive. So, checking in turn each of the criteria for an ordering: \end{proof}
21978
\section{Subset Relation is Ordering/General Result} Tags: Order Theory, Subsets, Subset \begin{theorem} Let $\mathbb S$ be a set of sets or class. Then $\subseteq$ is an ordering on $\mathbb S$. In other words, let $\struct {\mathbb S, \subseteq}$ be the relational structure defined on $\mathbb S$ by the relation $\subseteq$. Then $\struct {\mathbb S, \subseteq}$ is an ordered set. \end{theorem} \begin{proof} To establish that $\subseteq$ is an ordering, we need to show that it is reflexive, antisymmetric and transitive. So, checking in turn each of the criteria for an ordering: \end{proof}
21979
\section{Subset Relation on Power Set is Partial Ordering} Tags: Power Set, Orderings, Partial Orderings, Subsets, Order Theory, Subset \begin{theorem} Let $S$ be a set. Let $\powerset S$ be the power set of $S$. Let $\struct {\powerset S, \subseteq}$ be the relational structure defined on $\powerset S$ by the relation $\subseteq$. Then $\struct {\powerset S, \subseteq}$ is an ordered set. The ordering $\subseteq$ is partial {{iff}} $S$ is neither empty nor a singleton; otherwise it is total. \end{theorem} \begin{proof} From Subset Relation is Ordering, we have that $\subseteq$ is an ordering on any set of subsets of a given set. Suppose $S$ is neither a singleton nor the empty set. Then $\exists a, b \in S$ such that $a \ne b$. Then $\set a \in \powerset S$ and $\set b \in \powerset S$. However, $\set a \nsubseteq \set b$ and $\set b \nsubseteq \set a$. So by definition, $\subseteq$ is a partial ordering. Now suppose $S = \O$. Then $\powerset S = \set \O$ and, by Empty Set is Subset of All Sets, $\O \subseteq \O$. Hence, trivially, $\subseteq$ is a total ordering on $\powerset S$. Now suppose $S$ is a singleton: let $S = \set a$. Then $\powerset S = \set {\O, \set a}$. So there are only two elements of $\powerset S$, and we see that $\O \subseteq \set a$ from Empty Set is Subset of All Sets. So, trivially again, $\subseteq$ is a total ordering on $\powerset S$. {{qed}} \end{proof}
21980
\section{Subset and Image Admit Infima and Mapping is Increasing implies Infimum of Image Succeeds Mapping at Infimum} Tags: Order Theory \begin{theorem} Let $\struct {S, \preceq}$ and $\struct {T, \precsim}$ be ordered sets. Let $f: S \to T$ be a increasing mapping. Let $D \subseteq S$ such that :$D$ admits a infimum in $S$ and $f \sqbrk D$ admits a infimum in $T$. Then $\map f {\inf D} \precsim \map \inf {f \sqbrk D}$ \end{theorem} \begin{proof} By definition of infimum: :$\inf D$ is lower bound for $D$. By Increasing Mapping Preserves Lower Bounds: :$\map f {\inf D}$ is a lower bound for $f \sqbrk D$. Thus by definition of infimum: :$\map f {\inf D} \precsim \map \inf {f \sqbrk D}$ {{qed}} \end{proof}
21981
\section{Subset and Image Admit Suprema and Mapping is Increasing implies Supremum of Image Precedes Mapping at Supremum} Tags: Order Theory \begin{theorem} Let $\struct {S, \preceq}$, $\struct {T, \precsim}$ be ordered sets. Let $f: S \to T$ be a increasing mapping. Let $D \subseteq S$ such that :$D$ admits a supremum in $S$ and $f \sqbrk D$ admits a supremum in $T$. Then: :$\map \sup {f \sqbrk D} \precsim \map f {\sup D}$ \end{theorem} \begin{proof} By definition of supremum: :$\sup D$ is upper bound for $D$. By Increasing Mapping Preserves Upper Bounds: :$\map f {\sup D}$ is upper bound for $f \sqbrk D$. Thus by definition of supremum: :$\map \sup {f \sqbrk D} \precsim \map f {\sup D}$ {{qed}} \end{proof}
21982
\section{Subset equals Image of Preimage implies Surjection} Tags: Subset equals Image of Preimage implies Surjection, Surjections \begin{theorem} Let $f: S \to T$ be a mapping. Let: :$\forall B \subseteq T: B = \paren {f \circ f^{-1} } \sqbrk B$ where $f \sqbrk B$ denotes the image of $B$ under $f$. Then $f$ is a surjection. \end{theorem} \begin{proof} Let $g$ be such that: :$\forall B \in \mathcal P \left({T}\right): B = \left({f_g \circ f_{g^{-1}}}\right) \left({B}\right)$ In particular, it holds for $T$ itself. Hence: {{begin-eqn}} {{eqn | l = T | r = \left({f_g \circ f_{g^{-1} } }\right) \left({T}\right) | c = }} {{eqn | l = T | r = f_g \left({f_{g^{-1} } \left({T}\right)}\right) | c = Definition of Composition of Mappings }} {{eqn | o = \subseteq | r = f_g \left({S}\right) | c = Image of Subset is Subset of Image: Corollary 2 }} {{eqn | r = \operatorname{Im} \left({g}\right) | c = Definition of Image of Mapping }} {{eqn | o = \subseteq | r = T | c = Image is Subset of Codomain: Corollary 1 }} {{end-eqn}} So: :$T \subseteq \operatorname{Im} \left({g}\right) \subseteq T$ and so by definition of set equality: :$\operatorname{Im} \left({g}\right) = T$ So, by definition, $g$ is a surjection. {{qed}} \end{proof}
21983
\section{Subset equals Preimage of Image implies Injection} Tags: Subsets, Injections, Subset equals Preimage of Image implies Injection \begin{theorem} Let $f: S \to T$ be a mapping. Let $f^\to: \powerset S \to \powerset T$ be the direct image mapping of $f$. Similarly, let $f^\gets: \powerset T \to \powerset S$ be the inverse image mapping of $f$. Let: :$\forall A \in \powerset S: A = \map {\paren {f^\gets \circ f^\to} } A$ Then $f$ is an injection. \end{theorem} \begin{proof} Let $g$ be such that: :$\forall A \in \mathcal P \left({S}\right): A = \left({f_{g^{-1}} \circ f_g}\right) \left({A}\right)$ In particular, it holds for all subsets of $A$ which are singletons. Now, consider any $x, y \in A$. We have: {{begin-eqn}} {{eqn | l = g \left({x}\right) | r = g \left({y}\right) | c = }} {{eqn | ll= \implies | l = \left\{ {g \left({x}\right)}\right\} | r = \left\{ {g \left({y}\right)}\right\} | c = }} {{eqn | ll= \implies | l = f_g \left({\left\{ {x}\right\} }\right) | r = f_g \left({\left\{ {y}\right\} }\right) | c = Definition of Induced Mapping }} {{eqn | ll= \implies | l = \left\{ {x}\right\} | r = f_{g^{-1} } \left({f_g \left({\left\{ {x}\right\} }\right)}\right) | c = by hypothesis: $A = \left({f_{g^{-1} } \circ f_g}\right) \left({A}\right)$ }} {{eqn | r = f_{g^{-1} } \left({f_g \left({\left\{ {y}\right\} }\right)}\right) | c = }} {{eqn | r = \left\{ {y}\right\} | c = by hypothesis: $A = \left({f_{g^{-1} } \circ f_g}\right) \left({A}\right)$ }} {{eqn | ll= \implies | l = x | r = y | c = }} {{end-eqn}} So $g$ is an injection. {{qed}} \end{proof}
21984
\section{Subset has 2 Conjugates then Normal Subgroup} Tags: Conjugacy, Normal Subgroups \begin{theorem} Let $G$ be a group. Let $S$ be a subset of $G$. Let $S$ have exactly two conjugates in $G$. Then $G$ has a proper non-trivial normal subgroup. \end{theorem} \begin{proof} {{MissingLinks}} Consider the centralizer $\map {C_G} S$ of $S$ in $G$. From Centralizer of Group Subset is Subgroup, $\map {C_G} S$ is a subgroup of $G$. If $\map {C_G} S = G$, then $S$ has no conjugate but itself. {{explain|Link to that result.}} So, in order for $S$ to have exactly two conjugates in $G$, it is necessary for $\map {C_G} S$ to be a proper subgroup. Let $e$ be the identity of $G$. If $\map {C_G} S = \set e$, then for there to be exactly two conjugates of $S$: :$\forall a \ne b \in G \setminus \set e: b x b^{-1} = a x a^{-1}$ But: {{begin-eqn}} {{eqn | l = b x b^{-1} | r = a x a^{-1} | c = }} {{eqn | ll= \leadsto | l = \paren {a^{-1} b} x b^{-1} | r = x a^{-1} | c = }} {{eqn | ll= \leadsto | l = \paren {a^{-1} b} x \paren {a^{-1} b}^{-1} | r = x | c = }} {{eqn | ll= \leadsto | l = a^{-1} b | o = \in | r = \map {C_G} S | c = }} {{end-eqn}} This implies either that $\map {C_G} S$ is actually nontrivial, or that $a^{-1}b = e \iff a = b$, a contradiction. Thus $\map {C_G} S$ is a nontrivial proper subgroup of $G$. We have that there are exactly $2$ conjugacy classes of $S$. These are in one-to-one correspondence with cosets of $S$. Thus the index $\index G {\map {C_G} S}$ of the centralizer is: :$\index G {\map {C_G} S} = 2$ From Subgroup of Index 2 is Normal: :$\map {C_G} S$ is a proper nontrivial normal subgroup of $G$. {{qed}} \end{proof}
21985
\section{Subset implies Cardinal Inequality} Tags: Minimal Infinite Successor Set, Cardinals \begin{theorem} Let $S$ and $T$ be sets such that $S \subseteq T$. Furthermore, let: :$T \sim \card T$ where $\card T$ denotes the cardinality of $T$. Then: :$\card S \le \card T$ \end{theorem} \begin{proof} For the proof: :the ordering relation $\le$ for ordinals and :the subset relation $\subseteq$ shall be used interchangeably. Let $f: T \to \card T$ be a bijection. It follows that $f \restriction_S : S \to \card T$ is an injection. The image of $S$ under $f$ is a subset of $\card T$ and thus is a subset of an ordinal. By Unique Isomorphism between Ordinal Subset and Unique Ordinal, there is a unique mapping $\phi$ and a unique ordinal $x$ such that $\phi: x \to f \sqbrk S$ is an order isomorphism. It follows that $S \sim x$ by the definition of order isomorphism. Furthermore, $\phi$ is a strictly increasing mapping from ordinals to ordinals. {{begin-eqn}} {{eqn | l = y | o = \in | r = x | c = }} {{eqn | ll= \leadsto | l = y | o = \le | r = \map \phi y | c = Strictly Increasing Ordinal Mapping Inequality }} {{eqn | o = \in | r = f \sqbrk S | c = Definition of $\phi$ }} {{eqn | o = \subseteq | r = \card T | c = Image Preserves Subsets }} {{eqn | ll= \leadsto | l = y | o = \in | r = \card T | c = Cardinal Number is Ordinal }} {{end-eqn}} Therefore, $y \in x \implies y \in \card T$ and $x \le \card T$ by the definition of subset. But $\card S \le x$ by Cardinal Number Less than Ordinal. So $\card S \le \card T$ by the fact that Subset Relation is Transitive. {{qed}} \end{proof}
21986
\section{Subset is Compatible with Ordinal Addition} Tags: Ordinal Arithmetic \begin{theorem} Let $x, y, z$ be ordinals. Then: : $(1): x \le y \implies \left({z + x}\right) \le \left({z + y}\right)$ : $(2): x \le y \implies \left({x + z}\right) \le \left({y + z}\right)$ \end{theorem} \begin{proof} The result follows from Subset is Left Compatible with Ordinal Addition and Subset is Right Compatible with Ordinal Addition. {{qed}} Category:Ordinal Arithmetic \end{proof}
21987
\section{Subset is Compatible with Ordinal Multiplication} Tags: Ordinal Arithmetic \begin{theorem} Let $x, y, z$ be ordinals. Then: : $(1): x \le y \implies \left({z \cdot x}\right) \le \left({z \cdot y}\right)$ : $(2): x \le y \implies \left({x \cdot z}\right) \le \left({y \cdot z}\right)$ \end{theorem} \begin{proof} The result follows from Subset is Left Compatible with Ordinal Multiplication and Subset is Right Compatible with Ordinal Multiplication. {{qed}} Category:Ordinal Arithmetic \end{proof}
21988
\section{Subset is Left Compatible with Ordinal Addition} Tags: Ordinal Arithmetic \begin{theorem} Let $x, y, z$ be ordinals. Then: :$x \le y \implies \paren {z + x} \le \paren {z + y}$ \end{theorem} \begin{proof} The result follows from Membership is Left Compatible with Ordinal Addition. {{qed}} Category:Ordinal Arithmetic \end{proof}
21989
\section{Subset is Left Compatible with Ordinal Multiplication} Tags: Ordinal Arithmetic \begin{theorem} Let $x, y, z$ be ordinals. Then: :$x \le y \implies \left({z \cdot x}\right) \le \left({z \cdot y}\right)$ \end{theorem} \begin{proof} The result follows from Membership is Left Compatible with Ordinal Multiplication. {{qed}} Category:Ordinal Arithmetic \end{proof}
21990
\section{Subset is Right Compatible with Ordinal Addition} Tags: Ordinal Arithmetic \begin{theorem} Let $x, y, z$ be ordinals. Then: :$x \le y \implies \paren {x + z} \le \paren {y + z}$ \end{theorem} \begin{proof} The proof proceeds by transfinite induction on $z$. \end{proof}
21991
\section{Subset is Right Compatible with Ordinal Exponentiation} Tags: Ordinal Arithmetic \begin{theorem} Let $x, y, z$ be ordinals. Then: :$x \le y \implies x^z \le y^z$ \end{theorem} \begin{proof} The proof shall proceed by Transfinite Induction on $z$. \end{proof}
21992
\section{Subset is Right Compatible with Ordinal Multiplication} Tags: Ordinal Arithmetic \begin{theorem} Let $x, y, z$ be ordinals. Then: :$x \le y \implies \paren {x \cdot z} \le \paren {y \cdot z}$ \end{theorem} \begin{proof} The proof shall proceed by Transfinite Induction on $z$. \end{proof}
21993
\section{Subset not necessarily Submagma} Tags: Magmas, Abstract Algebra \begin{theorem} Let $\struct {S, \circ}$ be a magma. Let $T \subseteq S$. Then it is not necessarily the case that: : $\struct {T, \circ} \subseteq \struct {S, \circ}$ That is, it does not always follow that $\struct {T, \circ}$ is a submagma of $\struct {S, \circ}$. \end{theorem} \begin{proof} Let $\struct {\Z, -}$ be the magma which is the set of integers under the operation of subtraction. We have that the natural numbers $\N$ are a subset of the integers. Consider $\struct {\N, -}$, the natural numbers under subtraction. We have that Natural Number Subtraction is not Closed. For example: : $1 - 2 = -1 \notin \N$ Thus $\struct {\N, -}$ is not closed. So $\struct {\N, -}$ is not a submagma of $\struct {\Z, -}$ Hence it is not true to write $\struct {\N, -} \subseteq \struct {\Z, -}$, despite the fact that $\N \subseteq \Z$. Thus $\struct {\N, -}$ is not a submagma of $\struct {\Z, -}$. {{qed}} \end{proof}
21994
\section{Subset of Abelian Group Generated by Product of Element with Inverse Element is Subgroup} Tags: Abelian Groups, Subset of Abelian Group Generated by Product of Element with Inverse Element is Subgroup \begin{theorem} Let $\struct {G, \circ}$ be an abelian group. Let $S \subset G$ be a non-empty subset of $G$ such that $\struct {S, \circ}$ is closed. Let $H$ be the set defined as: :$H := \set {x \circ y^{-1}: x, y \in S}$ Then $\struct {H, \circ}$ is a subgroup of $\struct {G, \circ}$. \end{theorem} \begin{proof} Let $x \in S$. Then: :$x \circ x^{-1} \in H$ and so $H \ne \O$. Now let $a, b \in H$. Then: :$a = x_a \circ y_a^{-1}$ and: :$b = x_b \circ y_b^{-1}$ for some $x_a, y_a, x_b, y_b \in S$. Thus: {{begin-eqn}} {{eqn | l = a \circ b^{-1} | r = \paren {x_a \circ y_a^{-1} } \circ \paren {x_b \circ y_b^{-1} }^{-1} | c = }} {{eqn | r = \paren {x_a \circ y_a^{-1} } \circ \paren {\paren {y_b^{-1} }^{-1} \circ x_b^{-1} } | c = Inverse of Group Product }} {{eqn | r = \paren {x_a \circ y_a^{-1} } \circ \paren {y_b \circ x_b^{-1} } | c = Inverse of Group Inverse }} {{eqn | r = x_a \circ \paren {y_a^{-1} \circ y_b} \circ x_b^{-1} | c = {{GroupAxiom|1}} }} {{eqn | r = x_a \circ \paren {y_b\circ y_a^{-1} } \circ x_b^{-1} | c = {{Defof|Abelian Group}} }} {{eqn | r = \paren {x_a \circ y_b} \circ \paren {y_a^{-1} \circ x_b^{-1} } | c = {{GroupAxiom|1}} }} {{eqn | r = \paren {x_a \circ y_b} \circ \paren {x_b \circ y_a}^{-1} | c = Inverse of Group Product }} {{end-eqn}} As $\struct {S, \circ}$ is closed, both $x_a \circ y_b \in S$ and $x_b \circ y_a \in S$. Thus $a \circ b$ is in the form $x \circ y^{-1}$ for $x, y \in S$. Thus $a \circ b \in H$ Hence by the One-Step Subgroup Test, $H$ is a subgroup of $G$. {{qed}} \end{proof}
21995
\section{Subset of Bounded Above Set is Bounded Above} Tags: Boundedness \begin{theorem} Let $A$ and $B$ be sets of real numbers such that $A \subseteq B$. Let $B$ be bounded above. Then $A$ is also bounded above. \end{theorem} \begin{proof} Let $B$ be bounded above. Then by definition $B$ has an upper bound $U$. Hence: :$\forall x \in B: x \le U$ But by definition of subset: :$\forall x \in A: x \in B$ That is: :$\forall x \in A: x \le U$ Hence, by definition, $A$ is bounded above by $U$. {{qed}} \end{proof}
21996
\section{Subset of Bounded Below Set is Bounded Below} Tags: Boundedness \begin{theorem} Let $A$ and $B$ be sets of real numbers such that $A \subseteq B$. Let $B$ be bounded below. Then $A$ is also bounded below. \end{theorem} \begin{proof} Let $B$ be bounded below. Then by definition $B$ has a lower bound $L$. Hence: :$\forall x \in B: x \ge L$ But by definition of subset: :$\forall x \in A: x \in B$ That is: :$\forall x \in A: x \ge L$ Hence, by definition, $A$ is bounded below by $L$. {{qed}} \end{proof}
21997
\section{Subset of Cartesian Product} Tags: Cartesian Product, Axiomatic Set Theory \begin{theorem} Let $S$ be a set of ordered pairs. Then $S$ is the subset of the cartesian product of two sets. \end{theorem} \begin{proof} Let $S$ be a set of ordered pairs. Let $x \in S$ such that $x = \left\{{\left\{{a}\right\}, \left\{{a, b}\right\}}\right\}$ as defined in Kuratowski Formalization of Ordered Pair. Since the elements of $S$ are sets, we can form the union $\mathbb S = \bigcup S$ of the sets in $S$. Since $x \in S$ it follows that the elements of $x$ are elements of $\mathbb S$. Since $\left\{{a, b}\right\} \in x$ it follows that $\left\{{a, b}\right\} \in \mathbb S$. Now we can form the union $\mathbb S' = \bigcup \mathbb S$ of the sets in $\mathbb S$. Since $\left\{{a, b}\right\} \in \mathbb S$ it follows that both $a$ and $b$ are elements of $\mathbb S' = \bigcup \bigcup S$. Thus from the Kuratowski Formalization of Ordered Pair we have that $S$ is a subset of some $A \times B$. We can at this stage take both $A$ and $B$ as being equal to $\bigcup \bigcup S$. Finally, the axiom of specification is applied to construct the sets: :$A = \left\{{a: \exists b: \left({a, b}\right) \in S}\right\}$ and :$B = \left\{{b: \exists a: \left({a, b}\right) \in S}\right\}$ $A$ and $B$ are seen to be the first and second projections respectively of $S$. {{qed}} \end{proof}
21998
\section{Subset of Cartesian Product not necessarily Cartesian Product of Subsets} Tags: Cartesian Product \begin{theorem} Let $A$ and $B$ be sets. Let $A$ and $B$ both have at least two distinct elements. Then there exists $W \subseteq A \times B$ such that $W$ is not the cartesian product of a subset of $A$ and a subset of $B$. \end{theorem} \begin{proof} Let $a \in A, b \in A, c \in B, d \in B$ be arbitrary elements of $A$ and $B$. Let: :$W = \set {\tuple {a, c}, \tuple {a, d}, \tuple {b, d} }$ Then $W \subseteq A \times B$. Suppose $W = X \times Y$ such that $X \subseteq A, Y \subseteq B$. Then $a, b \in X$ and $c, d \in Y$. But $X \times Y$ also contains $\tuple {b, c}$ which is not in $W$. Hence the result. {{qed}} \end{proof}
21999
\section{Subset of Codomain is Superset of Image of Preimage} Tags: Induced Mappings, Composite Mappings, Mapping Theory, Subset of Codomain is Superset of Image of Preimage, Preimages under Mappings \begin{theorem} Let $f: S \to T$ be a mapping. Then: :$B \subseteq T \implies \paren {f \circ f^{-1} } \sqbrk B \subseteq B$ where: :$f \sqbrk B$ denotes the image of $B$ under $f$ :$f^{-1}$ denotes the inverse of $f$ :$f \circ f^{-1}$ denotes composition of $f$ and $f^{-1}$. This can be expressed in the language and notation of direct image mappings and inverse image mappings as: :$\forall B \in \powerset T: \map {\paren {f^\to \circ f^\gets} } B \subseteq B$ \end{theorem} \begin{proof} From Image of Preimage of Mapping: : $B \subseteq T \implies \left({f \circ f^{-1}}\right) \left[{B}\right] = B \cap f \left[{S}\right]$ The result follows from Intersection Subset. {{qed}} \end{proof}
22000
\section{Subset of Countable Set is Countable} Tags: Countable Sets, Subsets, Subset \begin{theorem} A subset of a countable set is countable. \end{theorem} \begin{proof} Let $S$ be a countable set. Let $T \subseteq S$. By definition, there exists an injection $f: S \to \N$. Let $i: T \to S$ be the inclusion mapping. We have that $i$ is an injection. Because the composite of injections is an injection, it follows that $f \circ i: T \to \N$ is an injection. Hence, $T$ is countable. {{qed}} \end{proof}
22001
\section{Subset of Countably Infinite Set is Countable} Tags: Set Theory, Infinite Sets, Subsets, Countable Sets, Subset \begin{theorem} Every subset of a countably infinite set is countable. \end{theorem} \begin{proof} Let $S = \set {a_0, a_1, a_2, \ldots}$ be countably infinite. Let $T \subseteq S = \set {a_{n_0}, a_{n_1}, a_{n_2}, \ldots}$, where $a_{n_0}, a_{n_1}, a_{n_2}, \ldots$ are the elements of $S$ also in $T$. If the set of numbers $\set {n_0, n_1, n_2, \ldots}$ has a largest number, then $T$ is finite. Otherwise, consider the bijection $i \leftrightarrow n_i$. This leads to the bijection $i \leftrightarrow a_{n_i}$ This latter bijection is the required one-to-one correspondence between the elements of $T$ and those of $\N$, showing that $T$ is indeed countable. {{finish|formal justification}} \end{proof}
22002
\section{Subset of Domain is Subset of Preimage of Image} Tags: Subset of Domain is Subset of Preimage of Image, Induced Mappings, Mappings, Composite Mappings, Mapping Theory, Preimages under Mappings \begin{theorem} Let $f: S \to T$ be a mapping. Then: :$A \subseteq S \implies A \subseteq \paren {f^{-1} \circ f} \sqbrk A$ where: :$f \sqbrk A$ denotes the image of $A$ under $f$ :$f^{-1} \sqbrk A$ denotes the preimage of $A$ under $f$ :$f^{-1} \circ f$ denotes composition of $f^{-1}$ and $f$. This can be expressed in the language and notation of direct image mappings and inverse image mappings as: :$\forall A \in \powerset S: A \subseteq \map {\paren {f^\gets \circ f^\to} } A$ \end{theorem} \begin{proof} As a mapping is by definition a left-total relation. Therefore Preimage of Image under Left-Total Relation is Superset applies: :$A \subseteq S \implies A \subseteq \paren {\RR^{-1} \circ \RR} \sqbrk A$ where $\RR$ is a relation. Hence: :$A \subseteq S \implies A \subseteq \paren {f^{-1} \circ f} \sqbrk A$ {{qed}} \end{proof}
22003
\section{Subset of Domain is Subset of Preimage of Image/Equality does Not Necessarily Hold} Tags: Subset of Domain is Subset of Preimage of Image \begin{theorem} Let $f: S \to T$ be a mapping. From Subset of Domain is Subset of Preimage of Image: :$A \subseteq S \implies A \subseteq \paren {f^{-1} \circ f} \sqbrk A$ where: :$f \sqbrk A$ denotes the image of $A$ under $f$ :$f^{-1} \sqbrk A$ denotes the preimage of $A$ under $f$ :$f^{-1} \circ f$ denotes composition of $f^{-1}$ and $f$. It is not necessarily the case that: :$A \subseteq S \implies A = \paren {f^{-1} \circ f} \sqbrk A$ \end{theorem} \begin{proof} Proof by Counterexample: Let: :$S = \set {0, 1}$ :$T = \set 2$ Let $f: S \to T$ be defined as: :$\map f 0 = 2$ :$\map f 1 = 2$ Let $A \subseteq S$ be defined as: :$A = \set 0$ Then we have: :$f \sqbrk A = \set 2$ but: :$f^{-1} \circ f \sqbrk A = f^{-1} \sqbrk 2 = \set {0, 1}$ That is: :$A \subseteq \paren {f^{-1} \circ f} \sqbrk A$ but it is not the case that: :$A \paren {f^{-1} \circ f} \sqbrk A$ {{qed}} \end{proof}
22004
\section{Subset of Empty Set} Tags: Set Theory, Empty Set \begin{theorem} Let $A$ be a class. Then: :$A$ is a subset of the empty set $\O$ {{iff}}: :$A$ is equal to the empty set: :$A \subseteq \O \iff A = \O$ \end{theorem} \begin{proof} {{begin-eqn}} {{eqn | l = A = \O | o = \leadsto | r = A \subseteq \O | c = {{Defof|Set Equality|index = 2}} }} {{end-eqn}} Conversely: {{begin-eqn}} {{eqn | l = A \subseteq \O | o = \leadsto | r = A \subseteq \O \land \O \subseteq A | c = Empty Set is Subset of All Sets }} {{eqn | o = \leadsto | r = A = \O | c = {{Defof|Set Equality|index = 2}} }} {{end-eqn}} {{qed}} Category:Empty Set \end{proof}
22005
\section{Subset of Empty Set iff Empty} Tags: Empty Set, Empty Set \begin{theorem} Let $S$ be a set. Let $\O$ denote the empty set. Then $S \subseteq \O$ {{iff}} $S = \O$. \end{theorem} \begin{proof} Suppose $x \in S$. Then since $S \subseteq \O$, it follows that $x \in \O$. Hence $x \notin S$. That is, $S = \O$. {{qed}} Category:Empty Set \end{proof}
22006
\section{Subset of Euclidean Plane whose Product of Coordinates are Greater Than or Equal to 1 is Closed} Tags: Euclidean Space, Closed Sets, Real Number Plane with Euclidean Topology \begin{theorem} Let $\struct {\R^2, \tau_d}$ be the real number plane with the usual (Euclidean) topology. Let $A \subseteq R^2$ be the set of all points defined as: :$A := \set {\tuple {x, y} \in \R^2: x y \ge 1}$ Then $A$ is a closed set in $\struct {\R^2, d}$. \end{theorem} \begin{proof} By definition, $\tau_d$ is the topology induced by the Euclidean metric $d$. Consider the complement of $A$ in $\R^2$: :$A' := \R^2 \setminus A$ Thus: :$A := \set {\tuple {x, y} \in \R^2: x y < 1}$ Let $a = \tuple {x_a, y_a} \in A^2$. Let $\epsilon = \size {1 - x_a y_a}$. Then the open $\epsilon$-ball of $a$ in $\R^2$ lies entirely in $A'$. As $a$ is arbitrary, it follows that any such $a$ has an open $\epsilon$-ball of $a$ in $\R^2$ which lies entirely in $A'$. Thus, by definition, $A'$ is open in $\R^2$. So, also by definition, $A$ is closed in $\R^2$. {{qed}} \end{proof}
22007
\section{Subset of Excluded Point Space is not Dense-in-itself} Tags: Excluded Point Topology, Denseness \begin{theorem} Let $T = \struct {S, \tau_{\bar p} }$ be a excluded point space such that $S$ is not a singleton. Let $H \subseteq S$. Then $H$ is not dense-in-itself. \end{theorem} \begin{proof} From Limit Points in Excluded Point Space, the only limit point of $H$ is $p$. So by definition, all points of $H$ are isolated in $H$ except $p$. So if $H \ne \set p$, $H$ contains at least one point which is isolated in $H$. As for $p$ itself, from Singleton Point is Isolated we have that $p$ is itself isolated in $\set p$. So if $H = \set p$, $H$ also contains one point which is isolated in $H$. Hence the result, by definition of dense-in-itself. {{qed}} \end{proof}
22008
\section{Subset of Finite Set is Finite} Tags: Set Theory, Subsets, Analysis, Finite Sets, Subset \begin{theorem} Let $X$ be a finite set. If $Y$ is a subset of $X$, then $Y$ is also finite. \end{theorem} \begin{proof} From the definition, $X$ is finite {{iff}} $\exists n \in \N$ such that there exists a bijection: :$f: X \leftrightarrow \N_n$ where $\N_n$ is the set of all elements of $\N$ less than $n$, that is: :$\N_n = \set {0, 1, 2, \ldots, n - 1}$ The case in which $X$ is empty is trivial. We begin proving the following particular case: :''If $X$ is finite and $a \in X$, then $X \setminus \set a$ is also finite.'' From Bijection between Specific Elements there exists a bijection $f: \N_n \to X$, which satisfies $\map f n = a$. Next we prove the general case by induction. \end{proof}
22009
\section{Subset of Hilbert Sequence Space with Non-Empty Interior is not Compact} Tags: Hilbert Sequence Space \begin{theorem} Let $A$ be the set of all real sequences $\sequence {x_i}$ such that the series $\ds \sum_{i \mathop \ge 0} x_i^2$ is convergent. Let $\ell^2 = \struct {A, d_2}$ be the Hilbert sequence space on $\R$. Let $H$ be a subset of $\ell^2$ whose interior is non-empty. Then $H$ is not compact in $\ell^2$. \end{theorem} \begin{proof} Let $x \in H^\circ$, where $H^\circ$ denotes the interior of $H$. By definition, $H^\circ$ is an open set of $\ell^2$ containing $x$. Again by definition, $H$ is a neighborhood of $x$. But from Point in Hilbert Sequence Space has no Compact Neighborhood, $x$ has no compact neighborhood in $\ell^2$. Thus $H$ cannot be compact in $\ell^2$. {{qed}} \end{proof}
22010
\section{Subset of Indiscrete Space is Compact} Tags: Compact Spaces, Indiscrete Topology \begin{theorem} Let $T = \struct {S, \set {\O, S} }$ be an indiscrete topological space. Let $H \subseteq S$. $H$ is compact in $T$. \end{theorem} \begin{proof} The subspace $T_H = \struct {H, \set {\O, S \cap H} }$ is trivially also an indiscrete space. The only open cover of $T_H$ is $\set H$ itself. The only subcover of $H$ is, trivially, also $\set H$, which is finite. So $H$ is (trivially) compact in $T$. {{qed}} \end{proof}
22011
\section{Subset of Indiscrete Space is Compact and Sequentially Compact} Tags: Sequentially Compact Spaces, Compact Spaces, Indiscrete Topology \begin{theorem} Let $T = \struct {S, \set {\O, S} }$ be an indiscrete topological space. Let $H \subseteq S$. \end{theorem} \begin{proof} The subspace $T_H = \left({H, \left\{{\varnothing, S \cap H}\right\}}\right)$ is trivially also an indiscrete space. The only open cover of $T_H$ is $\left\{{H}\right\}$ itself. The only subcover of $H$ is, trivially, also $\left\{{H}\right\}$, which is finite. So $H$ is (trivially) compact in $T$. From Convergent Sequences in Indiscrete Space, every sequence in $T$ converges to every point of $S$. So every infinite sequence has a subsequence which converges to every point in $S$. Hence $H$ is (trivially) sequentially compact in $T$. {{qed}} \end{proof}
22012
\section{Subset of Indiscrete Space is Dense-in-itself} Tags: Denseness, Indiscrete Topology \begin{theorem} Let $T = \struct {S, \set {\O, S} }$ be an indiscrete topological space. Let $H \subseteq S$ be a subset of $S$ containing more than one point. Then $H$ is dense-in-itself. \end{theorem} \begin{proof} Let $x \in H$. Then as $H$ is not singleton, $\exists y \in H: y \ne x$. Then every neighborhood of $x$ contains $y$, as the only open set of $T$ is $S$, which also contains both $x$ and $y$. Hence $x$ is not isolated by definition. $x$ is general, so all points in $H$ are similarly not isolated. Hence the subset $H$ is dense-in-itself by definition. {{qed}} \end{proof}
22013
\section{Subset of Indiscrete Space is Everywhere Dense} Tags: Denseness, Indiscrete Topology \begin{theorem} Let $T = \struct {S, \set {\O, S} }$ be an indiscrete topological space. Let $H \subseteq S$ such that $H \ne \O$. Then $H$ is everywhere dense. \end{theorem} \begin{proof} From Limit Points of Indiscrete Space, every point of $T$ is a limit point of $H$. Hence $H$ is everywhere dense by definition. {{qed}} \end{proof}
22014
\section{Subset of Indiscrete Space is Sequentially Compact} Tags: Sequentially Compact Spaces, Indiscrete Topology \begin{theorem} Let $T = \struct {S, \set {\O, S} }$ be an indiscrete topological space. Let $H \subseteq S$. $H$ is sequentially compact in $T$. \end{theorem} \begin{proof} From Sequence in Indiscrete Space converges to Every Point, every sequence in $T$ converges to every point of $S$. So every infinite sequence has a subsequence which converges to every point in $S$. Hence $H$ is (trivially) sequentially compact in $T$. {{qed}} \end{proof}
22015
\section{Subset of Linear Code with Even Weight Codewords} Tags: Linear Codes \begin{theorem} Let $C$ be a linear code. Let $C^+$ be the subset of $C$ consisting of all the codewords of $C$ which have even weight. Then $C^+$ is a subgroup of $C$ such that either $C^+ = C$ or such that $\order {C^+} = \dfrac {\order C} 2$. \end{theorem} \begin{proof} Note that the zero codeword is in $C^+$ as it has a weight of $0$ which is even. Let $c$ and $d$ be of even weight, where $c$ and $d$ agree in $k$ ordinates. Let $\map w c$ denote the weight of $c$. Then: {{begin-eqn}} {{eqn | l = \map w {c + d} | r = \map w c - k + \map w d - k | c = }} {{eqn | r = \map w c + \map w d - 2 k | c = }} {{end-eqn}} which is even. Since the negative of a vector $\mathbf v$ in $\Z_2$ equals $\mathbf v$, it follows that the inverse of $c \in C$ is also in $C$. It follows from the Two-Step Subgroup Test that $C^+$ is a subgroup of $C$. Let $C \ne C^+$. Then $C$ contains a codeword $c$ of odd weight. Let $C^-$ denote the subset of $C$ consisting of all the codewords of $C$ which have odd weight. Adding $c$ to each codeword of $C^+$ gives distinct codewords of odd weight, so: :$\order {C^-} \ge \order {C^+}$ Similarly, adding $c$ to each codeword of $C^-$ gives distinct codewords of even weight, so: :$\order {C^-} \le \order {C^+}$ As $C = C^+ \cup C^-$ it follows that: :$\order C = 2 \order {C^+}$ Hence the result. {{qed}} \end{proof}
22016
\section{Subset of Linearly Independent Set is Linearly Independent} Tags: Unitary Modules, Linear Algebra, Modules \begin{theorem} A subset of a linearly independent set is also linearly independent. \end{theorem} \begin{proof} Let $G$ be an unitary $R$-module. Then $\sequence {a_n}$ is a linearly independent sequence {{iff}} $\set {a_1, a_2, \ldots, a_n}$ is a linearly independent set of $G$. So suppose that $\set {a_1, a_2, \ldots, a_n}$ is a linearly independent set of $G$. Then clearly $\sequence {a_n}$ is a linearly independent sequence of $G$. Conversely, let $\sequence {a_n}$ be a linearly independent sequence of $G$. Let $\sequence {b_m}$ be a sequence of distinct terms of $\set {a_1, a_2, \ldots, a_n}$. Let $\sequence {\mu_m}$ be a sequence of scalars such that $\ds \sum_{j \mathop = 1}^m \mu_j b_j = 0$. For each $k \in \closedint 1 n$, let: :$\lambda_k = \begin{cases} \mu_j & : j \text { is the unique index such that } a_k = b_j \\ 0 & : a_k \notin \set {b_1, b_2, \ldots, b_m} \end{cases}$ Then: :$\ds 0 = \sum_{j \mathop = 1}^m \mu_j b_j = \sum_{k \mathop = 1}^n \lambda_k a_k$ Thus: :$\forall k \in \closedint 1 n: \lambda_k = 0$ As $\set {\mu_1, \ldots, \mu_m} \subseteq \set {\lambda_1, \ldots, \lambda_n}$, it follows that: :$\forall j \in \closedint 1 m: \mu_j = 0$ and so $\sequence {b_m}$ has been shown to be a linearly independent sequence. Hence the result. {{Qed}} \end{proof}
22017
\section{Subset of Linearly Ordered Space which is Order-Complete and Closed but not Compact} Tags: Examples of Topologies, Order Topology \begin{theorem} Let $X = \hointr 0 1 \cup \openint 2 3 \cup \set 4$. Let $\preceq$ be the ordering on $X$ induced by the usual ordering of the real numbers. Let $\tau$ be the $\preceq$ order topology on $X$. Let $Y = \hointr 0 1 \cup \set 4$. Let $\tau'$ be the $\tau$-relative subspace topology on $Y$. Then: :$\struct {Y, \preceq}$ is a complete lattice :$Y$ is closed in $X$ but: :$\struct {Y, \tau'}$ is not compact. \end{theorem} \begin{proof} First it is demonstrated that $\struct {Y, \preceq}$ is a complete lattice. Let $\phi: Y \to \closedint 0 1$ be defined as: :$\map \phi y = \begin{cases} y & : y \in \hointr 0 1 \\ 1 & : y = 4 \end{cases}$ Then $\phi$ is a order isomorphism. {{explain|The above needs to be proved.}} {{qed|lemma}} We have that $\closedint 0 1$ is a complete lattice. {{explain|This is probably around here somewhere.}} Next is is shown that $\struct {Y, \preceq}$ is closed in $X$. Let $x \in X \setminus Y$. Then: :$x \in \openint 2 3$ Thus: :$x \in \openint {\dfrac {x + 2} 2} {\dfrac {x + 3} 2} \in \tau'$ Since the complement of $Y$ is open, $Y$ is closed. Finally it is shown that $\struct {Y, \tau'}$ is not compact. Let: :$\AA = \set {x^\preceq: x \in \hointr 0 1} \cup \set {\paren {\dfrac 5 2}^\succeq}$ where: :$x^\preceq$ denotes the lower closure of $x$ :$x^\succeq$ denotes the upper closure of $x$ Then $\AA$ is an open cover of $Y$ with no finite subcover. {{explain|Prove the above statement.}} {{qed}} {{finish|The remaining parts of this proof need to be completed.}} Category:Order Topology \end{proof}
22018
\section{Subset of Meager Set is Meager Set} Tags: Meager Spaces \begin{theorem} Let $T = \struct {S, \tau}$ be a topological space. Let $A$ be meager in $T$. Let $B \subseteq A$. Then $B$ is meager in $T$. \end{theorem} \begin{proof} Since $A$ is meager in $T$: :there exists a countable collection of sets $\set {U_n: n \in \N}$ nowhere dense in $T$ such that $\ds A = \bigcup_{n \in \N} U_n$. Then, we have: {{begin-eqn}} {{eqn | l = B | r = A \cap B | c = Intersection with Subset is Subset }} {{eqn | r = \paren {\bigcup_{n \in \N} U_n} \cap B }} {{eqn | r = \bigcup_{n \in \N} \paren {U_n \cap B} | c = Union Distributes over Intersection }} {{end-eqn}} From Intersection is Subset: :$U_n \cap B \subseteq U_n$ From Subset of Nowhere Dense Subset is Nowhere Dense: :$U_n \cap B$ is nowhere dense in $T$. Then, we see that: :$B$ can be written as the union of nowhere dense sets in $T$. That is: :$B$ is meager in $T$. {{qed}} Category:Meager Spaces \end{proof}
22019
\section{Subset of Metric Space is Subset of its Closure} Tags: Set Closures \begin{theorem} Let $M = \struct {A, d}$ be a metric space. Let $H \subseteq A$ be a subset of $A$. Then: :$H \subseteq H^-$ where $H^-$ denotes the closure of $H$. \end{theorem} \begin{proof} By definition of closure: :$H^- = H' \cup H^i$ where: :$H'$ denotes the set of limit points of $H$ :$H^i$ denotes the set of isolated points of $H$. Let $a \in H$. If $a$ is a limit point of $H$ then $a \in H'$. Suppose $a \notin H'$. Then by definition of limit point: :$\neg \forall \epsilon \in \R_{>0}: \set {x \in A: 0 < \map d {x, a} < \epsilon} \ne \O$ That is: :$\exists \epsilon \in \R_{>0}: \set {x \in A: 0 < \map d {x, a} < \epsilon} = \O$ and so as $\map d {a, a} = 0$: :$\exists \epsilon \in \R_{>0}: \set {x \in A: \map d {x, a} < \epsilon} = \set a$ Thus, by definition, $a$ is an isolated point of $H$. So $a \in H'$ or $a \in H^i$. Hence by definition of set union: :$a \in H \subseteq H^-$ By definition of subset: :$H \subseteq H \subseteq H^-$ and hence the result by definition of closure. {{qed}} \end{proof}
22020
\section{Subset of Module Containing Identity is Linearly Dependent} Tags: Linear Algebra, Module Theory, Modules \begin{theorem} Let $G$ be a group whose identity is $e$. Let $\struct {R, +, \circ}$ be a ring whose zero is $0_R$. Let $\struct {G, +_G, \circ}_R$ be an $R$-module. Let $H \subseteq G$ such that $e \in H$. Then $H$ is a linearly dependent set. \end{theorem} \begin{proof} From Scalar Product with Identity, $\forall \lambda: \lambda \circ e = e$. Let $H \subseteq G$ such that $e \in H$. Consider any sequence $\sequence {a_k}_{1 \mathop \le k \mathop \le n}$ in $H$ which includes $e$. So, let $a_j = e$ for some $j \in \closedint 1 n$. Let $c \in R \ne 0_R$. Consider the sequence $\sequence {\lambda_k}_{1 \mathop \le k \mathop \le n}$ of elements of $R$ defined as: :$\lambda_k = \begin{cases} c & : k \ne j \\ 0_R & : k= j \end{cases}$ Then: {{begin-eqn}} {{eqn | l = \sum_{k \mathop = 1}^n \lambda_k \circ a_k | r = \lambda_1 \circ a_1 + \lambda_2 \circ a_2 + \cdots + \lambda_j \circ a_j + \cdots + \lambda_n \circ a_n | c = }} {{eqn | r = 0_R \circ a_1 + 0_R \circ a_2 + \cdots + c \circ e + \cdots + 0_R \circ a_n | c = }} {{eqn | r = e + e + \cdots + e + \cdots + e | c = }} {{eqn | r = e | c = }} {{end-eqn}} Thus there exists a sequence $\sequence {\lambda_k}_{1 \mathop \le k \mathop \le n}$ in which not all $\lambda_k = 0_R$ such that: :$\ds \sum_{k \mathop = 1}^n \lambda_k \circ a_k = e$ Hence the result. {{qed}} \end{proof}
22021
\section{Subset of Natural Numbers is Cofinal iff Infinite} Tags: Natural Numbers, Order Theory \begin{theorem} Consider the ordered set $\struct {\N, \le}$, where $\le$ is the usual ordering on the natural numbers. Let $S \subseteq \N$. Then $S$ is cofinal {{iff}} it is infinite. \end{theorem} \begin{proof} From Rule of Transposition, we may replace the ''only if'' statement by its contrapositive. Therefore, the following suffices: \end{proof}
22022
\section{Subset of Natural Numbers under Max Operation is Monoid} Tags: Examples of Monoids, Max Operation, Monoid Examples, Max and Min Operations, Natural Numbers, Monoids \begin{theorem} Let $S \subseteq \N$ be a subset of the natural numbers $\N$. Let $\struct {S, \max}$ denote the algebraic structure formed from $S$ and the max operation. Then $\struct {S, \max}$ is a monoid. Its identity element is the smallest element of $S$. \end{theorem} \begin{proof} By the Well-Ordering Principle, $\N$ is a well-ordered set. By definition, every subset of a well-ordered set is also well-ordered. Thus $S$ is a well-ordered set. The result follows from Max Operation on Woset is Monoid. {{qed}} \end{proof}
22023
\section{Subset of Naturals is Finite iff Bounded} Tags: Set Theory, Natural Numbers, Subsets, Subset \begin{theorem} Let $X$ be a subset of the natural numbers $\N$. Then $X$ is finite {{iff}} it is bounded. \end{theorem} \begin{proof} A subset of the natural numbers is also a subset of the real numbers $\R$. By definition, a bounded subset of $\R$ is bounded below in $\R$ and bounded above in $\R$. By the Well-Ordering Principle, $X$ is bounded below Thus $X$ is bounded {{iff}} $X$ is bounded above. That is, {{iff}}: :$\exists p \in \N: \forall x \in X: x \le p$ \end{proof}
22024
\section{Subset of Normed Vector Space is Everywhere Dense iff Closure is Normed Vector Space/Necessary Condition} Tags: Set Closures, Normed Vector Spaces, Denseness \begin{theorem} Let $\struct {X, \norm {\, \cdot \,} }$ is a normed vector space. Let $D \subseteq X$ be a subset of $X$. Let $D^-$ be the closure of $D$. Let $D$ be dense. Then: :$D^- = X$ \end{theorem} \begin{proof} Let $x \in X \setminus D$. Suppose $D$ is dense in $X$. Then: :$\forall n \in N : \exists d_n \in D : d_n \in \map {B_{\frac 1 n}} x$ where $\ds \map {B_{\frac 1 n}} x$ is an open ball. Let $\sequence {d_n}_{n \mathop \in \N}$ be a sequence in $D$. Then: :$\forall n \in \N : \norm {x - d_n} < \frac 1 n$ Hence, $x$ is a limit point of $D$. In other words, $x \in D^-$. We have just shown that: :$x \in X \setminus D \implies x \in D^-$ Hence: :$X \setminus D \subseteq D^-$. By definition of closure: :$D \subseteq D^-$ Therefore: {{begin-eqn}} {{eqn | l = X | r = D \cup \paren {X \setminus D} }} {{eqn | o = \subseteq | r = D^- }} {{eqn | o = \subseteq | r = X }} {{end-eqn}} Thus: :$X = D^-$. {{qed}} \end{proof}
22025
\section{Subset of Normed Vector Space is Everywhere Dense iff Closure is Normed Vector Space/Sufficient Condition} Tags: Set Closures, Normed Vector Spaces, Denseness \begin{theorem} Let $\struct {X, \norm {\, \cdot \,} }$ is a normed vector space. Let $D \subseteq X$ be a subset of $X$. Let $D^-$ be the closure of $D$. Let $D^- = X$. Then $D$ is dense. \end{theorem} \begin{proof} Let $X = D^-$. We have to show, that for every $x \in X$ there is an open ball with an element from $D^-$. We have that $X = D \cup \paren {X \setminus D}$. Suppose $x \in X \setminus D$. Then $x \in D^- \setminus D$. Hence, $x$ is a limit point of $D$. Therefore, there is a sequence $\sequence {d_n}_{n \mathop \in \N}$ in $D$ which converges to $x$. Thus: :$\forall \epsilon \in \R_{> 0} : \exists N \in \N : \norm {x - d_N} < \epsilon$ In other words: :$\ds d_N \in D \implies d_N \in \map {B_\epsilon} x$ Therefore: :$d_N \in D \cap \map {B_\epsilon} x$. Suppose $x \in D$. Let $\epsilon > 0$. Then $x \in \map {B_\epsilon} x \cap D$. From both parts and definition we conclude that $D^-$ is dense in $X$. {{qed}} \end{proof}
22026
\section{Subset of Nowhere Dense Subset is Nowhere Dense} Tags: Denseness \begin{theorem} Let $T = \struct {S, \tau}$ be a topological space. Let $A \subseteq S$ be nowhere dense in $T$. Let $B \subseteq A$. Then $B$ is nowhere dense in $T$. \end{theorem} \begin{proof} {{AimForCont}} it is not the case that $B$ is nowhere dense in $T$. Then by definition of nowhere dense: :$B^-$ contains some open set of $T$ which is non-empty. From Set Closure Preserves Set Inclusion, we have: :$B^- \subseteq A^-$ So: :$A^-$ contains some open set of $T$ which is non-empty. So $A$ is not nowhere dense in $T$. This contradicts our assertion that $A$ is nowhere dense in $T$: The result follows by Proof by Contradiction. {{qed}} Category:Denseness \end{proof}
22027
\section{Subset of Open Reciprocal-N Balls forms Neighborhood Basis in Real Number Line} Tags: Real Number Line with Euclidean Metric, Real Intervals, Real Number Space \begin{theorem} Let $\R$ denote the real number line with the usual (Euclidean) metric. Let $a \in \R$ be a point in $\R$. Let $k \in \Z$ be some fixed integer. Let $\BB_a$ be defined as: :$\BB_a := \set {\map {B_\epsilon} a: \epsilon \in \set {\dfrac 1 n: n \in \N, n > k} }$ that is, the set of all open $\epsilon$-balls of $a$ for $\epsilon$ which are reciprocals of integers greater than $k$. Then the $\BB_a$ is a basis for the neighborhood system of $a$. \end{theorem} \begin{proof} Let $N$ be a neighborhood of $a$ in $M$. Then by definition: :$\exists \epsilon' \in \R_{>0}: \map {B_{\epsilon'} } a \subseteq N$ where $\map {B_{\epsilon'} } a$ is the open $\epsilon'$-ball at $a$. From Open Ball in Real Number Line is Open Interval: :$\map {B_{\epsilon'} } a = \openint {a - \epsilon'} {a + \epsilon'}$ From Between two Real Numbers exists Rational Number: :$\exists \epsilon'' \in \Q: 0 < \epsilon'' < \epsilon'$ Let $\epsilon''$ be expressed in canonical form: :$\epsilon'' = \dfrac p q$ Let $\epsilon''' = \dfrac 1 q$ Then $\epsilon''' \le \epsilon'' < \epsilon'$ If $q \le k$, let $\epsilon = \dfrac 1 {k + 1}$ Otherwise, let $\epsilon = \epsilon'''$. Then: :$\openint {a - \epsilon} {a + \epsilon} \subseteq \openint {a - \epsilon'} {a + \epsilon'}$ From Open Real Interval is Open Ball :$\map {B_\epsilon} a = \openint {a - \epsilon} {a + \epsilon}$ is the open $\epsilon$-ball at $a$. By its method of construction: :$\map {B_\epsilon} a \in \set {\map {B_\epsilon} a: \epsilon \in \set {\dfrac 1 n: n \in \N, n > k} }$ From Subset Relation is Transitive: :$\openint {a - \epsilon} {a + \epsilon} \subseteq N$ From Open Ball is Neighborhood of all Points Inside, $\openint {a - \epsilon} {a + \epsilon}$ is a neighborhood of $a$ in $M$. Hence the result by definition of basis for the neighborhood system of $a$. {{qed}} \end{proof}
22028
\section{Subset of Ordinal implies Cardinal Inequality} Tags: Cardinals \begin{theorem} Let $S$ be a set. Let $x$ be an ordinal such that $S \subseteq x$. Then: :$\card S \le \card x$ where $\card S$ denotes the cardinality of $S$. \end{theorem} \begin{proof} Since $x$ is an ordinal, it follows that $x \sim \card x$ by Ordinal Number Equivalent to Cardinal Number. This satisfies the hypothesis for Subset implies Cardinal Inequality. Therefore: :$\card S \le \card x$ {{qed}} \end{proof}
22029
\section{Subset of Ordinals has Minimal Element} Tags: Ordinals, Class Theory \begin{theorem} Let $A$ be an ordinal (we shall allow $A$ to be a proper class). Let $B$ be a nonempty subset of $A$. Then $B$ has an $\Epsilon$-minimal element. {{explain|$\Epsilon$}} That is: :$\exists x \in B: B \cap x = \O$ \end{theorem} \begin{proof} We have that $\Epsilon$ creates a well-ordering on any ordinal. Also, the initial segments of $x$ are sets. Therefore from Proper Well-Ordering Determines Smallest Elements: :$B$ has an $\Epsilon$-minimal element. \end{proof}
22030
\section{Subset of Particular Point Space is either Open or Closed} Tags: Open Sets, Particular Point Topology, Closed Sets, Clopen Sets \begin{theorem} Let $T = \struct {S, \tau_p}$ be a particular point space. Let $H \subseteq S$ be any subset of $T$. Then $H$ is either open or closed in $T$. The only sets which are both closed and open in $T$ are $S$ and $\O$. \end{theorem} \begin{proof} Let $H \subseteq S$. There are two cases to consider: :$p \in H$ :$p \notin H$ If $p \in H$ then by definition of a particular point topology, $H$ is open. If $p \notin H$, then $p \in \relcomp S H$, where $\relcomp S H$ is the relative complement of $H$ in $S$. So $\relcomp S H$ is open by definition of a particular point topology. From the definition of a closed set, $\relcomp S {\relcomp S H}$ is closed in $T$. From Relative Complement of Relative Complement we have that $\relcomp S {\relcomp S H} = H$ and so $H$ is closed in $T$. Now suppose $H \subseteq T$ is both closed and open in $T$. From Open and Closed Sets in Topological Space, if $H = \O$ or $H = S$ then H is both closed and open in $T$. So, suppose $H \ne \O$ or $H \ne S$. As $H \subseteq T$ is both closed and open in $T$, $\relcomp S H$ is also both closed and open in $T$. From Boundary is Intersection of Closure with Closure of Complement we have: :$\partial H = H^- \cap \paren {\relcomp S H}^-$ where $H^-$ is the closure of $H$. By Closure of Open Set of Particular Point Space we have that $H^- = S$, and of course $\paren {\relcomp S H}^- = S$. So: :$\partial H = H^- \cap \paren {\relcomp S H}^- = S$ However, from Set Clopen iff Boundary is Empty we have that $\partial H = \O$. Hence if $H$ is both closed and open in $T$, then $H = \O$ or $H = S$. {{qed}} Category:Open Sets Category:Closed Sets Category:Clopen Sets Category:Particular Point Topology \end{proof}
22031
\section{Subset of Preimage under Relation is Preimage of Subset} Tags: Preimages under Relations, Subsets, Subset, Relation Theory, Mappings, Mapping Theory, Relations \begin{theorem} Let $\RR \subseteq S \times T$ be a relation. Let $X \subseteq S, Y \subseteq T$. Then: :$X \subseteq \RR^{-1} \sqbrk Y \iff \RR \sqbrk X \subseteq Y$ In the language of direct image mappings, this can be written: :$X \subseteq \map {\RR^\gets} Y \iff \map {\RR^\to} X \subseteq Y$ \end{theorem} \begin{proof} As $\RR$ is a relation, then so is its inverse $\RR^{-1}$. Let $\RR \sqbrk X \subseteq Y$. Thus: {{begin-eqn}} {{eqn | l = \RR \sqbrk X | o = \subseteq | r = Y | c = }} {{eqn | ll= \leadsto | l = \RR^{-1} \sqbrk {\RR \sqbrk X} | o = \subseteq | r = \RR^{-1} \sqbrk Y | c = Image of Subset under Relation is Subset of Image: Corollary 1 }} {{eqn | ll= \leadsto | l = X | o = \subseteq | r = \RR^{-1} \sqbrk Y | c = Image of Preimage under Relation is Subset, Subset Relation is Transitive }} {{end-eqn}} So: :$\RR \sqbrk X \subseteq Y \implies X \subseteq \RR^{-1} \sqbrk Y$ {{qed|lemma}} Now let $X \subseteq \RR^{-1} \sqbrk Y$. The same argument applies: {{begin-eqn}} {{eqn | l = X | o = \subseteq | r = \RR^{-1} \sqbrk Y | c = }} {{eqn | ll= \leadsto | l = \RR \sqbrk X | o = \subseteq | r = \RR \sqbrk {\RR^{-1} \sqbrk Y} | c = Image of Subset under Relation is Subset of Image }} {{eqn | ll= \leadsto | l = \RR \sqbrk X | o = \subseteq | r = Y | c = Image of Preimage under Relation is Subset, Subset Relation is Transitive }} {{end-eqn}} So: :$X \subseteq \RR^{-1} \sqbrk Y \implies \RR \sqbrk X \subseteq Y$ {{qed|lemma}} Thus we have: :$X \subseteq \RR^{-1} \sqbrk Y \implies \RR \sqbrk X \subseteq Y$ :$\RR \sqbrk X \subseteq Y \implies X \subseteq \RR^{-1} \sqbrk Y$ Hence the result. {{qed}} Category:Subsets Category:Preimages under Relations \end{proof}
22032
\section{Subset of Real Numbers is Interval iff Connected} Tags: Connected Spaces, Analysis, Real Intervals, Topology, Connectedness \begin{theorem} Let the real number line $\R$ be considered as a topological space. Let $S$ be a subspace of $\R$. Then $S$ is connected {{iff}} $S$ is an interval of $\R$. That is, the only subspaces of $\R$ that are connected are intervals. \end{theorem} \begin{proof} From Rule of Transposition, we may replace the ''only if'' statement by its contrapositive. Therefore, the following suffices: \end{proof}
22033
\section{Subset of Satisfiable Set is Satisfiable} Tags: Formal Semantics \begin{theorem} Let $\LL$ be a logical language. Let $\mathscr M$ be a formal semantics for $\LL$. Let $\FF$ be an $\mathscr M$-satisfiable set of formulas from $\LL$. Let $\FF'$ be a subset of $\FF$. Then $\FF'$ is also $\mathscr M$-satisfiable. \end{theorem} \begin{proof} Since $\FF$ is $\mathscr M$-satisfiable, there exists some model $\MM$ of $\FF$: :$\MM \models_{\mathscr M} \FF$ Thus for every $\psi \in \FF$: :$\MM \models_{\mathscr M} \psi$ Now, for every $\psi$ in $\FF'$: :$\psi \in \FF$ by definition of subset. Hence: :$\forall \psi \in \FF': \MM \models_{\mathscr M} \psi$ that is, $\MM$ is a model of $\FF'$. Hence $\FF'$ is $\mathscr M$-satisfiable. {{qed}} Category:Formal Semantics \end{proof}
22034
\section{Subset of Set Difference iff Disjoint Set} Tags: Disjoint Sets, Set Difference \begin{theorem} Let $S, T$ be sets. Let $A \subseteq S$ Then: :$A \cap T = \O \iff A \subseteq S \setminus T$ where: :$A \cap T$ denotes set intersection :$\O$ denotes the empty set :$S \setminus T$ denotes set difference. \end{theorem} \begin{proof} We have: {{begin-eqn}} {{eqn | l = A \cap \paren {S \setminus T} | r = \paren {A \cap S} \setminus T | c = Intersection with Set Difference is Set Difference with Intersection }} {{eqn | r = A \setminus T | c = Intersection with Subset is Subset }} {{end-eqn}} Then: {{begin-eqn}} {{eqn | l = A | o = \subseteq | r = S \setminus T | c = }} {{eqn | ll= \leadstoandfrom | l = A | r = A \cap \paren {S \setminus T} | c = Intersection with Subset is Subset }} {{eqn | ll= \leadstoandfrom | l = A | r = A \setminus T | c = As $A \cap \paren {S \setminus T} = A \setminus T$ }} {{eqn | ll = \leadstoandfrom | l = A \cap T | r = \O | c = Set Difference with Disjoint Set }} {{end-eqn}} {{qed}} Category:Set Difference Category:Disjoint Sets \end{proof}
22035
\section{Subset of Set is Coarser than Set} Tags: Preorder Theory \begin{theorem} Let $\left({S, \preceq}\right)$ be a preordered set. Let $A, B$ be subset of $S$ such that :$A \subseteq B$ Then $A$ is coarser than $B$. \end{theorem} \begin{proof} Let $x \in A$. By definition of subset: :$x \in B$ By definition of reflexivity: :$x \preceq x$ Thus :$\exists y \in B: y \preceq x$ {{qed}} \end{proof}
22036
\section{Subset of Standard Discrete Metric Space is Neighborhood of Each Point} Tags: Discrete Metrics, Neighborhoods \begin{theorem} Let $M = \struct {A, d}$ be a metric space where $d$ is the standard discrete metric. Let $S \subseteq A$. Let $a \in S$. Then $S$ is a neighborhood of $a$. That is, every subset of $A$ is a neighborhood of each of its points. \end{theorem} \begin{proof} Let $S \subseteq A$. Let $a \in S$. From Neighborhoods in Standard Discrete Metric Space, $\set a$ is a neighborhood of $a$. As $a \in S$ it follows from Singleton of Element is Subset that $\set a \subseteq S$. The result follows from Superset of Neighborhood in Metric Space is Neighborhood. {{qed}} \end{proof}
22037
\section{Subset of Standard Discrete Metric Space is Open} Tags: Discrete Metrics, Metric Spaces \begin{theorem} Let $M = \struct {A, d}$ be a standard discrete metric space. Let $S \subseteq A$ be a subset of $A$. Then $S$ is an open set of $M$. \end{theorem} \begin{proof} From the definition of standard discrete metric: :$\forall x, y \in A: \map d {x, y} = \begin {cases} 0 & : x = y \\ 1 & : x \ne y \end {cases}$ Let $\epsilon \in \R_{>0}$ be such that $0 < \epsilon \le 1$. Let $x \in S$. Let $\map {B_\epsilon} x$ be the open $\epsilon$-ball of $x$. Then by definition of $\epsilon$ and $d$: :$\map {B_\epsilon} x = \set x$ Thus: :$\forall x \in S: \map {B_\epsilon} x \subseteq S$ Hence the result by definition of open set. {{qed}} \end{proof}
22038
\section{Subset of Subset Product} Tags: Subset Products, Abstract Algebra \begin{theorem} Let $\struct {S, \circ}$ be a magma. Let $\powerset S$ be the power set of $S$. Let $X, Y, Z \in \powerset S$. Then: :$X \subseteq Y \implies \paren {X \circ Z} \subseteq \paren {Y \circ Z}$ :$X \subseteq Y \implies \paren {Z \circ X} \subseteq \paren {Z \circ Y}$ where $X \circ Z$ etc. denotes subset product. \end{theorem} \begin{proof} Let $x \in X, z \in Z$. Then: :$x \circ z \in X \circ Z$ and $z \circ x \in Z \circ X$ Now: :$Y \circ Z = \set {y \circ z: y \in Y, z \in Z}$ :$Z \circ Y = \set {z \circ y: y \in Y, z \in Z}$ But by the definition of a subset: :$x \in X \implies x \in Y$ Thus: :$x \circ z \in Y \circ Z$ and $z \circ x \in Z \circ Y$ and the result follows. {{Qed}} \end{proof}
22039
\section{Subset of Toset is Toset} Tags: Total Orderings \begin{theorem} Let $\left({S, \preceq}\right)$ be a totally ordered set. Let $T \subseteq S$. Then $\left({T, \preceq \restriction_T}\right)$ is also a totally ordered set. In the above, $\preceq \restriction_T$ denotes the restriction of $\preceq$ to $T$. \end{theorem} \begin{proof} As $\left({S, \preceq}\right)$ is a totally ordered set, the relation $\preceq$ is a total ordering, and is by definition: * reflexive * antisymmetric * transitive * connected From Properties of Restriction of Relation, a restriction of a relation which has all those properties inherits them all. Thus $\preceq \restriction_T$ is also: * reflexive * antisymmetric * transitive * connected and so is also a total ordering. Hence the result, by definition of totally ordered set. {{Qed}} \end{proof}
22040
\section{Subset of Well-Founded Relation is Well-Founded} Tags: Well-Founded Relations \begin{theorem} Let $\struct {S, \RR}$ be a relational structure. Let $\RR$ be a well-founded relation on $S$. Let $\QQ$ be a subset of $\RR$. Then $\QQ$ is also a well-founded relation on $S$. \end{theorem} \begin{proof} {{AimForCont}} $\struct {S, \QQ}$ is not a well-founded set. By Infinite Sequence Property of Well-Founded Relation there exists an infinite sequence $\sequence {x_n}$ in $S$ such that: :$\forall n \in \N: \tuple {x_{n + 1}, x_n} \in \QQ \text { and } x_{n + 1} \ne x_n$ But then because $\QQ \subseteq \RR$, it follows that: :$\forall n \in \N: \tuple {x_{n + 1}, x_n} \in \QQ \implies \tuple {x_{n + 1}, x_n} \in \RR$ and it follows that: :$\forall n \in \N: \tuple {x_{n + 1}, x_n} \in \RR \text { and } x_{n + 1} \ne x_n$ Hence by Infinite Sequence Property of Well-Founded Relation it follows that $\RR$ is not a well-founded relation on $S$. The result follows by Proof by Contradiction. {{Qed}} \end{proof}
22041
\section{Subset of Well-Ordered Set is Well-Ordered} Tags: Well-Orderings, Subset of Well-Ordered Set is Well-Ordered, Orderings \begin{theorem} Let $\struct {S, \preceq}$ be a well-ordered set. Let $T \subseteq S$. Let $\preceq'$ be the restriction of $\preceq$ to $T$. Then the relational structure $\struct {T, \preceq'}$ is a well-ordered set. \end{theorem} \begin{proof} Let $\struct {S, \preceq}$ be a well-ordered set. Let $T \subseteq S$. Let $X \subseteq T$. By Subset Relation is Transitive, $X \subseteq S$. By the definition of a well-ordered set, $X$ has a smallest element. It follows by definition that $T$ is well-ordered. Hence the result. {{Qed}} \end{proof}
22042
\section{Subsets in Increasing Union} Tags: Subsets, Set Union, Unions, Union, Subset \begin{theorem} Let $S_0, S_1, S_2, \ldots, S_i, \ldots$ be a nested sequence of sets, that is: :$S_0 \subseteq S_1 \subseteq S_2 \subseteq \ldots \subseteq S_i \subseteq \ldots$ Let $S$ be the increasing union of $S_0, S_1, S_2, \ldots, S_i, \ldots$: :$\ds S = \bigcup_{i \mathop \in \N} S_i$ Then: :$\forall s \in S: \exists k \in \N: \forall j \ge k: s \in S_j$ \end{theorem} \begin{proof} Let $k \in \N$. Let $j \ge k$. Then by as many applications as necessary of Subset Relation is Transitive, we have: :$S_k \subseteq S_j$ Now $s \in S$ means, by definition of set union, that: :$\exists S_k \subseteq S: s \in S_k$ Then from above: :$j \ge k \implies S_k \subseteq S_j$ it follows directly that: :$\forall s \in S: \exists k \in \N: \forall j \ge k: s \in S_j$ from the definition of subset. {{qed}} \end{proof}
22043
\section{Subsets of Disjoint Sets are Disjoint} Tags: Disjoint Sets, Subsets, Subset \begin{theorem} Let $S$ and $T$ be disjoint sets. Let $S' \subseteq S$ and $T' \subseteq T$. Then $S'$ and $T'$ are disjoint. \end{theorem} \begin{proof} Let $S \cap T = \O$. Let $S' \subseteq S$ and $T' \subseteq T$. {{AimForCont}} $S' \cap T' \ne \O$. Then: {{begin-eqn}} {{eqn | l = \exists x | o = \in | r = S' \cap T' | c = }} {{eqn | ll= \leadsto | l = x | o = \in | r = S' | c = {{Defof|Set Intersection}} }} {{eqn | lo= \land | l = x | o = \in | r = T' | c = }} {{eqn | ll= \leadsto | l = x | o = \in | r = S | c = {{Defof|Subset}} }} {{eqn | lo= \land | l = x | o = \in | r = T | c = }} {{eqn | ll= \leadsto | l = x | o = \in | r = S \cap T | c = {{Defof|Set Intersection}} }} {{eqn | ll= \leadsto | l = S \cap T | o = \ne | r = \O | c = {{Defof|Set Intersection}} }} {{end-eqn}} From this contradiction: :$S' \cap T' = \O$ Hence the result by definition of disjoint sets. {{qed}} \end{proof}
22044
\section{Subsets of Equidecomposable Subsets are Equidecomposable} Tags: Topology \begin{theorem} Let $A, B \subseteq \R^n$ be equidecomposable. Let $S \subseteq A$. Then there exists $T \subseteq B$ such that $S$ and $T$ are equidecomposable. \end{theorem} \begin{proof} Let $X_1, \dots, X_m$ be a decomposition of $A, B$ together with isometries $\mu_1, \ldots, \mu_m, \nu_1, \ldots, \nu_m: \R^n \to \R^n$ such that: :$\ds A = \bigcup_{i \mathop = 1}^m \map {\mu_i} {X_i}$ and :$\ds B = \bigcup_{i \mathop = 1}^m \map {\nu_i} {X_i}$ Define: :$Y_i = \mu_i^{-1} \paren {S \cap \map {\mu_i} {X_i} }$ Then: {{begin-eqn}} {{eqn | l = \bigcup_{i \mathop = 1}^m \map {\mu_i} {Y_i} | r = \bigcup_{i \mathop = 1}^m \paren {S \cap \map {\mu_i} {X_i} } | c = }} {{eqn | r = S \cap \bigcup_{i \mathop = 1}^m \map {\mu_i} {X_i} | c = }} {{eqn | r = S \cap A | c = }} {{eqn | r = S | c = }} {{end-eqn}} and so $\sequence {Y_i}_{i \mathop = 1}^m$ forms a decomposition of $S$. But for each $i$: :$\paren {S \cap \map {\mu_i} {X_i} } \subseteq \map {\mu_i} {X_i}$ and so: {{begin-eqn}} {{eqn | l = Y_i | r = \map {\mu_i^{-1} } {S \cap \map {\mu_i} {X_i} } | c = }} {{eqn | o = \subseteq | r = \map {\mu_i^{-1} } {\map {\mu_i} {X_i} } | c = }} {{eqn | r = X_i | c = }} {{end-eqn}} Hence: :$\map {\nu_i} {Y_i} \subseteq \map {\nu_i} {X_i}$ and so: {{begin-eqn}} {{eqn | l = \bigcup_{i \mathop = 1}^m \map {\nu_i} {Y_i} | o = \subseteq | r = \bigcup_{i \mathop = 1}^m \map {\nu_i} {X_i} | c = }} {{eqn | r = B | c = }} {{end-eqn}} Define: :$\ds \bigcup_{i \mathop = 1}^m \map {\nu_i} {Y_i} = T$ Hence the result. {{qed}} Category:Topology \end{proof}
22045
\section{Subspace Topology is Initial Topology with respect to Inclusion Mapping} Tags: Inclusion Mappings, Initial Topology, Topological Subspaces \begin{theorem} Let $\struct {X, \tau}$ be a topological space. Let $Y$ be a non-empty subset of $X$. Let $\iota: Y \to X$ be the inclusion mapping. Let $\tau_Y$ be the initial topology on $Y$ with respect to $\iota$. Then $\struct {Y, \tau_Y}$ is a topological subspace of $\struct {X, \tau}$. That is: :$\tau_Y = \set {U \cap Y: U \in \tau}$ \end{theorem} \begin{proof} By Initial Topology with respect to Mapping equals Set of Preimages, it follows that: :$\tau_Y = \set {\iota^{-1} \sqbrk U: U \in \tau}$ From Preimage of Subset under Inclusion Mapping, we have: :$\forall S \subseteq X: \iota^{-1} \sqbrk S = S \cap Y$ Hence the result. {{qed}} Category:Topological Subspaces Category:Inclusion Mappings Category:Initial Topology \end{proof}
22046
\section{Subspace of Complete Metric Space is Closed iff Complete} Tags: Metric Subspaces, Complete Metric Spaces, Metric Spaces \begin{theorem} Let $\struct {M, d}$ be a complete metric space. Let $\struct {S, d}$ be a subspace of $\struct {M, d}$. Then $S$ is closed {{iff}} $S$ is complete. \end{theorem} \begin{proof} This will be proved by demonstrating the contrapositive: :$S$ is not complete {{iff}} $S$ is not closed. \end{proof}
22047
\section{Subspace of Either-Or Space less Zero is not Lindelöf} Tags: Either-Or Topology, Lindelöf Spaces \begin{theorem} Let $T = \struct {S, \tau}$ be the either-or space. Let $H = S \setminus \set 0$ be the set $S$ without zero. Then the topological subspace $T_H = \struct {H, \tau_H}$ is not a Lindelöf space. \end{theorem} \begin{proof} By definition of topological subspace, $U \subseteq H$ is open in $T_H$ {{iff}}: :$(1): \quad \set 0 \nsubseteq U$ or: :$(2): \quad \openint {-1} 1 \subseteq U$ But for all $U \subseteq H$, condition $(1)$ holds as $0 \notin H$. So $T_H$ is by definition a discrete space. As $T_H$ is uncountable, we have that Uncountable Discrete Space is not Lindelöf holds. Hence the result. {{qed}} \end{proof}
22048
\section{Subspace of Finite Complement Topology is Compact} Tags: Compact Spaces, Finite Complement Topology, Compactness \begin{theorem} Let $T = \struct {S, \tau}$ be a finite complement topology on an infinite set $S$. Then every topological subspace of $T$, including $T$ itself, is a compact space. \end{theorem} \begin{proof} Let $T_H = \struct {H, \tau_H}$ be a subspace of $T$. Let $\CC$ be an open cover of $T_H$. Let $U \in \CC$ be any set in $C$. $U$ covers all but a finite number of points of $T_H$. So for each of those points we pick an element of $\CC$ which covers each of those points. Hence we have a finite subcover of $T_H$. So by definition $T_H$ is a compact space. {{qed}} \end{proof}
22049
\section{Subspace of Metric Space is Metric Space} Tags: Metric Subspaces, Metric Spaces \begin{theorem} Let $M = \struct {A, d}$ be a metric space. Let $H \subseteq A$. Let $d_H: H \times H \to \R$ be the restriction $d \restriction_{H \times H}$ of $d$ to $H$. Let $\struct {H, d_H}$ be a metric subspace of $\struct {A, d}$. Then $d_H$ is a metric on $H$. \end{theorem} \begin{proof} By definition of restriction: :$\forall x, y \in H: \map {d_H} {x, y} = \map d {x, y}$ As $d$ is a metric, the metric space axioms are all fulfilled by all $x, y \in A$ under $d$. As $H \subseteq A$, by definition of subset, all $x, y \in H$ are also elements of $A$. Therefore the metric space axioms are all fulfilled by all $x, y \in H$ under $d_H$. {{qed}} \end{proof}
22050
\section{Subspace of Noetherian Space is Noetherian} Tags: Noetherian Spaces \begin{theorem} Let $X$ be a Noetherian topological space. Let $Y \subseteq X$ be a subspace. Then $Y$ is Noetherian. \end{theorem} \begin{proof} Let $Y_1 \subset Y_2 \subset \ldots \subset$ be an ascending chain of open sets in $Y$. {{explain|If we are going to use the term "ascending chain", then we need to have a page that defines exactly that.}} By definition of subspace topology, there exists open sets $X_1, X_2, \ldots$ such that :$X_i \cap Y = Y_i$ for all $i$. By taking the intersection $Z_i := \ds \bigcap_{j \mathop = i}^{\infty} X_j$, we have: :$Z_i \cap Y = Y_i$ :$Z_1 \subset Z_2 \subset \dots$ Since $X$ is Noetherian, every ascending chain of open sets is eventually constant. Hence $Z_i$ is eventually constant. Then $Y_i = Z_i \cap Y$ is eventually constant. Hence $Y$ is also Noetherian. \end{proof}
22051
\section{Subspace of Product Space is Homeomorphic to Factor Space} Tags: Homeomorphisms, Subspace of Product Space Homeomorphic to Factor Space, Topological Subspaces, Subspace of Product Space is Homeomorphic to Factor Space, Product Spaces \begin{theorem} Let $\family {\struct {X_i, \tau_i} }_{i \mathop \in I}$ be a family of topological spaces where $I$ is an arbitrary index set. Let $\ds \struct {X, \tau} = \prod_{i \mathop \in I} \struct {X_i, \tau_i}$ be the product space of $\family {\struct {X_i, \tau_i} }_{i \mathop \in I}$. Suppose that $X$ is non-empty. Then for each $i \in I$ there is a subspace $Y_i \subseteq X$ which is homeomorphic to $\struct {X_i, \tau_i}$. Specifically, for any $z \in X$, let: :$Y_i = \set {x \in X: \forall j \in I \setminus \set i: x_j = z_j}$ and let $\upsilon_i$ be the subspace topology of $Y_i$ relative to $\tau$. Then $\struct {Y_i, \upsilon_i}$ is homeomorphic to $\struct {X_i, \tau_i}$, where the homeomorphism is the restriction of the projection $\pr_i$ to $Y_i$. \end{theorem} \begin{proof} Take any factor space $X_i$, and $z_j \in X_j$ where $j \ne i$. Then $\displaystyle Y_i = X_i \times \prod_{i\ne j\in I} \{z_j\} \subseteq X$ is a subspace of $(X,\tau)$. Even more, from: * Projection from Product Topology is Continuous * Projection from Product Topology is Open we have that $\operatorname{pr}_i \restriction_{Y_i}: Y_i\to X_i$ is a homeomorphism because it is open, continuous and bijective. {{qed}} Category:Product Spaces Category:Topological Subspaces Category:Homeomorphisms 127675 82903 2013-01-19T03:21:39Z Dfeuer 1672 Make the mathematics more precise and improve the theorem conclusion to make it more specific and hence useful. The "subspace" described wasn't actually a subspace (though it was homeomorphic to one), so nothing quite worked. 127675 wikitext text/x-wiki \end{proof}
22052
\section{Subspace of Product Space is Homeomorphic to Factor Space/Product with Singleton} Tags: Subspace of Product Space Homeomorphic to Factor Space, Product Spaces, Subspace of Product Space is Homeomorphic to Factor Space \begin{theorem} Let $T_1$ and $T_2$ be non-empty topological spaces. Let $b \in T_2$. Let $T_1 \times T_2$ be the product space of $T_1$ and $T_2$. Let $T_2 \times T_1$ be the product space of $T_2$ and $T_1$. Then: :$T_1$ is homeomorphic to the subspace $T_1 \times \set b$ of $T_1 \times T_2$ :$T_1$ is homeomorphic to the subspace $\set b \times T_1$ of $T_2 \times T_1$ \end{theorem} \begin{proof} The conclusions are symmetrical. {{WLOG}}, therefore, it will be shown that $T_1$ is homeomorphic to the subspace $T_1 \times \left\{{b}\right\}$ of $T_1 \times T_2$. Let $f: T_1 \to T_1 \times \set{b}$ be defined as: :$\map f x = \tuple {x, b}$ \end{proof}
22053
\section{Subspace of Product Space is Homeomorphic to Factor Space/Proof 1/Lemma 1} Tags: Subspace of Product Space Homeomorphic to Factor Space, Subspace of Product Space is Homeomorphic to Factor Space \begin{theorem} Let $\family {X_i}_{i \mathop \in I}$ be a family of sets where $I$ is an arbitrary index set. Let $\ds X = \prod_{i \mathop \in I} X_i$ be the Cartesian product of $\family {X_i}_{i \mathop \in I}$. Let $z \in X$. Let $i \in I$. Let $Y_i = \set {x \in X: \forall j \in I \setminus \set i: x_j = z_j}$ For all for all $j \in I$ let: ::$Z_j = \begin{cases} X_i & i = j \\ \set{z_j} & j \ne i \end{cases}$ Then: :$Y_i = \prod_{j \mathop \in I} Z_j$ \end{theorem} \begin{proof} {{begin-eqn}} {{eqn | r = x \in Y_i | o = }} {{eqn | ll = \leadstoandfrom | q = \forall j \in I | l = x_j | r = \begin {cases} z_j & j \ne i \\ x_i \in X_i & i = j \end {cases} | c = Definition of $Y_i$ }} {{eqn | ll = \leadstoandfrom | q = \forall j \in I | l = x_j | o = \in | r = Z_j | c = Definition of $Z_j$ for all $j \in I$ }} {{eqn | ll = \leadstoandfrom | l = x | o = \in | r = \prod_{j \mathop \in I} Z_j | c = {{Defof|Cartesian Product}} }} {{end-eqn}} The result follows by definition of set equality. {{qed}} Category:Subspace of Product Space is Homeomorphic to Factor Space \end{proof}
22054
\section{Subspace of Product Space is Homeomorphic to Factor Space/Proof 1/Lemma 2} Tags: Subspace of Product Space Homeomorphic to Factor Space, Subspace of Product Space is Homeomorphic to Factor Space \begin{theorem} Let $\family {X_i}_{i \mathop \in I}$ be a family of sets where $I$ is an arbitrary index set. Let $\ds X = \prod_{i \mathop \in I} X_i$ be the Cartesian product of $\family {X_i}_{i \mathop \in I}$. Let $z \in X$. Let $i \in I$. Let $\pr_i : X \to X_i$ be the $i$th-projection from $X$. For all for all $j \in I$ let: :$Z_j = \begin{cases} X_i & i = j \\ \set{z_j} & j \ne i \end{cases}$ Let $Y_i = \prod_{j \mathop \in I} Z_j$ Let $p_i : Y_i \to X_i$ be the $i$th-projection from $Y_i$. Then: :$\pr_i {\restriction_{Y_i} } = p_i$ \end{theorem} \begin{proof} For all $y \in Y_i$: {{begin-eqn}} {{eqn | l = \map {\pr_i {\restriction_{Y_i} } } y | r = \map {\pr_i} y | c = {{Defof|Restriction of Mapping}}: $\pr_i {\restriction_{Y_i} } : Y_i \to X_i$ }} {{eqn | r = y_i | c = {{Defof|Projection}}: $\pr_i: X \to X_i$ }} {{eqn | r = \map {p_i} y | c = {{Defof|Projection}}: $p_i: Y_i \to X_i$ }} {{end-eqn}} By Equality of Mappings: :$\pr_i {\restriction_{Y_i} } = p_i$ {{qed}} Category:Subspace of Product Space is Homeomorphic to Factor Space \end{proof}
22055
\section{Subspace of Product Space is Homeomorphic to Factor Space/Proof 2/Continuous Mapping} Tags: Subspace of Product Space Homeomorphic to Factor Space, Subspace of Product Space is Homeomorphic to Factor Space \begin{theorem} Let $\family {\struct {X_i, \tau_i} }_{i \mathop \in I}$ be a family of topological spaces where $I$ is an arbitrary index set. Let $\ds \struct {X, \tau} = \prod_{i \mathop \in I} \struct {X_i, \tau_i}$ be the product space of $\family {\struct {X_i, \tau_i} }_{i \mathop \in I}$. Let $z \in X$. Let $i \in I$. Let $Y_i = \set {x \in X: \forall j \in I \setminus \set i: x_j = z_j}$. Let $\upsilon_i$ be the subspace topology of $Y_i$ relative to $\tau$. Let $p_i = \pr_i {\restriction_{Y_i}}$, where $\pr_i$ is the projection from $X$ to $X_i$. Then: :$p_i$ is continuous. \end{theorem} \begin{proof} Let $V \in \tau_i$. Let $\ds U = \prod_{i \mathop \in I} U_i$ where: :$U_j = \begin{cases} X_j & j \ne i \\ V & j = i \end{cases}$ From Natural Basis of Product Topology, $U$ is an element of the the natural basis. By definition of the product topology $\tau$ on the product space $\struct {X, \tau}$ the natural basis is a basis for the product topology. It follows that: :$U$ is open in $\struct {X, \tau}$ Let $x \in Y_i$. Now: {{begin-eqn}} {{eqn | l = x | o = \in | r = \map {p_i^\gets} V }} {{eqn | ll= \leadstoandfrom | l = \map {p_i} x | o = \in | r = V | c = {{Defof|Inverse Image Mapping of Mapping}}: $p_i^\gets$ }} {{eqn | ll= \leadstoandfrom | l = \map {\pr_i} x | o = \in | r = V | c = {{Defof|Restriction of Mapping}} $p_i$ }} {{eqn | ll= \leadstoandfrom | l = x | o = \in | r = U | c = Definition of $U$ }} {{eqn | ll= \leadstoandfrom | l = x | o = \in | r = U \cap Y_i | c = as $x \in Y_i$ }} {{end-eqn}} By set equality: :$\map {p_i^\gets} V = U \cap Y_i$ By definition of the subspace topology on $Y_i$: :$\map {p_i^\gets} V \in \upsilon_i$ It follows that $p_i$ is continuous by definition. {{qed}} Category:Subspace of Product Space is Homeomorphic to Factor Space \end{proof}
22056
\section{Subspace of Product Space is Homeomorphic to Factor Space/Proof 2/Injection} Tags: Subspace of Product Space Homeomorphic to Factor Space, Subspace of Product Space is Homeomorphic to Factor Space \begin{theorem} Let $\family {X_i}_{i \mathop \in I}$ be a family of sets where $I$ is an arbitrary index set. Let $\ds X = \prod_{i \mathop \in I} X_i$ be the Cartesian product of $\family {X_i}_{i \mathop \in I}$. Let $z \in X$. Let $i \in I$. Let $Y_i = \set {x \in X: \forall j \in I \setminus \set i: x_j = z_j}$. Let $p_i = \pr_i {\restriction_{Y_i}}$, where $\pr_i$ is the projection from $X$ to $X_i$. Then: :$p_i$ is an injection. \end{theorem} \begin{proof} Note that by definitions of a restriction and a projection then: :$\forall y \in Y_i: \map {p_i} y = y_i$ Let $x, y \in Y_i$. Then for all $j \in I \setminus \set i$: :$x_j = z_j = y_j$ Let $\map {p_i} x = \map {p_i} y$. Then: :$x_i = y_i$ Thus: :$x = y$ It follows that $p_i$ is an injection by definition. {{qed}} Category:Subspace of Product Space is Homeomorphic to Factor Space \end{proof}
22057
\section{Subspace of Product Space is Homeomorphic to Factor Space/Proof 2/Lemma 1} Tags: Subspace of Product Space Homeomorphic to Factor Space, Subspace of Product Space is Homeomorphic to Factor Space \begin{theorem} Let $\family {\struct {X_i, \tau_i} }_{i \mathop \in I}$ be a family of topological spaces where $I$ is an arbitrary index set. Let $\ds \struct {X, \tau} = \prod_{i \mathop \in I} \struct {X_i, \tau_i}$ be the product space of $\family {\struct {X_i, \tau_i} }_{i \mathop \in I}$. For all $k \in I$, let $\pr_k$ denote the projection from $X$ to $X_k$. Let $z \in X$. Let $i, k \in I$. Let $Y_i = \set {x \in X: \forall j \in I \setminus \set i: x_j = z_j}$. Let $p_i = \pr_i {\restriction_{Y_i} }$ be the restriction of $\pr_i$ to $Y_i$. Let $V_k \in \tau_k$. Let $\map {\pr_k^\gets } {V_k} \cap Y_i \ne \O$. Then: :$\map {p_i^\to} {\map {\pr_k^\gets} {V_k} \cap Y_i}$ is open in $\struct{X_i, \tau_i}$ \end{theorem} \begin{proof} We have that $p_i$ is a bijection from the lemmas: :$p_i$ is an injection :$p_i$ is a surjection Let $x \in X_i$. Then: {{begin-eqn}} {{eqn | l = x | o = \in | r = \map {p_i^\to} {\map {\pr_k^\gets} {V_k} \cap Y_i} }} {{eqn | ll= \leadstoandfrom | l = \map {p_i^{-1} } x | o = \in | r = \map {\pr_k^\gets} {V_k} \cap Y_i | c = {{Defof|Direct Image Mapping of Mapping}} }} {{eqn | ll= \leadstoandfrom | l = \map {p_i^{-1} } x | o = \in | r = \map {\pr_k^\gets} {V_k} | c = as $\map {p_i^{-1} } x \in Y_i$ }} {{eqn | ll= \leadstoandfrom | l = \map {\pr_k} {\map {p_i^{-1} } x} | o = \in | r = V_k | c = {{Defof|Direct Image Mapping of Mapping}} }} {{end-eqn}} By definition of $p_i$: :$\map {p_i^{-1} } x = y$ where: :$\forall j \in I : y_j = \begin {cases} z_j & j \ne i \\ x & j = i \end {cases}$ \end{proof}
22058
\section{Subspace of Product Space is Homeomorphic to Factor Space/Proof 2/Open Mapping} Tags: Subspace of Product Space Homeomorphic to Factor Space, Subspace of Product Space is Homeomorphic to Factor Space \begin{theorem} Let $\family {\struct {X_i, \tau_i} }_{i \mathop \in I}$ be a family of topological spaces where $I$ is an arbitrary index set. Let $\ds \struct {X, \tau} = \prod_{i \mathop \in I} \struct {X_i, \tau_i}$ be the product space of $\family {\struct {X_i, \tau_i} }_{i \mathop \in I}$. Let $z \in X$. Let $i \in I$. Let $Y_i = \set {x \in X: \forall j \in I \setminus \set i: x_j = z_j}$. Let $\upsilon_i$ be the subspace topology of $Y_i$ relative to $\tau$. Let $p_i = \pr_i {\restriction_{Y_i} }$, where $\pr_i$ is the projection from $X$ to $X_i$. Then: :$p_i$ is an open mapping. \end{theorem} \begin{proof} Let $U \in \upsilon_i$. Let $x \in \map {p_i^\to} U$. Then by definition of the direct image mapping: :$\exists y \in U : x = \map {p_i} y$ By the definition of the subspace topology: :$\exists U' \in \tau: U = U' \cap Y_i$ For all $k \in I$ let $\pr_k$ denote the projection from $X$ to $X_k$. By definition of the natural basis of the product topology $\tau$: :there exists a finite subset $J$ of $I$ and: :for each $k \in J$, there exists a $V_k \in \tau_k$ such that: :$\ds y \in \bigcap_{k \mathop \in J} \map {\pr_k^\gets} {V_k} \subseteq U'$ Then: :$\ds y \in \paren {\bigcap_{k \mathop \in J} \map{\pr_k^\gets} {V_k} } \cap Y_i \subseteq U' \cap Y_i = U$ By definition of direct image mapping: :$\ds x = \map {p_i} y \in \map {p_i^\to} {\paren {\bigcap_{k \mathop \in J} \map {\pr_k^\gets} {V_k} } \cap Y_i} \subseteq \map {p_i^\to} U$ Recall that $p_i$ is an injection. Then: {{begin-eqn}} {{eqn | l = \map {p_i^\to} {\paren {\bigcap_{k \mathop \in J} \map {\pr_k^\gets} {V_k} } \cap Y_i} | r = \map {p_i^\to} {\bigcap_{k \mathop \in J} \paren {\map {\pr_k^\gets} {V_k} \cap Y_i} } | c = Intersection Distributes over Intersection }} {{eqn | r = \bigcap_{k \mathop \in J} \map {p_i^\to} {\map {\pr_k^\gets } {V_k} \cap Y_i} | c = Image of Intersection under Injection }} {{end-eqn}} Let $k \in J$. \end{proof}
22059
\section{Subspace of Product Space is Homeomorphic to Factor Space/Proof 2/Surjection} Tags: Subspace of Product Space Homeomorphic to Factor Space, Subspace of Product Space is Homeomorphic to Factor Space \begin{theorem} Let $\family {X_i}_{i \mathop \in I}$ be a family of sets where $I$ is an arbitrary index set. Let $\ds X = \prod_{i \mathop \in I} X_i$ be the Cartesian product of $\family {X_i}_{i \mathop \in I}$. Let $z \in X$. Let $i \in I$. Let $Y_i = \set {x \in X: \forall j \in I \setminus \set i: x_j = z_j}$. Let $p_i = \pr_i {\restriction_{Y_i} }$, where $\pr_i$ is the projection from $X$ to $X_i$. Then: :$p_i$ is a surjection. \end{theorem} \begin{proof} Note that by definitions of a restriction and a projection then: :$\forall y \in Y_i: \map {p_i} y = y_i$ Let $x \in X_i$. Let $y \in Y_i$ be defined by: :$\forall j \in I: y_j = \begin{cases} z_j & j \ne i \\ x & j = i \end{cases}$ Then: :$\map {p_i} y = y_i = x$ It follows that $p_i$ is an surjection by definition. {{qed}} Category:Subspace of Product Space is Homeomorphic to Factor Space \end{proof}
22060
\section{Subspace of Real Continuous Functions} Tags: Linear Algebra, Vector Subspaces, Analysis \begin{theorem} Let $\mathbb J = \set {x \in \R: a \le x \le b}$ be a closed interval of the real number line $\R$. Let $\map \CC {\mathbb J}$ be the set of all continuous real functions on $\mathbb J$. Then $\struct {\map \CC {\mathbb J}, +, \times}_\R$ is a subspace of the $\R$-vector space $\struct {\R^{\mathbb J}, +, \times}_\R$. \end{theorem} \begin{proof} By definition, $\map \CC {\mathbb J} \subseteq \R^{\mathbb J}$. Let $f, g \in \map \CC {\mathbb J}$. By Two-Step Vector Subspace Test, it needs to be shown that: :$(1): \quad f + g \in \map \CC {\mathbb J}$ :$(2): \quad \lambda f \in \map \CC {\mathbb J}$ for any $\lambda \in \R$ $(1)$ follows by Sum Rule for Continuous Real Functions. $(2)$ follows by Multiple Rule for Continuous Real Functions. Hence $\struct {\map \CC {\mathbb J}, +, \times}_\R$ is a subspace of the $\R$-vector space $\struct {\R^{\mathbb J}, +, \times}_\R$. {{qed}} \end{proof}
22061
\section{Subspace of Real Differentiable Functions} Tags: Differential Calculus, Differentiation, Vector Subspaces \begin{theorem} Let $\mathbb J$ be an open interval of the real number line $\R$. Let $\map \DD {\mathbb J}$ be the set of all differentiable real functions on $\mathbb J$. Then $\struct {\map \DD {\mathbb J}, +, \times}_\R$ is a subspace of the $\R$-vector space $\struct {\R^{\mathbb J}, +, \times}_\R$. \end{theorem} \begin{proof} Note that by definition, $\map \DD {\mathbb J} \subseteq \R^{\mathbb J}$. Let $f, g \in \map \DD {\mathbb J}$. Let $\lambda \in \R$. From Linear Combination of Derivatives, we have that: :$f + \lambda g$ is differentiable on $\mathbb J$. That is: :$f + \lambda g \in \map \DD {\mathbb J}$ So, by One-Step Vector Subspace Test: :$\struct {\map \DD {\mathbb J}, +, \times}_\R$ is a subspace of $\R^{\mathbb J}$. {{qed}} \end{proof}
22062
\section{Subspace of Real Functions of Differentiability Class} Tags: Vector Subspaces, Analysis \begin{theorem} Let $\mathbb J = \set {x \in \R: a < x < b}$ be an open interval of the real number line $\R$. Let $\map {\CC^m} {\mathbb J}$ be the set of all continuous real functions on $\mathbb J$ in differentiability class $m$. Then $\struct {\map {\CC^m} {\mathbb J}, +, \times}_\R$ is a subspace of the $\R$-vector space $\struct {\R^{\mathbb J}, +, \times}_\R$. \end{theorem} \begin{proof} Note that by definition, $\map {\CC^m} {\mathbb J} \subseteq \R^{\mathbb J}$. Let $f, g \in \map {\CC^m} {\mathbb J}$. Let $\lambda \in \R$. Applying Linear Combination of Derivatives $m$ times we have: :$f + \lambda g$ is $m$-times differentiable on $\mathbb J$ with $m$th derivative $f^{\paren m} + \lambda g^{\paren m}$. Since both $f$ and $g$ are of differentiability class $m$: :$f^{\paren m}$ and $g^{\paren m}$ are continuous on $\mathbb J$. From Combined Sum Rule for Continuous Real Functions: :$f^{\paren m} + \lambda g^{\paren m} = \paren {f + \lambda g}^{\paren m}$ is continuous on $\mathbb J$. So: :$f + \lambda g \in \map {\CC^m} {\mathbb J}$ Therefore, by One-Step Vector Subspace Test: :$\struct {\map {\CC^m} {\mathbb J}, +, \times}_\R$ is a subspace of $\struct {\R^{\mathbb J}, +, \times}_\R$. {{qed}} \end{proof}
22063
\section{Subspace of Riemann Integrable Functions} Tags: Vector Subspaces, Analysis \begin{theorem} Let $\mathbb J = \set {x \in \R: a \le x \le b}$ be a closed interval of the real number line $\R$. Let $\map \RR {\mathbb J}$ be the set of all Riemann integrable functions on $\mathbb J$. Then $\struct {\map \RR {\mathbb J}, +, \times}_\R$ is a subspace of the $\R$-vector space $\struct {\R^{\mathbb J}, +, \times}_\R$. \end{theorem} \begin{proof} Note that by definition, $\map \RR {\mathbb J} \subseteq \R^{\mathbb J}$. Let $f, g \in \map \RR {\mathbb J}$. Let $\lambda \in \R$. By Linear Combination of Definite Integrals: :$f + \lambda g$ is Riemann integrable on $\mathbb J$. That is: :$f + \lambda g \in \map \RR {\mathbb J}$ So by One-Step Vector Subspace Test: :$\struct {\map \RR {\mathbb J}, +, \times}_\R$ is a subspace of $\struct {\R^{\mathbb J}, +, \times}_\R$. {{qed}} \end{proof}
22064
\section{Subspace of Subspace is Subspace} Tags: Topological Subspaces \begin{theorem} Let $T = \struct{S, \tau}$ be a topological space. Let $H \subseteq S$ and $\tau_H$ be the subspace topology on $H$. Let $K\subseteq H$. Then the subspace topology on $K$ induced by $\tau$ equals the subspace topology on $K$ induced by $\tau_H$. \end{theorem} \begin{proof} Let $\tau_K$ be the subspace topology on $K$ induced by $\tau$. Let $\tau’_K$ be the subspace topology on $K$ induced by $\tau_H$. Then {{begin-eqn}} {{eqn | l = V \in \tau’_K | o = \leadstoandfrom | r = \exists U’ \in \tau_H : V = U’ \cap K | c = {{Defof|Subspace Topology}} $\tau’_K$ }} {{eqn | o = \leadstoandfrom | r = \exists U \in \tau : V = \paren {U \cap H} \cap K | c = {{Defof|Subspace Topology}} $\tau_H$ }} {{eqn | o = \leadstoandfrom | r = \exists U \in \tau : V = U \cap \paren {H \cap K} | c = Intersection is Associative }} {{eqn | o = \leadstoandfrom | r = \exists U \in \tau : V = U \cap K | c = Intersection with Subset is Subset }} {{eqn | o = \leadstoandfrom | r = V \in \tau_K | c = {{Defof|Subspace Topology}} $\tau_K$ }} {{end-eqn}} Category:Topological Subspaces \end{proof}
22065
\section{Subspaces of Dimension 2 Real Vector Space} Tags: Subspaces of Dimension 2 Real Vector Space, Linear Algebra \begin{theorem} Take the $\R$-vector space $\left({\R^2, +, \times}\right)_\R$. Let $S$ be a subspace of $\left({\R^2, +, \times}\right)_\R$. Then $S$ is one of: : $(1): \quad \left({\R^2, +, \times}\right)_\R$ : $(2): \quad \left\{{0}\right\}$ : $(3): \quad$ A line through the origin. \end{theorem} \begin{proof} * Let $S$ be a non-zero subspace of $\left({\R^2, +, \times}\right)_\R$. Then $S$ contains a non-zero vector $\left({\alpha_1, \alpha_2}\right)$. Hence $S$ also contains $\left\{{\lambda \times \left({\alpha_1, \alpha_2}\right), \lambda \in \R}\right\}$. From Equation of a Straight Line, this set may be described as a line through the origin. * Suppose $S$ also contains a non-zero vector $\left({\beta_1, \beta_2}\right)$ which is not on that line. Then $\alpha_1 \times \beta_2 - \alpha_2 \times \beta_1 \ne 0$. Otherwise $\left({\beta_1, \beta_2}\right)$ would be $\zeta \times \left({\alpha_1, \alpha_2}\right)$, where either $\zeta = \beta_1 / \alpha_1$ or $\zeta = \beta_2 / \alpha_2$ according to whether $\alpha_1 \ne 0$ or $\alpha_2 \ne 0$. But then $S = \left({\R^2, +, \times}\right)_\R$. Because, if $\left({\gamma_1, \gamma_2}\right)$ is any vector at all, then: : $\left({\gamma_1, \gamma_2}\right) = \lambda \times \left({\alpha_1, \alpha_2}\right) + \mu \times \left({\beta_1, \beta_2}\right)$ where $\lambda = \dfrac {\gamma_1 \times \beta_2 - \gamma_2 \times \beta_1} {\alpha_1 \times \beta_2 - \alpha_2 \times \beta_1}, \mu = \dfrac {\alpha_1 \times \gamma_2 - \alpha_2 \times \gamma_1} {\alpha_1 \times \beta_2 - \alpha_2 \times \beta_1}$ which we get by solving the simultaneous eqns: {{begin-eqn}} {{eqn | l=\alpha_1 \times \lambda + \beta_1 \times \mu | r=0 | c= }} {{eqn | l=\alpha_2 \times \lambda + \beta_2 \times \mu | r=0 | c= }} {{end-eqn}} The result follows. {{qed}} \end{proof}
22066
\section{Substitution Instance of Term is Term} Tags: Predicate Logic \begin{theorem} Let $\beta, \tau$ be terms of predicate logic. Let $x \in \operatorname {VAR}$ be a variable. Let $\map \beta {x \gets \tau}$ be the substitution instance of $\beta$ substituting $\tau$ for $x$. Then $\map \beta {x \gets \tau}$ is a term. \end{theorem} \begin{proof} Proceed by the Principle of Structural Induction on the definition of term, applied to $\beta$. If $\beta = y$ for some variable $y$, then: :$\map \beta {x \gets \tau} = \begin{cases} \tau & : \text {if $y = x$} \\ y &: \text {otherwise} \end{cases}$ In either case, $\map \beta {x \gets \tau}$ is a term. If $\beta = \map f {\tau_1, \ldots, \tau_n}$ and the induction hypothesis holds for $\tau_1, \ldots, \tau_n$, then: :$\map \beta {x \gets \tau} = \map f {\map {\tau_1} {x \gets \tau}, \ldots, \map {\tau_n} {x \gets \tau} }$ By the induction hypothesis, each $\map {\tau_i} {x \gets \tau}$ is a term. Hence so is $\map \beta {x \gets \tau}$. The result follows by the Principle of Structural Induction. {{qed}} \end{proof}
22067
\section{Substitution Instance of WFF is WFF} Tags: Predicate Logic \begin{theorem} Let $\mathbf A$ be a WFF of predicate logic. Let $\tau$ be a term of predicate logic. Let $x \in \mathrm{VAR}$ be a variable. Let $\mathbf A \left({x \gets \tau}\right)$ be the substitution instance of $\mathbf A$ substituting $\tau$ for $x$. Then $\mathbf A \left({x \gets \tau}\right)$ is a WFF. \end{theorem} \begin{proof} Proceed by the Principle of Structural Induction on the bottom-up specification of predicate logic, applied to $\mathbf A$. If $\mathbf A = p \left({\tau_1, \ldots, \tau_n}\right)$, then: :$\mathbf A \left({x \gets \tau}\right) = p \left({\tau_1 \left({x \gets \tau}\right), \ldots, \tau_n \left({x \gets \tau}\right)}\right)$ where $\tau_i \left({x \gets \tau}\right)$ is the substitution instance of $\tau_i$. By Substitution Instance of Term is Term, each such $\tau_i \left({x \gets \tau}\right)$ is again a term. It follows that $\mathbf A \left({x \gets \tau}\right)$ is again a WFF. If $\mathbf A = \neg \mathbf B$ and the induction hypothesis applies to $\mathbf B$, then: :$\mathbf A \left({x \gets \tau}\right) = \neg \mathbf B \left({x \gets \tau}\right)$ and it follows from the induction hypothesis that $\mathbf A \left({x \gets \tau}\right)$ is a WFF of predicate logic. If $\mathbf A = \mathbf B \circ \mathbf B'$ for $\circ$ one of $\land, \lor, \implies, \iff$ and the induction hypothesis applies to $\mathbf B, \mathbf B'$: :$\mathbf A \left({x \gets \tau}\right) = \mathbf B \left({x \gets \tau}\right) \circ \mathbf B' \left({x \gets \tau}\right)$ and it follows from the induction hypothesis that $\mathbf A \left({x \gets \tau}\right)$ is a WFF of predicate logic. If $\mathbf A = \exists x: \mathbf B$, and the induction hypothesis applies to $\mathbf B$: :$\mathbf A \left({x \gets \tau}\right) = \exists x: \mathbf B \left({x \gets \tau}\right)$ and it follows from the induction hypothesis that $\mathbf A \left({x \gets \tau}\right)$ is a WFF of predicate logic. If $\mathbf A = \forall x : \mathbf B$, and the induction hypothesis applies to $\mathbf B$: :$\mathbf A \left({x \gets \tau}\right) = \forall x: \mathbf B \left({x \gets \tau}\right)$ and it follows from the induction hypothesis that $\mathbf A \left({x \gets \tau}\right)$ is a WFF of predicate logic. The result follows by the Principle of Structural Induction. {{qed}} \end{proof}
22068
\section{Substitution Rule for Matrices} Tags: Matrix Algebra \begin{theorem} Let $\mathbf A$ be a square matrix of order $n$. Then: :$(1): \quad \ds \sum_{j \mathop = 1}^n \delta_{i j} a_{j k} = a_{i k}$ :$(2): \quad \ds \sum_{j \mathop = 1}^n \delta_{i j} a_{k j} = a_{k i}$ where: :$\delta_{i j}$ is the Kronecker delta :$a_{j k}$ is element $\tuple {j, k}$ of $\mathbf A$. \end{theorem} \begin{proof} By definition of Kronecker delta: :$\delta_{i j} = \begin {cases} 1 & : i = j \\ 0 & : i \ne j \end {cases}$ Thus: :$\delta_{i j} a_{j k} = \begin {cases} a_{i k} & : i = j \\ 0 & : i \ne j \end {cases}$ and: :$\delta_{i j} a_{k j} = \begin {cases} a_{k i} & : i = j \\ 0 & : i \ne j \end {cases}$ from which the result follows. {{qed}} \end{proof}
22069
\section{Substitution Theorem for Terms} Tags: Predicate Logic, Named Theorems \begin{theorem} Let $\beta, \tau$ be terms. Let $x \in \mathrm {VAR}$ be a variable. Let $\map \beta {x \gets \tau}$ be the substitution instance of $\beta$ substituting $\tau$ for $x$. Let $\AA$ be a structure for predicate logic. Let $\sigma$ be an assignment for $\beta$ and $\tau$. Suppose that: :$\map {\operatorname{val}_\AA} \tau \sqbrk \sigma = a$ where $\map {\operatorname{val}_\AA} \tau \sqbrk \sigma$ is the value of $\tau$ under $\sigma$. Then: :$\map {\operatorname{val}_\AA} {\map \beta {x \gets \tau} } \sqbrk \sigma = \map {\operatorname{val}_\AA} \beta \sqbrk {\sigma + \paren {x / a} }$ where $\sigma + \paren {x / a}$ is the extension of $\sigma$ by mapping $x$ to $a$. \end{theorem} \begin{proof} Proceed by the Principle of Structural Induction on the definition of term, applied to $\beta$. If $\beta = y$ for some variable $y$, then: :$\map \beta {x \gets \tau} = \begin {cases} \tau & : \text{if $y = x$} \\ y & : \text {otherwise} \end {cases}$ In the first case: {{begin-eqn}} {{eqn | l = \map {\operatorname{val}_\AA} {\map \beta {x \gets \tau} } \sqbrk \sigma | r = \map {\operatorname{val}_\AA} \tau \sqbrk \sigma }} {{eqn | r = a | c = Definition of $a$ }} {{eqn | r = \map {\paren {\sigma + \paren {x / a} } } x | c = {{Defof|Extension of Assignment}} }} {{eqn | r = \map {\operatorname{val}_\AA} \beta \sqbrk {\sigma + \paren {x / a} } | c = {{Defof|Value of Term under Assignment|Value under $\sigma + \paren {x / a}$}} }} {{end-eqn}} In the second case: {{begin-eqn}} {{eqn | l = \map {\operatorname{val}_\AA} {\map \beta {x \gets \tau} } \sqbrk \sigma | r = \map {\operatorname{val}_\AA} y \sqbrk \sigma }} {{eqn | r = \map \sigma y | c = {{Defof|Value of Term under Assignment|Value under $\sigma$}} }} {{eqn | r = \map {\paren {\sigma + \paren {x / a} } } y | c = {{Defof|Extension of Assignment}} }} {{eqn | r = \map {\operatorname{val}_\AA} \beta \sqbrk {\sigma + \paren {x / a} } | c = {{Defof|Value of Term under Assignment|Value under $\sigma + \paren {x / a}$}} }} {{end-eqn}} as desired. If $\beta = \map f {\tau_1, \ldots, \tau_n}$ and the induction hypothesis holds for $\tau_1, \ldots, \tau_n$, then: :$\map \beta {x \gets \tau} = \map f {\map {\tau_1} {x \gets \tau}, \ldots, \map {\tau_n} {x \gets \tau} }$ Now: {{begin-eqn}} {{eqn | l = \map {\operatorname{val}_\AA} {\map \beta {x \gets \tau} } \sqbrk \sigma | r = \map {\operatorname{val}_\AA} {\map f {\map {\tau_1} {x \gets \tau}, \ldots, \map {\tau_n} {x \gets \tau} } } \sqbrk \sigma }} {{eqn | r = \map {f_\AA} {\map {\operatorname{val}_\AA} {\map {\tau_1} {x \gets \tau} } \sqbrk \sigma, \ldots, \map {\operatorname{val}_\AA} {\map {\tau_n} {x \gets \tau} } \sqbrk \sigma} | c = {{Defof|Value of Term under Assignment|Value under $\sigma$}} }} {{eqn | r = \map {f_\AA} {\map {\operatorname{val}_\AA} {\tau_1} \sqbrk {\sigma + \paren {x / a} }, \ldots, \map {\operatorname{val}_\AA} {\tau_n} \sqbrk {\sigma + \paren {x / a} } } | c = Induction Hypothesis }} {{eqn | r = \map {\operatorname{val}_\AA} \beta \sqbrk {\sigma + \paren {x / a} } | c = {{Defof|Value of Term under Assignment|Value under $\sigma + \paren {x / a}$}} }} {{end-eqn}} as desired. The result follows by the Principle of Structural Induction. {{qed}} \end{proof}
22070
\section{Substitution Theorem for Well-Formed Formulas} Tags: Predicate Logic, Named Theorems \begin{theorem} Let $\mathbf A$ be a WFF of predicate logic. Let $x \in \mathrm{VAR}$ be a variable. Let $\tau$ be a term of predicate logic which is freely substitutable for $x$ in $\mathbf A$. Let $\map {\mathbf A} {x \gets \tau}$ be the substitution instance of $\mathbf A$ substituting $\tau$ for $x$. Let $\AA$ be a structure for predicate logic. Let $\sigma$ be an assignment for $\mathbf A$ and $\tau$. Suppose that: :$\map {\operatorname{val}_\AA} \tau \sqbrk \sigma = a$ where $\map {\operatorname{val}_\AA} \tau \sqbrk \sigma$ is the value of $\tau$ under $\sigma$. Then: :$\map {\operatorname{val}_\AA} {\map {\mathbf A} {x \gets \tau} } \sqbrk \sigma = \map {\operatorname{val}_\AA} {\mathbf A} \sqbrk {\sigma + \paren {x / a} }$ where $\sigma + \paren {x / a}$ is the extension of $\sigma$ by mapping $x$ to $a$. \end{theorem} \begin{proof} Proceed by the Principle of Structural Induction on the bottom-up specification of predicate logic, applied to $\mathbf A$. If $\mathbf A = \map p {\tau_1, \ldots, \tau_n}$, then: :$\map {\mathbf A} {x \gets \tau} = \map p {\map {\tau_1} {x \gets \tau}, \ldots, \map {\tau_n} {x \gets \tau} }$ where $\map {\tau_i} {x \gets \tau}$ is the substitution instance of $\tau_i$. Now: {{begin-eqn}} {{eqn|l = \map {\operatorname{val}_\AA} {\map {\mathbf A} {x \gets \tau} } \sqbrk \sigma |r = \map {\operatorname{val}_\AA} {\map p {\map {\tau_1} {x \gets \tau}, \ldots, \map {\tau_n} {x \gets \tau} } } \sqbrk \sigma }} {{eqn|r = \map {p_\AA} {\map {\operatorname{val}_\AA} {\map {\tau_1} {x \gets \tau} } \sqbrk \sigma, \ldots, \map {\operatorname{val}_\AA} {\map {\tau_n} {x \gets \tau} } \sqbrk \sigma} |c = {{Defof|Value of Formula under Assignment|Value under $\sigma$}} }} {{eqn|r = \map {p_\AA} {\map {\operatorname{val}_\AA} {\tau_1} \sqbrk {\sigma + \paren {x / a} }, \ldots, \map {\operatorname{val}_\AA} {\tau_n} \sqbrk {\sigma + \paren {x / a} } } |c = Substitution Theorem for Terms }} {{eqn|r = \map {\operatorname{val}_\AA} {\mathbf A} \sqbrk {\sigma + \paren {x / a} } |c = {{Defof|Value of Formula under Assignment|Value under $\sigma + \paren {x / a}$}} }} {{end-eqn}} Suppose $\mathbf A = \neg \mathbf B$ and the induction hypothesis applies to $\mathbf B$. Then since $\tau$ is also free for $x$ in $\mathbf B$: {{begin-eqn}} {{eqn|l = \map {\operatorname{val}_\AA} {\map {\mathbf A} {x \gets \tau} } \sqbrk \sigma |r = \map {f^\neg} {\map {\operatorname{val}_\AA} {\mathbf B} \sqbrk \sigma} |c = {{Defof|Value of Formula under Assignment|Value under $\sigma$}} }} {{eqn|r = \map {f^\neg} {\map {\operatorname{val}_\AA} {\mathbf B} \sqbrk {\sigma + \paren {x / a} } } |c = Induction Hypothesis }} {{eqn|r = \map {\operatorname{val}_\AA} {\mathbf A} \sqbrk {\sigma + \paren {x / a} } |c = {{Defof|Value of Formula under Assignment|Value under $\sigma + \paren {x / a}$}} }} {{end-eqn}} Suppose $\mathbf A = \mathbf B \circ \mathbf B'$ for $\circ$ one of $\land, \lor, \implies, \iff$ and the induction hypothesis applies to $\mathbf B, \mathbf B'$. Then since $\tau$ is also free for $x$ in $\mathbf B$ and $\mathbf B'$: {{begin-eqn}} {{eqn|l = \map {\operatorname{val}_\AA} {\map {\mathbf A} {x \gets \tau} } \sqbrk \sigma |r = \map {f^\circ} {\map {\operatorname{val}_\AA} {\mathbf B} \sqbrk \sigma, \map {\operatorname{val}_\AA} {\mathbf B} \sqbrk \sigma} |c = {{Defof|Value of Formula under Assignment|Value under $\sigma$}} }} {{eqn|r = \map {f^\circ} {\map {\operatorname{val}_\AA} {\mathbf B} \sqbrk {\sigma + \paren {x / a} }, \map {\operatorname{val}_\AA} {\mathbf B} \sqbrk {\sigma + \paren {x / a} } } |c = Induction Hypothesis }} {{eqn|r = \map {\operatorname{val}_\AA} {\mathbf A} \sqbrk {\sigma + \paren {x / a} } |c = {{Defof|Value of Formula under Assignment|Value under $\sigma + \paren {x / a}$}} }} {{end-eqn}} Suppose $\mathbf A = \exists y: \mathbf B$ or $\mathbf A = \forall y: \mathbf B$, and the induction hypothesis applies to $\mathbf B$. Because $\tau$ is free for $x$ in $\mathbf A$, it must be that either $x$ does not occur freely in $\mathbf A$, or $y$ does not occur in $\tau$. In the first case: :$\map {\mathbf A} {x \gets \tau} = \mathbf A$ and by Value of Formula under Assignment Determined by Free Variables: :$\map {\operatorname{val}_\AA} {\mathbf A} \sqbrk \sigma = \map {\operatorname{val}_\AA} {\mathbf A} \sqbrk {\sigma + \paren {x / a} }$ Now consider the case where $y$ does not occur in $\tau$. From the definition of value under $\sigma$, $\map {\operatorname{val}_\AA} {\map {\mathbf A} {x \gets \tau} } \sqbrk \sigma$ is determined by: :$\map {\operatorname{val}_\AA} {\map {\mathbf B} {x \gets \tau} } \sqbrk {\sigma + \paren {y / a'} }$ where $a'$ ranges over $\AA$. Now from Value of Term under Assignment Determined by Variables, since $y$ does not occur in $\tau$: :$\map {\operatorname{val}_\AA} \tau \sqbrk {\sigma + \paren {y / a'} } = \map {\operatorname{val}_\AA} \tau \sqbrk \sigma = a$ for all $a'$. Hence the induction hypothesis also applies to the assignment $\sigma + \paren {y / a'}$. Thus, for all $a'$: {{begin-eqn}} {{eqn|l = \map {\operatorname{val}_\AA} {\map {\mathbf B} {x \gets \tau} } \sqbrk {\sigma + \paren {y / a'} } |r = \map {\operatorname{val}_\AA} {\mathbf B} \sqbrk {\sigma + \paren {y / a'} + \paren {x / a} } |c = Induction Hypothesis }} {{eqn|r = \map {\operatorname{val}_\AA} {\mathbf B} \sqbrk {\sigma + \paren {x / a} + \paren {y / a'} } |c = {{Defof|Extension of Assignment}} }} {{end-eqn}} from which we infer: :$\map {\operatorname{val}_\AA} {\map {\mathbf A} {x \gets \tau} } \sqbrk \sigma = \map {\operatorname{val}_\AA} {\mathbf A} \sqbrk {\sigma + \paren {x / a} }$ as desired. The result follows by the Principle of Structural Induction. {{qed}} \end{proof}
22071
\section{Substitution for Equivalent Subformula is Equivalent} Tags: Propositional Logic, Boolean Interpretations \begin{theorem} Let $\mathbf B$ a WFF of propositional logic. Let $\mathbf A, \mathbf A'$ be equivalent WFFs. Let $\mathbf A$ be a subformula of $\mathbf B$. Let $\mathbf B' = \map {\mathbf B} {\mathbf A \,//\, \mathbf A'}$ be the substitution of $\mathbf A'$ for $\mathbf A$ in $\mathbf B$. Then $\mathbf B$ and $\mathbf B'$ are equivalent. \end{theorem} \begin{proof} Let $v$ be an arbitrary boolean interpretation. Then $\map v {\mathbf A} = \map v {\mathbf A'}$. It is to be shown that $\map v {\mathbf B} = \map v {\mathbf B'}$. We proceed by induction. Let $\map n {\mathbf B}$ be the number of WFFs $\mathbf C$ such that: :$\mathbf A$ is a subformula of $\mathbf C$, and $\mathbf C$ is a subformula of $\mathbf B$. Note that $\map n {\mathbf B} \ne 0$, for $\mathbf C = \mathbf A$ is a valid choice. Suppose now that $\map n {\mathbf B} = 1$. Because we have the valid choices $\mathbf C = \mathbf A$ and $\mathbf C = \mathbf B$, it follows that these choices must be identical, i.e. $\mathbf A = \mathbf B$. Hence $\mathbf B' = \mathbf A'$, and so: :$\map v {\mathbf B} = \map v {\mathbf B'}$ Suppose now that the assertion is true for all $\mathbf B$ with $\map n {\mathbf B} \le n$. Let $\map n {\mathbf B} = n + 1$. Suppose $\mathbf B = \neg \mathbf B_1$. Then obviously $\map n {\mathbf B_1} = n$, so by hypothesis: :$\map v {\mathbf B_1} = \map v {\map {\mathbf B_1} {\mathbf A \,//\, \mathbf A'} }$ Also, by definition of substitution: :$\mathbf B' = \neg \map {\mathbf B_1} {\mathbf A \,//\, \mathbf A'}$ Now, by definition of boolean interpretation: {{begin-eqn}} {{eqn | l = \map v {\mathbf B} | r = \map {f^\neg} {\map v {\mathbf B_1} } }} {{eqn | r = \map {f^\neg} {\map v {\map {\mathbf B_1} {\mathbf A \,//\, \mathbf A'} } } }} {{eqn | r = \map v {\neg \map {\mathbf B_1} {\mathbf A \,//\, \mathbf A'} } }} {{eqn | r = \map v {\mathbf B'} }} {{end-eqn}} Suppose now that $\mathbf B = \mathbf B_1 \mathbin{\mathsf B} \mathbf B_2$ for a binary connective $\mathsf B$. Then $\map n {\mathbf B_1}, \map n {\mathbf B_2} \le n$, so: :$\map v {\mathbf B_1} = \map v {\map {\mathbf B_1} {\mathbf A \,//\, \mathbf A'} }$ :$\map v {\mathbf B_2} = \map v {\map {\mathbf B_2} {\mathbf A \,//\, \mathbf A'} }$ This follows by either the induction hypothesis, or when $\mathbf A$ is not a subformula of $\mathbf B_1$ or $\mathbf B_2$, is entirely trivial, considering the substitution does not change anything. Also, by definition of substitution: :$\mathbf B' = \map {\mathbf B_1} {\mathbf A \,//\, \mathbf A'} \mathbin {\mathsf B} \map {\mathbf B_2} {\mathbf A \,//\, \mathbf A'}$ Now, by definition of boolean interpretation: {{begin-eqn}} {{eqn | l = \map v {\mathbf B} | r = \map {f^{\mathsf B} } {\map v {\mathbf B_1}, \map v {\mathbf B_2} } }} {{eqn | r = \map {f^{\mathsf B} } {\map v {\map {\mathbf B_1} {\mathbf A \,//\, \mathbf A'} }, \map v {\map {\mathbf B_2} {\mathbf A \,//\, \mathbf A'} } } }} {{eqn | r = \map v {\map {\mathbf B_1} {\mathbf A \,//\, \mathbf A'} \mathbin {\mathsf B} \map {\mathbf B_2} {\mathbf A \,//\, \mathbf A'} } }} {{eqn | r = \map v {\mathbf B'} }} {{end-eqn}} By definition of the language of propositional logic, $\mathbf B$ must have either of the above forms. Hence the result, from the Second Principle of Mathematical Induction. {{qed}} \end{proof}
22072
\section{Substitution in Big-O Estimate/General Result} Tags: Asymptotic Notation \begin{theorem} Let $X$ and $Y$ be topological spaces. Let $V$ be a normed vector space over $\R$ or $\C$ with norm $\norm {\,\cdot\,}$. Let $x_0 \in X$ and $y_0 \in Y$. Let $f: X \to Y$ be a function with $\map f {x_0} = y_0$ that is continuous at $x_0$. Let $g, h: Y \to V$ be functions. Suppose $\map g y = \map O {\map h y}$ as $y \to y_0$, where $O$ denotes big-O notation. Then $\map {\paren {g \circ f} } x = \map O {\map {\paren {h \circ f} } x}$ as $x \to x_0$. \end{theorem} \begin{proof} Because $g = \map O h$, there exists a neighborhood $V$ of $y_0$ and a real number $c$ such that: :$\norm {\map g x} \le c \cdot \norm {\map h x}$ for all $y \in V$. By definition of continuity, there exists a neighborhood $U$ of $x_0$ with $\map f U \subset V$. For $x \in U$, we have: :$\norm {\map g {\map f x} } \le c \cdot \norm {\map h {\map f x} }$ Thus $g \circ f = \map O {h \circ f}$ as $x \to x_0$. {{qed}} Category:Asymptotic Notation \end{proof}