id
stringlengths
1
260
contents
stringlengths
1
234k
22473
\section{Surjection from Finite Set to Itself is Permutation} Tags: Permutation Theory, Surjections \begin{theorem} Let $S$ be a finite set. Let $f: S \to S$ be an surjection. Then $f$ is a permutation. \end{theorem} \begin{proof} From Surjection iff Right Inverse, $f$ has a right inverse $g: S \to S$. From Right Inverse Mapping is Injection, $g$ is an injection. From Injection from Finite Set to Itself is Permutation, $g$ is a permutation and so a bijection. From Inverse of Bijection is Bijection, $f$ is also a bijection. Thus as $f$ is a bijection to itself, it is by definition a permutation. {{qed}} \end{proof}
22474
\section{Surjection from Natural Numbers iff Countable/Corollary 1} Tags: Countably Infinite, Natural Numbers, Countable Sets, Surjections \begin{theorem} Let $T$ be a countably infinite set. Let $S$ be a non-empty set. Then $S$ is countable {{iff}} there exists a surjection $f: T \to S$. \end{theorem} \begin{proof} Let $g: T \to \N$ be a bijection from $T$ to $\N$. By Inverse of Bijection is Bijection, $g^{-1}: \N \to T$ is a bijection from $\N$ to $T$. \end{proof}
22475
\section{Surjection from Natural Numbers iff Countable/Corollary 2} Tags: Countable Sets, Surjections \begin{theorem} Let $T$ be a countably infinite set. Let $S$ be an uncountable set. Let $f:T \to S$ be a mapping. Then $f$ is not a surjection. \end{theorem} \begin{proof} By Corollary 1 no mapping from $T$ to $S$ is a surjection. {{qed}} Category:Countable Sets Category:Surjections \end{proof}
22476
\section{Surjection if Composite is Surjection} Tags: Surjections, Mappings, Composite Mappings \begin{theorem} Let $f: S_1 \to S_2$ and $g: S_2 \to S_3$ be mappings such that $g \circ f$ is a surjection. Then $g$ is a surjection. \end{theorem} \begin{proof} Let $g \circ f$ be surjective. Fix $z \in S_3$. Now find an $x \in S_1: \map {g \circ f} x = z$. The surjectivity of $g \circ f$ guarantees this can be done. Now find an $y \in S_2: f \paren x = y$. $f$ is a mapping and therefore a left-total relation; which guarantees this too can be done. It follows that: {{begin-eqn}} {{eqn | l = \map g y | r = \map g {\map f x} }} {{eqn | r = \map {g \circ f} x | c = {{Defof|Composition of Mappings}} }} {{eqn | r = z | c = Choice of $x$ }} {{end-eqn}} {{qed}} \end{proof}
22477
\section{Surjection iff Right Inverse/Non-Uniqueness} Tags: Surjections, Surjection iff Right Inverse \begin{theorem} Let $S$ and $T$ be sets such that $S \ne \O$. Let $f: S \to T$ be a surjection. A right inverse of $f$ is in general not unique. Uniqueness occurs {{iff}} $f$ is a bijection. \end{theorem} \begin{proof} If $f$ is not an injection then: :$\exists y \in T: \exists x_1, x_2 \in S: \map f {x_1} = y = \map f {x_2}$ Hence we have more than one choice in $\map {f^{-1} } {\set y}$ for how to map $\map g y$. That is, $\map g y$ is not unique. This does not happen {{iff}} $f$ is an injection. Hence the result. {{qed}} \end{proof}
22478
\section{Surjection that Preserves Inner Product is Linear} Tags: Linear Transformations on Hilbert Spaces, Hilbert Spaces \begin{theorem} Let $H, K$ be Hilbert spaces, and denote by ${\innerprod \cdot \cdot}_H$ and ${\innerprod \cdot \cdot}_K$ their respective inner products. Let $U: H \to K$ be a surjection such that: :$\forall g, h \in H: {\innerprod g h}_H = {\innerprod {Ug} {Uh} }_K$ Then $U$ is a linear map, and hence an isomorphism. \end{theorem} \begin{proof} Let $x, y \in H$. Let $\alpha \in \GF$. By surjectivity of $U$, choose $z \in H$ such that $Uz = \map U {\alpha x + y} - \paren { \alpha Ux + Uy }$. Then: {{begin-eqn}} {{eqn | l = {\innerprod {Uz} {Uz} }_K | r = {\innerprod {\map U {\alpha x + y} - \paren{\alpha Ux + Uy } } {Uz} }_K }} {{eqn | r = {\innerprod {\map U {\alpha x + y} } {Uz} }_K - \paren{ \alpha {\innerprod {Ux} {Uz} }_K + {\innerprod {Uy} {Uz} }_K} | c = by linearity in the first coordinate }} {{eqn | r = {\innerprod {\alpha x + y} z }_H - \paren{ \alpha {\innerprod x z }_H + {\innerprod y z }_H} | c = as $U$ preserves the inner product }} {{eqn | r = 0 | c = by linearity in the first coordinate }} {{end-eqn}} By positivity, $Uz = {\bf 0}_K$. Hence: :$\map U {\alpha x + y} = \alpha U x + U y$ Thus, $U$ is linear. {{qed}} \end{proof}
22479
\section{Surjective Field Homomorphism is Field Isomorphism} Tags: Field Homomorphisms, Field Isomorphisms \begin{theorem} Let $E$ and $F$ be fields. Let $\phi: E \to F$ be a (field) homomorphism. Let $\phi$ be a surjection. Then $\phi$ is an isomorphism. \end{theorem} \begin{proof} As asserted, let $\phi$ be a surjection. From Field Homomorphism is either Trivial or Injection, $\phi$ is either an injection or the trivial homomorphism. If $\phi$ is an injection, then, by definition, $\phi$ is a bijection. Hence, again by definition, $\phi$ is an isomorphism. If $\phi$ is not an injection, then $\phi$ is the trivial homomorphism. But from Field Contains at least 2 Elements, $\Img \phi$ cannot in that case be a field. Hence if $\phi$ is not an injection, then $\phi$ cannot be a surjection. Hence the result. {{qed}} \end{proof}
22480
\section{Surjective Monotone Function is Continuous} Tags: Monotone Real Functions, Real Analysis, Continuity, Continuity, Real Analysis \begin{theorem} Let $X$ be an open set of $\R$. Let $Y$ be a real interval. Let $f: X \to Y$ be a surjective monotone real function. Then $f$ is continuous on $X$. \end{theorem} \begin{proof} {{Tidy}} {{MissingLinks}} {{WLOG}}, let $f$ be increasing. Let $c \in X$. From Limit of Monotone Real Function: Corollary, the one sided limits of monotone functions exist: {{begin-eqn}} {{eqn | l = L^-_c | m = \lim_{x \mathop \to c^-} \map f x | mo= = | r = \sup_{x \mathop < c} \map f x }} {{eqn | l = L^+_c | m = \lim_{x \mathop \to c^+} \map f x | mo= = | r = \inf_{x \mathop > c} \map f x }} {{end-eqn}} and satisfy: :$L^-_c, L^+_c \in Y$ :$L^-_c \le \map f c \le L^+_c$ Suppose that $\ds L = \lim_{x \mathop \to c} \map f x$ exists. From Limit iff Limits from Left and Right: :$L = L^-_c$ This leads to: :$L \le \map f c$ Similarly: :$L = L^+_c$ which leads to: :$L \ge \map f c$ Hence: :$\ds \lim_{x \mathop \to c} \map f x = \map f c$ proving continuity at $c$. By assumption, $f$ is increasing. Suppose $\ds \lim_{x \mathop \to c} \map f x$ does not exist. Then from Discontinuity of Monotonic Function is Jump Discontinuity, there is a jump discontinuity at $c$. {{AimForCont}} $f$ has a jump discontinuity at $c$. From Real Numbers are Densely Ordered: :$L^-_c < y < L^+_c$ for some $y \in Y$. By surjectivity, $y = \map f a$ for some $a \in X$. Hence: :$L^-_c < \map f a < L^+_c$ If $a < c$ then $\map f a \le L^-_c$. This contradicts the previous inequality. There is a similar contradiction if $a \ge c$. {{finish}} Category:Monotone Real Functions Category:Real Analysis Category:Continuity \end{proof}
22481
\section{Surjective Restriction of Real Exponential Function} Tags: Exponential Function, Surjections \begin{theorem} Let $\exp: \R \to \R$ be the exponential function: :$\map \exp x = e^x$ Then the restriction of the codomain of $\exp$ to the strictly positive real numbers: :$\exp: \R \to \R_{>0}$ is a surjective restriction. Hence: :$\exp: \R \to \R_{>0}$ is a bijection. \end{theorem} \begin{proof} We have Exponential on Real Numbers is Injection. Let $y \in \R_{> 0}$. Then $\exists x \in \R: x = \map \ln y$ That is: :$\exp x = y$ and so $\exp: \R \to \R_{>0}$ is a surjection. Hence the result. {{qed}} \end{proof}
22482
\section{Sylow Subgroup is Hall Subgroup} Tags: Group Theory, Hall Subgroups, Sylow p-Subgroups \begin{theorem} Let $G$ be a group. Let $H$ be a Sylow $p$-subgroup of $G$. Then $H$ is a Hall subgroup of $G$. \end{theorem} \begin{proof} Let $p$ be prime. Let $G$ be a finite group such that $\order G = k p^n$ where $p \nmid k$. By definition, a Sylow $p$-subgroup $H$ of $G$ is a subgroup of $G$ of order $p^n$. By Lagrange's Theorem, the index of $H$ in $G$ is given by: :$\index G H = \dfrac {\order G} {\order H}$ So in this case: :$\index G H = \dfrac {k p^n} {p^n} = k$ As $p \nmid k$ it follows from Prime not Divisor implies Coprime that $k \perp p$. The result follows from the definition of Hall subgroup. {{qed}} Category:Sylow p-Subgroups Category:Hall Subgroups \end{proof}
22483
\section{Sylow p-Subgroup is Unique iff Normal} Tags: Normal Subgroups, Sylow p-Subgroups \begin{theorem} A group $G$ has exactly one Sylow $p$-subgroup $P$ {{iff}} $P$ is normal. \end{theorem} \begin{proof} If $G$ has precisely one Sylow $p$-subgroup, it must be normal from Unique Subgroup of a Given Order is Normal. Suppose a Sylow $p$-subgroup $P$ is normal. Then it equals its conjugates. Thus, by the Third Sylow Theorem, there can be only one such Sylow $p$-subgroup. {{qed}} \end{proof}
22484
\section{Sylow p-Subgroups of Group of Order 2p} Tags: Groups of Order 2p, Sylow p-Subgroups of Group of Order 2p, Groups of Order 2 p, Sylow p-Subgroups \begin{theorem} Let $p$ be an odd prime. Let $G$ be a group of order $2 p$. Then $G$ has exactly one Sylow $p$-subgroup. This Sylow $p$-subgroup is normal. \end{theorem} \begin{proof} Let $n_p$ denote the number of Sylow $p$-subgroups of $G$. From the Fourth Sylow Theorem: :$n_p \equiv 1 \pmod p$ From the Fifth Sylow Theorem: :$n_p \divides 2 p$ that is: :$n_p \in \set {1, 2, p, 2 p}$ But $p$ and $2 p$ are congruent to $0$ modulo $p$ So: :$n_p \notin \set {p, 2 p}$ Also we have that $p > 2$. Hence: :$2 \not \equiv 1 \pmod p$ and so it must be that $n_p = 1$. It follows from Normal Sylow P-Subgroup is Unique that this Sylow $p$-subgroup is normal. {{qed}} \end{proof}
22485
\section{Symmetric Bilinear Form can be Diagonalized} Tags: Bilinear Forms \begin{theorem} Let $\mathbb K$ be a field. Let $V$ be a vector space over $\mathbb K$ of finite dimension $n>0$. Let $f$ be a symmetric bilinear form on $V$. Then there exists an ordered basis for which the relative matrix of $f$ is diagonal. \end{theorem} \begin{proof} {{ProofWanted}} Category:Bilinear Forms \end{proof}
22486
\section{Symmetric Bilinear Form is Reflexive} Tags: Bilinear Forms \begin{theorem} Let $\mathbb K$ be a field. Let $V$ be a vector space over $\mathbb K$. Let $b$ be a bilinear form on $V$. Let $b$ be symmetric. Then $b$ is reflexive. \end{theorem} \begin{proof} Let $\left({v, w}\right) \in V \times V$ with $b \left({v, w}\right) = 0$. Because $b$ is symmetric, $b \left({w, v}\right) = 0$. Because $\left({v, w}\right)$ was arbitrary, $b$ is reflexive. {{qed}} \end{proof}
22487
\section{Symmetric Closure of Ordering may not be Transitive} Tags: Orderings, Closure Operators, Symmetric Closures, Transitive Relations, Symmetric Closure \begin{theorem} Let $\struct {S, \preceq}$ be an ordered set. Let $\preceq^\leftrightarrow$ be the symmetric closure of $\preceq$. Then it is not necessarily the case that $\preceq^\leftrightarrow$ is transitive. \end{theorem} \begin{proof} Proof by Counterexample: Let $S = \set {a, b, c}$ where $a$, $b$, and $c$ are distinct. Let: :${\preceq} = \set {\tuple {a, a}, \tuple {b, b}, \tuple {c, c}, \tuple {a, c}, \tuple {b, c} }$: Then $\preceq$ is an ordering, but $\preceq^\leftrightarrow$ is not transitive, as follows: $\preceq$ is reflexive because it contains the diagonal relation on $S$. That $\preceq$ is transitive and antisymmetric can be verified by inspecting all ordered pairs of its elements. Thus $\preceq$ is an ordering. Now consider $\preceq^\leftrightarrow$, the symmetric closure of $\preceq$: :${\preceq^\leftrightarrow} = {\preceq} \cup {\preceq}^{-1} = \set{\tuple {a, a}, \tuple {b, b}, \tuple {c, c}, \tuple {a, c}, \tuple {c, a}, \tuple {b, c}, \tuple {c, b} }$ by inspection. Now $\tuple {a, c} \in {\preceq^\leftrightarrow}$ and $\tuple {c, b} \in {\preceq^\leftrightarrow}$, but $\tuple {a, b} \notin {\preceq^\leftrightarrow}$. Thus $\preceq^\leftrightarrow$ is not transitive. {{qed}} Category:Orderings Category:Symmetric Closures Category:Transitive Relations \end{proof}
22488
\section{Symmetric Closure of Relation Compatible with Operation is Compatible} Tags: Compatible Relations, Symmetric Closures \begin{theorem} Let $\struct {S, \circ}$ be a magma. Let $\RR$ be a relation compatible with $\circ$. Let $\RR^\leftrightarrow$ be the symmetric closure of $\RR$. Then $\RR^\leftrightarrow$ is compatible with $\circ$. \end{theorem} \begin{proof} By the definition of symmetric closure: :$\RR^\leftrightarrow = \RR \cup \RR^{-1}$. Here $\RR^{-1}$ is the inverse of $\RR$. By Inverse of Relation Compatible with Operation is Compatible, $\RR^{-1}$ is compatible with $\circ$. Thus by Union of Relations Compatible with Operation is Compatible: :$\RR^\leftrightarrow = \RR \cup \RR^{-1}$ is compatible with $\circ$. {{qed}} Category:Compatible Relations Category:Symmetric Closures \end{proof}
22489
\section{Symmetric Closure of Symmetric Relation} Tags: Definitions: Relation Theory, Symmetric Relations, Symmetric Closures, Relation Theory \begin{theorem} Let $\RR$ be a symmetric relation on a set $S$. Let $\RR^\leftrightarrow$ be the symmetric closure of $\RR$. Then: :$\RR = \RR^\leftrightarrow$ \end{theorem} \begin{proof} {{begin-eqn}} {{eqn | l = \RR^\leftrightarrow | r = \RR \cup \RR^{-1} | c = {{Defof|Symmetric Closure}} }} {{eqn | r = \RR \cup \RR | c = Inverse of Symmetric Relation is Symmetric }} {{eqn | r = \RR | c = Union is Idempotent }} {{end-eqn}} {{qed}} Category:Symmetric Relations Category:Symmetric Closures \end{proof}
22490
\section{Symmetric Difference is Subset of Union of Symmetric Differences} Tags: Set Union, Symmetric Difference, Union \begin{theorem} Let $R, S, T$ be sets. Then: :$R \symdif S \subseteq \paren {R \symdif T} \cup \paren {S \symdif T}$ where $R \symdif S$ denotes the symmetric difference between $R$ and $S$. \end{theorem} \begin{proof} From the definition of symmetric difference, we have: :$R \symdif S = \paren {R \setminus S} \cup \paren {S \setminus R}$ Then from Set Difference is Subset of Union of Differences, we have: :$R \setminus S \subseteq \paren {R \setminus T} \cup \paren {T \setminus S}$ :$S \setminus R \subseteq \paren {S \setminus T} \cup \paren {T \setminus R}$ Thus: {{begin-eqn}} {{eqn | l = \paren {R \setminus S} \cup \paren {S \setminus R} | o = \subseteq | r = \paren {R \setminus T} \cup \paren {T \setminus S} \cup \paren {S \setminus T} \cup \paren {T \setminus R} | c = Set Union Preserves Subsets }} {{eqn | r = \paren {R \setminus T} \cup \paren {T \setminus R} \cup \paren {S \setminus T} \cup \paren {T \setminus S} | c = Union is Commutative }} {{eqn | r = \paren {R \symdif T} \cup \paren {S \symdif T} | c = {{Defof|Symmetric Difference}} }} {{end-eqn}} {{qed}} Category:Symmetric Difference Category:Set Union \end{proof}
22491
\section{Symmetric Difference of Unions} Tags: Set Union, Symmetric Difference, Union \begin{theorem} Let $R$, $S$ and $T$ be sets. Then: :$\paren {R \cup T} \symdif \paren {S \cup T} = \paren {R \symdif S} \setminus T$ where: :$\symdif$ denotes the symmetric difference :$\setminus$ denotes set difference :$\cup$ denotes set union \end{theorem} \begin{proof} {{begin-eqn}} {{eqn | l = \paren {R \cup T} \symdif \paren {S \cup T} | r = \paren {\paren {R \cup T} \setminus \paren {S \cup T} } \cup \paren {\paren {S \cup T} \setminus \paren {R \cup T} } | c = {{Defof|Symmetric Difference|index = 1}} }} {{eqn | r = \paren {\paren {R \setminus S} \setminus T} \cup \paren {\paren {S \setminus R} \setminus T} | c = Set Difference with Union }} {{eqn | r = \paren {\paren {R \setminus S} \cup \paren {S \setminus R} } \setminus T | c = Set Difference is Right Distributive over Union }} {{eqn | r = \paren {R \symdif S} \setminus T | c = {{Defof|Symmetric Difference|index = 1}} }} {{end-eqn}} {{Qed}} Category:Symmetric Difference Category:Set Union \end{proof}
22492
\section{Symmetric Difference of Unions is Subset of Union of Symmetric Differences} Tags: Set Union, Symmetric Difference, Union \begin{theorem} Let $I$ be an indexing set. Let $S_\alpha, T_\alpha$ be sets, for all $\alpha \in I$. Then: :$\ds \bigcup_{\alpha \mathop \in I} S_\alpha \symdif \bigcup_{\alpha \mathop \in I} T_\alpha \subseteq \bigcup_{\alpha \mathop \in I} \paren {S_\alpha \symdif T_\alpha}$ where $S \symdif T$ is the symmetric difference between $S$ and $T$. \end{theorem} \begin{proof} From Difference of Unions is Subset of Union of Differences, we have: :$\ds \bigcup_{\alpha \mathop \in I} S_\alpha \setminus \bigcup_{\alpha \mathop \in I} T_\alpha \subseteq \bigcup_{\alpha \mathop \in I} \paren {S_\alpha \setminus T_\alpha}$ :$\ds \bigcup_{\alpha \mathop \in I} T_i \setminus \bigcup_{\alpha \mathop \in I} S_\alpha \subseteq \bigcup_{\alpha \mathop \in I} \paren {T_\alpha \setminus S_\alpha}$ where $\setminus$ denotes set difference. Thus we have: {{begin-eqn}} {{eqn | l = \bigcup_{\alpha \mathop \in I} S_\alpha \symdif \bigcup_{\alpha \mathop \in I} T_\alpha | r = \paren {\bigcup_{\alpha \mathop \in I} S_\alpha \setminus \bigcup_{\alpha \mathop \in I} T_\alpha} \cup \paren {\bigcup_{\alpha \mathop \in I} T_\alpha \setminus \bigcup_{\alpha \mathop \in I} S_\alpha} | c = {{Defof|Symmetric Difference}} }} {{eqn | o = \subseteq | r = \bigcup_{\alpha \mathop \in I} \paren {S_\alpha \setminus T_\alpha} \cup \bigcup_{\alpha \mathop \in I} \paren {T_\alpha \setminus S_\alpha} | c = Difference of Unions is Subset of Union of Differences }} {{eqn | r = \bigcup_{\alpha \mathop \in I} \paren {\paren {S_\alpha \setminus T_\alpha} \cup \paren {T_\alpha \setminus S_\alpha} } | c = Union is Associative and Union is Commutative }} {{eqn | r = \bigcup_{\alpha \mathop \in I} \paren {S_\alpha \symdif T_\alpha} | c = {{Defof|Symmetric Difference}} }} {{end-eqn}} {{qed}} Category:Set Union Category:Symmetric Difference \end{proof}
22493
\section{Symmetric Difference on Power Set forms Abelian Group} Tags: Abelian Groups, Symmetric Difference, Power Set, Group Theory \begin{theorem} Let $S$ be a set such that $\O \subset S$ (that is, $S$ is non-empty). Let $A \symdif B$ be defined as the symmetric difference between $A$ and $B$. Let $\powerset S$ denote the power set of $S$. Then the algebraic structure $\struct {\powerset S, \symdif}$ is an abelian group. \end{theorem} \begin{proof} From Power Set is Closed under Symmetric Difference, we have that $\struct {\powerset S, \symdif}$ is closed. The result follows directly from Set System Closed under Symmetric Difference is Abelian Group. {{Qed}} \end{proof}
22494
\section{Symmetric Difference with Intersection forms Boolean Ring} Tags: Set Intersection, Power Set, Symmetric Difference, Idempotent Rings \begin{theorem} Let $S$ be a set. Let: :$\symdif$ denote the symmetric difference operation :$\cap$ denote the set intersection operation :$\powerset S$ denote the power set of $S$. Then $\struct {\powerset S, \symdif, \cap}$ is a Boolean ring. \end{theorem} \begin{proof} From Symmetric Difference with Intersection forms Ring: :$\struct {\powerset S, \symdif, \cap}$ is a commutative ring with unity. From Intersection is Idempotent, $\cap$ is an idempotent operation on $S$. Hence the result by definition of Boolean ring. {{qed}} \end{proof}
22495
\section{Symmetric Difference with Intersection forms Ring} Tags: Symmetric Difference, Power Set, Intersection, Set Intersection, Examples of Rings with Unity, Examples of Commutative and Unitary Rings, Symmetric Difference with Intersection forms Ring, Commutative Algebra, Rings, Set Difference \begin{theorem} Let $S$ be a set. Let: :$\symdif$ denote the symmetric difference operation :$\cap$ denote the set intersection operation :$\powerset S$ denote the power set of $S$. Then $\struct {\powerset S, \symdif, \cap}$ is a commutative ring with unity, in which the unity is $S$. This ring is not an integral domain. \end{theorem} \begin{proof} * It has been established that $\left({\mathcal P \left({S}\right), *}\right)$ is an abelian group, where $\varnothing$ is the identity and each element is self-inverse. * From Power Set with Intersection is a Monoid, we know that $\left({\mathcal P \left({S}\right), \cap}\right)$ is a commutative monoid whose identity is $S$. * We have that Intersection Distributes over Symmetric Difference. * Thus $\left({\mathcal P \left({S}\right), *, \cap}\right)$ is a commutative ring with a unity which is $S$. * Next we find that $\forall A \in \mathcal P \left({S}\right): A \cap \varnothing = \varnothing = \varnothing \cap A$. Thus $\varnothing$ is indeed the zero. As set intersection is not cancellable, it follows that $\left({\mathcal P \left({S}\right), *, \cap}\right)$ is not an integral domain. {{qed}} \end{proof}
22496
\section{Symmetric Difference with Self is Empty Set} Tags: Symmetric Difference, Empty Set \begin{theorem} The symmetric difference of a set with itself is the empty set: :$S \symdif S = \O$ \end{theorem} \begin{proof} This follows directly from Symmetric Difference of Equal Sets: :$S \symdif T = \O \iff S = T$ substituting $S$ for $T$. {{Qed}} \end{proof}
22497
\section{Symmetric Difference with Union does not form Ring} Tags: Set Union, Symmetric Difference, Power Set, Examples of Rings \begin{theorem} Let $S$ be a set. Let: :$\symdif$ denote the symmetric difference operation :$\cup$ denote the set union operation :$\powerset S$ denote the power set of $S$. Then $\struct {\powerset S, \symdif, \cup}$ does not form a ring. \end{theorem} \begin{proof} For $\struct {S, \symdif, \cup}$ to be a ring, it is a necessary condition that $\cup$ be distributive over $*$. Also, the identity element for set union and symmetric difference must be different. However: :$(1): \quad$ the identity for union and symmetric difference is $\O$ for both operations :$(2): \quad$ set union is not distributive over symmetric difference: From Symmetric Difference of Unions: :$\paren {R \cup T} \symdif \paren {S \cup T} = \paren {R \symdif S} \setminus T$ The result follows. {{qed}} \end{proof}
22498
\section{Symmetric Group has Non-Normal Subgroup} Tags: Symmetric Groups, Symmetry Group, Normal Subgroups, Symmetric Group, Symmetry Groups \begin{theorem} Let $S_n$ be the (full) symmetric group on $n$ elements, where $n \ge 3$. Then $S_n$ contains at least one subgroup which is not normal. \end{theorem} \begin{proof} Let $S_n$ act on the set $S$. Let $e$ be the identity of $S_n$, by definition the identity mapping $I_S$ on $S$. As $S$ has at least three elements, three can be arbitrary selected and called $a$, $b$ and $c$. Let $\rho$ be a transposition of $S_n$, transposing elements $a$ and $b$. $\rho$ can be described in cycle notation as $\paren {a \ b}$. From Transposition is Self-Inverse it follows that $\set {e, \rho}$ is a subgroup of $S_n$. Let $\pi$ be the permutation on $S$ described in cycle notation as $\paren {a \ b \ c}$. By inspection it is found that $\pi^{-1} = \paren {a \ c \ b}$. Then we have: {{begin-eqn}} {{eqn | l = \pi^{-1} \rho \pi | r = \paren {a \ c \ b} \paren {a \ b} \paren {a \ b \ c} | c = }} {{eqn | r = \paren {a \ c} | c = by evaluation }} {{eqn | o = \notin | r = \set {e, \rho} | c = }} {{end-eqn}} So, by definition, $\set {e, \rho}$ is not a normal subgroup. {{qed}} \end{proof}
22499
\section{Symmetric Group is Generated by Transposition and n-Cycle} Tags: Examples of Generators of Groups, Symmetric Groups \begin{theorem} Let $n \in \Z: n > 1$. Let $S_n$ denote the symmetric group on $n$ letters. Then the set of cyclic permutations: :$\set {\begin {pmatrix} 1 & 2 \end{pmatrix}, \begin {pmatrix} 1 & 2 & \cdots & n \end{pmatrix} }$ is a generator for $S_n$. \end{theorem} \begin{proof} Denote: :$s = \begin {pmatrix} 1 & 2 \end{pmatrix}$ :$r = \begin {pmatrix} 1 & 2 & \cdots & n \end{pmatrix}$ By Cycle Decomposition of Conjugate,: :$r s r^{-1} = r \begin {pmatrix} 1 & 2 \end{pmatrix} r^{-1} = \begin {pmatrix} \map r 1 & \map r 2 \end{pmatrix} = \begin {pmatrix} 2 & 3 \end{pmatrix}$. By repeatedly using Cycle Decomposition of Conjugate: :$r^2 s r^{-2} = \begin {pmatrix} 3 & 4 \end{pmatrix}$ :$r^3 s r^{-3} = \begin {pmatrix} 4 & 5 \end{pmatrix}$ :$\cdots$ :$r^{n - 2} s r^{-\paren {n - 2} } = \begin {pmatrix} n - 1 & n \end{pmatrix}$ The result then follows from Transpositions of Adjacent Elements generate Symmetric Group. {{qed}} \end{proof}
22500
\section{Symmetric Group is Group} Tags: Symmetric Group is Group, Symmetric Groups, Symmetric Group, Group Examples \begin{theorem} Let $S$ be a set. Let $\map \Gamma S$ denote the set of all permutations on $S$. Then $\struct {\map \Gamma S, \circ}$, the symmetric group on $S$, forms a group. \end{theorem} \begin{proof} The fact that $\struct {S_n, \circ}$ is a group follows directly from group of permutations. By definition of cardinality, as $\card T = n$ we can find a bijection between $T$ and $\N_n$. From Number of Permutations, it is immediate that $\order {\paren {\Gamma \paren T, \circ} } = n! = \order {\struct {S_n, \circ} }$. Again, we can find a bijection $\phi$ between $\struct {\Gamma \paren T, \circ}$ and $\struct {S_n, \circ}$. The result follows directly from the Transplanting Theorem. {{qed}} \end{proof}
22501
\section{Symmetric Group is Subgroup of Monoid of Self-Maps} Tags: Permutation Theory, Symmetric Group, Group of Permutations, Symmetric Groups \begin{theorem} Let $S$ be a set. Let $S^S$ be the set of all mappings from $S$ to itself Let $\struct {\Gamma \paren S, \circ}$ denote the symmetric group on $S$. Let $\struct {S^S, \circ}$ be the monoid of self-maps under composition of mappings. Then $\struct {\Gamma \paren S, \circ}$ is a subgroup of $\struct {S^S, \circ}$. \end{theorem} \begin{proof} By Symmetric Group is Group, $\struct {\Gamma \paren S, \circ}$ is a group. Let $\phi \in \Gamma \paren S$ be a permutation on $S$. As a permutation is a self-map, it follows that $\phi \in S^S$. Thus by definition $\Gamma \paren S$ is a subset of $S^S$. So by definition, $\Gamma \paren S$, is a subgroup of $\struct {S^S, \circ}$. {{qed}} Category:Symmetric Groups Category:Permutation Theory \end{proof}
22502
\section{Symmetric Group on 3 Letters is Isomorphic to Dihedral Group D3} Tags: Dihedral Group D3, Examples of Group Isomorphisms, Symmetric Group on 3 Letters \begin{theorem} Let $S_3$ denote the Symmetric Group on 3 Letters. Let $D_3$ denote the dihedral group $D_3$. Then $S_3$ is isomorphic to $D_3$. \end{theorem} \begin{proof} Consider $S_3$ as presented by its Cayley table: {{:Symmetric Group on 3 Letters/Cayley Table}} Consider $D_3$ as presented by its group presentation: {{:Group Presentation of Dihedral Group D3}} and its Cayley table: {{:Dihedral Group D3/Cayley Table}} Let $\phi: S_3 \to D_3$ be specified as: {{begin-eqn}} {{eqn | l = \map \phi {1 2 3} | r = a | c = }} {{eqn | l = \map \phi {2 3} | r = b | c = }} {{end-eqn}} Then by inspection, we see: {{begin-eqn}} {{eqn | l = \map \phi {1 3 2} | r = a^2 | c = }} {{eqn | l = \map \phi {1 3} | r = a b | c = }} {{eqn | l = \map \phi {1 2} | r = a^2 b | c = }} {{end-eqn}} and the result follows. {{qed}} \end{proof}
22503
\section{Symmetric Group on Greater than 4 Letters is Not Solvable} Tags: Symmetric Groups \begin{theorem} Let $n \in \N$ such that $n > 4$. Let $S_n$ denote the symmetric group on $n$ letters. Then $S_n$ is not a solvable group. \end{theorem} \begin{proof} As stated, let $n > 4$ in the below. Recall the definition of solvable group: :A finite group $G$ is a '''solvable group''' {{iff}} it has a composition series in which each factor is a cyclic group. :A '''composition series for $G$''' is a normal series for $G$ which has no proper refinement. :A '''normal series''' for $G$ is a sequence of (normal) subgroups of $G$: ::$\set e = G_0 \lhd G_1 \lhd \cdots \lhd G_n = G$ :where $G_{i - 1} \lhd G_i$ denotes that $G_{i - 1}$ is a proper normal subgroup of $G_i$. Consider the alternating group on $n$ letters $A_n$. From Alternating Group is Normal Subgroup of Symmetric Group, $A_n$ is a proper normal subgroup of $S_n$. But from Normal Subgroup of Symmetric Group on More than 4 Letters is Alternating Group, the ''only'' normal subgroup of $S_n$ is $A_n$ From Alternating Group is Simple except on 4 Letters, $A_n$ is simple. That is, $A_n$ has no proper normal subgroup. It follows that the only composition series for $S_n$ is: :$\set e \lhd A_n \lhd S_n$ From Alternating Group on More than 3 Letters is not Abelian, $A_n$ is not an abelian group. From Cyclic Group is Abelian, $A_n$ is not cyclic. Thus we have demonstrated that the only composition series for $S_n$ contains a factor which is not cyclic. Hence the result. {{qed}} \end{proof}
22504
\section{Symmetric Group on n Letters is Isomorphic to Symmetric Group} Tags: Symmetric Groups \begin{theorem} The symmetric group on $n$ letters $\struct {S_n, \circ}$ is isomorphic to the symmetric group on the $n$ elements of any set $T$ whose cardinality is $n$. That is: :$\forall T \subseteq \mathbb U, \card T = n: \struct {S_n, \circ} \cong \struct {\Gamma \paren T, \circ}$ \end{theorem} \begin{proof} The fact that $\struct {S_n, \circ}$ is a group is a direct implementation of the result Symmetric Group is Group. By definition of cardinality, as $\card T = n$ we can find a bijection between $T$ and $\N_n$. From Number of Permutations, it is immediate that $\order {\paren {\Gamma \paren T, \circ} } = n! = \order {\struct {S_n, \circ} }$. Again, we can find a bijection $\phi$ between $\struct {\Gamma \paren T, \circ}$ and $\struct {S_n, \circ}$. The result follows directly from the Transplanting Theorem. {{qed}} \end{proof}
22505
\section{Symmetric Groups of Same Order are Isomorphic} Tags: Symmetric Groups of Same Order are Isomorphic, Symmetric Groups, Group Isomorphisms \begin{theorem} Let $n \in \Z_{>0}$ be a (strictly) positive integer. Let $T_1$ and $T_2$ be sets whose cardinality $\card {T_1}$ and $\card {T_2}$ are both $n$. Let $\struct {\map \Gamma {T_1}, \circ}$ and $\struct {\map \Gamma {T_2}, \circ}$ be the symmetric group on $S$ and $T$ respectively. Then $\struct {\map \Gamma {T_1}, \circ}$ and $\struct {\map \Gamma {T_2}, \circ}$ are isomorphic. \end{theorem} \begin{proof} Consider the symmetric group on $n$ letters $S_n$. From Symmetric Group on n Letters is Isomorphic to Symmetric Group we have that: :$\struct {\Gamma \paren {T_1}, \circ}$ is isomorphic to $S_n$ :$\struct {\Gamma \paren {T_2}, \circ}$ is isomorphic to $S_n$ and hence from Isomorphism is Equivalence Relation: :$\struct {\Gamma \paren {T_1}, \circ}$ is isomorphic to $\struct {\Gamma \paren {T_2}, \circ}$. {{qed}} \end{proof}
22506
\section{Symmetric Preordering is Equivalence Relation} Tags: Preorder Theory, Equivalence Relations, Symmetric Relations \begin{theorem} Let $\RR \subseteq S \times S$ be a preordering on a set $S$. Let $\RR$ also be symmetric. Then $\RR$ is an equivalence relation on $S$. \end{theorem} \begin{proof} By definition, a preordering on $S$ is a relation on $S$ which is: :$(1): \quad$ reflexive and: :$(2): \quad$ transitive. Thus $\RR$ is a relation on $S$ which is reflexive, transitive and symmetric. Thus by definition $\RR$ is an equivalence relation on $S$. {{qed}} \end{proof}
22507
\section{Symmetric Transitive and Serial Relation is Reflexive} Tags: Reflexive Relations, Equivalence Relations, Symmetric Relations, Serial Relations, Transitive Relations, Relations \begin{theorem} Let $\RR$ be a relation which is: :symmetric :transitive :serial. Then $\RR$ is reflexive. Thus such a relation is an equivalence. \end{theorem} \begin{proof} Let $S$ be a set on which $\RR$ is a relation which is symmetric, transitive and serial. As $\RR$ is symmetric: :$x \mathrel \RR y \implies y \mathrel \RR x$ As $\RR$ is transitive: :$x \mathrel \RR y \land y \mathrel \RR x \implies x \mathrel \RR x$ As $\RR$ is serial: :$\forall x \in S: \exists y \in S: x \mathrel \RR y$ Let $x \in S$. Then {{begin-eqn}} {{eqn | q = \exists y \in S | l = \tuple {x, y} | o = \in | r = \RR | c = as $\RR$ is serial }} {{eqn | ll= \leadsto | l = \tuple {y, x} | o = \in | r = \RR | c = as $\RR$ is symmetric }} {{eqn | ll= \leadsto | l = \tuple {x, x} | o = \in | r = \RR | c = as $\RR$ is transitive }} {{end-eqn}} Thus: :$\forall x: x \mathrel \RR x$ and by definition $\RR$ is reflexive. It follows by definition that such a relation is an equivalence relation. {{qed}} \end{proof}
22508
\section{Symmetric and Antisymmetric Relation is Transitive} Tags: Transitive Relations, Symmetric Relations \begin{theorem} Let $S$ be a set. Let $\RR \subseteq S \times S$ be a relation in $S$ which is both symmetric and antisymmetric. Then $\RR$ is transitive. \end{theorem} \begin{proof} Let $\tuple {x, y}, \tuple {y, z} \in \RR$. By Relation is Symmetric and Antisymmetric iff Coreflexive: :$x = y, y = z$ and so trivially: :$\tuple {x, z} = \tuple {x, x} \in \RR$ Thus $\RR$ is transitive. {{qed}} \end{proof}
22509
\section{Symmetry Group is Group} Tags: Symmetry Groups, Group Examples \begin{theorem} Let $P$ be a geometric figure. Let $S_P$ be the set of all symmetries of $P$. Let $\circ$ denote composition of mappings. The symmetry group $\struct {S_P, \circ}$ is indeed a group. \end{theorem} \begin{proof} By definition, a symmetry mapping is a bijection, and hence a permutation. From Symmetric Group is Group, the set of all permutations on $P$ form the symmetric group $\struct {\map \Gamma P, \circ}$ on $P$. Thus $S_P$ is a subset of $\struct {\map \Gamma P, \circ}$. Let $A$ and $B$ be symmetry mappings on $P$. From Composition of Symmetries is Symmetry, $A \circ B$ is also a symmetry mapping on $P$. Also, by the definition of symmetry, $A^{-1}$ is also a symmetry mapping on $P$. Thus we have: :$A, B \in S_P \implies A \circ B \in S_P$ :$A \in S_P \implies A^{-1} \in \S_P$ The result follows from the Two-Step Subgroup Test. {{qed}} \end{proof}
22510
\section{Symmetry Group of Equilateral Triangle is Group} Tags: Symmetric Group, Symmetry Group of Equilateral Triangle, Symmetry Groups, Group Examples \begin{theorem} The symmetry group of the equilateral triangle is a group. \end{theorem} \begin{proof} Let us refer to this group as $D_3$. Taking the group axioms in turn: \end{proof}
22511
\section{Symmetry Group of Line Segment is Group} Tags: Symmetry Group of Line Segment \begin{theorem} The symmetry group of the line segment is a group. \end{theorem} \begin{proof} Let us refer to this group as $D_1$. Taking the group axioms in turn: \end{proof}
22512
\section{Symmetry Group of Rectangle is Klein Four-Group} Tags: Klein Four-Group, Symmetry Group of Rectangle \begin{theorem} The symmetry group of the rectangle is the Klein $4$-group. \end{theorem} \begin{proof} Comparing the Cayley tables of the symmetry group of the rectangle with the Klein $4$-group the isomorphism can be seen: {{:Symmetry Group of Rectangle/Cayley Table}} {{:Klein Four-Group/Cayley Table}} Thus the required isomorphism $\phi$ can be set up as: {{begin-eqn}} {{eqn | l = \map \phi e | r = e }} {{eqn | l = \map \phi r | r = a }} {{eqn | l = \map \phi h | r = b }} {{eqn | l = \map \phi v | r = c }} {{end-eqn}} {{qed}} \end{proof}
22513
\section{Symmetry Group of Square is Group} Tags: Symmetry Groups: Examples, Examples of Symmetry Groups, Groups of Order 8, Symmetry Group of Square, Symmetry Group Examples, Symmetry Groups \begin{theorem} The symmetry group of the square is a non-abelian group. \end{theorem} \begin{proof} Let us refer to this group as $D_4$. Taking the group axioms in turn: \end{proof}
22514
\section{Symmetry Rule for Binomial Coefficients} Tags: Discrete Mathematics, Symmetry Rule for Binomial Coefficients, Binomial Coefficients \begin{theorem} Let $n \in \Z_{>0}, k \in \Z$. Then: :$\dbinom n k = \dbinom n {n - k}$ where $\dbinom n k$ is a binomial coefficient. \end{theorem} \begin{proof} Follows directly from the definition, as follows. If $k < 0$ then $n - k > n$. Similarly, if $k > n$, then $n - k > 0$. In both cases $\displaystyle \binom n k = \binom n {n - k} = 0$. Let $0 \le k \le n$. {{begin-eqn}} {{eqn | l=\binom n k | r=\frac {n!} {k! \ \left({n - k}\right)!} | c= }} {{eqn | r=\frac {n!} {\left({n - k}\right)! \ k!} | c= }} {{eqn | r=\frac {n!} {\left({n - k}\right)! \ \left ({n - \left({n - k}\right)}\right)!} | c= }} {{eqn | r=\binom n {n - k} | c= }} {{end-eqn}} {{qed}} \end{proof}
22515
\section{Symmetry Rule for Binomial Coefficients/Complex Numbers} Tags: Symmetry Rule for Binomial Coefficients \begin{theorem} For all $z, w \in \C$ such that it is not the case that $z$ is a negative integer and $w$ an integer: :$\dbinom z w = \dbinom z {z - w}$ where $\dbinom z w$ is a binomial coefficient. \end{theorem} \begin{proof} From the definition of the binomial coefficient: :$\dbinom z w := \ds \lim_{\zeta \mathop \to z} \lim_{\omega \mathop \to w} \dfrac {\map \Gamma {\zeta + 1} } {\map \Gamma {\omega + 1} \map \Gamma {\zeta - \omega + 1} }$ where $\Gamma$ denotes the Gamma function. {{begin-eqn}} {{eqn | l = \dbinom z w | r = \lim_{\zeta \mathop \to z} \lim_{\omega \mathop \to w} \dfrac {\map \Gamma {\zeta + 1} } {\map \Gamma {\omega + 1} \map \Gamma {\zeta - \omega + 1} } | c = }} {{eqn | r = \lim_{\zeta \mathop \to z} \lim_{\omega \mathop \to w} \dfrac {\map \Gamma {\zeta + 1} } {\map \Gamma {\zeta - \omega + 1} \map \Gamma {\omega + 1} } | c = }} {{eqn | r = \lim_{\zeta \mathop \to z} \lim_{\omega \mathop \to w} \dfrac {\map \Gamma {\zeta + 1} } {\map \Gamma {\zeta - \omega + 1} \map \Gamma {\zeta - \paren {\zeta - \omega} + 1} } | c = }} {{eqn | r = \dbinom z {z - w} | c = }} {{end-eqn}} {{qed}} \end{proof}
22516
\section{Symmetry Rule for Gaussian Binomial Coefficients} Tags: Symmetry Rule for Binomial Coefficients, Gaussian Binomial Coefficients \begin{theorem} Let $q \in \R_{\ne 1}, n \in \Z_{>0}, k \in \Z$. Then: :$\dbinom n k_q = \dbinom n {n - k}_q$ where $\dbinom n k_q$ is a Gaussian binomial coefficient. \end{theorem} \begin{proof} If $k < 0$ then $n - k > n$. Similarly, if $k > n$, then $n - k < 0$. In both cases: :$\dbinom n k_q = \dbinom n {n - k}_q = 0$ Let $0 \le k \le n$. Consider the case $k \le \dfrac n 2$. Then $k \le n - k$. {{begin-eqn}} {{eqn | l = \binom n {n - k}_q | r = \prod_{j \mathop = 0}^{\paren {n - k} - 1} \frac {1 - q^{n - j} } {1 - q^{j + 1} } | c = {{Defof|Gaussian Binomial Coefficient}} }} {{eqn | r = \paren {\frac {1 - q^{n - 0} } {1 - q^{0 + 1} } } \paren {\frac {1 - q^{n - 1} } {1 - q^{1 + 1} } } \paren {\frac {1 - q^{n - 2} } {1 - q^{2 + 1} } } \cdots \paren {\frac {1 - q^{n - \paren {\paren {n - k} - 1} } } {1 - q^{\paren {\paren {n - k} - 1} + 1} } } | c = }} {{eqn | r = \paren {\frac {1 - q^{n - 0} } {1 - q^{0 + 1} } } \paren {\frac {1 - q^{n - 1} } {1 - q^{1 + 1} } } \paren {\frac {1 - q^{n - 2} } {1 - q^{2 + 1} } } \cdots \paren {\frac {1 - q^{n - \paren {k - 1} } } {1 - q^{\paren {k - 1} + 1} } } \paren {\frac {1 - q^{n - k} } {1 - q^{k + 1} } } \cdots \paren {\frac {1 - q^{k + 1} } {1 - q^{n - k} } } | c = }} {{eqn | r = \paren {\frac {1 - q^{n - 0} } {1 - q^{0 + 1} } } \paren {\frac {1 - q^{n - 1} } {1 - q^{1 + 1} } } \paren {\frac {1 - q^{n - 2} } {1 - q^{2 + 1} } } \cdots \paren {\frac {1 - q^{n - \paren {k - 1} } } {1 - q^{\paren {k - 1} + 1} } } | c = The tail cancels out }} {{eqn | r = \binom n k_q | c = {{Defof|Gaussian Binomial Coefficient}} }} {{end-eqn}} The case $k \ge \dfrac n 2$ can be done by observing: :$n - k \le \dfrac n 2$ and hence by the above: :$\dbinom n k_q = \dbinom n {n - \paren {n - k} }_q = \dbinom n {n - k}_q$ {{qed}} \end{proof}
22517
\section{Symmetry in Space Implies Conservation of Momentum} Tags: Laws of Conservation \begin{theorem} The total derivative of the action $S_{12}$ from states $1$ to $2$ with regard to position is equal to the difference in momentum from states $1$ to $2$: :$\dfrac {\d S_{1 2} } {\d x} = p_2 - p_1$ {{MissingLinks|Although we do have a page Definition:State, it refers to a concept in game theory and not physics.}} \end{theorem} \begin{proof} From the definition of generalized momentum and the Euler-Lagrange Equations: {{begin-eqn}} {{eqn | l = 0 | r = \frac \d {\d t} \frac {\partial \LL} {\partial \dot x} - \frac {\partial \LL} {\partial x} | c = }} {{eqn | r = \dot p_i - \frac {\partial \LL} {\partial x} | c = }} {{eqn | ll= \leadsto | l = \dot p_i | r = \frac {\partial \LL} {\partial x} | c = }} {{end-eqn}} Therefore, via the definition of action, Definite Integral of Partial Derivative and the Fundamental Theorem of Calculus: {{begin-eqn}} {{eqn | l = \frac {\d S_{12} } {\d x} | r = \frac \d {\d x} \int_{t_1}^{t_2} \LL \rd t | c = }} {{eqn | r = \int_{t_1}^{t_2} \frac {\partial \LL} {\partial x} \rd t | c = }} {{eqn | r = \int_{t_1}^{t_2} \dot p_i \rd t | c = }} {{eqn | r = p_2 - p_1 | c = }} {{end-eqn}} {{qed}} Category:Laws of Conservation \end{proof}
22518
\section{Symmetry of Bernoulli Polynomial} Tags: Bernoulli Polynomials \begin{theorem} Let $\map {B_n} x$ denote the nth Bernoulli polynomial. Then: :$\map {B_n} {1 - x} = \paren {-1}^n \map {B_n} x$ \end{theorem} \begin{proof} Let $\map G {t, x}$ denote the Generating Function of Bernoulli Polynomials: :$\map G {t, x} = \dfrac {t e^{t x} } {e^t - 1}$ Then: {{begin-eqn}} {{eqn | l = \map G {t, 1 - x} | r = \frac {t e^{t \paren {1 - x} } } {e^t - 1} }} {{eqn | r = \frac {t e^{t - t x} } {e^t - 1} }} {{eqn | r = \frac {t e^{-t x} } {1 - e^{-t} } }} {{eqn | r = \frac {-t e^{-t x} } {e^{-t} - 1} }} {{eqn | r = \map G {-t, x} }} {{end-eqn}} Thus: {{begin-eqn}} {{eqn | l = \map G {t, 1 - x} | r = \map G {-t, x} }} {{eqn | ll= \leadsto | l = \sum_{k \mathop = 0}^\infty \frac {\map {B_k} {1 - x} } {k!} t^k | r = \sum_{k \mathop = 0}^\infty \frac {\paren {-1}^k \map {B_k} x} {k!} t^k }} {{eqn | ll= \leadsto | l = \map {B_k} {1 - x} | r = \paren {-1}^k \map {B_k} x }} {{end-eqn}} {{qed}} Category:Bernoulli Polynomials \end{proof}
22519
\section{Symmetry of Relations is Symmetric} Tags: Relations, Symmetric Relations \begin{theorem} Let $\RR$ be a relation on $S$ which is symmetric. Then: :$\tuple {x, y} \in \RR \iff \tuple {y, x} \in \RR$. \end{theorem} \begin{proof} Let $\RR$ be symmetric. {{begin-eqn}} {{eqn | l = \tuple {x, y} \in \RR | o = \implies | r = \tuple {y, x} \in \RR | c = {{Defof|Symmetric Relation}} }} {{eqn | l = \tuple {y, x} \in \RR | o = \implies | r = \tuple {x, y} \in \RR | c = {{Defof|Symmetric Relation}} }} {{eqn | ll= \leadsto | l = \leftparen {\tuple {x, y} \in \RR} | o = \iff | r = \rightparen {\tuple {y, x} \in \RR} | c = {{Defof|Biconditional}} }} {{end-eqn}} {{qed}} Category:Symmetric Relations \end{proof}
22520
\section{Syndrome is Zero iff Vector is Codeword} Tags: Linear Codes \begin{theorem} Let $C$ be a linear $\tuple {n, k}$-code whose master code is $\map V {n, p}$ Let $G$ be a (standard) generator matrix for $C$. Let $P$ be a standard parity check matrix for $C$. Let $w \in \map V {n, p}$. Then the syndrome of $w$ is zero {{iff}} $w$ is a codeword of $C$. \end{theorem} \begin{proof} Let $G = \paren {\begin{array} {c|c} \mathbf I & \mathbf A \end{array} }$. Let $c \in \map V {n, p}$. Then, by definition of $G$, $c$ is a codeword of $C$ {{iff}} $c$ is of the form $u G$, where $u \in \map V {k, p}$. Thus $c \in C$ {{iff}}: {{begin-eqn}} {{eqn | l = c | r = u G | c = }} {{eqn | r = u \paren {\begin{array} {c {{!}} c} \mathbf I & \mathbf A \end{array} } | c = }} {{eqn | r = \paren {\begin{array} {c {{!}} c} u & v \end{array} } | c = }} {{end-eqn}} where: :$v = u \mathbf A$ :$\paren {\begin{array} {c|c} u & v \end{array} }$ denotes the $1 \times n$ matrix formed from the $k$ elements of $u$ and the $n - k$ elements of $v$. Let $w \in \map V {n, p}$. $w$ can be expressed in the form: :$w = \paren {\begin{array} {c|c} u_1 & v_1 \end{array} }$ where $u_1 \in \map V {k, p}$. The syndrome of $v$ is then calculated as: {{begin-eqn}} {{eqn | l = \map S v | r = \paren {\begin{array} {c {{!}} c} -\mathbf A^\intercal & \mathbf I \end{array} } w^\intercal | c = }} {{eqn | r = \paren {\begin{array} {c {{!}} c} -\mathbf A^\intercal & \mathbf I \end{array} } \paren {\begin{array} {c {{!}} c} u_1^\intercal & v_1^\intercal \end{array} } | c = }} {{eqn | r = -\mathbf A^\intercal u_1^\intercal + v_1^\intercal | c = }} {{end-eqn}} It follows that the syndrome of $w$ is zero {{iff}} $w$ is the concatenation of $u_1$ and $v_1$, where: :$v_1^\intercal = \mathbf A^\intercal u_1^\intercal = \paren {u_1 \mathbf A}^\intercal$ Thus the syndrome of $w$ is zero {{iff}} $w$ is a codeword of $C$. {{qed}} \end{proof}
22521
\section{Synthetic Basis formed from Synthetic Sub-Basis} Tags: Sub-Bases, Topology, Topological Bases \begin{theorem} Let $X$ be a set. Let $\SS$ be a synthetic sub-basis on $X$. Define: :$\ds \BB = \set {\bigcap \FF: \FF \subseteq \SS, \text{$\FF$ is finite} }$ Then $\BB$ is a synthetic basis on $X$. \end{theorem} \begin{proof} We consider $X$ as the universe. Thus, in accordance with Intersection of Empty Set, we take the convention that: :$\ds \bigcap \O = X \in \BB$ By Set is Subset of Union: General Result, it follows that: :$\ds X \subseteq \bigcup \BB$ That is, axiom $(\text B 1)$ for a synthetic basis is satisfied. We have that $\BB \subseteq \powerset X$. Let $B_1, B_2 \in \BB$. Then there exist finite $\FF_1, \FF_2 \subseteq \SS$ such that: :$\ds B_1 = \bigcap \FF_1$ :$\ds B_2 = \bigcap \FF_2$ It follows that: :$\ds B_1 \cap B_2 = \bigcap \paren {\FF_1 \cup \FF_2}$ {{explain|proof that $\ds \bigcap_{i \mathop \in I} \bigcap \mathbb S_i$ is equal to $\ds \bigcap \bigcup_{i \mathop \in I} \mathbb S_i$}} By Union is Smallest Superset, $\FF_1 \cup \FF_2 \subseteq \SS$. We have that $\FF_1 \cup \FF_2$ is finite. Hence $B_1 \cap B_2 \in \BB$, so it follows by definition that axiom $(\text B 2)$ for a synthetic basis is satisfied. {{qed}} \end{proof}
22522
\section{System of Simultaneous Equations may have Multiple Solutions} Tags: Simultaneous Equations \begin{theorem} Let $S$ be a system of simultaneous equations. Then it is possible that $S$ may have a solution set which is a singleton. \end{theorem} \begin{proof} Consider this system of simultaneous linear equations: {{begin-eqn}} {{eqn | n = 1 | l = x_1 - 2 x_2 + x_3 | r = 1 }} {{eqn | n = 2 | l = 2 x_1 - x_2 + x_3 | r = 2 }} {{end-eqn}} From its evaluation it has the following solutions: {{begin-eqn}} {{eqn | l = x_1 | r = 1 - \dfrac t 3 }} {{eqn | l = x_2 | r = \dfrac t 3 }} {{eqn | l = x_3 | r = t }} {{end-eqn}} where $t$ is any number. Hence the are as many solutions as the cardinality of the domain of $t$. {{qed}} \end{proof}
22523
\section{System of Simultaneous Equations may have No Solution} Tags: Simultaneous Equations \begin{theorem} Let $S$ be a system of simultaneous equations. Then it is possible that $S$ may have a solution set which is empty. \end{theorem} \begin{proof} Consider this system of simultaneous linear equations: {{begin-eqn}} {{eqn | n = 1 | l = x_1 + x_2 | r = 2 }} {{eqn | n = 2 | l = 2 x_1 + 2 x_2 | r = 3 }} {{end-eqn}} From its evaluation it is seen to have no solutions. Hence the result. {{qed}} \end{proof}
22524
\section{System of Simultaneous Equations may have Unique Solution} Tags: Simultaneous Equations \begin{theorem} Let $S$ be a system of simultaneous equations. Then it is possible that $S$ may have a solution set which is a singleton. \end{theorem} \begin{proof} Consider this system of simultaneous linear equations: {{begin-eqn}} {{eqn | n = 1 | l = x_1 - 2 x_2 + x_3 | r = 1 }} {{eqn | n = 2 | l = 2 x_1 - x_2 + x_3 | r = 2 }} {{eqn | n = 3 | l = 4 x_1 + x_2 - x_3 | r = 1 }} {{end-eqn}} From its evaluation it has the following unique solution: {{begin-eqn}} {{eqn | l = x_1 | r = -\dfrac 1 2 }} {{eqn | l = x_2 | r = \dfrac 1 2 }} {{eqn | l = x_3 | r = \dfrac 3 2 }} {{end-eqn}} Hence the result. {{qed}} \end{proof}
22525
\section{Szpilrajn Extension Theorem} Tags: Order Theory \begin{theorem} Let $\struct {S, \prec}$ be a strictly ordered set. {{Disambiguate|Definition:Strictly Ordered Set}} Then there is a strict total ordering on $S$ of which $\prec$ is a subset. \end{theorem} \begin{proof} {{proof wanted}} {{Namedfor|Edward Szpilrajn|cat = Marczewski}} Category:Order Theory \end{proof}
22526
\section{T0 Property is Hereditary} Tags: Topological Subspaces, T0 Spaces \begin{theorem} Let $T = \struct {S, \tau}$ be a topological space which is a $T_0$ (Kolmogorov) space. Let $T_H = \struct {H, \tau_H}$, where $\O \subset H \subseteq S$, be a subspace of $T$. Then $T_H$ is a $T_0$ (Kolmogorov) space. \end{theorem} \begin{proof} Let $T$ be a $T_0$ (Kolmogorov) space. That is: :$\forall x, y \in S$ such that $x \ne y$, either: ::$\exists U \in \tau: x \in U, y \notin U$ :or: ::$\exists U \in \tau: y \in U, x \notin U$ We have that the set $\tau_H$ is defined as: :$\tau_H := \set {U \cap H: U \in \tau}$ Let $x, y \in H$ such that $x \ne y$. Then as $x, y \in S$ we have that: :$\exists U \in \tau: x \in U, y \notin U$ or: :$\exists U \in \tau: y \in U, x \notin U$ Then either: :$U \cap H \in \tau_H: x \in U \cap H, y \notin U \cap H$ or: :$U \cap H \in \tau_H: y \in U \cap H, x \notin U \cap H$ and so the $T_0$ axiom is satisfied. {{qed}} \end{proof}
22527
\section{T0 Space is Preserved under Closed Bijection} Tags: Separation Axioms, T0 Spaces \begin{theorem} Let $T_A = \struct {S_A, \tau_A}$ and $T_B = \struct {S_B, \tau_B}$ be topological spaces. Let $\phi: T_A \to T_B$ be a closed bijection. If $T_A$ is a $T_0$ (Kolmogorov) space, then so is $T_B$. \end{theorem} \begin{proof} Let $T_A$ be a $T_0$ (Kolmogorov) space. By Bijection is Open iff Closed, $\phi$ is an open bijection. By Bijection is Open iff Inverse is Continuous, it follows that $\phi^{-1}$ is continuous. By definition: :$\forall x, y \in S_A$, either: ::$\exists U \in \tau_A: x \in U, y \notin U$ :or: ::$\exists U \in \tau_A: y \in U, x \notin U$ Suppose that: :$\exists x, y \in S_B: \forall V \in \tau_B: x, y \in V \lor x, y \notin V$ That is: :$\exists x, y \in S_B: \forall V \in \tau_B: \set {x, y} \subseteq V \lor \set {x, y} \cap V = \O$ From Image of Subset under Relation is Subset of Image: Corollary 3 it follows that: :$\forall V \in \tau_B: \phi^{-1} \sqbrk {\set {x, y} } \subseteq \phi^{-1} \sqbrk V \lor \phi^{-1} \sqbrk {\set {x, y} } \cap \phi^{-1} \sqbrk V = \O$ that is: :$\forall V \in \tau_B: \map {\phi^{-1} } x, \map {\phi^{-1} } y \in \phi^{-1} \sqbrk V \lor \map {\phi^{-1} } x, \map {\phi^{-1} } y \notin \phi^{-1} \sqbrk V$ But by definition of continuous mapping, $U = \phi^{-1} \sqbrk V$ is open in $T_A$ {{iff}} $V$ is open in $T_B$. Thus: :$\forall U \in \tau_A: \map {\phi^{-1} } x, \map {\phi^{-1} } y \in U \lor \map {\phi^{-1} } x, \map {\phi^{-1} } y \notin U$ This contradicts the condition that $T_A$ is a $T_0$ (Kolmogorov) space. It follows that $T_B$ must also be a $T_0$ (Kolmogorov) space. {{qed}} \end{proof}
22528
\section{T0 Space is Preserved under Homeomorphism} Tags: Separation Axioms, Homeomorphisms, T0 Spaces \begin{theorem} Let $T_A = \struct {S_A, \tau_A}$ and $T_B = \struct {S_B, \tau_B}$ be topological spaces. Let $\phi: T_A \to T_B$ be a homeomorphism. If $T_A$ is a $T_0$ (Kolmogorov) space, then so is $T_B$. \end{theorem} \begin{proof} By definition of homeomorphism, $\phi$ is a closed continuous bijection. The result follows from $T_0$ (Kolmogorov) Space is Preserved under Closed Bijection. {{qed}} \end{proof}
22529
\section{T1/2 Space is T0 Space} Tags: T1: 2 Spaces, T0 Spaces \begin{theorem} Let $T = \struct {S, \tau}$ be a $T_{\frac 1 2}$ topological space. Then $T$ is $T_0$ space. \end{theorem} \begin{proof} By Characterization of T0 Space by Closures of Singletons it suffices to prove that :$\forall x, y \in S: x \ne y \implies x \notin \set y^- \lor y \notin \set x^-$ where $\set y^-$ denotes the closure of $\set y$. Let $x, y$ be points of $T$ such that: :$x \ne y$ {{AimForCont}}: :$x \in \set y^- \land y \in \set x^-$ We will prove that: :$x \notin \set x'$ where $\set x'$ denotes the derivative of $\set x$. {{AimForCont}}: :$x \in \set x'$ As $S$ is open by definition of topological space by Characterization of Derivative by Open Sets: :$\exists z \in S: z \in \set x \cap S \land z \ne x$ By definition of intersection: :$z \in \set x$ This contradicts $z \ne x$ by definition of singleton. Thus $x \notin \set x'$. We will prove that: :$(1): \quad \lnot \forall G \in \tau: y \in G \implies \set x \cap G \ne \O$ {{AimForCont}}: :$\forall G \in \tau: y \in G \implies \set x \cap G \ne \O$ As sublemma we will show that: :$\forall U \in \tau: y \in U \implies \exists r \in S: r \in \set x \cap U \land y \ne r$ Let $U \in \tau$ such that: :$y \in U$ Then by assumption: :$\set x \cap U \ne \O$ By definition of empty set: :$\exists z: z \in \set x \cap U$ By definition of intersection: :$z \in \set x$ Then by definition of singleton: :$z = x \ne y$ Thus: :$\exists r \in S: r \in \set x \cap U \land y \ne r$ Then by Characterization of Derivative by Open Sets: :$y \in \set x'$ By definition of relative complement: :$y \notin \relcomp S {\set x'} \land x \in \relcomp S {\set x'}$ By definition of $T_{\frac 1 2}$ space: :$\set x'$ is closed By definition of closed set: :$\relcomp S {\set x'}$ is open By $x \in \set y^-$ and by Condition for Point being in Closure: :$\set y \cap \relcomp S {\set x'} \ne \O$ Then by definition of empty set: :$\exists z: z \in \set y \cap \relcomp S {\set x'}$ By definition of intersection: :$z \in \set y \land z \in \relcomp S {\set x'}$ By definition of singleton: :$z = y$ Thus $z \in \relcomp S {\set x'}$ contradicts $y \notin \relcomp S {\set x'}$. This ends the proof of $(1)$. Then by $(1)$ and Condition for Point being in Closure: :$y \notin \set x^-$ This contradicts $y \in \set x^-$. {{qed}} \end{proof}
22530
\section{T1 Property is Hereditary} Tags: T1 Spaces, Topological Subspaces \begin{theorem} Let $T = \struct {S, \tau}$ be a topological space which is a $T_1$ (Fréchet) space. Let $T_H = \struct {H, \tau_H}$, where $\O \subset H \subseteq S$, be a subspace of $T$. Then $T_H$ is a $T_1$ (Fréchet) space. \end{theorem} \begin{proof} Let $T$ be a $T_1$ (Fréchet) space. That is: :$\forall x, y \in S$ such that $x \ne y$, both: ::$\exists U \in \tau: x \in U, y \notin U$ :and: ::$\exists U \in \tau: y \in U, x \notin U$ We have that the set $\tau_H$ is defined as: :$\tau_H := \set {U \cap H: U \in \tau}$ Let $x, y \in H$ such that $x \ne y$. Then as $x, y \in S$: :$\exists U \in \tau: x \in U, y \notin U$ and: :$\exists U \in \tau: y \in U, x \notin U$ Then both: :$U \cap H \in \tau_H: x \in U \cap H, y \notin U \cap H$ and: :$U \cap H \in \tau_H: y \in U \cap H, x \notin U \cap H$ and so the $T_1$ axiom is satisfied. {{qed}} \end{proof}
22531
\section{T1 Space is Preserved under Closed Bijection} Tags: T1 Spaces, Separation Axioms \begin{theorem} Let $T_A = \struct {S_A, \tau_A}$ and $T_B = \struct {S_B, \tau_B}$ be topological spaces. Let $\phi: T_A \to T_B$ be a closed bijection. If $T_A$ is a $T_1$ (Fréchet) space, then so is $T_B$. \end{theorem} \begin{proof} Let $T_A$ be a $T_1$ (Fréchet) space. By definition, all points in $T_A$ are closed. Let $a \in S_A$. Then $\set a$ is a closed set. As $\phi$ is a closed mapping it follows directly that $\phi \sqbrk {\set a}$ is closed. As $\phi$ is a bijection it follows that every point in $S_B$ is the image under $\phi$ of a single point in $S_A$. Hence every point in $S_B$ is closed. That is, $T_B$ is a $T_1$ (Fréchet) space. {{qed}} \end{proof}
22532
\section{T1 Space is Preserved under Homeomorphism} Tags: T1 Spaces, Separation Axioms, Homeomorphisms \begin{theorem} Let $T_A = \struct {S_A, \tau_A}$ and $T_B = \struct {S_B, \tau_B}$ be topological spaces. Let $\phi: T_A \to T_B$ be a homeomorphism. If $T_A$ is a $T_1$ (Fréchet) space, then so is $T_B$. \end{theorem} \begin{proof} By definition of homeomorphism, $\phi$ is a closed continuous bijection. The result follows from $T_1$ (Fréchet) Space is Preserved under Closed Bijection. {{qed}} \end{proof}
22533
\section{T1 Space is T0 Space} Tags: T1 Spaces, Separation Axioms, T0 Spaces \begin{theorem} Let $\struct {S, \tau}$ be a Fréchet ($T_1$) space. Then $\struct {S, \tau}$ is also a Kolmogorov ($T_0$) space. \end{theorem} \begin{proof} Let $\struct {S, \tau}$ be a $T_1$ space. Let $x, y \in S: x \ne y$. From the definition of $T_1$ space: :'''Both''' ::$\exists U \in \tau: x \in U, y \notin U$ :'''and''': ::$\exists V \in \tau: y \in V, x \notin V$ From the Rule of Simplification: : $\exists U \in \tau: x \in U, y \notin U$ From the Rule of Addition: :'''Either''' ::$\exists U \in \tau: x \in U, y \notin U$ :'''or''': ::$\exists V \in \tau: y \in V, x \notin V$ which is precisely the definition of a Kolmogorov ($T_0$) space. {{qed}} \end{proof}
22534
\section{T1 Space is T1/2 Space} Tags: T1 Spaces, T1: 2 Spaces \begin{theorem} Let $T$ be a $T_1$ topological space. Then $T$ is $T_{\frac 1 2}$ space. \end{theorem} \begin{proof} By Closure of Derivative is Derivative in T1 Space: :$\forall A \subseteq T: \left({A'}\right)^- = A'$ where :$A'$ denotes the derivative of $A$ :$\left({A'}\right)^-$ denotes the closure of $A'$ Then by Topological Closure is Closed: :$\forall A \subseteq T: A'$ is closed Thus by definition: :$T$ is $T_{\frac 1 2}$ space {{qed}} \end{proof}
22535
\section{T2 Property is Hereditary} Tags: Hausdorff Spaces, Topological Subspaces \begin{theorem} Let $T = \struct {S, \tau}$ be a topological space which is a $T_2$ (Hausdorff) space. Let $T_H = \struct {H, \tau_H}$, where $\O \subset H \subseteq S$, be a subspace of $T$. Then $T_H$ is a $T_2$ (Hausdorff) space. That is, the property of being a $T_2$ (Hausdorff) space is hereditary. \end{theorem} \begin{proof} Let $T = \struct {S, \tau}$ be a $T_2$ (Hausdorff) space. Then: :$\forall x, y \in S, x \ne y: \exists U, V \in \tau: x \in U, y \in V: U \cap V = \O$ That is, for any two distinct elements $x, y \in S$ there exist disjoint open sets $U, V \in \tau$ containing $x$ and $y$ respectively. We have that the set $\tau_H$ is defined as: :$\tau_H := \set {U \cap H: U \in \tau}$ Let $x, y \in H$ such that $x \ne y$. Then as $x, y \in S$ we have that: :$\exists U, V \in \tau: x \in U, y \in V, U \cap V = \O$ As $x, y \in H$ we have that: :$x \in U \cap H, y \in V \cap H: \paren {U \cap H} \cap \paren {V \cap H} = \O$ and so the $T_2$ axiom is satisfied in $H$. {{qed}} \end{proof}
22536
\section{T2 Space is Preserved under Closed Bijection} Tags: Hausdorff Spaces, Separation Axioms \begin{theorem} Let $T_A = \struct {S_A, \tau_A}$ and $T_B = \struct {S_B, \tau_B}$ be topological spaces. Let $\phi: T_A \to T_B$ be a closed bijection. If $T_A$ is a $T_2$ (Hausdorff) space, then so is $T_B$. \end{theorem} \begin{proof} Let $T_A$ be a $T_2$ (Hausdorff) space. Then: :$\forall x, y \in S_A, x \ne y: \exists U_A, V_A \in \tau_A: x \in U_A, y \in V_A: U_A \cap V_A = \O$ Suppose that $T_B$ is not Hausdorff. Then: :$\exists a, b \in S_B: a \ne b: \forall U_B, V_B \in \tau_B: a \in U_B, b \in V_B \implies U_B \cap V_B \ne \O$ That is, there exists at least one pair of points $a$ and $b$ for which all the open sets containing $a$ and $b$ are not disjoint. As $\tau_B$ is a topology, it follows that $W_B = U_B \cap V_B$ is also an open set. Let $U_A = \phi^{-1} \sqbrk {U_B}, V_A = \phi^{-1} \sqbrk {V_B}, W_A = \phi^{-1} \sqbrk {W_B}$. From Preimage of Intersection under Mapping, we have: :$\phi^{-1} \sqbrk {U_B \cap V_B} = \phi^{-1} \sqbrk {U_B} \cap \phi^{-1} \sqbrk {V_B}$ that is: :$U_A \cap V_A = W_A$ By Bijection is Open iff Closed, $\phi$ is an open bijection. By Bijection is Open iff Inverse is Continuous, it follows that $\phi^{-1}$ is continuous. As $\phi$ is continuous, all of $U_A, V_A, W_A$ are open in $T_A$. From the bijective nature of $\phi$, we have that: :$a \in U_B \implies \map {\phi^{-1} } a \in U_A$ :$b \in V_B \implies \map {\phi^{-1} } b \in V_A$ Let $x = \map {\phi^{-1} } a, y = \map {\phi^{-1} } b$. Thus we have that: :$\exists x, y \in S_A: x \ne y: \forall U_A, V_A \in \tau_A: x \in U_A, y \in V_A \implies U_A \cap V_A \ne \O$ contradicting the fact that $T_A$ is a $T_2$ (Hausdorff) space. Hence $T_B$ must after all be a $T_2$ (Hausdorff) space. {{qed}} \end{proof}
22537
\section{T2 Space is T1 Space} Tags: Hausdorff Spaces, T1 Spaces, Separation Axioms, Definitions: Separation Axioms \begin{theorem} Let $\struct {S, \tau}$ be a $T_2$ (Hausdorff) space. Then $\struct {S, \tau}$ is also a $T_1$ (Fréchet) space. \end{theorem} \begin{proof} From the definition of $T_2$ (Hausdorff) space: :$\forall x, y \in S: x \ne y: \exists U, V \in \tau: x \in U, y \in V: U \cap V = \O$ As $U \cap V = \O$ it follows from the definition of disjoint sets that: :$x \in U \implies x \notin V$ :$y \in V \implies y \notin U$ So if $x \in U, y \in V$ then: :$\exists U \in \tau: x \in U, y \notin U$ :$\exists V \in \tau: y \in V, x \notin V$ which is precisely the definition of a $T_1$ (Fréchet) space. {{qed}} \end{proof}
22538
\section{T3 1/2 Property is Hereditary} Tags: T3 1: 2 Spaces, Topological Subspaces \begin{theorem} Let $T = \struct {S, \tau}$ be a topological space which is a $T_{3 \frac 1 2}$ space. Let $T_H = \struct {H, \tau_H}$, where $\O \subset H \subseteq S$, be a subspace of $T$. Then $T_H$ is a $T_{3 \frac 1 2}$ space. \end{theorem} \begin{proof} Let $T = \struct {S, \tau}$ be a $T_{3 \frac 1 2}$ space. Then: :For any closed set $F \subseteq S$ and any point $y \in S$ such that $y \notin F$, there exists an Urysohn function for $F$ and $\set y$. We have that the set $\tau_H$ is defined as: :$\tau_H := \set {U \cap H: U \in \tau}$ Let $F \subseteq H$ such that $F$ is closed in $H$. Let $y \in H$ such that $y \notin F$. From Closed Set in Topological Subspace $F$ is also closed in $T$. Because $T$ is a $T_{3 \frac 1 2}$ space, we have that there exists an Urysohn function for $F$ and $\set y$: That is, there exists a continuous mapping $f: S \to \closedint 0 1$, where $\closedint 0 1$ is the closed unit interval, such that: :$f {\restriction_F} = 0, f {\restriction_{\set y} } = 1$ where $f {\restriction_F}$ denotes the restriction of $f$ to $F$. That is: :$\forall a \in F: \map f a = 0$ :$\forall b \in \set y: \map f b = 1$ From Continuity of Composite with Inclusion, as $f$ is continuous on $S$, then $f {\restriction_H}$ is continuous on $H$. Thus $f {\restriction_H}$ is an Urysohn function for $F$ and $\set y$ in $H$. So the $T_{3 \frac 1 2}$ axiom is satisfied in $H$. {{qed}} \end{proof}
22539
\section{T3 1/2 Space is Preserved under Homeomorphism} Tags: T3 1: 2 Spaces, Separation Axioms, Homeomorphisms \begin{theorem} Let $T_A = \struct {S_A, \tau_A}$ and $T_B = \struct {S_B, \tau_B}$ be topological spaces. Let $\phi: T_A \to T_B$ be a homeomorphism. If $T_A$ is a $T_{3 \frac 1 2}$ space, then so is $T_B$. \end{theorem} \begin{proof} Let $F \subseteq S_B$ be closed, and let $y \in S_B$ such that $y \notin F$. Let $G = \phi^{-1} \sqbrk F$ and let $z = \map {\phi^{-1} } y$. Since $T_A$ is a $T_{3 \frac 1 2}$ space, there exists an Urysohn function $f: X_A \to \closedint 0 1$ for $G$ and $\set z$. Define $g: S_B \to \closedint 0 1$ by: :$\map g x = \map f {\map {\phi^{-1} } x}$ By Composite of Continuous Mappings is Continuous, $g$ is continuous. Also, for all $x \in F$: :$\map g x = 0$ as $\map {\phi^{-1} } x \in G$. Similarly, because $\map {\phi^{-1} } y = z$: :$\map g y = 1$ Hence $g$ is an Urysohn function for $F$ and $\set y$. Since $F$ and $y$ were arbitrary, it follows that $T_B$ is a $T_{3 \frac 1 2}$ space. {{qed}} \end{proof}
22540
\section{T3 1/2 Space is T3 Space} Tags: T3 1: 2 Spaces, T3 Spaces, Separation Axioms \begin{theorem} Let $T$ be a $T_{3 \frac 1 2}$ space. Then $T$ is also a $T_3$ space. \end{theorem} \begin{proof} Let $T = \struct {S, \tau}$ be a $T_{3 \frac 1 2}$ space. From the definition of $T_{3 \frac 1 2}$ space: :For any closed set $F \subseteq S$ and any point $y \in S$ such that $y \notin F$, there exists an Urysohn function for $F$ and $\set y$. Let $F \subseteq S$ be a closed set in $T$ and let $y \in \relcomp S F$. An Urysohn function for $F$ and $\set y$ is a continuous mapping $f: S \to \closedint 0 1$ where: :$\forall a \in F: \map f a = 0$ :$\map f y = 1$ Let: :$U = \map {f^{-1} } 0$ :$V = \map {f^{-1} } 1$ As $f$ is continuous, both $U$ and $V$ are open in $T$. Suppose $x \in U \cap V$. Then we would have: :$x \in U \implies \map f x = 0$ :$x \in V \implies \map f x = 1$ which contradicts the many-to-one nature of a mapping. So $U \cap V = \O$. Thus we have: :$\forall F \subseteq S: \relcomp S F \in \tau, y \in \relcomp S F: \exists U, V \in \tau: F \subseteq U, y \in V: U \cap V = \O$ That is, for any closed set $F \subseteq S$ and any point $y \in S$ such that $y \notin F$ there exist disjoint open sets $U, V \in \tau$ such that $F \subseteq U$, $y \in V$. which is precisely the definition of a $T_3$ space. {{qed}} \end{proof}
22541
\section{T3 1/2 Space is not necessarily T2 Space} Tags: Hausdorff Spaces, T3 1: 2 Spaces \begin{theorem} Let $T = \struct {S, \tau}$ be a be a $T_{3 \frac 1 2}$ space. Then it is not necessarily the case that $T$ is a $T_2$ (Hausdorff) space. \end{theorem} \begin{proof} Proof by Counterexample: Let $S$ be a set and let $\PP$ be a partition on $S$ which is specifically not the (trivial) partition of singletons. Let $T = \struct {S, \tau}$ be the partition space whose basis is $\PP$. From Partition Topology is $T_{3 \frac 1 2}$, we have that $T$ is a $T_{3 \frac 1 2}$ space. From Partition Topology is not Hausdorff, $T$ is not a $T_2$ (Hausdorff) space. The result follows. {{qed}} \end{proof}
22542
\section{T3 Property is Hereditary} Tags: T3 Spaces, Topological Subspaces \begin{theorem} Let $T = \struct {S, \tau}$ be a topological space which is a $T_3$ space. Let $T_H = \struct {H, \tau_H}$, where $\O \subset H \subseteq S$, be a subspace of $T$. Then $T_H$ is a $T_3$ space. \end{theorem} \begin{proof} Let $T = \struct {S, \tau}$ be a $T_3$ space. Then: :$\forall F \subseteq S: \relcomp S F \in \tau, y \in \relcomp S F: \exists U, V \in \tau: F \subseteq U, y \in V: U \cap V = \O$ That is, for any closed set $F \subseteq S$ and any point $y \in S$ such that $y \notin F$ there exist disjoint open sets $U, V \in \tau$ such that $F \subseteq U$, $y \in V$. We have that the set $\tau_H$ is defined as: :$\tau_H := \set {U \cap H: U \in \tau}$ Let $F \subseteq H$ such that $F$ is closed in $H$. Let $y \in H$ such that $y \notin F$. From Closed Set in Topological Subspace $F$ is also closed in $T$. Because $T$ is a $T_3$ space, we have that: :$\exists U, V \in \tau: F \subseteq U, y \in V, U \cap V = \O$ As $F \subseteq H$ and $y \in H, y \notin F$ we have that: :$F \subseteq U \cap H, y \in V \cap H: \paren {U \cap H} \cap \paren {V \cap H} = \O$ and so the $T_3$ axiom is satisfied in $H$. {{qed}} \end{proof}
22543
\section{T3 Space is Preserved under Homeomorphism} Tags: T3 Spaces, Separation Axioms, Homeomorphisms \begin{theorem} Let $T_A = \struct {S_A, \tau_A}$ and $T_B = \struct {S_B, \tau_B}$ be topological spaces. Let $\phi: T_A \to T_B$ be a homeomorphism. If $T_A$ is a $T_3$ space, then so is $T_B$. \end{theorem} \begin{proof} Suppose that $T_A$ is a $T_3$ space. Let $F$ be closed in $T_B$. Let $y \in S_B$ such that $y \notin F$. From Preimage of Intersection under Mapping it follows that $\phi^{-1} \sqbrk F$ and $\map {\phi^{-1}} y$ are disjoint. Also, as $\phi$ is a homeomorphism, it is a fortiori continuous. Thus by Continuity Defined from Closed Sets, $\phi^{-1} \sqbrk F$ is closed. Now as $T_A$ is a $T_3$ space, we find disjoint open sets $U_1$ containing $\phi^{-1} \sqbrk F$ and $U_2$ containing $\map {\phi^{-1}} y$. From Image of Subset is Subset of Image, we have $F = \phi \sqbrk {\phi^{-1} \sqbrk F} \subseteq \phi \sqbrk {U_1}$. Here, the first equality follows from Subset equals Image of Preimage iff Mapping is Surjection, as $\phi$ is a fortiori surjective, being a homeomorphism. Mutatis mutandis, we deduce also $\set y \subseteq \phi \sqbrk {U_2}$. From Image of Intersection under Injection it follows that $\phi \sqbrk {U_1}$ and $\phi \sqbrk {U_2}$ are disjoint. Since $\phi$ is a homeomorphism, they are also both open in $T_B$. Therewith, we have construed two disjoint open sets in $T_B$, one containing $F$, and the other containing $y$. Hence $T_B$ is shown to be a $T_3$ space as well. {{qed}} \end{proof}
22544
\section{T3 Space is Semiregular} Tags: Regular Open Sets, T3 Spaces, Separation Axioms, Semiregular Spaces \begin{theorem} Let $T = \struct {S, \tau}$ be a $T_3$ space. Then $T$ is a semiregular space. \end{theorem} \begin{proof} Let $\BB = \set {B \subseteq S: B^{- \circ} = B}$. In other words, $\BB$ is the collection of regular open sets contained in $S$. It follows immediately from the definition of a basis that our theorem is proved if we can show that: :$\BB$ is a cover for $S$ :$\forall U, V \in \BB: \forall x \in U \cap V: \exists W \in \BB: x \in W \subseteq U \cap V$ First we show that $\BB$ is a cover for $S$. Since $S$ is open: :$S^\circ = S$ Since $S$ is closed: :$S^- = S$ Therefore: :$S^{- \circ} = S$ and so: :$S \in \BB$ From Set is Subset of Union: :$S \subseteq \bigcup \BB$ Thus $\BB$ covers $S$. {{qed|lemma}} Next we demonstrate the second condition for $\BB$ to be a basis. Let $U, V \in \BB$, $x \in U \cap V$. We will show that for some $W \in \BB$: :$x \in W \subseteq U \cap V$ By General Intersection Property of Topological Space, $U \cap V \in \tau$. Since $T$ is $T_3$, there is a closed neighborhood $N_x$ around $x$ that is contained in $U \cap V$: :$\exists N_x: \relcomp S {N_x} \in \tau: \exists Q \in \tau: x \in Q \subseteq N_x \subseteq \paren {U \cap V}$ Let $W := {N_x}^{- \circ}$. Then by Interior of Closure is Regular Open, $W^{- \circ} = W$ and $W \in \BB$. On the other hand, since $N_x$ is closed: :${N_x}^- = N_x$, and thus $W = {N_x}^\circ$ By definition of interior, we have $Q \subseteq W \subseteq N_x$. Therefore we have $x \in W \subseteq N_x \subseteq U \cap V$, as desired. It follows that $\BB$ is a basis for $\tau$. Hence the result by definition of semiregular space. {{qed}} \end{proof}
22545
\section{T4 Property Preserved in Closed Subspace} Tags: Topological Subspaces, Normal Spaces, Separation Axioms, Closed Sets, T4 Spaces \begin{theorem} Let $T = \struct {S, \tau}$ be a topological space. Let $T_K$ be a subspace of $T$ such that $K$ is closed in $T$. If $T$ is a $T_4$ space then $T_K$ is also a $T_4$ space. That is, the property of being a $T_4$ space is weakly hereditary. \end{theorem} \begin{proof} Let $T = \struct {S, \tau}$ be a $T_4$ space. Then: :$\forall A, B \in \map \complement \tau, A \cap B = \O: \exists U, V \in \tau: A \subseteq U, B \subseteq V, U \cap V = \O$ That is, for any two disjoint closed sets $A, B \subseteq S$ there exist disjoint open sets $U, V \in \tau$ containing $A$ and $B$ respectively. We have that the set $\tau_K$ is defined as: :$\tau_K := \set {U \cap K: U \in \tau}$ where $K$ is closed in $T$. Let $A, B \subseteq K$ be closed in $K$ such that $A \cap B = \O$. From Intersection with Subset is Subset we have that $A \subseteq K \iff A \cap K = A$. As $K$ is itself closed in $T$, it follows that so is $A \cap K = A$ from Topology Defined by Closed Sets. Similarly, so is $B \cap K = B$. Because $T$ is a $T_4$ space, we have that: :$\exists U, V \in \tau: A \subseteq U, B \subseteq V, U \cap V = \O$ As $A, B \subseteq K$ such that we have that: :$A \subseteq U \cap K, B \subseteq V \cap K: \paren {U \cap K} \cap \paren {V \cap K} = \O$ From the definition of topological subspace, both $U \cap K$ and $V \cap K$ are open in $K$. Thus the $T_4$ axiom is satisfied in $K$. {{qed}} \end{proof}
22546
\section{T4 Property Preserved in Closed Subspace/Corollary} Tags: Closed Sets, Topological Subspaces, T4 Spaces, Normal Spaces \begin{theorem} Let $T = \struct {S, \tau}$ be a topological space. Let $T_K$ be a subspace of $T$ such that $K$ is closed in $T$. If $T$ is a normal space then $T_K$ is also a normal space. That is, the property of being a normal space is weakly hereditary. \end{theorem} \begin{proof} From the definition, $T = \struct {S, \tau}$ is a normal space {{iff}}: :$\struct {S, \tau}$ is a $T_4$ space :$\struct {S, \tau}$ is a $T_1$ (Fréchet) space. From Separation Properties Preserved in Subspace, any subspace of a $T_1$ space is also a $T_1$ space. From T4 Property Preserved in Closed Subspace, any closed subspace of a $T_4$ space is also a $T_4$ space. Hence the result. {{qed}} \end{proof}
22547
\section{T4 Property is not Hereditary} Tags: T5 Spaces, Topological Subspaces, T4 Spaces \begin{theorem} Let $T = \struct {S, \tau}$ be a topological space which is a $T_4$ space. Let $T_H = \struct {H, \tau_H}$, where $\O \subset H \subseteq S$, be a subspace of $T$. Then it does not necessarily follow that $T_H$ is a $T_4$ space. \end{theorem} \begin{proof} Let $T$ be the Tychonoff plank. Let $T'$ be the deleted Tychonoff plank. By definition, $T'$ is a subspace of $T$. From Tychonoff Plank is Normal, $T$ is a normal space. From Deleted Tychonoff Plank is Not Normal, $T'$ is not a normal space. Thus it is seen that the property of being a normal space is not inherited by a subspace. Hence the result. {{qed}} \end{proof}
22548
\section{T4 Space is Preserved under Homeomorphism} Tags: T4 Spaces, Separation Axioms, Homeomorphisms \begin{theorem} Let $T_A = \struct {S_A, \tau_A}$ and $T_B = \struct {S_B, \tau_B}$ be topological spaces. Let $\phi: T_A \to T_B$ be a homeomorphism. If $T_A$ is a $T_4$ space, then so is $T_B$. \end{theorem} \begin{proof} Suppose that $T_A$ is a $T_4$ space. Let $B_1$ and $B_2$ be closed in $T_B$ that are disjoint. Denote by $A_1 := \phi^{-1} \sqbrk {B_1}$ and $A_2 := \phi^{-1} \sqbrk {B_2}$ the preimages of $B_1$ and $B_2$ under $\phi$, respectively. From Preimage of Intersection under Mapping it follows that $A_1$ and $A_2$ are disjoint. Also, as $\phi$ is a homeomorphism, it is a fortiori continuous. Thus Continuity Defined from Closed Sets applies to yield that both $A_1$ and $A_2$ are closed. Now as $T_A$ is a $T_4$ space, we find disjoint open sets $U_1$ containing $A_1$ and $U_2$ containing $A_2$. From Image of Subset is Subset of Image, we have $B_1 = \phi \sqbrk {\phi^{-1} \sqbrk {B_1} } = \phi \sqbrk {A_1} \subseteq \phi \sqbrk {U_1}$. Here, the first equality follows from Subset equals Image of Preimage iff Mapping is Surjection, as $\phi$ is a fortiori surjective, being a homeomorphism. Mutatis mutandis, we deduce also $B_2 \subseteq \phi \sqbrk {U_2}$. From Image of Intersection under Injection it follows that $\phi \sqbrk {U_1}$ and $\phi \sqbrk {U_2}$ are disjoint. Since $\phi$ is a homeomorphism, they are also both open in $T_B$. Therewith, we have construed two disjoint open sets in $T_B$, one containing $B_1$, and the other containing $B_2$. Hence $T_B$ is shown to be a $T_4$ space as well. {{qed}} \end{proof}
22549
\section{T4 and T3 Space is T 3 1/2} Tags: T3 1: 2 Spaces, T3 Spaces, Separation Axioms, T4 Spaces \begin{theorem} Let $T = \struct {S, \tau}$ be: :a $T_4$ space and also: :a $T_3$ space. Then $T$ is also a $T_{3 \frac 1 2}$ space. \end{theorem} \begin{proof} Let $T = \struct {S, \tau}$ be a $T_4$ space which is also a $T_3$ space. From it being $T_3$: :$\forall F \subseteq S: \relcomp S F \in \tau, y \in \relcomp S F: \exists U, V \in \tau: F \subseteq U, y \in V: U \cap V = \O$ Consider this $U \in \tau$, which is disjoint from $\set y$. Then $\relcomp S U$ is a closed set which is disjoint from $F$ but such that $\set y \subseteq \relcomp S U$. As $T$ is a $T_4$ space, we have that from Urysohn's Lemma there exists an Urysohn function $f$ for $F$ and $\relcomp S U$. As $\set y \subseteq \relcomp S U$, this function $f$ is a Urysohn function for $F$ and $\set y$ as well. So: :For any closed set $F \subseteq S$ and any point $y \in S$ such that $y \notin F$, there exists an Urysohn function for $F$ and $\set y$. which is precisely the definition of a $T_{3 \frac 1 2}$ space. {{qed}} \end{proof}
22550
\section{T5 Property is Hereditary} Tags: T5 Spaces, Topological Subspaces \begin{theorem} Let $T = \struct {S, \tau}$ be a topological space which is a $T_5$ space. Let $T_H = \struct {H, \tau_H}$, where $\O \subset H \subseteq S$, be a subspace of $T$. Then $T_H$ is a $T_5$ space. \end{theorem} \begin{proof} Let $T = \struct {S, \tau}$ be a $T_5$ space. Then: :$\forall A, B \subseteq S, \map {\cl_S} A \cap B = A \cap \map {\cl_S} B = \O: \exists U, V \in \tau: A \subseteq U, B \subseteq V, U \cap V = \O$ where $\map {\cl_S} A$ denotes the closure of $A$ in $S$. That is: :For any two separated sets $A, B \subseteq S$ there exist disjoint open sets $U, V \in \tau$ containing $A$ and $B$ respectively. We have that the set $\tau_H$ is defined as: :$\tau_H := \set {U \cap H: U \in \tau}$ Let $A, B \subseteq H$ such that $\map {\cl_H} A \cap B = A \cap \map {\cl_H} B = \O$. That is, $A$ and $B$ are separated in $H$. Then: {{begin-eqn}} {{eqn | l = \map {\cl_S} A \cap B | r = \map {\cl_S} A \cap \paren {H \cap B} | c = as $B \subseteq H$ }} {{eqn | r = \paren {\map {\cl_S} A \cap H} \cap B | c = Intersection is Associative }} {{eqn | r = \map {\cl_H} A \cap B | c = Closure of Subset in Subspace }} {{eqn | r = \O | c = Assumption }} {{end-eqn}} Similarly: :$A \cap \map {\cl_S} B = \O$ So $A$ and $B$ are separated in $S$. Because $T$ is a $T_5$ space, we have that: :$\exists U, V \in \tau: A \subseteq U, B \subseteq V, U \cap V = \O$ It follows that: :$\exists U \cap H, V \cap H \in \tau_H : A \subseteq U \cap H, B \subseteq V \cap H, \paren {U \cap H} \cap \paren {V \cap H} = \O$ and so the $T_5$ axiom is satisfied in $H$. {{qed}} \end{proof}
22551
\section{T5 Space is T4 Space} Tags: T5 Spaces, Separation Axioms, T4 Spaces \begin{theorem} Let $\struct {S, \tau}$ be a $T_5$ space. Then $\struct {S, \tau}$ is also a $T_4$ space. \end{theorem} \begin{proof} Let $\struct {S, \tau}$ be a $T_5$ space. From the definition of $T_5$ space: :$\forall A, B \subseteq S, A^- \cap B = A \cap B^- = \O: \exists U, V \in \tau: A \subseteq U, B \subseteq V, U \cap V = \O$ where $A^-$ is the closure of $A$ in $T$. Let $C, D \subseteq S$ be disjoint sets which are closed in $T$. Thus $C, D \in \map \complement \tau$ from the definition of closed set. From Topological Closure is Closed: :$C^- = C, D^- = D$ and so from $C \cap D = \O$: :$C^- \cap D = C \cap D^- = \O$ Thus from the definition of $T_5$ space: :$\forall C, D \in \map \complement \tau, C \cap D = \O: \exists U, V \in \tau: C \subseteq U, D \subseteq V$ which is precisely the definition of a $T_4$ space. {{qed}} \end{proof}
22552
\section{Tableau Confutation contains Finite Tableau Confutation} Tags: Propositional Tableaus \begin{theorem} Let $\mathbf H$ be a countable set of WFFs of propositional logic. Let $T$ be a tableau confutation of $\mathbf H$. Then there exists a finite rooted subtree of $T'$ that is also a tableau confutation of $\mathbf H'$. \end{theorem} \begin{proof} For each node $v \in T$, let $\map p v$ be the path from $v$ to $r_T$, the root of $T$. This path is unique by Path in Tree is Unique. Let $\VV$ be the subtree of $T$ consisting those nodes $v$ of $T$ such that $\map p v$ is not contradictory. {{AimForCont}} that $\VV$ were infinite. Then by König's Tree Lemma, $\VV$ has an infinite branch $\Gamma$. Since $\VV \subseteq T$, it follows that $\Gamma$ is also a branch of $T$. However, by construction, it is impossible that $\Gamma$ is contradictory. This contradicts that $T$ is a tableau confutation. Hence $\VV$ is finite. Next, define a finite propositional tableau $T'$ by: :$v \in T' \iff \map \pi v \in \VV$ that is, the rooted tree formed by $\VV$ and all its children. Then by construction, for each leaf node $v$ of $T'$, we have that $v \notin \VV$. That is, that $\map p v$ is a contradictory branch of $T'$. By Leaf of Rooted Tree is on One Branch, every branch of $T'$ is contradictory. Hence $T'$ is a tableau confutation of $\mathbf H'$, as desired. {{qed}} \end{proof}
22553
\section{Tableau Confutation implies Unsatisfiable} Tags: Propositional Tableaus \begin{theorem} Let $\mathbf H$ be a collection of WFFs of propositional logic. Suppose there exists a tableau confutation of $\mathbf H$. Then $\mathbf H$ is unsatisfiable for boolean interpretations. \end{theorem} \begin{proof} Let $\left({T, \mathbf H, \Phi}\right)$ be a tableau confutation of $\mathbf H$. Suppose that $v$ were a boolean interpretation model for $\mathbf H$, i.e.: :$v \models_{\mathrm{BI}} \mathbf H$ By Model of Root of Propositional Tableau is Model of Branch, it follows that: :$v \models_{\mathrm{BI}} \Phi \left[{\Gamma}\right]$ for some branch $\Gamma$ of $T$. Since $T$ is a tableau confutation, there is some WFF $\mathbf A$ such that: :$\mathbf A, \neg\mathbf A \in \Phi \left[{\Gamma}\right]$ Hence $v \models_{\mathrm{BI}} \mathbf A$, i.e.: :$v \left({\mathbf A}\right) = T$ But by the truth table for $\neg$, this means: :$v \left({\neg\mathbf A}\right) = F$ which contradicts that $v \models_{\mathrm{BI}} \neg\mathbf A$. Hence, no such boolean interpretation can exist. That is, $\mathbf H$ is unsatisfiable for boolean interpretations. {{qed}} \end{proof}
22554
\section{Tableau Confutation is Finished} Tags: Propositional Tableaus, Propositional Logic, Propositional Calculus \begin{theorem} Let $T$ be a tableau confutation. Then $T$ is a finished tableau. \end{theorem} \begin{proof} By definition of tableau confutation, every branch of $T$ is contradictory. The result follows by definition of finished propositional tableau. {{qed}} \end{proof}
22555
\section{Tableau Extension Lemma/General Statement/Proof 1} Tags: Propositional Tableaus \begin{theorem} Let $T$ be a finite propositional tableau. Let its hypothesis set $\mathbf H$ be finite. {{:Tableau Extension Lemma/General Statement}} \end{theorem} \begin{proof} Let $T_{\mathbf H'}$ be the finite propositional tableau obtained by replacing the hypothesis set $\mathbf H$ of $T$ with $\mathbf H \cup \mathbf H'$. By the Tableau Extension Lemma, $T_{\mathbf H'}$ has a finished extension $T'$. By definition of extension, $T_{\mathbf H'}$ is a rooted subtree of $T'$. But $T_{\mathbf H'}$ and $T$ are equal when considered as rooted trees. The result follows. {{qed}} Category:Propositional Tableaus \end{proof}
22556
\section{Tableau Extension Lemma/General Statement/Proof 2} Tags: Propositional Tableaus \begin{theorem} Let $T$ be a finite propositional tableau. Let its hypothesis set $\mathbf H$ be finite. {{:Tableau Extension Lemma/General Statement}} \end{theorem} \begin{proof} The proof uses induction on the number $n$ of elements of $\mathbf H$. Suppose we are given the result for the case $n = 1$, that is, when $\mathbf H$ is a singleton. Suppose also that we are given the result for all sets $\mathbf H'$ with $n$ elements. Now, given a set $\mathbf H' = \left\{{\mathbf A_1, \ldots, \mathbf A_{n+1}}\right\}$ with $n+1$ elements. Let $T$ be a finite propositional tableau. By induction hypothesis, there is a finished finite propositional tableau $T'$ containing $T$ as a subgraph, and with root $\mathbf H \cup \left\{{\mathbf A_1, \ldots, \mathbf A_n}\right)$. Now apply the case $n = 1$ to this resulting propositional tableau $T'$ and the set $\left\{{\mathbf A_{n+1}}\right\}$. This yields a finished finite propositional tableau $T''$ which: $(1):\quad$ has root $\mathbf H \cup \left\{{\mathbf A_1, \ldots, \mathbf A_n}\right\} \cup \left\{{\mathbf A_{n+1}}\right\} = \mathbf H \cup \mathbf H'$; $(2):\quad$ contains $T'$ as a subgraph. But then $T''$ also contains $T$ as a subgraph, proving the result for $\mathbf H'$. It thus only remains to take care of the base cases $n = 0$ and $n = 1$. First, the case $n = 0$. Let $T$ be a finite propositional tableau. To find the finite propositional tableau $T'$ with the desired properties, we use some of the tableau construction rules, starting with $T$. Let $t$ be any leaf node of $T$, and let $\Gamma_t$ be the branch from Leaf of Rooted Tree is on One Branch. Let $n \left({\Gamma_t}\right)$ be the number of non-basic WFFs that were not used to add any of the nodes of $\Gamma_t$ to $T$. It is seen that for any application of the tableau construction rules on $t$: :If $s$ is added by the rule, then $n \left({\Gamma_s}\right) \le n \left({\Gamma_t}\right)$. Moreover, it is seen that any rule reduces the total count $m \left({\Gamma_t}\right)$ of logical connectives occurring in these non-basic, unused WFFs along $\Gamma_t$. In conclusion: :If $s$ is added by a rule, then $m \left({\Gamma_s}\right) < m \left({\Gamma_t}\right)$ By the Method of Infinite Descent applied to $m \left({\Gamma_t}\right)$, only finitely many rules can be applied, starting from $t$. Since $T$ has only finitely many leaves and corresponding branches, only finitely many rules can be applied to $T$ in total. Let $T'$ be the finite propositional tableau resulting from applying all these possible rules. By construction of $T'$, it follows that every branch of $\Gamma$ is either contradictory or finished. That is, $T'$ is finished. Finally, the last case, $n = 1$. Let $\mathbf A$ be a WFF of propositional logic. Let $T$ be a finite propositional tableau. First, using the case $n = 0$, extend $T$ to a finished finite propositional tableau $T'$. Again using the case $n = 0$, let $T_{\mathbf A}$ be a finished finite propositional tableau with root $\left\{{\mathbf A}\right\}$. Now add $\mathbf A$ to the root of $T'$. Then at every leaf $t$ of $T'$, $\mathbf A$ is the only WFF that is not used yet. As far as the rules for propositional tableaus are concerned, there is no difference between: :$t$ as a leaf of $T'$, and :the tableau consisting only of a root and with hypothesis set $\mathbf A$. Therefore, the rules allow to "paste", as it were, the finished tableau $T_{\mathbf A}$ under every leaf $t$ of $T'$. Denote the resulting tableau with $T'_{\mathbf A}$. Then for any branch $\Gamma$ of $T'_{\mathbf A}$ and every non-basic WFF $\mathbf B$ along it: :$\mathbf B$ is on $T'$, or: :$\mathbf B$ is on a copy of $T_{\mathbf A}$. In either case, the finished nature of these tableaus implies that: :$\mathbf B$ is used at some node of $\Gamma$ Hence $\Gamma$ is contradictory or finished. In conclusion, $T'_{\mathbf A}$ is finished, and contains $T$ as a subgraph. The result follows from the Principle of Mathematical Induction. {{qed}} Category:Propositional Tableaus \end{proof}
22557
\section{Tail of Convergent Sequence} Tags: Convergence, Sequences \begin{theorem} Let $\left\langle{a_n}\right\rangle$ be a real sequence. Let $N \in \N$ be a natural number. Let $a \in R$ be a real number. Then: :$a_n \to a$ {{iff}}: :$a_{n + N} \to a$ \end{theorem} \begin{proof} {{ProofWanted}} Category:Convergence Category:Sequences \end{proof}
22558
\section{Tail of Convergent Series tends to Zero} Tags: Series \begin{theorem} Let $\sequence {a_n}_{n \mathop \ge 1}$ be a sequence of real numbers. Let $\ds \sum_{n \mathop = 1}^\infty a_n$ be a convergent series. Let $N \in \N_{\ge 1}$ be a natural number. Let $\ds \sum_{n \mathop = N}^\infty a_n$ be the tail of the series $\ds \sum_{n \mathop = 1}^\infty a_n$. Then: :$\ds \sum_{n \mathop = N}^\infty a_n$ is convergent :$\ds \sum_{n \mathop = N}^\infty a_n \to 0$ as $N \to \infty$. That is, the tail of a convergent series tends to zero. \end{theorem} \begin{proof} Let $\sequence {s_n}$ be the sequence of partial sums of $\ds \sum_{n \mathop = 1}^\infty a_n$. Let $\sequence {s'_n}$ be the sequence of partial sums of $\ds \sum_{n \mathop = N}^\infty a_n$. It will be shown that $\sequence {s'_n}$ fulfils the Cauchy criterion. That is: :$\forall \epsilon \in \R_{>0}: \exists N: \forall m, n > N: \size {s'_n - s'_m} < \epsilon$ Let $\epsilon \in \R_{>0}$ be a strictly positive real number. As $\sequence {s_n}$ is convergent, it conforms to the Cauchy criterion by Convergent Sequence is Cauchy Sequence. Thus: :$\exists N: \forall m, n > N: \size {s_n - s_m} < \epsilon$ Now: {{begin-eqn}} {{eqn | l = s_n | r = \sum_{k \mathop = 1}^n a_k | c = }} {{eqn | r = \sum_{k \mathop = 1}^{N - 1} a_k + \sum_{k \mathop = N}^n a_k | c = Indexed Summation over Adjacent Intervals }} {{eqn | r = s_{N - 1} + s'_n }} {{end-eqn}} and similarly: :$s_m = s_{N - 1} + s'_m$ Thus: :$s'_n = s_n - s_{N - 1}$ and: :$s'_m = s_m - s_{N - 1}$ So: {{begin-eqn}} {{eqn | l = \size {s_n - s_m} | o = < | r = \epsilon | c = }} {{eqn | ll= \leadsto | l = \size {s_n - s_{N - 1} - s_m + s_{N - 1} } | o = < | r = \epsilon | c = }} {{eqn | ll= \leadsto | l = \size {\paren {s_n - s_{N - 1} } - \paren {s_m - s_{N - 1} } } | o = < | r = \epsilon | c = }} {{eqn | ll= \leadsto | l = \size {\size {s_n - s_{N - 1} } - \size {s_m - s_{N - 1} } } | o = < | r = \epsilon | c = Triangle Inequality }} {{eqn | ll= \leadsto | l = \size {s'_n - s'_m} | o = < | r = \epsilon | c = }} {{end-eqn}} So $\ds \sum_{n \mathop = N}^\infty a_n$ fulfils the Cauchy criterion. By Convergent Sequence is Cauchy Sequence it follows that it is convergent. Now it is shown that $\ds \sum_{n \mathop = N}^\infty a_n \to 0$ as $N \to \infty$. We have that $\sequence {s_n}$ is convergent, Let its limit be $l$. Thus we have: :$\ds l = \sum_{n \mathop = 1}^\infty a_n = s_{N - 1} + \sum_{n \mathop = N}^\infty a_n$ So: :$\ds \sum_{n \mathop = N}^\infty a_n = l - s_{N - 1}$ But $s_{N - 1} \to l$ as $N - 1 \to \infty$. The result follows. {{qed}} \end{proof}
22559
\section{Tamref's Last Theorem} Tags: Number Theory \begin{theorem} The Diophantine equation: :$n^x + n^y = n^z$ has exactly one form of solutions in integers: :$2^x + 2^x = 2^{x + 1}$ for all $x \in \Z$. \end{theorem} \begin{proof} Since $n^z = n^x + n^y > n^x$ and $n^y$, $z > x,y$. {{WLOG}} assume that $x \le y < z$. {{begin-eqn}} {{eqn | l = n^x + n^y | r = n^z }} {{eqn | ll = \leadsto | l = 1 + n^{y - x} | r = n^{z - x} | c = Divide both sides by $n^x$ }} {{eqn | ll = \leadsto | l = 1 | r = n^{y - x} \paren {n^{z - y} - 1} }} {{end-eqn}} Since both $n^{y - x}$ and $n^{z - y} - 1$ are positive integers, both are equal to $1$. This gives: :$y = x$ and $n^{z - y} = 2$ which gives the integer solution: :$n = 2$, $z - y = 1$ Thus the solutions are: :$\tuple {n, x, y, z} = \tuple {2, x, x, x + 1}, x \in \Z$ {{qed}} \end{proof}
22560
\section{Tangent Exponential Formulation/Formulation 1} Tags: Tangent Exponential Formulation, Tangent Function \begin{theorem} Let $z$ be a complex number. Let $\tan z$ denote the tangent function and $i$ denote the imaginary unit: $i^2 = -1$. Then: :$\tan z = i \dfrac {1 - e^{2 i z} } {1 + e^{2 i z} }$ \end{theorem} \begin{proof} {{begin-eqn}} {{eqn | l = \tan z | r = \frac {\sin z} {\cos z} | c = {{Defof|Complex Tangent Function}} }} {{eqn | r = \frac {\frac 1 2 i \paren {e^{-i z} - e^{i z} } } {\frac 1 2 \paren {e^{-i z} + e^{i z} } } | c = Sine Exponential Formulation and Cosine Exponential Formulation }} {{eqn | r = i \frac {e^{-i z} - e^{i z} } {e^{-i z} + e^{i z} } }} {{eqn | r = i \frac {1 - e^{2 i z} } {1 + e^{2 i z} } | c = multiplying numerator and denominator by $e^{i z}$ }} {{end-eqn}} {{qed}} Category:Tangent Exponential Formulation \end{proof}
22561
\section{Tangent Function is Periodic on Reals} Tags: Tangent Function, Analysis \begin{theorem} The tangent function is periodic on the set of real numbers $\R$ with period $\pi$. This can be written: :$\tan x = \map \tan {x \bmod \pi}$ where $x \bmod \pi$ denotes the modulo operation. \end{theorem} \begin{proof} {{begin-eqn}} {{eqn | l = \map \tan {x + \pi} | r = \frac {\map \sin {x + \pi} } {\map \cos {x + \pi} } | c = {{Defof|Real Tangent Function}} }} {{eqn | r = \frac {-\sin x} {-\cos x} | c = Sine and Cosine are Periodic on Reals }} {{eqn | r = \tan x | c= }} {{end-eqn}} From Derivative of Tangent Function, we have that: :$\map {D_x} {\tan x} = \dfrac 1 {\cos^2 x}$ provided $\cos x \ne 0$. From Shape of Cosine Function, we have that $\cos > 0$ on the interval $\openint {-\dfrac \pi 2} {\dfrac \pi 2}$. From Derivative of Monotone Function, $\tan x$ is strictly increasing on that interval, and hence can not have a period of ''less'' than $\pi$. Hence the result. {{qed}} \end{proof}
22562
\section{Tangent Inequality} Tags: Trigonometry, Tangent Function, Inequalities \begin{theorem} :$x < \tan x$ for all $x$ in the interval $\left({0 \,.\,.\, \dfrac {\pi} 2}\right)$. \end{theorem} \begin{proof} Let $f \left({x}\right) = \tan x - x$. By Derivative of Tangent Function, $f' \left({x}\right) = \sec^2 x - 1$. By Shape of Secant Function, $\sec^2 x > 1$ for $x \in \left({0 \,.\,.\, \dfrac {\pi} 2}\right)$. Hence $f' \left({x}\right) > 0$. From Derivative of Monotone Function, $f \left({x}\right)$ is strictly increasing in this interval. Since $f \left({0}\right) = 0$, it follows that $f \left({x}\right) > 0$ for all $x$ in $x \in \left({0 \,.\,.\, \dfrac {\pi} 2}\right)$. {{qed}} Category:Tangent Function Category:Inequalities \end{proof}
22563
\section{Tangent Line to Convex Graph} Tags: Analysis, Differential Calculus, Convex Real Functions, Tangents, Analytic Geometry \begin{theorem} Let $f$ be a real function that is: :continuous on some closed interval $\closedint a b$ :differentiable and convex on the open interval $\openint a b$. Then all the tangent lines to $f$ are below the graph of $f$. {{explain|"below"}} \end{theorem} \begin{proof} :500px Let $\TT$ be the tangent line to $f$ at some point $\tuple {c, \map f c}$, $c \in \openint a b$. Let the gradient of $\TT$ be $m$. Let $\tuple {x_1, y_1}$ be an arbitrary point on $\TT$. From the point-slope form of a straight line: {{begin-eqn}} {{eqn | l = y - y_1 | r = m \paren {x - x_1} | c = }} {{eqn | ll= \leadsto | l = y | r = m \paren {x - x_1} + y_1 | c = }} {{end-eqn}} For $\TT$: :$y = \map \TT x$ :$y_1 = \map f c$ :$m = \map {f'} c$ :$x = x$ :$x_1 = c$ so: :$\map \TT x = \map {f'} c \paren {x - c} + \map f c$ Consider the graph of $f$ to the right of $\tuple {c, \map f c}$, that is, any $x$ in $\openint c b$. Let $d$ be the directed vertical distance from $\TT$ to the graph of $f$. That is, if $f$ is above $\TT$ then $d > 0$. If $f$ is below $\TT$, then $d < 0$. (From the diagram, it is apparent that $\TT$ is below $f$, but we shall prove it analytically.) $d$ can be evaluated by: {{begin-eqn}} {{eqn | l = d | r = \map f x - \map \TT x | c = }} {{eqn | r = \map f x - \map {f'} c \paren {x - c} - \map f c | c = }} {{eqn | r = \map f x - \map f c - \map {f'} c \paren {x - c} | c = }} {{end-eqn}} By the Mean Value Theorem, there exists some constant $k$ in $\openint c b$ such that: {{begin-eqn}} {{eqn | l = \map {f'} k | r = \frac {\map f x - \map f c} {x - c} | c = }} {{eqn | ll= \leadsto | l = \map {f'} k \paren {x - c} | r = \map f x - \map f c | c = }} {{end-eqn}} Substitute this into the formula for $d$: {{begin-eqn}} {{eqn | l = d | r = \map {f'} k \paren {x - c} - \map {f'} c \paren {x - c} | c = }} {{eqn | r = \paren {\map {f'} k - \map {f'} c} \paren {x - c} | c = }} {{end-eqn}} Recall that $x$ lies in the interval $\openint c b$. So $x > c$, and the quantity $x - c$ is (strictly) positive. $k$ is also in the interval $\openint c b$ and so $k > c$. By construction, $f$ is convex. By the definition of convex: :$k > c \implies \map {f'} k > \map {f'} c$ which means that: :$\paren {\map {f'} k - \map {f'} c} > 0$ Then $d$ is the product of two (strictly) positive quantities and is itself (strictly) positive. Similarly, consider the graph of $f$ to the left of $\tuple {c, \map f c}$, that is, any $x$ in $\openint a c$. By the same process as above, we will have: :$d = \paren {\map {f'} k - \map {f'} c} \paren {x - c}$ This time, $x < c$ and the quantity $x - c$ is (strictly) negative. Further, $k < c$, and so by a similar argument as above: :$k < c \implies \map {f'} k < \map {f'} c$ and the quantity $\paren {\map {f'} k - \map {f'} c}$ is also (strictly) negative. Thus $d$ will be the product of two (strictly) negative quantities, and will again be (strictly) positive. {{qed}} \end{proof}
22564
\section{Tangent Secant Theorem} Tags: Circles, Named Theorems, Tangent Secant Theorem, Tangents \begin{theorem} Let $D$ be a point outside a circle $ABC$. Let $DB$ be tangent to the circle $ABC$. Let $DA$ be a straight line which cuts the circle $ABC$ at $A$ and $C$. Then $DB^2 = AD \cdot DC$. {{:Euclid:Proposition/III/36}} \end{theorem} \begin{proof} Let $DA$ pass through the center $F$ of circle $ABC$. Join $FB$. From Radius at Right Angle to Tangent, $\angle FBD$ is a right angle. :320px We have that $F$ bisects $AC$ and that $CD$ is added to it. So we can apply Square of Sum less Square and see that: :$AD \cdot DC + FC^2 = FD^2$ But $FC = FB$ and so: :$AD \cdot DC + FB^2 = FD^2$ But from Pythagoras's Theorem we have that $FD^2 = FB^2 + DB^2$ and so: :$AD \cdot DC + FB^2 = FB^2 + DB^2$ from which it follows that: :$AD \cdot DC = DB^2$ which is what we wanted to show. {{qed|lemma}} Now let $DA$ be such that it does not pass through the center $E$ of circle $ABC$. Draw $EF$ perpendicular to $DA$ and draw $EB, EC, ED$. :320px From Radius at Right Angle to Tangent, $\angle EBD$ is a right angle. From Conditions for Diameter to be Perpendicular Bisector, $EF$ bisects $AC$. So $AF = FC$. So we can apply Square of Sum less Square and see that: :$AD \cdot DC + FC^2 = FD^2$ Let $FE^2$ be added to each: :$AD \cdot DC + FC^2 + FE^2 = FD^2 + FE^2$ Now $\angle DFE$ is a right angle and so by Pythagoras's Theorem we have: :$FD^2 + FE^2 = ED^2$ :$FC^2 + FE^2 = EC^2$ This gives us: :$AD \cdot DC + EC^2 = ED^2$ But $EC = EB$ as both are radii of the circle $ABC$. Next note that $\angle EBD$ is a right angle and so by Pythagoras's Theorem we have: :$ED^2 = EB^2 + DB^2$ which gives us: :$AD \cdot DC + EB^2 = EB^2 + DB^2$ from which it follows that: :$AD \cdot DC = DB^2$ which is what we wanted to show. {{qed}} {{Euclid Note|36|III|{{EuclidNoteConverse|prop = 37|title = Converse of Tangent Secant Theorem}}}} \end{proof}
22565
\section{Tangent Space is Vector Space} Tags: \begin{theorem} Let $M$ be a smooth manifold of dimension $n \in \N$. Let $m \in M$ be a point. Let $\struct {U, \kappa}$ be a chart with $m \in U$. Let $T_m M$ be the tangent space at $m$. Then $T_m M$ is a real vector space of dimension $n$, spanned by the basis: :$\set {\valueat {\dfrac \partial {\partial \kappa^i} } m : i \in \set {1, \dotsc, n} }$ that is, the set of partial derivatives with respect to the $i$th coordinate function $\kappa^i$ evaluated at $m$. \end{theorem} \begin{proof} Let $V$ be an open neighborhood of $m$ with $V \subseteq U \subseteq M$. Let $\map {C^\infty} {V, \R}$ be the set of smooth mappings $f: V \to \R$. Let $X_m, Y_m \in T_m M$. Let $\lambda \in \R$. Then, by definition of tangent vector and Equivalence of Definitions of Tangent Vector: :$X_m, Y_m$ are linear transformations on $\map {C^\infty} {V, \R}$. Hence $\paren {X_m + \lambda Y_m}$ are also linear transformations. Therefore, it is enough to show that $X_m + \lambda Y_m$ satisfies the Leibniz law. Let $f, g \in \map {C^\infty} {V, \R}$. Then: {{begin-eqn}} {{eqn | l = \map {\paren {X_m + \lambda Y_m} } {f g} | r = \map {X_m} {f g} + \lambda \map {Y_m} {f g} | c = {{Defof|Linear Transformation}} }} {{eqn | r = \map {X_m} f \map g m + \map f m \map {X_m} g + \lambda \paren {\map {Y_m} f \map g m + \map f m \map {Y_m} g} | c = Leibniz law for $X_m, Y_m$ }} {{eqn | r = \map {\paren {X_m + \lambda Y_m} } f \map g m + \map f m \map {\paren {X_m + \lambda Y_m} } g | c = reordering summands }} {{end-eqn}} It follows that: :$X_m + \lambda Y_m \in T_m M$ Hence $T_m M$ is a real vector space. Again, by definition of tangent vector and Equivalence of Definitions of Tangent Vector: :for all $X_m \in T_m M$ there exists a smooth curve: ::$\gamma: I \subseteq \R \to M$ :where $\map \gamma 0 = m$ such that: {{begin-eqn}} {{eqn | l = \map {X_m} f | r = \valueat {\map {\frac {\map \d {f \circ \gamma} } {\d \tau} } \tau} {\tau \mathop = 0} }} {{eqn | r = \valueat {\map {\frac {\map \d {f \circ \kappa^{-1} \circ \kappa \circ \gamma} } {\d \tau} } \tau} {\tau \mathop = 0} | c = $f \circ \kappa^{-1} \circ \kappa = f$, as $\kappa$ is a homeomorphism, in particular a bijection. }} {{eqn | r = \sum_{i \mathop = 1}^n \valueat {\map {\frac {\map \partial {f \circ \kappa^{-1} } } {\partial \kappa^i} } {\map {\kappa \circ \gamma} \tau} \map {\frac {\map \d {\kappa^i \circ \gamma} } {\d \tau} } \tau} {\tau \mathop = 0} | c = Chain Rule for Real-Valued Functions }} {{eqn | r = \sum_{i \mathop = 1}^n \map {\frac {\map \d {\kappa^i \circ \gamma} } {\d \tau} } 0 \map {\frac {\map \partial {f \circ \kappa^{-1} } } {\partial \kappa^i} } {\map {\kappa \circ \gamma} 0} | c = rearranging }} {{eqn | r = \sum_{i \mathop = 1}^n \map {\frac {\map \d {\kappa^i \circ \gamma} } {\d \tau} } 0 \map {\frac {\map \partial {f \circ \kappa^{-1} } } {\partial \kappa^i} } {\map \kappa m} | c = as $m = \map \gamma 0$ }} {{end-eqn}} We define: :$X^i_m := \map {\dfrac {\map \d {\kappa^i \circ \gamma} } {\d \tau} } 0$ and as above: :$\valueat {\map {\dfrac \partial {\partial \kappa^i} } m} f := \map {\dfrac {\map \partial {f \circ \kappa^{-1} } } {\partial \kappa^i} } {\map \kappa m}$ Therefore: :$\ds \map {X_m} f = \map {\paren {\sum_{i \mathop = 1}^n X^i_m \valueat {\dfrac \partial {\partial \kappa^i} } m} } f$ {{iff}}: :$\ds X_m = \sum_{i \mathop = 1}^n X^i_m \valueat {\frac \partial {\partial \kappa^i} } m$ Hence: :$\set {\valueat {\dfrac \partial {\partial \kappa^i} } m: i \in \set {1, \dotsc, n} }$ forms a basis. Hence, by definition of dimension of vector space: :$\dim T_m M = n = \dim M$ This completes the proof. {{qed}} \end{proof}
22566
\section{Tangent in terms of Secant} Tags: Trigonometric Functions, Tangent Function, Secant Function \begin{theorem} Let $x$ be a real number such that $\cos x \ne 0$. Then: {{begin-eqn}} {{eqn | l = \tan x | r = +\sqrt {\sec^2 x - 1} | c = if there exists an integer $n$ such that $n \pi < x < \paren {n + \dfrac 1 2} \pi$ }} {{eqn | l = \tan x | r = -\sqrt {\sec^2 x - 1} | c = if there exists an integer $n$ such that $\paren {n + \dfrac 1 2} \pi < x < \paren {n + 1} \pi$ }} {{end-eqn}} where $\tan$ denotes the real tangent function and $\sec$ denotes the real secant function. \end{theorem} \begin{proof} {{begin-eqn}} {{eqn | l = \sec^2 x - \tan^2 x | r = 1 | c = Difference of Squares of Secant and Tangent }} {{eqn | ll= \leadsto | l = \tan^2 x | r = \sec^2 x - 1 }} {{eqn | ll= \leadsto | l = \tan x | r = \pm \sqrt {\sec^2 x - 1} }} {{end-eqn}} Also, from Sign of Tangent: :If there exists integer $n$ such that $n \pi < x < \paren {n + \dfrac 1 2} \pi$, $\tan x > 0$. :If there exists integer $n$ such that $\paren {n + \dfrac 1 2} \pi < x < \paren {n + 1} \pi$, $\tan x < 0$. When $\cos x = 0$, $\tan x$ and $\sec x$ is undefined. {{qed}} \end{proof}
22567
\section{Tangent is Reciprocal of Cotangent} Tags: Trigonometric Functions, Cotangent Function, Reciprocal, Tangent Function \begin{theorem} Let $\theta$ be an angle such that $\sin \theta \ne 0$ and $\cos \theta \ne 0$. Then: :$\tan \theta = \dfrac 1 {\cot \theta}$ where $\tan$ denotes the tangent function and $\cot$ denotes the cotangent function. \end{theorem} \begin{proof} {{begin-eqn}} {{eqn | l = \frac 1 {\tan \theta} | r = \cot \theta | c = Cotangent is Reciprocal of Tangent }} {{eqn | ll= \leadsto | l = \tan \theta | r = \frac 1 {\cot \theta} }} {{end-eqn}} $\tan \theta$ is not defined when $\cos \theta = 0$, and $\cot \theta$ is not defined when $\sin \theta = 0$. {{qed}} \end{proof}
22568
\section{Tangent is Sine divided by Cosine} Tags: Trigonometry, Sine Function, Tangent Function, Cosine Function \begin{theorem} Let $\theta$ be an angle such that $\cos \theta \ne 0$. Then: :$\tan \theta = \dfrac {\sin \theta} {\cos \theta}$ where $\tan$, $\sin$ and $\cos$ mean tangent, sine and cosine respectively. \end{theorem} \begin{proof} Let a point $P = \tuple {x, y}$ be placed in a cartesian plane with origin $O$ such that $OP$ forms an angle $\theta$ with the $x$-axis. Then: {{begin-eqn}} {{eqn | l = \frac {\sin \theta} {\cos \theta} | r = \frac {y / r} {x / r} | c = Sine of Angle in Cartesian Plane and Cosine of Angle in Cartesian Plane }} {{eqn | r = \frac y r \frac r x | c = }} {{eqn | r = \frac y x | c = }} {{eqn | r = \tan \theta | c = Tangent of Angle in Cartesian Plane }} {{end-eqn}} When $\cos \theta = 0$ the expression $\dfrac {\sin \theta} {\cos \theta}$ is not defined. {{qed}} \end{proof}
22569
\section{Tangent of 22.5 Degrees} Tags: Tangent Function \begin{theorem} :$\tan 22.5^\circ = \tan \dfrac \pi 8 = \sqrt 2 - 1$ where $\tan$ denotes tangent. \end{theorem} \begin{proof} {{begin-eqn}} {{eqn | l = \tan 22.5^\circ | r = \tan \frac {45^\circ} 2 | c = }} {{eqn | r = \frac {1 - \cos 45^\circ} {\sin 45^\circ} | c = Half Angle Formulas/Tangent/Corollary 2 }} {{eqn | r = \frac {1 - \frac {\sqrt 2} 2} {\frac {\sqrt 2} 2} | c = Cosine of 45 Degrees and Sine of 45 Degrees }} {{eqn | r = \sqrt 2 - 1 | c = multiplying top and bottom by $\sqrt 2$ }} {{end-eqn}} {{qed}} Category:Tangent Function \end{proof}
22570
\section{Tangent of 67.5 Degrees} Tags: Tangent Function \begin{theorem} :$\tan 67.5 \degrees = \tan \dfrac {3 \pi} 8 = \sqrt 2 + 1$ where $\tan$ denotes tangent. \end{theorem} \begin{proof} {{begin-eqn}} {{eqn | l = \tan 67.5 \degrees | r = \map \tan {45 \degrees + 22.5 \degrees} | c = }} {{eqn | r = \frac {\tan 45 \degrees + \tan 22.5 \degrees} {1 - \tan 45 \degrees \tan 22.5 \degrees} | c = Tangent of Sum }} {{eqn | r = \frac {1 + \paren {\sqrt 2 - 1} } {1 - 1 \times \paren {\sqrt 2 - 1} } | c = Tangent of $45 \degrees$ and Tangent of $22.5 \degrees$ }} {{eqn | r = \frac {\sqrt 2} {2 - \sqrt 2} | c = simplifying }} {{eqn | r = \frac {\sqrt 2 \paren {2 + \sqrt 2} } {\paren {2 - \sqrt 2} \paren {2 + \sqrt 2} } | c = multiplying top and bottom by $2 + \sqrt 2$ }} {{eqn | r = \frac {2 \sqrt 2 + 2} {4 - 2} | c = Difference of Two Squares }} {{eqn | r = \sqrt 2 + 1 | c = simplifying }} {{end-eqn}} {{qed}} Category:Tangent Function \end{proof}
22571
\section{Tangent of Angle in Cartesian Plane} Tags: Trigonometry, Tangent Function, Analytic Geometry \begin{theorem} Let $P = \tuple {x, y}$ be a point in the cartesian plane whose origin is at $O$. Let $\theta$ be the angle between the $x$-axis and the line $OP$. Let $r$ be the length of $OP$. Then: :$\tan \theta = \dfrac y x$ where $\tan$ denotes the tangent of $\theta$. \end{theorem} \begin{proof} :500px Let a unit circle $C$ be drawn with its center at the origin $O$. Let a tangent line be drawn to $C$ parallel to $PS$ meeting $C$ at $R$. Let $Q$ be the point on $OP$ which intersects this tangent line. $\angle OSP = \angle ORQ$, as both are right angles. Both $\triangle OSP$ and $\triangle ORQ$ share angle $\theta$. By Triangles with Two Equal Angles are Similar it follows that $\triangle OSP$ and $\triangle ORQ$ are similar. Thus: Then: {{begin-eqn}} {{eqn | l = \frac y x | r = \frac {SP} {OS} | c = }} {{eqn | r = \frac {RQ} {OR} | c = {{Defof|Similar Triangles}} }} {{eqn | r = RQ | c = $OP$ is Radius of the Unit Circle }} {{eqn | r = \tan \theta | c = {{Defof|Tangent Function|subdef = Definition from Circle}} }} {{end-eqn}} When $\theta$ is obtuse, the same argument holds, but both $x$ and $\tan \theta$ are negative. When $\theta = \dfrac \pi 2$ we have that $x = 0$. Then $OP$ is parallel to the tangent line at $R$ which it therefore does not meet. Thus when $\theta = \dfrac \pi 2$, it follows that $\tan \theta$ is not defined. Likewise $\dfrac y x$ is not defined when $x = 0$. Thus the relation holds for $\theta = \dfrac \pi 2$. When $\pi < \theta < 2 \pi$ the diagram can be reflected in the $x$-axis. In this case, $y$ is negative. Thus the relation continues to hold. When $\theta = 0$ and $\theta = \pi$ we have that $y = 0$ and $\tan \theta = 0 = \dfrac y x$. Hence the result. {{qed}} \end{proof}
22572
\section{Tangent of Complement equals Cotangent} Tags: Cotangent Function, Tangent Function \begin{theorem} :$\map \tan {\dfrac \pi 2 - \theta} = \cot \theta$ for $\theta \ne n \pi$ where $\tan$ and $\cot$ are tangent and cotangent respectively. That is, the cotangent of an angle is the tangent of its complement. This relation is defined wherever $\sin \theta \ne 0$. \end{theorem} \begin{proof} {{begin-eqn}} {{eqn | l = \map \tan {\frac \pi 2 - \theta} | r = \frac {\map \sin {\frac \pi 2 - \theta} } {\map \cos {\frac \pi 2 - \theta} } | c = Tangent is Sine divided by Cosine }} {{eqn | r = \frac {\cos \theta} {\sin \theta} | c = Sine and Cosine of Complementary Angles }} {{eqn | r = \cot \theta | c = Cotangent is Cosine divided by Sine }} {{end-eqn}} The above is valid only where $\sin \theta \ne 0$, as otherwise $\dfrac {\cos \theta} {\sin \theta}$ is undefined. From Sine of Multiple of Pi it follows that this happens when $\theta \ne n \pi$. {{qed}} \end{proof}