How do I prove the transitivity of a set of implications?
$$(1) P\rightarrow Q\tag{Hypothesis}$$ $$(2) Q\rightarrow R\tag{Hypothesis}$$ $$\quad\quad\underline{(3)\; P\quad }\tag{Assumption}$$ $$\quad \quad\quad |(4) Q \tag{1 and 3: Modus Ponens}$$ $$\quad\quad\quad |(5) R \tag{2 and 4: Modus Ponens}$$ $$(6) P \rightarrow R \tag{3 - 5: if P, then R}$$
$$\therefore ((P\rightarrow Q)\land (Q\rightarrow R))\rightarrow (P\to R)$$
(Note: step 6 is sometimes justified by "conditional introduction": together with what you are given or have established, if by assuming P, you can derive R, then you have shown $P \rightarrow R$).
Note: when I first learned propositional logic, once it was established (proven), we referred to the following "syllogism": $$P\to Q\\ \underline{Q\to R}\\ \therefore P\to R\quad$$ by citing it as the "Hypothetical Syllogism", for justification in future proofs.
Since the logic you are using is not quantified, and there are only three variables, you can prove the argument valid with an eight row truth table covering all of the possible truth values of the three variables:
P Q R Premise: P -> Q Premise: Q -> R: Conclusion: P -> R
------------------------------------------------------------------------
0 0 0 1 1 1
0 0 1 1 1 1
0 1 0 1 0 1
0 1 1 1 1 1
1 0 0 0 1 0
1 0 1 0 1 1
1 1 0 1 0 0
1 1 1 1 1 1
Now an argument is valid if the conclusion is true whenever all the premises are true. This means that in all the rows where both premises have a 1, we check whether the conclusion has a 1. By golly, this is the case. Therefore the argument is valid!
If we found a zero, that would be a counterexample which destroys the argument: a situation where the premises are all true, yet the conclusion is false.
In my table, the P -> Q
column and its siblings are simply derived from the truth table for the conditional. P -> Q
is only false when P is true, and Q is false, and true for the other three possible values of the variables.
Denote by $+$ the $or$ logic operator and by $\cdot$ the $and$ logic operator.
We have $(A\Rightarrow B)=\bar A+B$ and $(B\Rightarrow C) =\bar B+C$
Assume $(\bar A + B)=1=(\bar B+C)$.
Then $1=1\cdot 1=(\bar A + B)\cdot (\bar B+C) \\ = (\bar A\bar B + \bar A C + B\bar B + BC)\\ = \bar A + BC = (\bar A + B) \cdot (\bar A+C) \\ =\bar A + C $