The transpose of a permutation matrix is its inverse.

A direct computation is also fine: $$(PP^T)_{ij} = \sum_{k=1}^n P_{ik} P^T_{kj} = \sum_{k=1}^n P_{ik} P_{jk}$$ but $P_{ik}$ is usually 0, and so $P_{ik} P_{jk}$ is usually 0. The only time $P_{ik}$ is nonzero is when it is 1, but then there are no other $i' \neq i$ such that $P_{i'k}$ is nonzero ($i$ is the only row with a 1 in column $k$). In other words, $$\sum_{k=1}^n P_{ik} P_{jk} = \begin{cases} 1 & \text{if } i = j \\ 0 & \text{otherwise} \end{cases}$$ and this is exactly the formula for the entries of the identity matrix, so $$PP^T = I$$


Another way to prove it is to realize that any permutation matrix is the product of elementary permutations, where by elementary I mean a permutation that swaps two entries. Since in an identity matrix swapping $i$ with $j$ in a row is the same as swapping $j$ with $i$ in a column, such matrix is symmetric and it coincides with its inverse. Then, assuming $P=P_1\cdots P_k$, with $P_1,\ldots,P_k$ elementary, we have

$$ P^{-1} = (P_1\cdots P_k)^{-1}=P_k^{-1}\cdots P_1^{-1}=P_k\cdots P_1=P_k^t\cdots P_1^t = (P_1\cdots P_k)^t=P^t $$


Using a little knowledge about orthogonal matrices the following proof is pretty simple:

Since $v^tw=\sum_{k=0}^nv_iw_i$ if $v=(v_1,...,v_n),w=(w_1,...,w_n)$ we have $v^tv=1$ whenever v is a column of $P$. On the other hand $v^tw=0$ if $v$ and $w$ are two distinct columns of $P$. Therefore we can conclude that $(P^tP)_{i,j}=\delta_{i,j}$ and so $P^t=P^{-1}$.