Here is a proof that is very specific to the case of symmetric groups and is probably rather involved for a general fact like this. (In fact, the primary reason for typing this answer is to record the statement and the proof of Proposition 1 below.)
Define the (Young-)Jucys-Murphy elements $\{X_k\}_{k=1}^{n}$ as $$ X_k=(1~k)+\ldots+(k-1~k)=\sum_{i=1}^{k-1}(i~k)\in\mathbb{Z}[S_n]. $$ The statement in question is a direct consequence of Propositions 1 and 2 below.
Proposition 1 (Jucys, Murphy). The center of the group ring $\mathbb{Z}[S_n]$ coincides with the subring of $\mathbb{Z}[S_n]$ generated by $f(X_1,\ldots,X_n)$, where $f$ runs over $\mathbb{Z}[x_1,\ldots,x_n]^{S_n}$ (symmetric polynomials in $n$ variables with integer coefficients).
Proof (sketch). We outline the main ideas, the details can be found in T. Ceccherini, F. Scarabotti, F. Tolli. Representation Theory of the Symmetric Groups. (The Okounkov-Vershik Approach, Character Formula, and Partition Algebras), Cambridge University Press (2010), see Section 4.4.2 and Theorem 4.4.5.
The fact that any expression of the form $f(X_1,\ldots,X_n)$ with $f\in\mathbb{Z}[x_1,\ldots,x_n]^{S_n}$ follows from the identities $$ s_iX_js_i^{-1}=X_j~\text{for}~j\neq i,i+1, \\ s_i(X_{i}+X_{i+1})s_i^{-1}=X_{i}+X_{i+1},\quad s_i(X_{i}X_{i+1})s_i^{-1}=X_{i}X_{i+1}, $$ where $s_i=(i~i+1)$. (Recall that $s_iX_{i+1}s_i^{-1}=X_i+s_i$.)
The harder part is to show that any element of the center of $\mathbb{Z}[S_n]$ can be represented as $f(X_1,\ldots,X_n)$ for $f\in\mathbb{Z}[x_1,\ldots,x_n]^{S_n}$.
The center of $\mathbb{Z}[S_n]$ is additively generated by the elements $$ C_{\mu}=\sum_{\sigma\in S_n,~\mathrm{ct}(\sigma)=\mu}\sigma, $$ where $\mu$ runs over partitions of $n$, and $\mathrm{ct}(\sigma)$ stands for the cycle type of permutation $\sigma$. In other words, for $\mu=(\mu_1,\ldots,\mu_k)\vdash n$ with $\mu_1\ge\ldots\ge\mu_k\ge 1$ the element $C_{\mu}$ is the sum of all permutations that are products of $k$ disjoint cycles of lengths $\mu_1,\ldots,\mu_k$, respectively.
Let $m_{\lambda}(x_1,\ldots,x_n)$ be the monomial symmetric polynomial in $n$ variables $x_1\ldots,x_n$. The main observation is the following
Claim. For any $\mu=(\mu_1,\ldots,\mu_k)\vdash n$ with $\mu_1\ge \ldots\ge\mu_k\ge 1$ we have the identity: $$ m_{\mu_1-1,\ldots,\mu_k-1}(X_1,\ldots,X_n)=C_{\mu}+\sum_{\nu}k_{\nu}C_{\nu},\quad k_{\nu}\in\mathbb{Z}, $$ where the summation runs over partitions $\nu\vdash n$ which have more parts equal to $1$ than $\mu$.
This proposition is proven via the analysis of permutations arising after the expansion of the left-hand side. The proof is now concluded by proving by induction on the number of parts of $\mu$ equal to $1$ that $C_{\mu}=f(X_1,\ldots,X_n)$ for some $f\in\mathbb{Z}[x_1,\ldots,x_n]^{S_n}$. $\square$
Remark. If one does not care about integer coefficients, only about $\mathbb{Q}$ or $\mathbb{C}$, then one can induct on the number of fixed points, use $X_1^{r-1}+\ldots+X_{n}^{r-1}$ to generate $r$-cycles and use the expansion $C_{\mu}=\frac{1}{\operatorname{Stab}(\mu)}C_{(\mu_1)}\ldots C_{(\mu_k)}+\ldots$ to generate other conjugacy classes.
Proposition 2. The elements $\{X_k\}_{k=1}^{n}\subset\mathbb{Z}[S_n]$ can be simultaneously diagonalised in any finite-dimensional representation of $S_n$. Moreover, all eigenvalues of the corresponding operators are integers.
A nice and rather elementary proof of the proposition above is given by Igor Makhlin in this answer.
Remark. In fact, as follows from the Okounkov-Vershik approach to the representation theory of $S_n$, in the irreducible representation of $S_n$ corresponding to a partition $\lambda\vdash n$ one can choose a basis $\{v_T\}$ parameterised by the standard tableaux $T\in\operatorname{Tab}(\lambda)$ of shape $\lambda$. Then, $\{X_k\}_{k=1}^{n}$ act diagonally on this basis and the eigenvalue of $X_k$ equals the content $c_k(T)=\mathrm{col}_k-\mathrm{row}_k$ of the box of $T$ containing $k$.