$\newcommand{\Si}{\Sigma}$Your first question: "Is my projection optimal (not necessarily unique)?" The answer to this is -- Of course, not. Moreover, your projection is usually not even in $S$.
Indeed, as was finally clarified in your comments, we have \begin{equation*} S=S_1\cap S_2, \end{equation*} where \begin{equation*} S_1=\{x\in\{0,1\}^{n\times n}\colon x_{i,j}+x_{j,i}\le1\ \forall(i,j)\in[n]\times[n]\}, \end{equation*} \begin{equation*} S_2=\{ x\in \{0,1\}^{n\times n}\colon\sum_{j=1}^{n}x_{i,j}=1\ \forall i\in[n]\}, \end{equation*} and $[n]:= \{1,...,n\}$.
Let $Px$, $P_1x$, and $P_2x$ denote the projections of $x$ onto $S$, $S_1$, and $S_2$, respectively, so that your projection of $x$ is $P_{12}x:=P_2(P_1x)$.
So, your first question is this: Is it true for all $x\in[0,1]^{n\times n}$ that \begin{equation*} Px=P_{12}x\text{?} \tag{1}\label{1} \end{equation*}
The (nonlinear) projection $P_1x$ of $x\in[0,1]^{n\times n}$ is described quite simply: \begin{equation*} (P_1x)_{i,j}=x_{i,j}\,1(x_{i,j}>x_{j,i}) \end{equation*} for all $(i,j)\in[n]\times[n]$; here we exclude the "zero-probability" case when $x_{i,j}=x_{j,i}$ for some distinct $i$ and $j$ in $[n]$.
Note that the matrices $x\in S_2$ are in the bijective correspondence with the $n$-tuples $s$ in the set $[n]^n$, where the correspondence is given by the formula \begin{equation*} x_{i,j}=X(s)_{i,j}:=1(j=s_i) \end{equation*} for all $(i,j)\in[n]\times[n]$, so that $s_i$ is the position of the (only) $1$ in the $i$th row of the matrix $x\in S_2$. The square of the Euclidean distance $d(x,X(s))$ from a matrix $x\in[0,1]^{n\times n}$ to the matrix $X(s)$ corresponding to an $n$-tuple $s\in[n]^n$ is \begin{equation*} \begin{aligned} d(x,X(s))^2&=\sum_{(i,j)\in[n]\times[n]}(x_{i,j}-X(s)_{i,j})^2 \\ &=\sum_{(i,j)\in[n]\times[n]}(x_{i,j}-1(j=s_i))^2 \\ &=\sum_{i\in[n]}\Big((x_{i,s_i}-1)^2-x_{i,s_i}^2+\sum_{j\in[n]}x_{i,j}^2\Big) \\ &=n+\sum_{(i,j)\in[n]\times[n]}x_{i,j}^2-2\sum_{i\in[n]}x_{i,s_i}. \end{aligned} \end{equation*}
So, projecting a matrix $x\in[0,1]^{n\times n}$ onto $S_2$ or onto $S$ is equivalent to maximizing \begin{equation*} \Si_x(s):=\sum_{i\in[n]}x_{i,s_i} \tag{2}\label{2} \end{equation*} over all $n$-tuples $s\in[n]^n$ corresponding to the matrices $x$ in $S_2$ or $S$, respectively.
The just described projections $P,P_1,P_2$ and hence $P_{12}=P_2\circ P_1$ are implemented in a Mathematica notebook whose image is shown below. In the notebook, we take a (pseudo)random matrix $x\in[0,1]^{n\times n}$ (with $n=3$) and see that $P_{12}x\notin S$; so, $P_{12}x$ cannot be the same the projection $Px$ of $x$ onto $S$. (In the notebook, we refer to the $n$-tuples $s\in[n]^n$ corresponding to matrices $X(s)\in S_1$ as good; clearly, $s\in[n]^n$ is good iff $s_{s_i}\ne i$ for all $i\in[n]$.)
As for your second question, "Is there any algorithm to solve this problem?", I can say the following:
Asking multiple questions is not encouraged on MathOverflow. So, please consider moving this question to another post.
It was shown above that the subset $S$ of the set $\{0,1\}^{n\times n}$ of cardinality $2^{n^2}$ can be conveniently indexed by a subset of the set $[n]^n$ of the much smaller cardinality $n^n$.
Therefore, the procedure described in the Mathematica notebook should be feasible for $n\le10$.
For $n>10$, some "greedy"-type algorithms can probably give reasonable approximations to the desired projection.
Here is an image of the mentioned Mathematica notebook:

