|
@@ -213,8 +213,7 @@ discrepancy and this heuristic.
|
|
|
\label{rand_iter3}
|
|
|
\end{figure}
|
|
|
|
|
|
-%# TODO sa n'est pas evolutionnaire
|
|
|
-\subsection{Evolutionary heuristic: Simulated annealing and local search}
|
|
|
+\subsection{Local search with simulated annealing}
|
|
|
The second heuristic implemented is a randomized local search with
|
|
|
simulated annealing. This heuristic is inspired by the physical
|
|
|
process of annealing in metallurgy.
|
|
@@ -239,7 +238,7 @@ The whole algorithm is described in the flowchart~\ref{flow_rec}.
|
|
|
\begin{figure}
|
|
|
\begin{mdframed}
|
|
|
\includegraphics[scale=0.4]{flow_recuit.pdf}
|
|
|
-\caption{Flowchart of the simulated annealing local search heuristic}
|
|
|
+\caption{Flowchart of the local search with simulated annealing heuristic}
|
|
|
\label{flow_rec}
|
|
|
\end{mdframed}
|
|
|
\end{figure}
|
|
@@ -297,7 +296,26 @@ rates for fully random search with 400 iterations.
|
|
|
\label{iter_sa}
|
|
|
\end{figure}
|
|
|
|
|
|
-\subsection{Genetic (5+5) search}
|
|
|
+\subsection{Genetic $(\mu+\lambda)$ search}
|
|
|
+
|
|
|
+The third heuristic implemented is the $(\mu+\lambda)$ genectic search.
|
|
|
+This heuristic is inspired from the evolution of species: a family
|
|
|
+of $\mu$ genes is known (they are generated randomly at the beginning),
|
|
|
+from which $\lambda$ new genes are derived. A gene is the set of parameters
|
|
|
+we are optimising, i.e. the permutations.
|
|
|
+Each one is derived either from one gene applying a mutation
|
|
|
+(here a transposition of one of the permutations), or from two
|
|
|
+genes applying a crossover : a blending of both genes (the
|
|
|
+algorithm is described in details further). The probability of
|
|
|
+making a mutation is $c$, the third parameter of the algorithm,
|
|
|
+among $\mu$ and $\lambda$. After that, only the $\mu$ best
|
|
|
+genes are kept, according to their fitness, and the process
|
|
|
+can start again.
|
|
|
+
|
|
|
+Because variating over $\mu$ or $\lambda$ does not change fundamentaly
|
|
|
+the algorithm, we have chosen to fix $\mu=\lambda=5$ once and for all,
|
|
|
+which seemed to be a good trade-off between the running time of
|
|
|
+each iteration and the size of the family.
|
|
|
|
|
|
|
|
|
\begin{figure}
|
|
@@ -308,23 +326,6 @@ rates for fully random search with 400 iterations.
|
|
|
\end{mdframed}
|
|
|
\end{figure}
|
|
|
|
|
|
-\subsubsection{Dependence on the parameter p}
|
|
|
-First experiments were made to select the value for the crossover parameter
|
|
|
-p. Results are compiled in graphs~\ref{res_gen2},~\ref{res_gen2z},\ref{res_gen3}
|
|
|
-and~\ref{res_gen4}.
|
|
|
-Graph~\ref{res_gen2}, represents the results obtained
|
|
|
-in dimension 2 between 10 and 500 points. The curve obtained is, with no
|
|
|
-surprise again,
|
|
|
-the characteristic curve of the average evolution of the discrepancy we already
|
|
|
-saw with the previous experiments.
|
|
|
-The most interesting part of these results are concentrated --- once again ---
|
|
|
-between 80 and 160 points were the different curves splits.
|
|
|
-The graph~\ref{res_gen2z} is a zoom of~\ref{res_gen2} in this window, and
|
|
|
-graphs~\ref{res_gen3} and~\ref{res_gen4} are focused directly into it too.
|
|
|
-We remark that in dimension 2, the results are better for $p$ close to $0.5$
|
|
|
-whereas for dimension 3 and 4 the best results are obtained for $p$ closer to
|
|
|
-$0.1$.
|
|
|
-
|
|
|
\subsubsection{Algorithm of crossover}
|
|
|
|
|
|
We designed a crossover for permutations. The idea is simple: given two
|
|
@@ -381,32 +382,49 @@ permutations.
|
|
|
\end{algorithm}
|
|
|
|
|
|
|
|
|
+\subsubsection{Dependence on the parameter $c$}
|
|
|
+First experiments were made to select the value for the crossover parameter
|
|
|
+$c$. Results are compiled in graphs~\ref{res_gen2},~\ref{res_gen2z},\ref{res_gen3}
|
|
|
+and~\ref{res_gen4}.
|
|
|
+Graph~\ref{res_gen2}, represents the results obtained
|
|
|
+in dimension 2 between 10 and 500 points. The curve obtained is, with no
|
|
|
+surprise again,
|
|
|
+the characteristic curve of the average evolution of the discrepancy we already
|
|
|
+saw with the previous experiments.
|
|
|
+The most interesting part of these results are concentrated --- once again ---
|
|
|
+between 80 and 160 points were the different curves splits.
|
|
|
+The graph~\ref{res_gen2z} is a zoom of~\ref{res_gen2} in this window, and
|
|
|
+graphs~\ref{res_gen3} and~\ref{res_gen4} are focused directly into it too.
|
|
|
+We remark that in dimension 2, the results are better for $c$ close to $0.5$
|
|
|
+whereas for dimension 3 and 4 the best results are obtained for $c$ closer to
|
|
|
+$0.1$.
|
|
|
+
|
|
|
|
|
|
\begin{figure}
|
|
|
\includegraphics[scale=0.3]{Results/res_gen_2.png}
|
|
|
-\caption{Dependence on parameter p: D=2}
|
|
|
+\caption{Dependence on parameter $c$: D=2}
|
|
|
\label{res_gen2}
|
|
|
\end{figure}
|
|
|
|
|
|
\begin{figure}
|
|
|
\includegraphics[scale=0.3]{Results/res_gen_2_zoom.png}
|
|
|
-\caption{Dependence on parameter p (zoom): D=2}
|
|
|
+\caption{Dependence on parameter $c$ (zoom): D=2}
|
|
|
\label{res_gen2z}
|
|
|
\end{figure}
|
|
|
\begin{figure}
|
|
|
\includegraphics[scale=0.3]{Results/res_gen_3_zoom.png}
|
|
|
-\caption{Dependence on parameter p: D=3}
|
|
|
+\caption{Dependence on parameter $c$: D=3}
|
|
|
\label{res_gen3}
|
|
|
\end{figure}
|
|
|
|
|
|
\begin{figure}
|
|
|
\includegraphics[scale=0.3]{Results/res_gen_4_zoom.png}
|
|
|
-\caption{Dependence on parameter p: D=4}
|
|
|
+\caption{Dependence on parameter $c$: D=4}
|
|
|
\label{res_gen4}
|
|
|
\end{figure}
|
|
|
|
|
|
Once again we investigated the stability
|
|
|
-of the algorithm with regards to the number of iterations. Once again we
|
|
|
+of the algorithm with regards to the number of iterations. Once again we
|
|
|
restricted the window between 80 and 180 points were curves are split.
|
|
|
The results are compiled in graph~\ref{gen_iter}.
|
|
|
An interesting phenomena can be observed: the error rates are getting really
|