espitau 8 years ago
parent
commit
001bccf77f
2 changed files with 95 additions and 49 deletions
  1. 37 23
      main.tex
  2. 58 26
      main.tex.bak

+ 37 - 23
main.tex

@@ -48,19 +48,19 @@ Experiments were conducted on two machines:
 \end{itemize}
 
 \begin{figure}
-  \label{main_flow}
 \includegraphics[scale=0.6]{main_flow.pdf}
 \caption{Tool overview}
+  \label{main_flow}
 \end{figure}
 
 On these machines, some basic profiling has make clear that 
 the main bottleneck of the computations is hiding in the \emph{computation
-of the discrepency}. The chosen algorithm and implantation of this 
+of the discrepancy}. The chosen algorithm and implantation of this 
 cost function is the DEM-algorithm of \emph{Magnus Wahlstr\o m}.\medskip
 
 All the experiments has been conducted on dimension 2,3,4 
 --- with a fixed Halton basis 7, 13, 29, 3 ---. Some minor tests have
-been made in order to discuss the dependency of the discrepency and 
+been made in order to discuss the dependency of the discrepancy and 
 efficiency of the heuristics with regards to the values chosen for the
 prime base. The average results remains roughly identical when taking 
 changing these primes and taking them in the range [2, 100]. For such
@@ -78,15 +78,15 @@ extremal values are also given in order to construct error bands graphs.
 
 A flowchart of the conduct of one experiment is described in the 
 flowchart~\ref{insight_flow}. The number of iteration of the heuristic is 
-I and the number of full restart is N. Th efunction Heuristic() correspond to
-a single step of the chosen heursitic. We now present an in-depth view of
+I and the number of full restart is N. Th function Heuristic() correspond to
+a single step of the chosen heuristic. We now present an in-depth view of
 the implemented heuristics.
 
 \begin{figure}
  \begin{mdframed}
-  \label{insight_flow}
 \includegraphics[scale=0.4]{insight.pdf}
 \caption{Flowchart of a single experiment}
+\label{insight_flow}
 \end{mdframed}
 \end{figure}
 
@@ -94,10 +94,10 @@ the implemented heuristics.
 \section{Heuristics developed}
 
 \subsection{Fully random search (Test case)}
- The first heuristic implemented is rge random search. We generates
+ The first heuristic implemented is the random search. We generates
  random sets of Halton points and select the best set with regard to its
  discrepancy iteratively. The process is wrapped up in the 
- flowchart~\ref{rand_flow}. In order to generate at each step a random 
+ flowchart~\ref{random_flow}. In order to generate at each step a random 
  permutation, we transform it directly from the previous one.
   More precisely the permutation is a singleton object which have method 
   random, built on the Knuth Fisher Yates shuffle. This algorithm allows
@@ -105,9 +105,9 @@ the implemented heuristics.
   this fact and detail the algorithm in the following section.
 \begin{figure}
  \begin{mdframed}
-  \label{rand_flow}
 \includegraphics[scale=0.4]{flow_rand.pdf}
 \caption{Flowchart of the random search}
+  \label{random_flow}
 \end{mdframed}
 \end{figure}
 
@@ -155,43 +155,57 @@ the naive implementation.
 
 
 \subsubsection{Results and stability}
-
 We first want to analyze the dependence of the results on the number of 
 iterations of the heuristic, in order to discuss its stability. 
-The results are compiled in the figures~\ref{rand_iter2}\ref{rand_iter3}.
+The results are compiled in the figures~\ref{rand_iter2}, \ref{rand_iter3}.
 As expected from a fully random search, the error bands are very large for 
 low number of iterations ($15\%$ of the value for 200 iterations) and tends
-to shrink with a bigger number of iterations (arround $5\%$ for 1600 iterations).
+to shrink with a bigger number of iterations (around $5\%$ for 1600 iterations).
 The average results are quite stable, they decrease progressively with 
-the growing number of iterations, but seems to get to a limits after 1000 iterations. This value acts as a threshold for the interesting number of iterations.
+the growing number of iterations, but seems to get to a limits after 1000 
+iterations. This value acts as a threshold for the interesting number of iterations.
 As such interesting results can be conducted with \emph{only} 1000 iterations, 
-without alterating too much the quality of the set with regards to its
-discrepency and this heuristic.
+without altering too much the quality of the set with regards to its
+discrepancy and this heuristic.
 
 \begin{figure}
-  \label{rand_iter2}
 \includegraphics[scale=0.3]{Results/random_iter.png}
-\caption{Dependence on iterations number: D=2}
+\caption{Dependence on iterations, dimension 2}
+\label{rand_iter2}
 \end{figure}
 \begin{figure}
-  \label{rand_iter3}
 \includegraphics[scale=0.3]{Results/random_iter_3.png}
-\caption{Dependence on iterations number: D=3}
+\caption{Dependence on iterations, dimension 3}
+\label{rand_iter3}
 \end{figure}
 
-
-
 \subsection{Evolutionary heuristic: Simulated annealing and local search}
+The second heuristic implemented is a randomiezd local search with 
+simmulated annealing. This heuristic is inspired by the physical 
+process of annealing in metallurgy.
+Simulated annealing interprets the physical slow cooling as a 
+slow decrease in the probability of accepting worse solutions as it 
+explores the solution space. 
+More precisely the neighbours are here the permutations which can be obtained
+by application of exactly one transposition of the current permutation.
+The selection phase is dependant on the current temperature:
+after applaying a random transposition on the current permutation, either
+the discrepency of the corresponding Halton set is decreased and the 
+evolution is keeped, either it does not but is still keeped with 
+a probability $e^{\frac{\delta}{T}}$ where $\delta$ is the difference
+between the old and new discrepancy, and $T$ the current temperature.
+The all algorithm is described in the flowchart~\ref{flow_rec}.
+
 \begin{figure}
  \begin{mdframed}
-  \label{rand_flow}
 \includegraphics[scale=0.4]{flow_recuit.pdf}
 \caption{Flowchart of the simulated annealing local search heuristic}
+\label{flow_rec}
 \end{mdframed}
 \end{figure}
 
 \subsubsection{Dependence on the temperature}
-
+First exeriements were made to select the best initial temperature.
 \begin{figure}
   \label{rand_flow}
 \includegraphics[scale=0.3]{Results/resu_temp3.png}

+ 58 - 26
main.tex.bak

@@ -33,18 +33,18 @@
 
 \section{General architecture of the tool}
 
-The testing tool is aimed to be modular: it is made of independant blocks that
+The testing tool is aimed to be modular: it is made of independents blocks that
 are interfaced trough a scheduler. More precisely a master wrapper is written
 in Python that calls a first layer which performs the chosen heuristic. This
-layer is written in C++ for performences. The given discrepancy algorithm 
+layer is written in C++ for performances. The given discrepancy algorithm 
 --- written in C --- is called when evaluations of a state is needed.
 The wrapper dispatch the computations on the multi-core architecture of 
 modern computers\footnote{for us, between 2 and 4 physical cores and 4 or 8 
 virtual cores}. This basic architecture is described in figure~\ref{main_flow}.
 Experiments were conducted on two machines: 
 \begin{itemize}
-  \item 2.4 GHz Intel Dual Core i5 hyperthreaded to 2.8GHz, 8 Go 1600 MHz DDR3.
-  \item 2.8 GHz Intel Quad Core i7 hyperthreaded to 3.1GHz, 8 Go 1600 MHz DDR3.
+  \item 2.4 GHz Intel Dual Core i5 hyper-threaded to 2.8GHz, 8 Go 1600 MHz DDR3.
+  \item 2.8 GHz Intel Quad Core i7 hyper-threaded to 3.1GHz, 8 Go 1600 MHz DDR3.
 \end{itemize}
 
 \begin{figure}
@@ -53,24 +53,50 @@ Experiments were conducted on two machines:
 \caption{Tool overview}
 \end{figure}
 
+On these machines, some basic profiling has make clear that 
+the main bottleneck of the computations is hiding in the \emph{computation
+of the discrepency}. The chosen algorithm and implantation of this 
+cost function is the DEM-algorithm of \emph{Magnus Wahlstr\o m}.\medskip
 
-\subsection{Algorithmics insights}
+All the experiments has been conducted on dimension 2,3,4 
+--- with a fixed Halton basis 7, 13, 29, 3 ---. Some minor tests have
+been made in order to discuss the dependency of the discrepency and 
+efficiency of the heuristics with regards to the values chosen for the
+prime base. The average results remains roughly identical when taking 
+changing these primes and taking them in the range [2, 100]. For such
+a reason we decided to pursue the full computations with a fixed 
+basis.
+
+
+\subsection{Algorithmic insights}
+
+To perform an experiment we made up a 
+loop above the main algorithm which calls the chosen heuristic multiple 
+times in order to smooth up the results and obtain more exploitable datas.
+Then an arithmetic mean of the result is performed on the values. In addition
+extremal values are also given in order to construct error bands graphs.
+
+A flowchart of the conduct of one experiment is described in the 
+flowchart~\ref{insight_flow}. The number of iteration of the heuristic is 
+I and the number of full restart is N. Th efunction Heuristic() correspond to
+a single step of the chosen heursitic. We now present an in-depth view of
+the implemented heuristics.
 
 \begin{figure}
  \begin{mdframed}
-  \label{rand_flow}
-\includegraphics[scale=0.4]{flow_rand_2.pdf}
-\caption{Flowchart of the experiement for dependence on iterations}
+  \label{insight_flow}
+\includegraphics[scale=0.4]{insight.pdf}
+\caption{Flowchart of a single experiment}
 \end{mdframed}
 \end{figure}
 
 
-\section{Heuristics devlopped}
+\section{Heuristics developed}
 
 \subsection{Fully random search (Test case)}
  The first heuristic implemented is rge random search. We generates
  random sets of Halton points and select the best set with regard to its
- discrepancy iteratively. The process is wrapped up in the following 
+ discrepancy iteratively. The process is wrapped up in the 
  flowchart~\ref{rand_flow}. In order to generate at each step a random 
  permutation, we transform it directly from the previous one.
   More precisely the permutation is a singleton object which have method 
@@ -130,32 +156,32 @@ the naive implementation.
 
 \subsubsection{Results and stability}
 
-We first want to analyse the dependence of the results on the number of 
-iterations of the algorithm. To perform such an experiment we made up a 
-wrapper above the main algorithm which calls the random search on 
-different number of iterations. To smooth up the results and obtain
-more exploitable datas, we also perform an 
-arithmetic mean of fifteen searches for each experiments. The flowchart of
-the conducts of this experiment is desibed in the figure:
+We first want to analyze the dependence of the results on the number of 
+iterations of the heuristic, in order to discuss its stability. 
+The results are compiled in the figures~\ref{rand_iter2}\ref{rand_iter3}.
+As expected from a fully random search, the error bands are very large for 
+low number of iterations ($15\%$ of the value for 200 iterations) and tends
+to shrink with a bigger number of iterations (arround $5\%$ for 1600 iterations).
+The average results are quite stable, they decrease progressively with 
+the growing number of iterations, but seems to get to a limits after 1000 iterations. This value acts as a threshold for the interesting number of iterations.
+As such interesting results can be conducted with \emph{only} 1000 iterations, 
+without alterating too much the quality of the set with regards to its
+discrepency and this heuristic.
 
-The results are compiled in the figure
 \begin{figure}
-  \label{rand_flow}
+  \label{rand_iter2}
 \includegraphics[scale=0.3]{Results/random_iter.png}
 \caption{Dependence on iterations number: D=2}
 \end{figure}
-
-The sae experiment has been conducted on dimension 4 -- with Halton basis
-7, 13, 29, 3 ---:
 \begin{figure}
-  \label{rand_flow}
+  \label{rand_iter3}
 \includegraphics[scale=0.3]{Results/random_iter_3.png}
 \caption{Dependence on iterations number: D=3}
 \end{figure}
 
 
 
-\subsection{Evolutionary heuristic: Simmulated annealing and local search}
+\subsection{Evolutionary heuristic: Simulated annealing and local search}
 \begin{figure}
  \begin{mdframed}
   \label{rand_flow}
@@ -164,6 +190,8 @@ The sae experiment has been conducted on dimension 4 -- with Halton basis
 \end{mdframed}
 \end{figure}
 
+\subsubsection{Dependence on the temperature}
+
 \begin{figure}
   \label{rand_flow}
 \includegraphics[scale=0.3]{Results/resu_temp3.png}
@@ -181,6 +209,8 @@ The sae experiment has been conducted on dimension 4 -- with Halton basis
 \caption{Dependence on iterations number: D=3}
 \end{figure}
 
+\subsubsection{Stability with regards to the number of iterations}
+
 \begin{figure}
   \label{rand_flow}
 \includegraphics[scale=0.3]{Results/sa_iter.png}
@@ -188,11 +218,13 @@ The sae experiment has been conducted on dimension 4 -- with Halton basis
 \end{figure}
 
 \subsection{Genetic (5+5) search}
+
+
 \begin{figure}
  \begin{mdframed}
   \label{rand_flow}
-\includegraphics[scale=0.4]{flow_recuit.pdf}
-\caption{Flowchart of the simulated annealing local search heuristic}
+\includegraphics[scale=0.4]{crossover_flow.pdf}
+\caption{Flowchart of the crossover algorithm.}
 \end{mdframed}
 \end{figure}