main.tex 3.7 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121
  1. % ------------------------------------------------------------------------
  2. \documentclass{llncs}
  3. \input{prelude}
  4. \begin{document}
  5. \title{Star Discrepancies for generalized Halton points}
  6. % \titlerunning{} % abbreviated title (for running head)
  7. % also used for the TOC unless
  8. % \toctitle is used
  9. \author{Thomas Espitau\inst{1} \and Olivier Marty\inst{1}}
  10. %
  11. % \authorrunning{} % abbreviated author list (for running head)
  12. %
  13. %%%% list of authors for the TOC (use if author list has to be modified)
  14. % \tocauthor{}
  15. %
  16. \institute{
  17. $\mbox{}^1$ ENS Cachan \qquad
  18. }
  19. \maketitle
  20. \makeatletter
  21. \makeatother
  22. \begin{abstract}
  23. \end{abstract}
  24. \section{Introduction}
  25. \section{General architecture of the toolset}
  26. Big pipeline + pool + blabla\medskip
  27. Un truc bien ca serait d'avoir un truc modulaire (genre les heuristiques
  28. sont des "blocs" et l'outil appelle un bloc quand il veut faire son test
  29. en l'appelant en multicore"
  30. (c'est stupide mais ca fera un beau schéma!)
  31. \section{Heuristics devlopped}
  32. \subsection{Fully random search (Test case)}
  33. The first heuristic implemented is rge random search. We generates
  34. random sets of Halton points and select the best set with regard to its
  35. discrepancy iteratively. The process is wraped up in the following
  36. flowchart~\ref{rand_flow}. In order to generate at each step a random
  37. permutation, we transform it directly from the previous one.
  38. More precisely the permutation is a singleton object which have method
  39. random, built on the Knuth Fisher Yates shuffle. This algorithm allows
  40. us to generate an uniformly chosen permutation at each step. We recall
  41. this fact and detail the algorithm in the following section.
  42. \begin{figure}
  43. \begin{mdframed}
  44. \label{rand_flow}
  45. \includegraphics[scale=0.4]{flow_rand.pdf}
  46. \caption{Flowchart of the random search}
  47. \end{mdframed}
  48. \end{figure}
  49. \subsubsection{The Knuth-Fisher-Yates shuffle}
  50. The Fisher–Yates shuffle is an algorithm for generating a random permutation
  51. of a finite sets. The Fisher–Yates shuffle is unbiased, so that every
  52. permutation is equally likely. We present here the Durstenfeld variant of
  53. the algorithm, presented by Knuth in \emph{The Art of Computrer programming}.
  54. The algorithm's time complexity is here $O(n)$, compared to $O(n^2)$ of
  55. the naive implementation.
  56. \begin{algorithm}[H]
  57. \SetAlgoLined
  58. \SetKwFunction{Rand}{Rand}
  59. \SetKwFunction{Swap}{Swap}
  60. \KwData{A table T[1..n]}
  61. \KwResult{Same table T, shuffled}
  62. \For{$i\leftarrow 1$ \KwTo $n-1$}{
  63. $j \leftarrow$ \Rand{$[1,n-i]$}\;
  64. \Swap{$T[i], T[i+j]$}\;
  65. }
  66. \caption{KFY algorithm}
  67. \end{algorithm}
  68. \begin{lemma}
  69. The resulting permutation of KFY is unbiaised.
  70. \end{lemma}
  71. \begin{proof}
  72. Let consider the set $[1,\ldots n]$ as the vertices of a random graph
  73. constructed as the trace of the execution of the algorithm:
  74. an edge $(i,j)$ exists in the graph if and only if the swap of $T[i]$ and
  75. $T[j]$ had been executed. This graph encodes the permutation represented by
  76. $T$. To be able to encode any permutation the considered graph must be connex
  77. --- in order to allow any pairs of points to be swapped ---.
  78. Since by construction every points is reached by an edge, and that there
  79. exists exactly $n-1$ edges, we can conclude directly that any permutation can
  80. be reached by the algorithm. Since the probability of getting a fixed graph
  81. of $n-1$ edges with every edges of degree at least one is $n!^{-1}$, the
  82. algorithm is thus unbiaised.
  83. \end{proof}
  84. \subsubsection{Results and stability}
  85. \subsection{Local Search}
  86. Blabla+Pipeline
  87. \subsection{Evolutionary heuristics}
  88. Blabla+Pipeline
  89. \subsection{Genetic heuristics}
  90. Blabla+Pipeline
  91. \section{Results}
  92. \section{Conclusion}
  93. \end{document}