As a implementation of a genetic algorithm, I use simple-genetic-algorithm, that is, ahem, simple to use compared to some other packages. I've modified it, though, to use MonadRandom instead of having to thread through the RandomGen, so if you want to run the code make sure to take the version from my fork in the MonadRandom branch, until these changes are released by the maintainer on Hackage.
To use a genetic algorithm, we need two things: a representation of the structure we want to evolve that is suitable for the genetic transformations, and the implementation of these transformations (crossover and mutation) themselves.
For the first task, we convert the network to simple vectors:
-- | Convert a network for only vectors structures
toVectors :: (Floating (Vector a), Container Vector a, Floating a) =>
Network a -> [([Vector a],Vector a)]
toVectors = map (tovs . layerToShowable) . layers
where
tovs (ShowableLayer ws bs) = (toRows ws,bs)
-- | Convert back to a network given a network definition and vector structures
fromVectors :: (Floating (Vector a), Container Vector a, Floating a) =>
LayerDefinition a -> [([Vector a],Vector a)] -> Network a
fromVectors ld = Network . map ((\sl-> showableToLayer (sl,ld)) . frvs)
where
frvs (vs,v)=ShowableLayer (fromRows vs) v
Note that we need a LayerDefinition to rebuild the network from the Vectors. Currently each layer of the network has the same definition and the algorithm does NOT evolve this structure, only the weights and biases.
We're going to keep that information along with some Worlds that we use for fitness testing:
-- | Store the network information as vectors
data NetworkData = NetworkData [([Vector Float],Vector Float)] [World]
deriving (Show,Read,Eq,Ord)
Then we can implement the Chromosome class. Crossover takes the average of all weights and biases of the two parents, mutation changes one value randomly. Of course other implementations could be found.
-- | Chromosome instance
instance Chromosome NetworkData where
-- Take average
crossover (NetworkData xs ws) (NetworkData ys _) =
return [ NetworkData (Prelude.zipWith zipW xs ys) ws]
where
zipW (vs1,v1) (vs2,v2) = (Prelude.zipWith zipW1 vs1 vs2,zipW1 v1 v2)
zipW1 = V.zipWith (\x y -> (x + y) / 2)
-- Mutate one weight randomly
mutation (NetworkData xs ws) = do
xs' <- font="" r1="" randomchange="" xs="">->
return $ NetworkData xs' ws
where
randomChange _ [] = return []
randomChange f xs2 = do
idx <- -="" 1="" font="" getrandomr="" length="" xs2="">->
mapM (\(i,x)->if i==idx then f x else return x) $ zip [0..] xs2
r1 (vs,v) = do
(v3:vs2) <- font="" r2="" randomchange="" v:vs="">->
return (vs2,v3)
r2 v2 = do
idx2 <- -="" 1="" font="" getrandomr="" v.length="" v2="">->
dx <- 20="" font="" getrandomr="" nbsp="">->
return $ v2 V.// [(idx2,dx)]
-- calculate fitness
fitness (NetworkData xs ws) = sum (map (fitWorld xs) ws) / fromIntegral (length ws)
For the fitness function we calculate the fitness for each given world and average it. I'm not trying to be too clever with that code.
For each world we run the food searching algorithm from each corners, and evaluate how far we are from the target, and if we reached it how long it took us. So networks that find the food will always rank higher than the ones who don't, and the quicker among them will rank higher again.
-- | Calculate fitness on a given world
fitWorld :: [([Vector Float],Vector Float)] -> World -> Double
fitWorld dat w = sum (map fitCorner $ corners $ wSize w) / 4
where
fitCorner pos =
let network = fromVectors layerDef dat
poss = algSteps w (neuralAlg w network) 50 pos
endSmell = currentSmell w $ last poss
possLength = length poss
in fitFormula w endSmell possLength (distance pos $ wFood w)
-- | Fitness formula
fitFormula :: World -> Int -> Int -> Int -> Double
fitFormula w endSmell possLenth dist = case fromIntegral endSmell / fromIntegral (wSmell w) of
1 -> 2 + (fromIntegral dist / fromIntegral possLenth)
n -> n
Then we just need a stop condition: either a maximum number of generations or reaching the maximum possible fitness (shortest path found from all corners)
-- | Maximum for the fitness
maxFit :: Double
maxFit = fitFormula (World undefined undefined 10 undefined) 10 10 10
-- | Stop function
stopf :: NetworkData -> Int -> IO Bool
stopf nd gen= return $ gen > 300 || fitness nd == maxFit
And code to generate random networks
-- | Build a random network data
buildNetworkData :: (Monad m,RandomGen g) => [World] -> RandT g m NetworkData
buildNetworkData ws= do
g <- font="" getsplit="">->
let n = buildNetwork g
return $ NetworkData (toVectors n) ws
We can then evolve our population of networks, say on two worlds, w and w2:
runGAIO 64 0.1 (buildNetworkData [w,w2]) stopf
And we get after a couple of minutes a network that can find the food from another point that the tested corners:
Iteration:1
##########
#........#
#........#
#........#
#...X....#
#........#
#........#
#........#
#.......@#
#........#
#........#
##########
Iteration:2
##########
#........#
#........#
#........#
#...X....#
#........#
#........#
#......@.#
#........#
#........#
#........#
##########
Iteration:3
##########
#........#
#........#
#........#
#...X....#
#........#
#.....@..#
#........#
#........#
#........#
#........#
##########
Iteration:4
##########
#........#
#........#
#........#
#...X....#
#....@...#
#........#
#........#
#........#
#........#
#........#
##########
Iteration:5
##########
#........#
#........#
#........#
#...@....#
#........#
#........#
#........#
#........#
#........#
#........#
##########
Yummy!
Interested in all this? I recommend the ai-junkie tutorial, then! My code is of course on github. Feedback and suggestions welcome!
4 comments:
You might find my creatur library of interest. Among other things, it provides a default genome for Haskell data types, and an embedded domain-specific language for mating two Haskell values.
Package: http://hackage.haskell.org/package/creatur
Tutorial: https://github.com/mhwombat/creatur-examples/raw/master/Tutorial.pdf
Paper: http://amydebuitleir.eu/wp/wp-content/uploads/2015/01/debuitleir-functional-approach-sex-2014.pdf
Amy de Buitléir, Michael Russell, and Mark Daly. Wains: A pattern-seeking artificial life species. Artificial Life, (18)4:399–423, 2012.
Evolution strategies could be more suitable for certain types of neural network:
https://forum.processing.org/two/discussion/23422/a-simple-way-to-optimize
I also had this to say recently:
https://groups.google.com/forum/#!topic/artificial-general-intelligence/4aKEE0gGGoA
Post a Comment