Sciencemadness Discussion Board

Very good explanation of evolution with simulations

woelen - 19-11-2021 at 14:51

I accidently came across a very nice video about evolution and how it works. Someone made a simulation and shows the actual working of some basic driving forces behind evolution. This really provides insight in how evolution works, from a scientific point of view.

It takes almost 1 hour, but to my opinion it really is worth one hour of watching!

Oxy - 20-11-2021 at 13:36

Very nice simulation and video.
Few years ago I made a colony of ants simulation for fun, just to see if I will be able to easily simulate the way how they look for food and how others are using traces of pheromones left by other to follow or avoid a given track. But there were no neural networks and no genetic algorithms.

This reminds me a funny situation from my university. I had a colleague who leaved chemistry department and switched to biology as there was too much of mathematics for him. Unfortunately he quickly found that some of the subjects and exams to pass are evolutional biology and genetics with kind of complex mathematical foundation. He didn't study the biology for any longer :D

[Edited on 20-11-2021 by Oxy]

Bedlasky - 20-11-2021 at 19:05

This is awsome video! I was little bit affraid at the start because this video include some genetics, I never understand genetics and biochemicals processes. But I give it a shot and I am glad I watch this.

Speaking about evolution, month ago I found this channel. He have nice videos about evolution of certain group of animals.

mayko - 21-11-2021 at 11:38

Cool video, thanks for sharing! I’ve always been interested in alife, evolutionary programming, and the like. Many years ago I did a project using (much simpler!) simulations to explore game theory: what happens when players face one another repeatedly in the prisoner’s dilemma? Can they learn to cooperate? I observed semi-stable populations, some which consisted of individuals always cooperating on the surface, though when closely examined they seemed to retain some capacity to defect that wasn’t used except perhaps against defecting mutants. Mostly I’m proud of my genome encoding scheme: I had a k-bit memory register, followed by a 2**k bit rule table. The memory string comes preprogrammed as part of the genome, but the opponent’s moves displace this one game at a time, until it just remembers the last k plays. On its turn, it converts the memory to a binary string, and then looks up the rule in the corresponding location to learn how to respond.

Attachment: GeneticAlgorithmsAndGameTheory.pdf (61kB)
This file has been downloaded 130 times

There’s a couple of details in the explanation that could use some more clarity or complication:

One minor thing is that the ‘conditions for evolution’ section, recombination (the remixing of parental genomes) gets rolled into the definition of inheritance. Recombination might accelerate adaption but it’s not required for it; asexual organisms can still evolve! (Why recombination/sex exists at all is actually controversial; all else being equal, asexuality would be about twice as fit as sexual reproduction, sometimes called the “2-fold cost of males”. In spite of this, asexuality is remarkably rare. Bdelloid rotifers used to be one of the few examples of purely asexual animals… but a couple were observed getting it on this one time back in 2015, and now they’ve disgraced the family name forever, smdh.)

Another subtlety is that what this person calls evolution is more precisely adaptive evolution. It makes sense to emphasize adaptation when explaining natural selection and genetic algorithms, but it’s not the whole story, especially when we return to meatspace biology. There is also neutral evolution, in which variation that isn’t selected drifts back and forth stochastically. This is important for a few reasons; one is that selection isn’t all-powerful and its efficiency depends upon population size. In small populations, advantageous genes aren’t guaranteed reproductive success, nor deleterious genes failure. Another counterintuitive reason is that the origin of complex structures are sometimes better explained by neutral processes, than by adaptive ones. A broader problem is that it can be very easy for biologists to get so fixated on adaptation that they reflexively assume that any feature that exists must have been selected for, to the point that they begin spinning evidence-free “just-so stories” to explain why it’s adaptive. A classic counterexample is the sound that the heart makes as it pumps. It’s certainly advantageous as a diagnostic sign in a medical emergency, but it would be absurd to say that it is an adaptation that had been selected for. Stephen J. Gould & Richard Lewontin famously compared this approach to a guy who concludes that the bridge of his nose exists in order to hold up his glasses.

Gould, S. J., & Lewontin, R. C. (1979). The spandrels of San Marco and the Panglossian paradigm: a critique of the adaptationist programme. Proceedings of the Royal Society of London. Series B. Biological Sciences, 205(1161), 581–598.

Attachment: Society - 2015 - The spandrels of San Marco and the Panglossian paradigm a critique of the adaptationist programme.pdf (3.3MB)
This file has been downloaded 146 times

The only real issue I had was the circular characterization of natural selection. Evolution is sometimes unfairly criticized as tautological and thus philosophically invalid: “Who survives? The fit. Who are the fit? The ones who survive.” There’s a few problems with this; critically, it ignores that fitness can be independently assessed by reference to the environment, and the expectation of survival within it. “Whatever reproduces, reproduces” is similarly problematic IMHO, since it disconnects reproduction from the fitness function that defines the expectation of reproduction.

A popular way to talk about fitness and natural selection is the “fitness landscape”, in which the space of genomes is flattened into a plane and the fitness is represented as the height of the landscape above, with deleterious genomes in valleys and adaptive ones on hills. In this conception, natural selection acts as a force to push populations uphill towards (local) fitness maxima. I know it’s already an hour long but I feel like expanding on this could have helped with another idea that gets shortened to “explicit” vs. “implicit” programming. Evolution simulations are sometimes criticized as not generating novelty, since the answer (the fitness maximum) is implicit in the question (the fitness function). An infamous example came from Richard Dawkins, in which sequences of random characters were evolved, and selected based on their similarity to a target string. After some generations, nonsense strings evolved to match the target exactly… but haven’t we just smuggled the answer to the artificial organisms, by hiding it in the fitness function?

The answer is, yes, that’s the point: natural selection moves information about the environment into the genome. This only seems like a criticism because Dawkins' was a toy model meant to illustrate natural selection with a trivial definition of fitness. With more complicated environments, it becomes a little like pointing at a camel and saying “that’s JUST the solution to a quantum wave equation.” Also against the idea of trivial-but-covert-programming is the fact that evolution simulations sometimes find solutions which work but it’s not obvious to the programmers why; the video points out that even simple genomes are difficult or impossible to understand when examined. They are also known to find unanticipated solutions, either by exploiting facts about their environment that the programmers didn’t/couldn’t know, or because the environment given didn't quite specify the solution the programmers wanted. There was a recent publication collecting examples of “The Surprising Creativity of Digital Evolution”, except in many cases it’s more like Smartassery. Some of my favorites:

In a graduate-level AI class at UT Austin in 1997 taught by Risto Miikkulainen, the capstone project was a five-in-a-row Tic Tac Toe competition played on an infinitely large board. […] However, it had a clever mechanism for encoding its desired move that allowed for a broad range of coordinate values (by using units with an exponential activation function). A byproduct of this encoding was that it enabled the system to request non-existent moves very, very far away in the tic-tac-toe board. Evolution discovered that making such a move right away lead to a lot of wins. The reason turned out to be that the other players dynamically expanded the board representation to include the location of the far-away move—and crashed because they ran out of memory, forfeiting the match!


Cully et al. (2015) [46] presented an algorithm that enables damaged robots to successfully adapt to damage in under two minutes. The chosen robot had six-legs, and evolution’s task was to discover how to walk with broken legs or motors (Fig. 5). To do so, ahead of the test, the researchers coupled digital evolution with a robot simulator, to first learn a wide diversity of walking strategies. Once damaged, the robot would then use the intuitions gained from simulated evolution to quickly learn from test trials in the real world, zeroing in on a strategy that remained viable given the robot’s damage. [...] Naturally, the team thought it impossible for evolution to solve the case where all six feet touch the ground 0% of the time, but to their surprise, it did. Scratching their heads, they viewed the video: it showed a robot that flipped onto its back and happily walked on its elbows, with its feet in the air!


In Thompson’s experiment, an EA evolved the connectivity of a reconfigurable Field Programmable Gate Area (FPGA) chip, with the aim of producing circuits that could distinguish between a high-frequency and a lower-frequency square wave signal. After 5, 000 generations of evolution, a perfect solution was found that could discriminate between the waveforms. This was a hoped-for result, and not truly surprising in itself. However, upon investigation, the evolved circuits turned out to be extremely unconventional. The circuit had evolved to work only in the specific temperature conditions in the lab, and exploited manufacturing peculiarities of the particular FPGA chip used for evolution. Furthermore, when attempting to analyze the solution, Thompson disabled all circuit elements that were not part of the main powered circuit, assuming that disconnected elements would have no effect on behavior. However, he discovered that performance degraded after such pruning! Evolution had learned to leverage some type of subtle electromagnetic coupling, something a human designer would not have considered (or perhaps even have known how to leverage).


Lehman, J., Clune, J., Misevic, D., Adami, C., Beaulieu, J., Bentley, P. J., Bernard, S., Belson, G., Bryson, D. M., Cheney, N., Cully, A., Donciuex, S., Dyer, F. C., Ellefsen, K. O., Feldt, R., Fischer, S., Forrest, S., Frénoy, A., Gagneé, C., … Yosinksi, J. (2018). The Surprising Creativity of Digital Evolution: A Collection of Anecdotes from the Evolutionary Computation and Artificial Life Research Communities.

Tatjam - 21-11-2021 at 12:14

I would like to share this channel which does very interesting life simulations too:

Not so academic but certainly fun to play with. I like the fact that the organisms actually can have a shape and are not simply pixels, but I kind of wish the developer had gone into more detail with the brains of the organisms as the behaviors of the creatures are relatively simplistic!