As an engineer I can see that evolution is a fact... it is a tool. I see that it can easily be demonstrated on a computer and can also be demonstrated with bacteria and viruses. Bacteria and viruses can build up an immunity to drugs due to their rapid growth rate under different circumstances. The math behind it is simple... you must roll the dice.
For example, those that roll the right number reproduce, those that roll the wrong number don't. Eventually if there is a mechanism to control the dice roll, then that trait will be selected. The trait is selected by the environment that kills off those who roll the wrong number, and enables those who roll the right number. Inside a computer there are a number of algorithms to generate a pseudo-random number, which is essentially an algorithm that produces entropy that is non-correlated with anything. Even better are electronic circuits that produce white noise by passing a current through a resistor, similar to dropping balls down a beg board to produce a random path, and then that random noise can be digitally sampled and used within an algorithm wherein the numbers are thus entirely uncorrelated. There is no evolution without a source of uncorrelated entropy.
For example: our children. How many consecutive kids come out looking exactly the same? Why is that each kid out of the same two parents has a different genetic code? The machinery of the cells are powerful in producing copy after copy of exactly the same code. Identical twins prove that. But somewhere each sperm or each egg or each combination thereof is somehow different every time. Imagine a person wherein every cell had an entirely different code... it is only the case with the sperm and the egg. Somewhere there is a source of entropy causing different outcomes. There is a selection that is always... different. Evolution can not occur without that entropy.
What is entropy? In science and engineering, the word entropy took a twist when Claud E. Shannon wrote a paper called "A Mathematical Theory of Communication". It is a little different than the word entropy taught in chemistry and physics; however, it is also very similar. Basically the Shannon entropy is a measure of information, and Shannon identified that to maximize a communications channel you have to encode the data to maximize its entropy. Make the information look like pure noise. Entropy on our side of the fence involves probability. An unknown. To maximize the entropy the events or outcomes need to be equiprobable and uncertain. For example if the dice are loaded or biased then each roll presents less information. This understanding has been implemented in communication systems and computer algorithms in the last century to maximize the transfer of information, by taking information and compacting it down into unrecognizable trash. Then on the other side of the communication channel the unrecognizable trash is uncompacted, unfolded back into its original state by an algorithm that is perfectly symmetric to the algorithm that made it. While there is hopefully little rolling of the dice in the process since it can potentially corrupt the communication, the data is made to look as if someone had rolled the dice. The algorithms are deterministic, but the encoder is designed to produce data that appears stochastic and the decoder is designed to take the data that appeared stochastic and put it back into its original form.
"The data was made to look as if someone had rolled the dice." Read that again. I was not talking about evolution, I was talking about a communications channel... but my oh my. A communications channel from who? Who hides their communication so that it looks as if it could have been due to a roll of the dice? Well, we do... inside cell phones, computer communications, satellite communications... but a person doesn't see it because hopefully the receiver has turned the entropy back into a legible <cough> language. The brain has further learned to encode and decode the language into legible <cough> concepts.
Back to evolution... the seed of evolution is a roll of the dice. It is possible to feedback entropy that is non-correlated to seed the changes into a system to test the strength of the change within an environment. It works, but it is slow and painful. It is one way of cracking a code so to speak. It is one way for a mouse to find its way through the maze to locate the cheese. Just kill off every mouse that doesn't succeed, and eventually the mouse will allegedly develop a method of finding the cheese. It is like trying to crack a password by trying random permutations until one is found.
There is a limit to what evolution can do... there is a limit to what can be accomplished by feeding back uncorrelated stochastic information into trial and error. The limit is this: evolution requires that the information of what can succeed, of what will live, already exists in the environment and is available to test. If a genetic code can not be put to the test then evolution is dead. For example when growing bacteria or viruses, if they can not be put to the test against a drug in a host then the little buggers can NOT evolve to overcome the drug. Denied of the ability to try millions of attempts at a secret code to allow for coexistance in the face of a toxic drug, the little buggers can't learn anything about the secret code through trial and error. Evolution is like a hacker, bent on breaking a code that will enable it power. The doctor wants no survivors... either take the full dose of a drug to kill off the whole lot of hackers so that the bacteria will not survive and thus evolve, or don't take the drug and deny them the ability to learn. Similarly if a computer hacker enters the wrong code several times then the gates can be closed to deny the ability to evolve (evolute) an answer.
Anyone disagree?