This content originally appeared on Bits and Pieces - Medium and was authored by Fernando Doglio
A Comprehensive Guide to Implementing Genetic Algorithms in Your Code

Sometimes the solution to a problem can depend on so many different and independent factors, that writing an algorithm that makes sense of them all is almost impossible.
Understanding when to change one parameter instead of the other, can take a lot of time and effort to research.
Instead, genetic algorithms provide a way for you to find the best solution to a problem “naturally” without worrying about the parameters and how they interact with each other.
It sounds a little bit like magic, but it’s actually quite interesting how they work, so let’s look at what these types of algorithms are and how they can help you find solutions to your problems.
What exactly are genetic algorithms?
So genetic algorithms are a way to solve problems by imitating (to a degree) how cellular organisms reproduce.
You see, when a dude and a dudette like each other to the point that they’re ready to have a child, they will essentially “mix their DNA”, half of the dude’s DNA is mixed with half of the dudette’s. Their genes mix, creating new, potentially never before seen combinations (at least new within their family tree) and every once in a while, some of those genes will mutate and create even newer combinations (that’s where mutant powers come from!).
How do we translate all of that into a problem-solving algorithm?
“Easy!” (notice the quotes)
Potentially, what we want to achieve is the following:
- For a given problem, we should encode a solution in the form of a stream of DNA where every gene is one of the parameters of the solution.
- Then create a “population” of random solutions.
- Evaluate these random solutions, and assign points to them (we’ll call this value “fitness”). The closer they are to yielding the desired output, the better their fitness will be (your logic will determine if that means the value will be higher or lower).
- Sort them by points and discard the lower half, we don’t want them, they’re “too wrong”.
- Now grab the top half of the remaining population, and mix it with the other half. To mix them, you grab half the genes of one potential solution, and mix them with the other half of another solution. This would yield a brand new potential solution (we’ll call them “offsprings”).
- Now go back to step 3, where you’ll have the new offsprings to evaluate and sort.
Additionally, if during step 5, you add some randomness and “mutate” some of the values before mixing both solutions, you might find a new emerging pattern that wasn’t even in your random population to begin with.
This process will, ideally, evolve the population into a set of solutions that are very close to the desired output, hopefully, having one or more of the offsprings providing the right combination of parameters.So that’s the “theory” of how it’s done, let’s take a look at some practical scenarios.
Using genetic algorithms
There are many different scenarios where genetic algorithms might come in handy.
The key is to find a problem that would require a lot of iterations to solve correctly if you were to follow some other “traditional” method.
One classic example here is the so-called “Traveling salesman problem”. Imagine being a traveling salesman with several houses you have to visit every day. They’re all spread throughout the city, so you have to figure out the route that will let you visit them all in the shortest time.
You could take the back-tracking approach and try every potential route, killing off routes that at some point already show are not better than the current best solution. But it would require a lot of logic and (potentially) recursion which is something we all know JavaScript isn’t great at.
You could also generate every single route, and then calculate the time it would take you to go through each one. This would be the so-called “brute force” approach. It’s effective, but it can potentially take A LONG time if the number of houses is big enough.
Instead, we can take the brute force approach and improve it with a genetic algorithm. Let’s generate a number of random solutions, and go through the process described above. We can let nature figure out the best approach.
Implementing the solution
Let’s assume we’re dealing with a scenario where there are 10 cities, and we can of course, go from any of them to the next one.
We’ll use the distance between them to understand how good a solution is, after all, the lower the total traveled distance, the better for us it’ll be.
Here is our setup code, including the total number of iterations we’ll be running (after all, these types of problems don’t really end unless you tell them when to finish), the size of our population (i.e how many different solutions we’ll be dealing with), and the probabilities of crossing over and mutating genes.
Notice the distance matrix, the 0 is because that’s the distance of that particular city with itself.
Let’s now create our initial random population of potential solutions:
Nothing really interesting to see here, we’re just creating a list of numbers from 0 to NUM_CITIES and then we’re shuffling the order. We do this POPULATION_SIZE times.
We’ll then run the genetic algorithm MAX_ITERATIONS times modifying the population and then we’ll sort it by fitness (i.e. total traveled distance) and we’ll keep the first one.
That logic translates to this code:
We are, of course, missing some parts. Let’s take a look at what the evolve function looks like:
This function selects the solutions for the next generation based on probability (instead of the sort-kill method I described earlier). Then it’ll generate a new population based on chance. If chance wants it, we’ll mix two solutions together and we’ll keep their offsprings.
And then we’ll call the mutation function, which will “mutate” the genes of our solution if the random gods allow for it (we’ll look at that function in a minute).
Once the next generation is complete, we’ll return it.
Let’s look at the next most complex function, the selection function:
This is a long function, but let me break it up for you:
- First we calculate the fitness of the entire population. As you’ll see in a minute, the fitness of a solution is the total traveled distance. That means a lower fitness is better for our use case.
- Then we normalize the fitness values by calculating the total sum of the fitnesses together and then calculating the percentage each solution covers of that total.
- We then proceed to go over our population of solutions, and based on Math.random and the probabilities calculated in the previous step, to select two potential individuals. This in turn will keep the fittest of both options. Since we’re doing this POPULATION_SIZE times, we’ll return an array of potentially repeated values, but that’s not a problem, because chance will also help us breed them together to find new ones.
Which takes me to the crossover function. This function will take two parent solutions and it’ll create an offspring by picking a random crossover point and mixing the values:
As you can see, there are some considerations to be had if the mixture of both paths isn’t a new, complete path. We’ll have to fill-in the missing cities to create a full solution.
Finally, going back to the evolve function, after we breed our solutions, we also call the mutation function, which has a very low probability of adding a mutation, which in our case, it means swapping cities within a solution:
There are two reasons why you’ll want to add mutation into your algorithm:
- To add potential solutions that were never covered as part of the initial random population or by the breeding process
- To break potential stagnation points. Without mutation, the algorithm can potentially reach a point where it can’t find any better solutions, through mutation we add noise. This noise can help emerge new solutions that were not possible before.
Usually you’ll want to keep the mutation rates low, because otherwise the random gods will take over your algorithm and the results will never be good. However, playing around with that number can help you find better results faster, depending on what you’re trying to achieve.
Did you like what you read? Consider subscribing to my FREE newsletter where I share my 2 decades’ worth of wisdom in the IT industry with everyone. Join “The Rambling of an old developer” !
Testing the algorithm
Once we have all of our code ready, and the initial setup values as shown above, then we have to run the code.
The result should come almost instantly because we’re not really doing a lot of computation.
In my case, the result I get is the following:
Meaning that the shortest route it could find was from 0 -> 1 -> 3 -> 4 -> 2 -> 7 -> 6 -> 9 -> 8 -> 5
Which can also be represented, using our initial matrix, like this:

If you add up all the distances shown there, you’ll get 100, which is the fitness of that solution.
Your results might vary, one of the characteristics of these algorithms is that they don’t always arrive at the same conclusion, especially not if you don’t give them enough time to get there.
Which takes me to a final note about the initial setup values: those are example values. If you’re trying this code out, feel free to change them up, add more iterations, or decrease the population, experiment and see what results you get.
Genetic algorithms are a very interesting way of solving very complex problems by letting the solution evolve on its own. They’re not fit for all problems sadly, but if you know how to implement your own, you’ll eventually run into a problem that would benefit from them.
For instance, I implemented a theme generator using genetic algorithms to evolve the best theme based on user preferences. You can check out the full tutorial over here.
By the way, the full source code used for this example can be found on this Github repo, so feel free to use it for whatever you need.
Have you used genetic algorithms in the past? What did you use them for?
Build Apps with reusable components, just like Lego

Bit’s open-source tool help 250,000+ devs to build apps with components.
Turn any UI, feature, or page into a reusable component — and share it across your applications. It’s easier to collaborate and build faster.
Split apps into components to make app development easier, and enjoy the best experience for the workflows you want:
→ Micro-Frontends
→ Design System
→ Code-Sharing and reuse
→ Monorepo
Learn more
- How We Build Micro Frontends
- How we Build a Component Design System
- Bit - Component driven development
- 5 Ways to Build a React Monorepo
- How to Create a Composable React App with Bit
- Sharing JavaScript Utility Functions Across Projects
JavaScript Meets Genetics: How to Create Genetic Algorithms was originally published in Bits and Pieces on Medium, where people are continuing the conversation by highlighting and responding to this story.
This content originally appeared on Bits and Pieces - Medium and was authored by Fernando Doglio

Fernando Doglio | Sciencx (2023-01-20T08:02:30+00:00) JavaScript Meets Genetics: How to Create Genetic Algorithms. Retrieved from https://www.scien.cx/2023/01/20/javascript-meets-genetics-how-to-create-genetic-algorithms/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.