6
AlgoRythm
11h

TL;DR: I'm reading papers and doing computer science like I could never afford to in college.

I am beginning my scientific arc.

Over the past few days, I have been working on implementing my own Evolutionary Algorithms

I've been doing a combination of "experimentation" and (probably less than I should,) actual research.

My Mark 1 was just a proof of concept that set up the data structures correctly, Mark 2 generalized the data structures and actually implemented some natural selection, but this was really just made up by me so I'm only getting mediocre results.

Next step: I have two papers lined up to read on EAs. Mark 3 might not implement them exactly, but I hope to beat the performance of Mark 2.

I'm encouraged by the fact that these research papers have TONS of different things they tried, and I'm really only on my first prototype (since Mark 1 didn't have any selection implementation, only randomness)

Follow along if interested:

https://github.com/AlgoRythm-Dylan/...

Comments
  • 0
    I don’t have a clue about any of this but I like the idea!
  • 2
    Evolutionary algorithms are quite the black box. Their inner workings are simple, their structures are regular.
    No, the hard part here is to fine tune the parameters, and to run it all fast enough so you can make many, many experiments. Non deterministic randomized approaches are a bitch, you can have very different results from the same data set. So you need to try a billion dofferent parameters.
    You might want to try parallelism and grid search.
  • 0
    @JsonBoa parameter sweep a grid, just randomize where you start for each parameter, and use some sort of bitmask for the combination of all selected parameters.

    Doesn't get you partial activation of course.

    Hey algorythm what papers you reading?
  • 0
    @JsonBoa Because I'm loyal to my roots and humans didn't come from any parallelized grid search, boy 😡😡😡

    @wisecrack I'm starting with these two:

    Improvements of Real Coded Genetic Algorithms Based on Differential Operators Preventing the Premature Convergence

    A Comparative Study of Common Nature-Inspired Algorithms for Continuous Function Optimization
Add Comment