Harmony Search Algorithm in C++

The following article continues my understanding and impression of algorithms whose origin is based on the beauty of nature and physical phenomena.
Harmony search (HS) is a meta-heuristic search algorithm that mimics the improvisation process of musicians in finding a pleasing harmony.

In the process of improvising music, a set number of musicians work to adjust the pitch of their instruments in order to create a beautiful harmony (optimal state). A particular relationship between many sound waves with various frequencies characterizes harmony in nature.
Aesthetic evaluation determines the improvised harmony’s level of quality. The musicians constantly improve their performance (objective function) in order to boost artistic sense and find the best harmony.

You can find the implementation of this algorithm (in C++) on my GitHub.

Harmony Search Algorithm for Solving Optimization Problems

Let’s first idealize the improvisation method used by a musician in order to further understand the Harmony Search. A musician who is improvising has three options: (1) she/he can play a well-known melody perfectly from memory, (2) she/he can play something similar to the tune but significantly alter the pitch, or (3) she/he can create new or random sounds.

Harmony memory (HM) usage is essential since that guarantees that strong harmonics are taken into account while creating new solution vectors. The HS method incorporates a random parameter HMCR [0,1], known as harmony memory considering rate. If this rate is too low, it may only select a small number of elite harmonies and the algorithm converges slowly. If this rate is very high (around 1), the harmony memory is mostly employed, and alternative pitches are not fully investigated, which prevents good solutions. As a result, we usually use HMCR accept =0.7 — 0.95.

Pitch adjustment, results in changing the frequency. Although linear correction is typically utilized, nonlinear adjustment is also an option in theory. Thus, we have

where BW parameter is the bandwidth of generation. The term of ( rand − 0.5 ) generates a random number from [-0.5 0.5].

Performing the above steps we compute a new set of harmony (a set of instrument frequencies; in the optimization case we can associate with the fit function values). Later we are able to estimate a new objective function and decide if a new set of values (harmony) affect the better performance of the algorithm (we approach the final solution).

The author of this article is under big impression over the simplicity of implementation and later the performance while solving optimization problems.

Tests

In this article, I used the SCA to solve two optimizations problem. The simple one is the ordinary quadratic function I used for my personal debugging purpose (see Github).

The second problem is connected (benchmark optimization problem) with tension/compression spring where the objective of the problem is to minimize the weight. There are three design variables: wire diameter ( d ), mean coil diameter ( D ), and the number of active coils ( N ). This is a three-dimensional problem (each variable is considered as a dimension).

(a) Schematic of the spring; (b) stress distribution evaluated at the optimum design; and c) displacement distribution evaluated at the optimum design (article I referred to)

The optimization problem can be formulated as follows :

The existing constraints have been included in the objective function as penalties. I modeled penalties as a quadratic loss function (see here and here).
The implemented method solves the problem (computes the minimum weight of the spring). The solution I received was equal (almost) to the article I was inspired by.

Result of the optimization problem. Spring design (by author)

Thank you for reading.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store