** THEORETICAL BACKGROUND**

**2.1 The HS Algorithm**

HS (Geem et al., 2001) is a relatively new population-based metaheuristic optimization al-gorithm that imitates the music improvisation process where the musicians improvise their instruments’ pitch by searching for a perfect state of harmony. It has been successfully tailored to various scientific and engineering applications such as music composition (Geem and Choi, 2007), sudoku puzzle solving (Geem, 2007a), tour planning (Geem, Tseng and Park, 2005), web page clustering (Forsati et al., 2008; Mahdavi and Abolhassani, 2009), structural design (Lee and Geem, 2004; Geem, 2009a), water network design (Geem, 2009c), vehicle routing (Geem, Lee and Park, 2005), dam scheduling (Geem, 2007b), ground water modeling (Ay-vaz, 2009, 2007), soil stability analysis (Cheng et al., 2008), ecological conservation (Geem and Williams, 2008), energy system dispatch (Vasebi et al., 2007), heat exchanger design (Fe-sanghary et al., 2009), transportation energy modeling (Ceylan et al., 2008), satellite heat pipe

design (Geem and Hwangbo, 2006), medical physics (Panchal, 2009), timetabling (Al-Betar et al., 2008; Al-Betar, Khader and Liao, 2010), RNA structure prediction (Mohsen et al., 2010), etc. For further information on these applications see (Ingram and Zhang, 2009) and references therein. It is a very successful metaheuristic algorithm that can explore the search space of a given data in parallel optimization environment, where each solution (harmony) vector is gen-erated by intelligently exploring and exploiting a search space (Geem, 2009c).

HS as mentioned mimic the improvisation process of musicians’ with an intelligent way as can be seen in Figure 2.1. The analogy between improvisation and optimization is likely as follows (Geem, 2010):

1. Each musician corresponds to each decision variable.

2. Musical instrument’s pitch range corresponds to the decision variable’s value range.

3. Musical harmony at a certain time corresponds to the solution vector at a certain iteration.

4. Audience’s aesthetics corresponds to the objective function.

Just like musical harmony is improved time after time, solution vector is improved iteration by iteration. In general, HS has five steps and they are described as in (Geem, Tseng and Park, 2005) as follows:

Step 1. Initialize the Optimization Problem and HS Parameters The optimization problem is defined as follows:

minimize/maximize f(a),

subject to ai∈Ai, i=1,2, . . . ,N

(2.1)

Figure 2.1: Analogy between Improvisation and Optimization, obtained from (Geem, 2010)
where f(a)is an objective function;ais the set of decision variable(ai);Ai is the set
of possible range of values for each decision variable,La_{i}≤Ai≤_{U}a_{i};La_{i}andUa_{i} are
the lower and upper bounds for each decision variable(ai)respectively. The variable
Nis the number of decision variables.

Then, the parameters of the HS are initialized. These parameters are:

(a) Harmony Memory Size (HMS) (i.e., number of solution vectors in harmony memory);

(b) Harmony Memory Considering Rate (HMCR), where HMCR∈[0,1]; (c) Pitch Adjusting Rate (PAR), where PAR∈[0,1];

(d) Stopping Criteria (i.e., number of improvisation (NI));

More explanation of these parameters is provided in the next steps.

Step 2. Initialize Harmony Memory

The harmony memory (HM) is a matrix of solutions with a size of HMS, where each HM vector represents one solution as can be seen in Eq.2.2. In this step, the solutions

are randomly constructed and optionally rearranged in a reversed order to HM, based
on their objective function values such as f(a^{1})≤ f(a^{2})· · · ≤ f(a^{HMS}), where f(a^{j})is
the objective function value for j^{th}harmony memory vector.

HM=

a^{1}_{1} a^{1}_{2} . . . a^{1}_{N} f(a^{1})

a^{2}_{1} a^{2}_{2} . . . a^{2}_{N} f(a^{2})

... ... . . . ... ...

a^{HMS}_{1} a^{HMS}_{2} . . . a^{HMS}_{N} f(a^{HMS})

(2.2)

Step 3. Improvise New Harmony

This step is the essence of the HS algorithm and the cornerstone that has been laid for
building this algorithm. In this step, the HS generates (improvises) a new harmony
vector,a^{0}= (a^{0}_{1},a^{0}_{2},a^{0}_{3}, . . . ,a^{0}_{N}). It is based on three operators: memory consideration;

pitch adjustment; or random consideration. In the memory consideration, the values of
the new harmony vector are randomly inherited from the historical values stored in HM
with a probability of HMCR. Therefore, the value of decision variable(a^{0}_{1})is chosen
from (a^{1}_{1},a^{2}_{1},a^{3}_{1}, . . . ,a^{HMS}_{1} ) that is stored in HM. The next decision variable (a^{0}_{2}) is
chosen from(a^{1}_{2},a^{2}_{2},a^{3}_{2}, . . . ,a^{HMS}_{2} ), and the other decision variables,(a^{0}_{3},a^{0}_{4},a^{0}_{5}, . . . ,a^{0}_{N}),
are chosen consecutively in the same manner with the probability of HMCR∈[0,1].

The usage of HM is similar to the step where the musician uses his or her memory to generate a good tune. This cumulative step ensures that good harmonies are considered as the elements of new harmony vectors.

Alternativally, where the other decision variable values are not chosen from HM,
ac-cording to the HMCR probability test, they are randomly chosen acac-cording to their
possible range,a^{0}_{i}∈Ai. This case is referred to as random consideration (with a
prob-ability of (1-HMCR)), which increases the diversity of the solutions and drives the

system further to explore various diverse solutions so that global optimality can be attained.

The following equation summarizes these two steps i.e., memory consideration and random consideration.

a^{0}_{i}←

a^{0}_{i}∈

a^{1}_{i},a^{2}_{i},a^{3}_{i}, . . . ,a^{HMS}_{i} w.p.HMCR
a^{0}_{i}∈A_{i} w.p. (1−HMCR)

(2.3)

For example, a HMCR of 0.95 indicates that the HS algorithm will choose the decision variable value from historically stored values in the HM with a 95% probability or from the entire possible range with a 5% probability. Therefore, if a generated random numberw∈[0,1]is equal to 0.78 then, the value of the decision variable is picked out from HM since it is less than the HMCR(0.95).

Furthermore, an additional search for good solutions in the search space is achieved
through tuning each decision variable in the new harmony vector,a^{0}= (a^{0}_{1},a^{0}_{2},a^{0}_{3}, . . . ,
a^{0}_{N}), inherited from HM using PAR operator. These decision variables are examined to
be tuned with the probability of PAR∈[0,1]as in Eq.5.3.

a^{0}_{i}←

Ad justing Pitch w.p.PAR Doing Nothing w.p. (1−PAR)

(2.4)

If a generated random numberrnd∈[0,1]within the probability of PAR then, the new
decision variable(a^{0}_{i})will be adjusted based on the following equation:

(a^{0}_{i}) = (a^{0}_{i})±rand()∗bw (2.5)

Here,bwis an arbitrary distance bandwidth, or fret width f was renamed in the updated version of HS (Geem, 2010), used to improve the performance of HS and (rand()) is a

function that generates a random number∈[0,1]. Actually,bwdetermines the amount of movement or changes that may have occurred to the components of the new vector.

The value ofbwis based on the optimization problem itself i.e., continuous or discrete.

In general, the way that the parameter (PAR) modifies the components of the new harmony vector is an analogy to the musicians’ behaviors when they slightly change their tone frequencies in order to get better harmonies. Consequently, it explores more solutions in the search space and improves the searching abilities.

All of these operators are well-illustrated using pseudo code as in Fig 2.2.

Step 4. Update the Harmony Memory

In order to update HM with the new generated vectora^{0} = (a^{0}_{1},a^{0}_{2},a^{0}_{3}, . . . ,a^{0}_{N}), the
objective function is calculated for each new harmony vector f(a^{0}). If the objective
function value for the new vector is better than the worst harmony vector stored in
HM, then the worst harmony vector is replaced by the new vector. Otherwise, this new
vector is ignored.

a^{0} ∈HM ∧ a^{worst} ∈/HM (2.6)

However, for the diversity of harmonies in HM, other harmonies (in terms of least-similarity) can be considered. Also, the maximum number of identical harmonies in HM can be considered in order to prevent premature HM.

Step 5. Check the Stopping Criterion

The iteration process in steps 3 and 4 is terminated when the maximum number of im-provisations (NI) is reached. Finally, the best HM vector is selected and is considered to be the best solution to the problem under investigation.

HS Algorithm begin

Define fitness function f(a),a= (a_{1},a_{2}, . . . ,a_{N})^{T}
Define (HMCR), (PAR), (HMS), (NI), (bw)
HM←GenerateInitialPoplution()

min = minimum visible value.

max = maximum visible value.

while(iter≤NI)do while(i≤N)do

if(rand∈(0,1)≤HMCR)then

randLoc=1+f loor(rand∈(0,1)×N)
a^{0}_{i}=a^{randLoc}_{i}

if(rand∈(0,1)≤PAR)then
adjust the value ofa^{0}_{i}by:

a^{0}_{i}=a^{0}_{i}±rand∈(0,1)×bw
end if

else

choose a random variable:

a^{0}_{i}=min+rand∈(0,1)×(max−min)
end if

end while

if(f(a^{0})≤ f(a^{worst}))then
a^{0} ∈HM,a^{worst} ∈/HM
end if

end while

best=find the current best solution end

Figure 2.2: Pseudo code of the HS algorithm.

2.1.1 HS Characteristics

The HS algorithm has several characteristics that make it one of the most important meta-heuristic algorithms (Geem et al., 2001). These distinguish it from other metameta-heuristics such as (1) the generation of a new vector after considering all existing vectors, rather than consid-ering only two vectors as in GA (parents); (2) the independent consideration for each decision variable in HM vector; (3) the consideration of continuous decision variable values without any loss of precision; (4) no requirement of decimal-binary conversions or a fixed number (2n) of decision variable values as in GA; and (5) no need for any starting values of the decision variables or require complex derivatives as in gradient-based methods.

The other important strengths of HS are their improvisation operators, memory

considera-tion; pitch adjustment; and random consideration, that play a major rule in achieving the desired balance between the two major extremes for any optimization algorithm, Intensification and di-versification (Yang, 2009a). Essentially, both pitch adjustment and random consideration are the key components of achieving the desired diversification in HS. In random consideration, the new vector’s components are generated at random mode, have the same level of efficiency as in other algorithms that handle randomization, where this property allows HS to explore new regions that may not have been visited in the search space. While, the pitch adjustment adds a new way for HS to enhance its diversification ability by tuning the new vector’s component within a given bandwidth. A small random amount is added to or subtracted from an exist-ing component stored in HM. This operator, pitch adjustment, is a fine-tunexist-ing process of local solutions that ensures that good local solutions are retained, while it adds a new room for ex-ploring new solutions. Further to that, the pitch adjustment operator can also be considered as a mechanism to support the intensification of HS through controlling the probability of PAR. The intensification in the HS algorithm is represented by the third HS operator, memory consider-ation. A high harmony acceptance rate means that good solutions from the history/memory are more likely to be selected or inherited. This is equivalent to a certain degree of elitism.

Obviously, if the acceptance rate is too low, solutions will converge more slowly.

Finally, the structure of the HS algorithm is relatively easy. This advantage makes it very flexible to combine HS with other metaheuristic algorithms as can be seen in Section 3.1.