Scalarizing multi-objective optimizations

I already introduced various coding examples implementing multi-objective optimization. In these examples I implemented different strategies for searching a multi-objective optimum. One of these strategies was based on scalarizing multiple objectives into a single objective function using weights for each indivual objective function. In this article I want to provide a more comprehensive explanation on different kinds of scalarizing strategies.

In multi-objective optimization problems one is facing competing objectives. There is not a single standard method for how to solve multi-objective optimization problems. One popular approach, however, is scalarizing.

Overview of popular techniques

As I already mentioned, scalarizing techniques are applied to transform a multi-objective optimization problem in such a way that only a single objective function is optimized. I have found that there are many techniques for doing so. In this article I present three of them:

  1. Linear scalarization
  2. Epsilon-constraint method
  3. Sen`s programming

Scalarizing multiple objectives with linear scalarization

This method applies weighting factors to combine competing objective functions into a single objective function. Analysts have to set or monitor the weight values, as they represent the weight of the respective objective.

I best describe this approach with a mathematical formulation og the linearly scales objective function, as posted below:

\min _{x\in X}\sum _{i=1}^{k}w_{i}f_{i}(x),
scalarizing with linear scalarization

Normally, I like to set the sum of weights to equal 100%. This makes it easier to view the relative weighting of the respective objectives. Please see my article on multi-obejctive optimization with PuLP in Python to view a demonstration of this approach.

Scalarizing multiple objectives with epsilon-constraint method

Epsilon-constraint models will convert the problem into a single-objective problem by keeping only one function as an objective function. The other functions are modelled as constraint functions. Each function should achieve its individual optima by some extent, as indicated by the value epsilon.

Again, I best describe this appraoch by posting the mathematical notation below:

{\displaystyle {\begin{array}{ll}\min &f_{j}(x)\\{\text{s.t. }}&x\in X\\&f_{i}(x)\leq \epsilon _{i}{\text{ for }}i\in \{1,\ldots ,k\}\setminus \{j\},\end{array}}}
scalarizing with epsilon-constraint method

When I apply this approach I usually solve for each objective separately first. Thereby I reveal the individual optima of each objective. Next, I construct the epsilo-constraint version of the problem. In the problem statement I set the model to the individual optima by a at least a defined extend.

I published an article on this blog demonstrating the implementation of this approach using PuLP in Python for multi-objective optimization.

Scalarizing multiple objectives with Sen`s programming

This approach normalizes each objective function by dividing through its absolute individual optima before summation into a single joint objective function. I illustrate this approach using mathematical notation. See below. Please note that in this examples obejctives 1 to r are maximization objectives, while objectives r+1 to s are minimization objectives.

{\displaystyle {\begin{array}{ll}\max &{\frac {\sum _{j=1}^{r}Z_{j}}{W_{j}}}-{\frac {\sum _{j=r+1}^{s}Z_{j}}{W_{r+1}}}\\{\text{s.t. }}&AX=b\\&X\geq 0,\end{array}}}, scalarizing with Sans programming

Other popular approaches for scalarizing

Many supply chain analysts also apply a scalarizing technique presented by Wiezbicki, which you can read more about here: https://www.sciencedirect.com/science/article/pii/0270025582900380?via%3Dihub

Coding examples for (multi-objective) optimization

I have contributed a series of blog posts covering linear optimization in Python and R. Here is a list that will get you started in R and Python. I also included two examples including multi-objective optimization, namely linear scalarizing and the epsilon-constraint method. Here are some of my posts:

You May Also Like

Leave a Reply

2 comments

Chandra Sen says:

There is no San’s Programming. It is Sen’s Programming.

Correct. Thank you very much for the correction and feedback. I editted the post accordingly.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.