LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

A Bayesian approach to find Pareto optima in multiobjective programming problems using Sequential Monte Carlo algorithms

Photo by pask_07 from unsplash

In this paper we consider a new approach to multicriteria decision making problems. Such problems are, usually, cast into a Pareto framework where the objective functions are aggregated into a… Click to show full abstract

In this paper we consider a new approach to multicriteria decision making problems. Such problems are, usually, cast into a Pareto framework where the objective functions are aggregated into a single one using certain weights. The problem is embedded into a statistical framework by adopting a posterior distribution for both the decision variables and the Pareto weights. This embedding dates back to [25] but in this work we operationalize the concept further. We propose a Metropolis-Hastings and a Sequential Monte Carlo (SMC) to trace out the entire Pareto frontier and / or find the global optimum of the problem. We apply the new techniques to a multicriteria portfolio decision making problem proposed in [37] and to a test problem proposed by [27]. The good performance of new techniques suggests that SMC and other algorithms, like the classical Metropolis-Hastings algorithm, can be used profitably in the context of multicriteria decision making problems to trace out the Pareto frontier and / or find a global optimum. Most importantly SMC can be considered as an off-the-shelf technique to solve arbitrary multicriteria decision making problems routinely and efficiently.

Keywords: decision; monte carlo; sequential monte; approach; decision making; problem

Journal Title: Omega
Year Published: 2018

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.