Python has become the programming language of choice for research and industry projects related to data science, machine learning, and deep learning. Since optimization is an inherent part of these research fields, more optimization related frameworks have arisen in the past few years. Only a few of them support the optimization of multiple conflicting objectives at a time but do not provide comprehensive tools for a complete multi-objective optimization task. To address this issue, we have developed pymoo, a multiobjective optimization framework in Python. We provide a guide to getting started with our framework by demonstrating the implementation of an exemplary constrained multi-objective optimization scenario. Moreover, we give a high-level overview of the architecture of pymoo to show its capabilities followed by an explanation of each module and its corresponding sub-modules. The implementations in our framework are customizable and algorithms can be modified/extended by supplying custom operators. Moreover, a variety of single, multi- and many-objective test problems are provided and gradients can be retrieved by automatic differentiation out of the box. Also, pymoo addresses practical needs, such as the parallelization of function evaluations, methods to visualize low and high-dimensional spaces, and tools for multi-criteria decision making. For more information about pymoo, readers are encouraged to visit: https://pymoo.org.
- genetic algorithm,
- multi-objective optimization,
- Task analysis,
- Data visualization,
- Evolutionary computation
OPTIMIZATION plays an essential role in many scientific areas, such as engineering, data analytics, and deep learning. These fields are fast-growing and their concepts are employed for various purposes, for instance gaining insights from large data sets or fitting accurate prediction models. Whenever an algorithm has to handle a significantly large amount of data, and efficient implementation in a suitable programming language is important. Python  has become the programming language of choice for the above-mentioned research areas over the last few years, not only because it is easy to use but also good community support exists. Python is a high-level, cross-platform, and interpreted programming language that focuses on code readability. A large number of high-quality libraries are available and support for any kind of scientific computation is ensured. These characteristics make Python an appropriate tool for many research and industry projects where the investigations can be rather complex.
A fundamental principle of research is to ensure the reproducibility of studies and to provide access to materials used in the research, whenever possible. In computer science, this translates to a sketch of an algorithm and the implementation itself. However, the implementation of optimization algorithms can be challenging, and specifically benchmarking is time-consuming. Having access to either a good collection of different source codes or a comprehensive library is time-saving and avoids an error-prone implementation from scratch.
To address this need for multi-objective optimization in Python, we introduce pymoo. The goal of our framework is not only to provide state of the art optimization algorithms but also to cover different aspects related to the optimization process itself. We have implemented single, multi, and many-objective test problems which can be used as a test-bed for algorithms. In addition to the objective and constraint values of test problems, gradient information can be retrieved through automatic differentiation . Moreover, a parallelized evaluation of solutions can be implemented through vectorized computations, multi-threaded execution, and distributed computing. Further, pymoo provides implementations of performance indicators to measure the quality of results obtained by a multi-objective optimization algorithm. Tools for an explorative analysis through visualization of lower and higher-dimensional data are available and multicriteria decision-making methods guide the selection of a single solution from a solution set based on preferences. Our framework is designed to be extendable through its modular implementation. For instance, a genetic algorithm is assembled in a plug-and-play manner by making use of specific sub-modules, such as initial sampling, mating selection, crossover, mutation, and survival selection. Each submodule takes care of an aspect independently and, therefore, variants of algorithms can be initiated by passing different combinations of sub-modules. This concept allows end-users to incorporate domain knowledge through custom implementations. For example, in an evolutionary algorithm, a biased initial sampling module created with the knowledge of domain experts can guide the initial search. Furthermore, we like to mention that our framework is well-documented with a large number of available code snippets. We created a starter’s guide for users to become familiar with our framework and to demonstrate its capabilities. As an example, it shows the optimization results of a bi-objective optimization problem with two constraints. An extract from the guide will be presented in this paper. Moreover, we provide an explanation of each algorithm and source code to run it on a suitable optimization problem in our software documentation. Additionally, we show a definition of test problems and provide a plot of their fitness landscapes. The framework documentation is built using Sphinx  and the correctness of modules is ensured by automatic unit testing . Most algorithms have been developed in collaboration with the second author and have been benchmarked extensively against the original implementations. In the remainder of this paper, we first present related existing optimization frameworks in Python and in other programming languages. Then, we provide a guide to getting started with pymoo in Section III which covers the most important steps of our proposed framework. In Section IV we illustrate the framework architecture and the corresponding modules, such as problems, algorithms, and related analytics.
In the last decades, various optimization frameworks in diverse programming languages were developed. However, some of them only partially cover multi-objective optimization. In general, the choice of a suitable framework for an optimization task is a multi-objective problem itself. Moreover, some criteria are rather subjective, for instance, the usability and extendibility of a framework and, therefore, the assessment regarding criteria as well as the decision-making process differ from user to user. For example, one might have decided on a programming language first, either because of personal preference or a project constraint, and then search for a suitable framework. One might give more importance to the overall features of a framework, for example, parallelize tion or visualization, over the programming language itself. An overview of some existing multi-objective optimization frameworks in Python is listed in Table 1, each of which is described in the following. Recently, the well-known multi-objective optimization framework jMetal  developed in Java  has been ported to a Python version, namely jMetalPy . The authors aim to further extend it and to make use of the full feature set of Python, for instance, data analysis and data visualization. In addition to traditional optimization algorithms, jMetalPy also offers methods for dynamic optimization. Moreover, the post-analysis of performance metrics of an experiment with several independent runs is automated. Parallel Global Multiobjective Optimizer, PyGMO , is an optimization library for the easy distribution of massive optimization tasks over multiple CPUs. It uses the generalized island-model paradigm for the coarse-grained parallelization of optimization algorithms and, therefore, allows users to develop asynchronous and distributed algorithms. Platypus  is a multi-objective optimization framework that offers implementations of state-of-the-art algorithms. It enables users to create an experiment with various algorithms and provides post-analysis methods based on metrics and visualization. A Distributed Evolutionary Algorithms in Python (DEAP)  is a novel evolutionary computation framework for rapid prototyping and testing of ideas. Even though, DEAP does not focus on multi-objective optimization, however, due to the modularity and extendibility of the framework multiobjective algorithms can be developed. Moreover, parallelization and load-balancing tasks are supported out of the box. Inspired  is a framework for creating bio-inspired computational intelligence algorithms in Python which is not focused on multi-objective algorithms directly, but on evolutionary computation in general. However, an example for NSGA-II  is provided and other multi-objective algorithms can be implemented through the modular implementation of the framework. If the search for frameworks is not limited to Python, other popular frameworks should be considered: PlatEMO  in Matlab, MOEA , and jMetal  in Java, jMetalCpp  and PaGMO  in C++. Of course, this is not an exhaustive list and readers may search for other available options. III. GETTING STARTED 1 In the following, we provide a starter’s guide for pymoo. It covers the most important steps in an optimization scenario starting with the installation of the framework, defining an optimization problem, and the optimization procedure itself.
This paper has introduced pymoo, a multi-objective optimization framework in Python. We have walked through our framework beginning with the installation up to the optimization of a constrained bi-objective optimization problem. Moreover, we have presented the overall architecture of the framework consisting of three core modules: Problems, Optimization, and Analytics. Each module has been described in-depth and illustrative examples have been provided. We have shown that our framework covers various aspects of multi-objective optimization including the visualization of high-dimensional spaces and multi-criteria decision making to finally select a solution out of the obtained solution set. One distinguishing feature of our framework with other existing ones is that we have provided a few options for various key aspects of a multi-objective optimization task, providing standard evolutionary operators for optimization, standard performance metrics for evaluating a run, standard visualization techniques for showcasing obtained trade-off solutions, and a few approaches for decision-making. Most such implementations were originally suggested and developed by the second author and his collaborators for more than 25 years. Hence, we consider that the implementations of all such ideas are authentic and error-free. Thus, the results from the proposed framework should stay as benchmark results of implemented procedures. However, the framework can be definitely extended to make it more comprehensive and we are constantly adding new capabilities based on practicalities learned from our collaboration with industries. In the future, we plan to implement more optimization algorithms and test problems to provide more choices to end-users. Also, we aim to implement some methods from the classical literature on single-objective optimization which can also be used for multiobjective optimization through decomposition or embedded as a local search. So far, we have provided a few basic performance metrics. We plan to extend this by creating a module that runs a list of algorithms on test problems automatically and provides statistics of different performance indicators. Furthermore, we like to mention that any kind of contribution is more than welcome. We see our framework as a collaborative collection from and to the multi-objective optimization community. By adding a method or algorithm to pymoo the community can benefit from a growing comprehensive framework and it can help researchers to advertise their methods. Interested researchers are welcome to contact the authors. In general, different kinds of contributions are possible and more information can be found online. Moreover, we would like to mention that even though we try to keep our framework as bug-free as possible, in case of exceptions during the execution or doubt of correctness, please contact us directly or use our issue tracker.
The Kavian Scientific Research Association (KSRA) is a non-profit research organization to provide research / educational services in December 2013. The members of the community had formed a virtual group on the Viber social network. The core of the Kavian Scientific Association was formed with these members as founders. These individuals, led by Professor Siavosh Kaviani, decided to launch a scientific / research association with an emphasis on education.
KSRA research association, as a non-profit research firm, is committed to providing research services in the field of knowledge. The main beneficiaries of this association are public or private knowledge-based companies, students, researchers, researchers, professors, universities, and industrial and semi-industrial centers around the world.
Our main services Based on Education for all Spectrum people in the world. We want to make an integration between researches and educations. We believe education is the main right of Human beings. So our services should be concentrated on inclusive education.
The KSRA team partners with local under-served communities around the world to improve the access to and quality of knowledge based on education, amplify and augment learning programs where they exist, and create new opportunities for e-learning where traditional education systems are lacking or non-existent.
FULL Paper PDF file:pymoo: Multi-objective Optimization in Python
Pymoo: Multi-Objective Optimization in Python
in IEEE Access, vol. 8, pp. 89497-89509, 2020,
PDF reference and original file: Click here