When searching for a topic to write about in this blog, I decided to look for a formal definition of Constraint Programming, a technique regularly used in decide projects. In wikipedia, Constraint Programming is defined as a programming paradigm which expresses variables in the form of constraints.
The particularity of this paradigm is that, instead of defining the way to reach a solution, it simply defines the properties the solution must have, without specifying how to arrive at it. In this regard, it is clearly a declarative paradigm.
While it is true that the use of this paradigm in business involves more than simply defining the variables and constraints involved in a problem, this definition serves as a starting point to explain what we mean at decide when we talk about Optimization.
Constraints (context)
All the problems we encounter in our everyday lives, for which we must make decisions, can be formulated using this paradigm. Any decision we make will have to be applicable in a specific context, defined by a set of constraints, of rules that this solution must fulfil to make sense. Every time we make a decision, we need to know the context (constraints) in which this decision will be applied so we can ensure the decision is viable.
Thus, when you want to move your car from one spot to another, you have to follow, at the very least, the laws of physics and, as you are a responsible citizen, you will naturally want to abide by traffic laws. A valid path is one that meets all the constraints imposed by the physical context + traffic laws. Other possible routes, those that cross rivers without bridges, run through buildings or make illegal manoeuvres, are of no interest.
In a manufacturing problem, constraints will be defined by the capacity of production lines, the time required to carry out each task, precedence relationships between tasks, etc. A viable solution is a sequencing of production tasks that takes into account all these constraints.
When assigning shifts, the constraints will be defined by the labour agreement, work requirements and individual agreements with each employee, etc. In this case, a solution will consist of allocating shifts to people over a period of time, respecting all these constraints.
Criteria of ‘goodness’ (intention)
While it is true that there are cases where it is sufficient to accept a solution that is simply viable, this is not the norm: generally we want to make the best decision from the different viable options.
But what makes one solution ‘better’ than another? Are there any objective criteria to help us make this decision? Actually, no. Behind the judgment that determines that one viable solution is better than another, there is always an intention. Probably our intention is to arrive as soon as possible to our destination, or to take the shortest distance to reach it, or even to choose the most scenic route. And, depending on the criteria we use to evaluate the different routes, the outcome of this judgement will be different.
We always have a criterion which allows us to evaluate solutions. We either prefer to manufacture as quickly as possible, or as cheaply as possible. We prefer to organise a shifts to minimises the number of employees, or one that minimises the financial costs of allocation, or a mixture of both (multi-objective). We always have a criterion.
Thus, all decision-making problems are actually a mix between a problem of satisfying constraints (searching for viable solutions) and a problem of optimization problem of optimization (searching for good solutions).
It is a changing world
A reasonable (and infallible) approach in search of these solutions would be to enumerate all feasible solutions (viabilities), to evaluate each of these and to choose the best one, in accordance with the criteria of ‘goodness’ we set.
But the human mind is not good enumerating things. It does have other virtues: it is creative and able to invent rules or guidelines based on experience. And this is the asset that is usually used for solving such decision-making problems: planning experts use their experience, translated into planning guidelines and rules, to make ‘good’ decisions in their constrained systems.
But the human mind is not good enumerating things. It does have other virtues: it is creative and able to invent rules or guidelines based on experience. And this is the asset that is usually used for solving such decision-making problems: planning experts use their experience, translated into planning guidelines and rules, to make ‘good’ decisions in their constrained systems.
It would be unfair to argue that this technique does not work. The entire evolution of humankind has been a result of using this technique. Over time, humans have gained experience; we have established rules for solving the problems we have encountered, and therefore managed to avert these problems in order to be able to focus on the next ones. It is obvious that this strategy has worked.
However, in the past 50 years decision-making techniques have evolved more than in previous millennia. Moreover, in a changing world, there is no time to leave decision-making to experience. The rules that were ‘good’ 10 years ago are no longer ‘good’ now; and the rules that are ‘good’ now will not be ‘good’ in 5 years time.
This difference between the pace at which changes occur and the pace at which they are assimilated (they become ‘experience’) means that ultimately outdated solutions end up being applied to new problems; that is to say, often general rules are used which once worked but now no longer work as well.
What do I mean when I talk about Optimization?
If we want to be competitive, if we want to adapt our decision-making processes to changing conditions, we need to approach things differently.
This is when Optimization and Artificial Intelligence techniques start to come into play and make sense. Computers have no problem enumerating solutions or performing millions of calculations in one second. This ‘brute force’ approach has an important advantage: it does not depend on prior knowledge or on experience. The technique of enumerating possible solutions to a problem and choosing the best one only requires knowledge of the constraints, and the criteria of ‘goodness’, its structure. If the problem changes, as long as there is information on what has changed (the constraints or criteria), the technique works exactly the same.
While it is true that most problems have too many viable solutions, too many even for a computer, in recent decades the scientific community has worked very hard, both from a theoretical and a practical standpoint, to address this complexity.
They have created a series of techniques, programming paradigms, and theoretical and practical results that position computers as serious competitors when making decisions. And this is what I mean when I talk about optimization: the application of software, mathematical or statistical techniques in automated decision-making. This is the work that Decide has been conducting in this area since it was founded.
Constraint Programming is one of the techniques we use in particular, but there are others. Linear Programming techniques (Integer Linear and Mixed Integer Linear) are able to solve many of our customers’ problems. For example, Stochastic Programming deals with the appearance of uncertainty in optimization problems. On the other hand, Metaheuristic techniques look at the way nature solves its problems in order to adapt these solutions so that they can be used to solve business problems. There are countless techniques (Non-Linear Programming, Dynamic Programming, etc.) that science has given us to enable computers to solve human problems.
In later posts, we will have a closer look at each of these techniques, and introduce case studies we have encountered in our projects, showing how each of these techniques has been applied and giving details of the implementation procedures.