This article is based on a talk given by Dr Roger Main of BP's Technology Centre to the Mathematical Programming Study Group. Introduction Planning and scheduling the activities of oil refineries is one of the best-established applications of Mathematical Programming (MP). Despite this, the practice of applying MP models to oil refineries is far from straightforward. Oil refineries consist of a number of process units which turn crude oil into intermediate components and then blend these components together to make finished products such as petrol or heating oil. Crude oil is a mixture of a vast range of hydrocarbon molecules. A crude distillation unit separates this mixture into components whose boiling points lie within certain ranges e.g. 30^{o} - 80^{o}, 80^{o} - 130^{o}, 130^{o} - 170^{o}, etc. Other process units further refine these intermediate streams and chemically alter the hydrocarbon molecules, splitting them or removing sulphur, for instance. The resulting components have a wide range of physical properties: density; viscosity; octane; sulphur content; etc. On their own these components would not be suitable for commercial use, but blended together in various ways they form the products which we know as petrol, diesel, heating oil, etc. Traditional Model of a Refinery The traditional way of representing the activities in a refinery is as follows. For each process unit, a number of distinct "modes of operation" are selected, e.g. for a crude distillation unit these might be the distillation cut points. Many different combinations of cut points are possible: for a single crude at one refinery, 62 modes of operation were identified. Multiply this by the number of crudes processed and add in the representation of further processing and blending and you can see that the result is a very large model. Blending can either be represented by:
In most models, blending is done to a quality specification. Problems with the Traditional Approach Although it has been used for many years, this traditional approach does not represent reality particularly well. At any time, process units convert a single feed stream (which may be a blend of input materials, e.g. crudes) into a small number of output materials. Typically you can do 1 or 2 operations a week, i.e. change the feed or change the processing. By contrast, the traditional model assumes that there is no limit to the number of distinct feeds or modes of operation which can be processed. The traditional model produces myriad distinct intermediate components and is able to use these separately in blending. This is unrealistic: in practice there is a small number of tanks which can be used for each intermediate component and the model's individual "streams" must therefore be "pooled" and the pooled components used for blending. Unfortunately, the quality of such pooled components cannot be represented in a linear model. These two types of extra freedom, the multiplicity of process operations and the failure to pool intermediate components, give rise to over-optimization. The model has greater flexibility than exists in practice and it exploits this to produce solutions which are "too good". As well as these fundamental weaknesses there are other problems. If you run a process twice as hard you don't get twice as much material. You can go some way towards representing this by having different modes of operation for different severities or throughputs, but this is not a complete solution. However, one thing which might be expected to cause severe problems does not in practice do so. While only a few qualities blend linearly, it is possible to use linear constraints to control a very large range of qualities by means of some standard "tricks of the trade". The most important of these are quality blending indices. These are functions (e.g. logarithms) of the measured quality of components which themselves blend linearly. Even this approach falters when faced with novel qualities such as are being used in the USA to control vehicle emissions: the definition of the quality "mg of benzene emitted per mile" is itself a highly nonlinear function of many different qualities which are measured directly. The Nonlinear Approach The limitations of the traditional model have led to the development of alternative nonlinear models. These use Successive Linear Programming (SLP) to represent the refinery more directly. In the traditional model one has many streams for each intermediate component but assumes that the qualities of each of these is fixed. In the nonlinear model one has a single stream but its qualities can vary. This means that the model uses the average quality of each intermediate component throughout the period; similarly, average process conditions and an average mix of crudes are used. Although the nonlinear approach has addressed one of the limitations of the traditional model, a fundamental problem remains. This is that a structure of time periods must be imposed on the problem and that, within each time period in the model, everything happens continuously. In practice operations in refineries occur in batches, but the batches are asynchronous across different process units and blenders. This makes it impossible to represent them directly in a model. The result is that the nonlinear approach leads to under-optimization. Refinery schedulers have more freedom than the model. They use the small number of tanks for each intermediate component to produce variants of the component with slightly different qualities and then use these separately in the blending. Hierarchy of Models Planning and scheduling the activities of a refinery is not a single problem but many. There is a hierarchy of problems, from long-term planning (1 - 5 years) through monthly planning to scheduling the day's activities. Different levels of detail are used and what is fixed in one model is a decision variable in another. Some of these aspects are shown in the table.
Horses for Courses We can now see that both the traditional model and the nonlinear model have their uses at different places in the hierarchy of models. The traditional approach is over-optimistic, being very selective and producing unrealistic schedules. But, being linear, it always produces the guaranteed optimum and provides useful sensitivity analysis. This makes it valuable in the longer-term models where the data are anyway uncertain and the model's aim is to provide guidance to decision makers. The nonlinear approach tends towards under-optimization, averaging everything. Its recommendations can be implemented, which makes them attractive for short-term scheduling. But things do change with time and so the refinery schedulers still have work to do to disaggregate them and improve on the recommended solutions. Concluding Remarks Real refineries are highly complex. Building a model, whether traditional or nonlinear is a large undertaking involving man-years of effort, even when using a dedicated refinery modelling package. The problem lies in obtaining the data and deciding how to use it. Models must reflect the available data and be designed to assist the planners and schedulers in doing their jobs. For medium- and long-term models, forecast data must inevitably be used, for instance for prices and demand. Such forecasts are by definition unreliable. Modellers need to keep this in mind and remember that there is a limit to what they can achieve Related articles include Planning and Scheduling in Oil Refineries and Modelling Oil Refineries using Linear Programming. To find other articles, refer to the MP in Action page. |
MP in Action > Oil Industry >