top of page

Axminster Town AFC Group

Public·11 members

Process Modeling, Simulation And Control For Ch... __EXCLUSIVE__

The aim of this chapter is to show the reader how to develop a dynamic simulation for a distillation tower and its control from first principles. The different classes of dynamic distillation models and various approaches to solving these models will be presented. The author hopes to dispel the myth that modelling and simulation of distillation dynamics must be difficult and complex.

Process Modeling, Simulation and Control for Ch...

Dynamic modelling and simulation has proven to be an insightful and productive process engineering tool. It can be used to design a distillation process that will produce quality products in the most economic fashion possible, even under undesirable process disturbances. Working dynamic models provide a process engineering tool that has a long and useful life.

Dynamic simulation can be used early in a project to aid in the process and control system design. It ensures that the process is operable and can meet product specifications when the process varies from steady-state design. Later in the project the simulation can be used to complete the detailed control system design and solve plantwide operability problems. After the project, the same simulation is useful for training. Years later, as product and economic conditions change, the simulation can be used for plant improvement programs.

A dynamic model is needed to study and design composition controls. To do this we will develop a sufficiently rigorous tray-by-tray model with nonideal vapor-liquid and stage equilibrium. Proportional-integral feedback controllers will control product compositions or tray temperatures. Vapor flow and pressure dynamics often can be assumed negligible. I will discuss how systems with vapor hydraulic pressure dynamics can be modelled and simulated. My approach is based on fundamental process engineering principles. Only the relationships that are necessary to solve the problem should be modelled. Most importantly, these are models that any chemical engineer can easily simulate. They are suitable for a small personal computer, in whatever progranmiing language you prefer.

Modeling and Simulation for Chemical Engineers: Theory and Practice begins with an introduction to the terminology of process modeling and simulation. Chapters 2 and 3 cover fundamental and constitutive relations, while Chapter 4 on model formulation builds on these relations. Chapters 5 and 6 introduce the advanced techniques of model transformation and simplification. Chapter 7 deals with model simulation, and the final chapter reviews important mathematical concepts.

Figure 9.9 shows the properties for the first and second Assign steps in the TreatmentRooms_ `Processing` add-on process. The basic concept of marking an entity with the simulation time at one point in the model, then recording the time interval between the marked time and the later time in a tally statistic, is quite common in situations where we want to record a time interval as a performance metric. While Simio and other simulation packages can easily track some intervals (e.g., the interval between when an entity is created and when it is destroyed), the package has no way to know which intervals will be important for a specific problem. As such, you definitely want to know how to tell the model to track these types of user-defined statistics yourself.

Modeling and simulation are especially valuable for testing conditions that might be difficult to reproduce with hardware prototypes alone, especially in the early phase of the design process when hardware may not be available. Iterating between modeling and simulation can improve the quality of the system design early, thereby reducing the number of errors found later in the design process.

Common representations for system models include block diagrams, schematics, and state diagrams. Using these representations you can model mechatronic systems, control software, signal processing algorithms, and communications systems. To learn more about modeling and simulation with block diagrams, see Simulink.

This is the first complete introduction to process control that fully integrates software tools--enabling professionals and students to master critical techniques hands on, through computer simulations based on the popular MATLAB environment. Process Control: Modeling, Design, and Simulation teaches the field's most important techniques, behaviors, and control problems through practical examples, supplemented by extensive exercises--with detailed derivations, relevant software files, and additional techniques available on a companion Web site. Coverage includes:

Bequette walks step by step through the development of control instrumentation diagrams for an entire chemical process, reviewing common control strategies for individual unit operations, then discussing strategies for integrated systems. The book also includes 16 learning modules demonstrating how to use MATLAB and SIMULINK to solve several key control problems, ranging from robustness analyses to biochemical reactors, biomedical problems to multivariable control.

B. WAYNE BEQUETTE is Professor of Chemical Engineering at Rensselaer Polytechnic Institute. His teaching and research interests are in the areas of process systems and control engineering for biomedical systems, pharmaceuticals, chromatography, and complex chemical processes. He is Associate Editor of Automatica, a journal of the International Federation of Automatic Control, and General Chair for the 2003 American Control Conference. He is the author of Process Dynamics: Modeling, Analysis, and Simulation (Prentice Hall).

The techniques we describe are based on the "stable fluids" method of Stam 1999. However, while Stam's simulations used a CPU implementation, we choose to implement ours on graphics hardware because GPUs are well suited to the type of computations required by fluid simulation. The simulation we describe is performed on a grid of cells. Programmable GPUs are optimized for performing computations on pixels, which we can consider to be a grid of cells. GPUs achieve high performance through parallelism: they are capable of processing multiple vertices and pixels simultaneously. They are also optimized to perform multiple texture lookups per cycle. Because our simulation grids are stored in textures, this speed and parallelism is just what we need.

A CPU implementation of the simulation performs the steps in the algorithm by looping, using a pair of nested loops to iterate over each cell in the grid. At each cell, the same computation is performed. GPUs do not have the capability to perform this inner loop over each texel in a texture. However, the fragment pipeline is designed to perform identical computations at each fragment. To the programmer, it appears as if there is a processor for each fragment, and that all fragments are updated simultaneously. In the parlance of parallel programming, this model is known as single instruction, multiple data (SIMD) computation. Thus, the GPU analog of computation inside nested loops over an array is a fragment program applied in SIMD fashion to each fragment.

Computer simulation is the process of mathematical modelling, performed on a computer, which is designed to predict the behaviour of, or the outcome of, a real-world or physical system. The reliability of some mathematical models can be determined by comparing their results to the real-world outcomes they aim to predict. Computer simulations have become a useful tool for the mathematical modeling of many natural systems in physics (computational physics), astrophysics, climatology, chemistry, biology and manufacturing, as well as human systems in economics, psychology, social science, health care and engineering. Simulation of a system is represented as the running of the system's model. It can be used to explore and gain new insights into new technology and to estimate the performance of systems too complex for analytical solutions.[1]

A computer model is the algorithms and equations used to capture the behavior of the system being modeled. By contrast, computer simulation is the actual running of the program that contains these equations or algorithms. Simulation, therefore, is the process of running a model. Thus one would not "build a simulation"; instead, one would "build a model (or a simulator)", and then either "run the model" or equivalently "run a simulation".

Computer simulation developed hand-in-hand with the rapid growth of the computer, following its first large-scale deployment during the Manhattan Project in World War II to model the process of nuclear detonation. It was a simulation of 12 hard spheres using a Monte Carlo algorithm. Computer simulation is often used as an adjunct to, or substitute for, modeling systems for which simple closed form analytic solutions are not possible. There are many types of computer simulations; their common feature is the attempt to generate a sample of representative scenarios for a model in which a complete enumeration of all possible states of the model would be prohibitive or impossible.[7]

The process control and monitoring laboratories contain a number of real-time instrumented experiments for online model-based control and fault diagnosis. The specific experiments include emulsion polymerization, complex quadruple-tank level control and other systems. All of these units are equipped with state-of-the-art control hardware and software systems.

One of the most rapidly growing aspects of research within the department is process modeling. Research efforts include computer control and modeling of biochemical reactors, development and modeling of novel separations processes, modeling of transport in living systems, modeling and simulation of polymer processes and elucidation and modeling of reaction pathways. To support the research in chemical engineering analysis, the department maintains its own computer laboratory. Numerous microcomputers are in use in our research laboratories both for data acquisition and modeling; most recently several BEOWULF clusters of high performance PC computers have been built; the department also makes extensive use of the University and national computing facilities. 041b061a72

  • About

    Welcome to the group! You can connect with other members, ge...

    • Black Facebook Icon
    • Black Twitter Icon
    bottom of page