Queuing theory is the mathematical study of waiting lines or queues. This approach is applied to different types of problems, such as scheduling, resource allocation, and traffic flow. It is often applied in:

- Operations research
- Industrial engineering
- Network design
- Computer architecture

You can explore queuing theory by modeling, measuring, and analyzing the arrival times, wait times, and service times of queuing systems.

For details, see MATLAB^{®}, Statistics Toolbox^{™} and SimEvents^{®}.

- M/M/1 Queuing Theory (Example)
- M/D/1 Queuing System (Example)
- G/G/1 Queuing System and Little's Law (Example)
- Generating Entities as a Markov-Modulated Poisson Process (Example)
- Modeling Variable Size Buffers Using Queues (Example)

- SimEvents (Documentation)
- Basic Queues and Servers in SimEvents (Documentation)
- Modeling Queues and Servers in SimEvents (Documentation)
- Routing Techniques in SimEvents (Documentation)
- Statistical Tools for Discrete-Event Simulation in SimEvents (Documentation)
- Statistics Toolbox (Documentation)
- Resource Allocation Optimization with Mixed Integer Optimization (Documentation)

*See also*: *discrete event simulation*, *SimEvents*, *Statistics Toolbox*, *genetic algorithm*, *random number*, *system design and simulation*