En cours de chargement...
This book provides an accessible introduction to the mathematical methods of quantum optics. Starting from first principles, it reveals how a given system of atoms and a field is mathematically modelled. The method of eigen-function expansion and the Lie algebraic method for solving equations are outlined. Analytically exactly solvable classes of equations are identified. The text also discusses consequences of Lie algebraic properties of Hamiltonians, such as the classification of their states as coherent, classical or non-classical based on the generalized uncertainty relation and the concept of quasi-probability distributions.
A unified approach is developed for determining the dynamics of two-level and a three-level atom in combinations of quantized fields under certain conditions. Simple methods for solving a variety of linear and nonlinear dissipative master equations are given.