Since 1757 when Roger Joseph Boscovich addressed the fundamental mathematical problem in determining the parameters which best fits observational equations, a large number of estimation methods has been proposed and developed for linear regression. Four of the commonly used methods are the least absolute deviations, least squares, trimmed least squares. and the M-regression. Each of these methods has its own competitive edge but none is good for all purposes. This book focuses on construction of an adaptive combination of several pairs of these estimation methods. The purpose of adaptive methods is to help users make an objective choice and combine desirable properties of two estimators. With this single objective in mind, this book describes in detail the theory, method, and algorithms for combining several pairs of estimation methods. It will be of interest for those who wish to perform regression analyses beyond the least squares method, and for researchers in robust statistics and graduate students who wish to learn some asymptotic theory for linear models. In addition to a review of least squares, ridge, the least absolute deviations, and the M-, L, and GM-regressions, this book covers four new estimators : least absolute deviations with the least squares regression. Least absolute deviations with M-regression. Least absolute deviations with trimmed least squares. Least squares with trimmed least squares regression. The methods presented in this book are illustrated on numerical examples based on real data. The computer programs in S-PLUS for A procedures presented are available for data analysts working with applications in industry, economics, and the experimental sciences.