Analytical methods for evaluating the performance of digital transmission systems are often quite difficult to develop. As a result, digital computer simulations that are based on Monte Carlo methods are commonly used to achieve realistic estimates of the performance of such systems. Unfortunately, conventional Monte Carlo methods often require a large number of simulation runs in order to obtain accurate estimates. Various variance reduction techniques that are known as importance sampling methods have been successfully employed in the communications and statistical literature to significantly reduce the computational burden of brute-force Monte Carlo. In this paper, our objective is to illustrate the use of large deviations theory as a powerful tool for designing highly computationally efficient and flexible importance sampling schemes. As an application, we consider the simulation of fiber optic transmission systems.