Abstract
Thin films in optical coatings used for laser applications need to have low optical absorption to assure high laser damage threshold.1 Heat generated by even very low but finite absorption (A ≤ 10–6) of laser pulse energy needs to be dumped rather quickly and efficiently to avoid thin film damage by local overheating (melting, evaporation). Thus, the thermal conductivity of the films toward the substrate needs to be as high as possible. In comparison, lateral thermal conductivity is rather unimportant, as heat is generated over the full area of the spot size (which can be several hundred micrometers or even several millimeters) so that the distance of an individual point-like heat source within the spot to the cooler non-irradiated film is large compared to the distance to the substrate, which is an efficient heat sink. Despite this rather important influence of absorption and thermal conductivity, few reliable data is available. Early measurements on evaporated thin films showed that their thermal conductivity was one to two orders of magnitude lower than that of the same bulk material.2,3,4
© 1992 Optical Society of America
PDF ArticleMore Like This
C. A. Amsden, S. E. Gilman, S. D. Jacobs, and J. S. Torok
TuC3 Optical Interference Coatings (OIC) 1988
S. D. Jacobs, D. G. Angeley, D. S. Smith, and J. C. Lambropoulos
TUR4 OSA Annual Meeting (FIO) 1986
Arthur H. Guenther and John K. McIver
WF2 OSA Annual Meeting (FIO) 1988