Ultraviolet radiation (UVR) has been associated with various health outcomes, including skin cancers, vitamin D insufficiency, and multiple sclerosis. Measurement of UVR has been difficult, traditionally relying on subject recall. We investigated trends in satellite-derived UVB from 1978 to 2014 within the continental United States (US) to inform UVR exposure assessment and determine the potential magnitude of misclassification bias created by ignoring these trends. Monthly UVB data remotely sensed from various NASA satellites were used to investigate changes over time in the United States using linear regression with a harmonic function. Linear regression models for local geographic areas were used to make inferences across the entire study area using a global field significance test. Temporal trends were investigated across all years and separately for each satellite type due to documented differences in UVB estimation. UVB increased from 1978 to 2014 in 48% of local tests. The largest UVB increase was found in Western Nevada (0.145 kJ/m2 per five-year increment), a total 30-year increase of 0.87 kJ/m2. This largest change only represented 17% of total ambient exposure for an average January and 2% of an average July in Western Nevada. The observed trends represent cumulative UVB changes of less than a month, which are not relevant when attempting to estimate human exposure. The observation of small trends should be interpreted with caution due to measurement of satellite parameter inputs (ozone and climatological factors) that may impact derived satellite UVR nearly 20% compared to ground level sources. If the observed trends hold, satellite-derived UVB data may reasonably estimate ambient UVB exposures even for outcomes with long latency phases that predate the satellite record.