Adding visual speech information (i.e. lip movements) to auditory speech information (i.e. voice) can enhance speech comprehension in younger and older adults while at the same time it reduces electrical brain responses, as measured by event-related potentials (ERPs). Thus, the brain seems to allocate fewer resources to speech comprehension when audio-visual (AV) speech information is available. This study examined whether the brain resources saved at the perceptual level during AV presentation allow younger and older adults to perform better on a working memory task, and whether older adults benefit to the same extent as younger adults. Twenty older adults and 23 younger adults completed an n-back working memory task (0-, 1-,2-, 3-back) under visual-only (V-only), auditory-only (A-only), and AV condition while ERPs were recorded. The results showed a decrease in reaction time across all memory loads and an improvement in accuracy for 2back and 3-back during AV compared to the V-only and A-only conditions. In addition, ERP analysis from a sample of 12 younger and 12 older adults showed a smaller N1 amplitude for the older group during AV compared to A-only presentation. The attenuation of N1, however, did not correlate with behavioural data. Nor did it show a relationship with changes either in the latency or the amplitude of P3, an ERP that reflects working memory processes. Thus, despite clear behavioural improvements on the working memory task during AV speech presentation, a more direct relationship between facilitation of sensory processing and working memory improvement was not identified.