On many contentious political topics, voters' beliefs fail to reach consensus—or continue to polarize—despite exposure to abundant information. Individuals interpret new evidence in ways that reinforce their existing beliefs, and sometimes hold onto those beliefs even more strongly when presented with evidence to the contrary. Based on recent findings on working memory, this paper proposes a cognitively grounded deviation from Bayesian updating that accounts for confirmation bias, the backfire effect, and anchoring in a unified framework. When a message is informative on multiple dimensions, the order in which beliefs are updated leads to different posteriors—even under common priors and common information. In environments sufficiently rich in misinformation, sequential updaters exhibit confirmation bias regardless of the evidence they encounter. I apply the model to show how a principal can exploit confirmation bias through misinformation, and how voters' overtrust of incumbents can eliminate pandering in a political agency setting.