Somewhere along the cortical hierarchy, behaviorally relevant information is distilled from raw sensory inputs. We examined how this transformation progresses along multiple levels of the hierarchy by comparing neural representations in visual, temporal, parietal, and frontal cortices in monkeys categorizing across three visual domains (shape, motion direction, color). Representations in visual areas MT and V4 were tightly linked to external sensory inputs. In contrast, prefrontal cortex (PFC) largely represented the abstracted behavioral relevance of stimuli (task rule, motion category, color category). Intermediate-level areas — posterior inferotemporal (PIT), lateral intraparietal (LIP), and frontal eye fields (FEF) — exhibited mixed representations. While the distribution of sensory information across areas aligned well with classical functional divisions — MT carried stronger motion information, V4 and PIT carried stronger color and shape information — categorical abstraction did not, suggesting these areas may participate in different networks for stimulus-driven and cognitive functions. Paralleling these representational differences, the dimensionality of neural population activity decreased progressively from sensory to intermediate to frontal cortex. This shows how raw sensory representations are transformed into behaviorally relevant abstractions and suggests that the dimensionality of neural activity in higher cortical regions may be specific to their current task.