Language is known to interact flexibly with non-verbal representations, but the processing mechanisms governing these interactions remain unclear. This article reviews general cognitive processes that operate across various tasks and stimulus types and argues that these processes may drive the interactions between language and cognition, regardless of whether these interactions occur cross-linguistically or within a language. These general processes include goal-directed behaviour, reliance on context-relevant semantic knowledge and attuning to task demands. An overview of existing findings suggests that resorting to language in non-verbal or multi-modal tasks may depend on how linguistic representations align with current task goals and demands. Progress in understanding these mechanisms requires theories that make specific processing predictions about how tasks and experimental contexts encourage or discourage access to linguistic knowledge. Systematic testing of alternative mechanisms is necessary to explain how and why linguistic information influences some cognitive tasks but not others.