Question from the Natural Language Processing - Fundamentals test

Tokenization is the process of separating text into words or groups of words.