Question from the Natural Language Processing - Fundamentals test

Tokenization is the process of separating text into words or groups of words.

Easy

What is tokenization?

Author: ConstantinStatus: PublishedQuestion passed 329 times
Edit
2
Community EvaluationsNo one has reviewed this question yet, be the first!