split sentences into tokens python code example Example: import word_tokenize import nltk from nltk import word_tokenize