Home > Term: tokenization
tokenization
In text mining or Full-Text Search, the process of identifying meaningful units within strings, either at word boundaries, morphemes, or stems, so that related tokens can be grouped. For example, although "San Francisco" is two words, it could be treated as a single token.
- Jenis Kata: noun
- Industri / Domain: Perangkat lunak
- Kategori: Sistem operasi
- Company: Microsoft
0
Penulis
- Maxiao
- 100% positive feedback