tokenization (Q15019): Difference between revisions

From LexBib
(‎Added qualifier: BabelNet Synset ID (P86): 13890904n)
(‎Created claim: skos:definition (P80): breaking a stream of text up into chunks for analysis or further processing)
Property / skos:definition
 
breaking a stream of text up into chunks for analysis or further processing
Property / skos:definition: breaking a stream of text up into chunks for analysis or further processing / rank
 
Normal rank

Revision as of 13:21, 14 August 2021

No description defined
Language Label Description Also known as
English
tokenization
No description defined

    Statements

    0 references
    0 references
    breaking a stream of text up into chunks for analysis or further processing
    0 references