3 votes 3 votes Compiler Design ace-test-series compiler-design lexical-analysis compiler-tokenization + – Tushar Shinde asked Jan 24, 2016 • edited Mar 6, 2019 by akash.dinkar12 Tushar Shinde 1.3k views answer comment Share Follow See all 5 Comments See all 5 5 Comments reply Pooja Palod commented Jan 24, 2016 reply Follow Share 10 tokens 2 votes 2 votes Tushar Shinde commented Jan 24, 2016 reply Follow Share 18 lexems... and the tokens are 3.. viz 'id' and ' . ' and ' , ' 1 votes 1 votes shivanisrivarshini commented Jan 24, 2016 reply Follow Share But all the strings that present in " " are tokens and such strings are 9 na am i wrong? 0 votes 0 votes Tushar Shinde commented Jan 24, 2016 reply Follow Share those " -- " line are the syntax of OUT statement and has nothing to do with tokens.. 0 votes 0 votes Vikram Bhat commented Jan 24, 2016 reply Follow Share You should mention what are the regex for tokens . 0 votes 0 votes Please log in or register to add a comment.
3 votes 3 votes Total tokens are |the|, quick|,brown|,fox|,jumps|,over|,the|,lazy|,dog|.| so total number of tokens are 10 shivanisrivarshini answered Jan 24, 2016 shivanisrivarshini comment Share Follow See all 4 Comments See all 4 4 Comments reply shekhar chauhan commented Jun 2, 2016 reply Follow Share Can you tell me how did you find out these tokens and why you did not consider "(" and ")" as tokens .and there is (;) at the end of every statement..Apart from this tell me which one to consider as token and which is Not 0 votes 0 votes shivanisrivarshini commented Jun 2, 2016 reply Follow Share Here I considered the output of those and then counted the tokens if not out ,( , are to be considered 0 votes 0 votes Prateek kumar commented Sep 2, 2016 reply Follow Share why we are not considering "," as a different token in each out ? 0 votes 0 votes Brij gopal Dixit commented May 31, 2019 reply Follow Share why are u not considering ' , ' as token ?? i think there should be 18 tokens in the output . 0 votes 0 votes Please log in or register to add a comment.