37 votes 37 votes The lexical analysis for a modern computer language such as Java needs the power of which one of the following machine models in a necessary and sufficient sense? Finite state automata Deterministic pushdown automata Non-deterministic pushdown automata Turing machine Compiler Design gatecse-2011 compiler-design lexical-analysis easy + – go_editor asked Sep 29, 2014 edited Dec 4, 2017 by kenzou go_editor 13.7k views answer comment Share Follow See 1 comment See all 1 1 comment reply Gajanan Purud commented Sep 15, 2023 reply Follow Share A 0 votes 0 votes Please log in or register to add a comment.
Best answer 57 votes 57 votes Answer - A In compiler lexical analyzer categorizes character sequence into lexemes and produces tokens as output for parser. And tokens are expressed in regular expressions so a simple Finite Automata is sufficient for it. ankitrokdeonsns answered Nov 23, 2014 edited Dec 4, 2017 by kenzou ankitrokdeonsns comment Share Follow See all 10 Comments See all 10 10 Comments reply smartmeet commented Feb 4, 2017 reply Follow Share So where Turing Machine and PDAs are used? 0 votes 0 votes Angkit commented Apr 24, 2017 reply Follow Share FA itself is fulfilling everything, so we need not got to TM and PDA at this level. 0 votes 0 votes rishu_darkshadow commented Dec 5, 2017 reply Follow Share PDA is use during syntax analysis... 12 votes 12 votes talha hashim commented Jul 13, 2018 reply Follow Share Lexical analysis=> DFA Syntax analysis=> PDA Semantic analysis=> Turing machine 26 votes 26 votes rajinder singh commented Jul 14, 2018 reply Follow Share Why we need turning machine in semantic analysis? 0 votes 0 votes akash.dinkar12 commented Jul 27, 2018 reply Follow Share talha hashim Can u explain how TM used in semantic analysis?? 0 votes 0 votes Ashish Goyal commented Oct 2, 2018 reply Follow Share @akash.dinkar12 @rajinder singh we use context sensitive language in semantics analysis phase. So, we need a linear bounded turing machine to implement that. 2 votes 2 votes Ayush Upadhyaya commented Jan 8, 2019 reply Follow Share Tokens are recognized using a fixed set of rules called regular expressions.For each regulat expression, we can have a DFA.Hence FA is necessary and both sufficient. PDA and TM would be more than necessary. Power of PDA is needed in the syntax analysis phase. 0 votes 0 votes rohith1001 commented Sep 26, 2019 reply Follow Share @talha hashim Lexical analysis=> DFA Syntax analysis=> PDA Semantic analysis=> Turing machine For semantic analysis LBA (which accepts CSL) would suffice. Semantic analysis checks for more meaningful things such as if a variable is declared before it's use, type checking etc. Turing machines are confined to only Theory. Correct me if I'm wrong. 1 votes 1 votes Pvkarma commented Sep 18, 2020 reply Follow Share for lexical analysis we use finite automata 0 votes 0 votes Please log in or register to add a comment.
4 votes 4 votes During Lexical analysis,the tokens are recognized by FA.So a FA is necessary and sufficient vnc answered Nov 28, 2015 vnc comment Share Follow See all 0 reply Please log in or register to add a comment.