The Gateway to Computer Science Excellence
First time here? Checkout the FAQ!
+20 votes

The lexical analysis for a modern computer language such as Java needs the power of which one of the following machine models in a necessary and sufficient sense?

  1. Finite state automata
  2. Deterministic pushdown automata
  3. Non-deterministic pushdown automata
  4. Turing machine
asked in Compiler Design by Veteran (115k points)
edited by | 2.8k views

2 Answers

+34 votes
Best answer

Answer - A

In compiler lexical analyzer categorizes character sequence into lexemes and produces tokens as output for parser. And tokens are expressed in regular expressions so a simple Finite Automata is sufficient for it.

answered by Loyal (8.8k points)
edited by
So where Turing Machine and PDAs are used?
FA itself is fulfilling everything, so we need not got to TM and PDA at this level.
PDA is use during syntax analysis...
Lexical analysis=> DFA

Syntax analysis=> PDA

Semantic analysis=> Turing machine
Why we need turning machine in semantic analysis?

talha hashim

Can u explain how TM used in semantic analysis??


@akash.dinkar12 @rajinder singh 

we use context sensitive language in semantics analysis phase. So, we need a linear bounded turing machine to implement that.

Tokens are recognized using a fixed set of rules called regular expressions.For each regulat expression, we can have a DFA.Hence FA is necessary and both sufficient.

PDA and TM would be more than necessary.

Power of PDA is needed in the syntax analysis phase.
+1 vote
During Lexical analysis,the tokens are recognized by FA.So a FA is necessary and sufficient
answered by Active (2.8k points)

Related questions

Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true
50,126 questions
53,252 answers
70,502 users