The Gateway to Computer Science Excellence
First time here? Checkout the FAQ!
x
+20 votes
2.3k views

The lexical analysis for a modern computer language such as Java needs the power of which one of the following machine models in a necessary and sufficient sense?

  1. Finite state automata
  2. Deterministic pushdown automata
  3. Non-deterministic pushdown automata
  4. Turing machine
asked in Compiler Design by Veteran (106k points)
edited by | 2.3k views

2 Answers

+33 votes
Best answer

Answer - A

In compiler lexical analyzer categorizes character sequence into lexemes and produces tokens as output for parser. And tokens are expressed in regular expressions so a simple Finite Automata is sufficient for it.

answered by Loyal (9.1k points)
edited by
0
So where Turing Machine and PDAs are used?
0
FA itself is fulfilling everything, so we need not got to TM and PDA at this level.
+4
PDA is use during syntax analysis...
+1
Lexical analysis=> DFA

Syntax analysis=> PDA

Semantic analysis=> Turing machine
0
Why we need turning machine in semantic analysis?
0

talha hashim

Can u explain how TM used in semantic analysis??

0

@akash.dinkar12 @rajinder singh 

we use context sensitive language in semantics analysis phase. So, we need a linear bounded turing machine to implement that.

+1 vote
During Lexical analysis,the tokens are recognized by FA.So a FA is necessary and sufficient
answered by Active (2.7k points)
Answer:

Related questions

0 votes
0 answers
7
asked Oct 5 in Compiler Design by Na462 Loyal (6.9k points) | 94 views


Quick search syntax
tags tag:apple
author user:martin
title title:apple
content content:apple
exclude -tag:apple
force match +apple
views views:100
score score:10
answers answers:2
is accepted isaccepted:true
is closed isclosed:true

42,599 questions
48,599 answers
155,652 comments
63,718 users