984 views
2 votes
2 votes
consider  3  matrices

A[100*200]

B[200*50]]

C[50*30]

Suppose a computer  takes

1) 1 microsecond to multiply 2  numbers.

2) almost 0 second to perform Addition.

 

Then  find out  how much time the computer  will  take  to  Multiply matrices in  All possible ways.

Assume  the Matrix multiplication procedure to be continuous  without any   time delay.

options are--

1) 0.5 seconds

2) 1.5 seconds

3) 2 seconds

4) 3 seconds

1 Answer

Best answer
1 votes
1 votes

There are three ways two multiply 3 Matrices

((AB)C)  and (A(BC))

Total number of multiplication in ((AB)C)

Multiplication in AB=100*200*50 =1000,000 

Let AB=D

Multiplication in DC=100*50*30 =150,000

So, Total number of multiplication in ((AB)C) = 1000,000+150,000 = 1150,000 

it Takes 1.15  second

---------------------------------------------------------------------------------------------------------------

Total number of multiplication in (A(BC))

Multiplication in BC=200*50*30 = 300,000

let BC=D

Multiplication in AD=100*200*30 =600,000

So, Total number of multiplication in (A(BC))=300,000 + 600,000 == 900,000

it Takes 0.9 second

So,total Time is1.15+ 0.9=2.05 second

selected by

Related questions

0 votes
0 votes
1 answer
3
vishwa ratna asked Jan 17, 2017
997 views
Consider three matrices A (10 × 100), B (100 × 5), C (5 × 50). What is total number of multiplications required?Your Answer:500Correct Answer: 7500 Status: incorrec...
0 votes
0 votes
1 answer
4
Rohan Mundhey asked Nov 11, 2016
1,581 views
Matrix multiplication is associative and matrix chain multiplication uses following matricesA1 is 30×35A2 is 35×15A3 is 15×5A4 is 5×10A5 is 10×20A6 is 20×25Find the...