Every language that has a context-free grammar can be recognized in at most  $O(n^{3})$ time for strings of length $n$. A simple way to do so,called the Cocke- Younger-Kasami (or CYK) algorithm is based on dynamic programming. That is, given a string $a_{1}a_{2}\cdot\cdot\cdot a_{n}$, we construct an $n$-by-$n$ table $T$ such that $T_{ij}$ is the set of nonterminals that generate the substring  $a_{i}a_{i+1}\cdot\cdot\cdot a_{j}$. If the underlying grammar is in CNF (see \question $4.4.8$), then one table entry can be filled in in $O(n)$ time, provided we fill the entries in the proper order: lowest value of $j - i$ first. Write an algorithm that correctly fills in the  entries of the table, and show that your algorithm takes $O(n_{3})$ time.  Having filled in the table, how do you determine whether  \$a_{l}a_{2}\cdot\cdot\cdot a_{n} is in the language?