## Control flow operations: conditionals and loops
When building complex models such as recurrent neural networks you may need to control the flow of operations through conditionals and loops. In this section we introduce a number of commonly used control flow ops.
Let's assume you want to decide whether to multiply to or add two given tensors based on a predicate. This can be simply implemented with either python's built-in if statement or using tf.cond function:
```python
a = tf.constant(1)
b = tf.constant(2)
p = tf.constant(True)
# Alternatively:
# x = tf.cond(p, lambda: a + b, lambda: a * b)
x = a + b if p else a * b
print(x.numpy())
```
Since the predicate is True in this case, the output would be the result of the addition, which is 3.
Most of the times when using TensorFlow you are using large tensors and want to perform operations in batch. A related conditional operation is tf.where, which like tf.cond takes a predicate, but selects the output based on the condition in batch.
```python
a = tf.constant([1, 1])
b = tf.constant([2, 2])
p = tf.constant([True, False])
x = tf.where(p, a + b, a * b)
print(x.numpy())
```
This will return [3, 2].
Another widely used control flow operation is tf.while_loop. It allows building dynamic loops in TensorFlow that operate on sequences of variable length. Let's see how we can generate Fibonacci sequence with tf.while_loops:
```python
@tf.function
def fibonacci(n):
a = tf.constant(1)
b = tf.constant(1)
for i in range(2, n):
a, b = b, a + b
return b
n = tf.constant(5)
b = fibonacci(n)
print(b.numpy())
```
This will print 5. Note that tf.function automatically converts the given python code to use tf.while_loop so we don't need to directly interact with the TF API.
Now imagine we want to keep the whole series of Fibonacci sequence. We may update our body to keep a record of the history of current values:
```python
@tf.function
def fibonacci(n):
a = tf.constant(1)
b = tf.constant(1)
c = tf.constant([1, 1])
for i in range(2, n):
a, b = b, a + b
c = tf.concat([c, [b]], 0)
return c
n = tf.constant(5)
b = fibonacci(n)
print(b.numpy())
```
Now if you try running this, TensorFlow will complain that the shape of one of the loop variables is changing. One way to fix this is to use "shape invariants", but this functionality is only available when using the low-level tf.while_loop API:
```python
n = tf.constant(5)
def cond(i, a, b, c):
return i < n
def body(i, a, b, c):
a, b = b, a + b
c = tf.concat([c, [b]], 0)
return i + 1, a, b, c
i, a, b, c = tf.while_loop(
cond, body, (2, 1, 1, tf.constant([1, 1])),
shape_invariants=(tf.TensorShape([]),
tf.TensorShape([]),
tf.TensorShape([]),
tf.TensorShape([None])))
print(c.numpy())
```
This is not only getting ugly, but is also pretty inefficient. Note that we are building a lot of intermediary tensors that we don't use. TensorFlow has a better solution for this kind of growing arrays. Meet tf.TensorArray. Let's do the same thing this time with tensor arrays:
```python
@tf.function
def fibonacci(n):
a = tf.constant(1)
b = tf.constant(1)
c = tf.TensorArray(tf.int32, n)
c = c.write(0, a)
c = c.write(1, b)
for i in range(2, n):
a, b = b, a + b
c = c.write(i, b)
return c.stack()
n = tf.constant(5)
c = fibonacci(n)
print(c.numpy())
```
TensorFlow while loops and tensor arrays are essential tools for building complex recurrent neural networks. As an exercise try implementing [beam search](https://en.wikipedia.org/wiki/Beam_search) using tf.while_loops. Can you make it more efficient with tensor arrays?