## TensorFlow.js basics TensorFlow.js is a numerical library written from the ground up for the web. Similar to TensorFlow and PyTorch, TensorFlow.js allows you to create and manipulate high dimensional tensors, it supports automatic differentiation, and it allows you to train and run neural networks, and it does all that in your browser. You should use TensorFlow.js if you want to play around with machine learning. Being able to use the richness of javascript and the web echosystem for visualization combined with the power of TensorFlow makes TensorFlow.js a great educational tool to get started with machine learning. It however comes with its drawbacks. And the primary drawback is performance. Don't expect to train a massive [residual neural network](https://arxiv.org/abs/1512.03385) in your browser just yet. You can probably run inference on relatively large networks though with not terribly bad performance. Getting started with TensorFlow.js is easy, grab the latest version of TensorFlow.js javascript library from your favorite CDN, include that in an HTML file and you are good to go. ```html <html> <head> <script src="https://cdnjs.cloudflare.com/ajax/libs/tensorflow/2.7.0/tf.min.js"></script> <script> tf.tensor('Hello, World!').print(); </script> </head> </html> ``` In this series we skip including the html snippet, so you'd see something like the following. You can even press the run button to see the result in action: ```jsx tf.tensor('Hello, World!').print(); ``` The first thing to learn about TensorFlow.js is the concept of Tensors. Tensors are simply multidimensional arrays. A tensor can store a scalar value: ```jsx let a = tf.tensor(3); a.print(); ``` or an array: ```jsx let b = tf.tensor([1, 2]); b.print() ``` a matrix: ```jsx let c = tf.zeros([2, 2]); c.print(); ``` or any arbitrary dimensional tensor: ```jsx let d = tf.randomUniform([2, 2, 2]); console.log(d.shape); ``` Tensors can be used to perform algebraic operations efficiently. One of the most commonly used operations in machine learning applications is matrix multiplication. Say you want to multiply two random matrices of size 2x3 and 3x3, this can be done with the matrix multiplication operation: ```jsx let x = tf.randomNormal([2, 3]); let y = tf.randomNormal([3, 3]); let z = x.matMul(y); z.print(); ``` Similarly, to add two vectors, you can do: ```jsx let x = tf.randomNormal([1, 2]); let y = tf.randomNormal([1, 2]); let z = x.add(y); z.print(); ``` Note that calling the print function on a tensor is equivalent to passing the tensor to `console.log()`: ```jsx let x = tf.randomNormal([1, 2]); x.print(); console.log(x); ``` ### Automatic differentiation One of the most interesting features of TensorFlow.js is the automatic differentiation functionality which is very useful in optimization applications such as optimizing parameters of a neural network. Let's try to understand it with an example. Say you have a composite function which is a chain of two functions: `g(u(x))`. To compute the derivative of `g` with respect to `x` we can use the chain rule which states that: `dg/dx = dg/du * du/dx`. `tf.grad` can analytically compute the derivatives for us. We show this with an example: We define function `u` to square its input and function `g` simply negates its input. If we chain these two together we'll have `g(u(x)) = -x*x`. So its derivative with respect to `x` is `-2x`. At point `x=1`, the derivative is equal to `-2`, and at point `x=-2` it's `4`, let's verify that: ```jsx let u = x => x.square(); let g = x => x.neg(); let grad = tf.grad(x => g(u(x))); grad(1.0).print(); grad(-2.0).print(); ``` The result should be `-2` and `4` as we expected. ### Curve fitting To understand how powerful automatic differentiation can be let's have a look at another example. Assume that we have samples from a curve (say `f(x) = 5x^2 + 3`) and we want to estimate `f(x)` based on these samples. We define a parametric function `g(x, w) = w0 x^2 + w1 x + w2`, which is a function of the input `x` and latent parameters `w`, our goal is then to find the latent parameters such that `g(x, w) ≈ f(x)`. This can be done by minimizing the following loss function: `L(w) = Σ (f(x) - g(x, w))^2`. Although there's a closed form solution for this simple problem, we opt to use a more general approach that can be applied to any arbitrary differentiable function, and that is using stochastic gradient descent. We simply compute the average gradient of `L(w)` with respect to `w` over a set of sample points and move in the opposite direction. Here's how it can be done in TensorFlow.js: ```jsx // Assuming we know that the desired function is a polynomial of 2nd degree, we // allocate a vector of size 3 to hold the coefficients and initialize it with // random noise. let w = tf.variable(tf.randomNormal([3, 1], 0, 0.1)); // We use the Adam optimizer with learning rate set to 0.1 to minimize the loss. let opt = tf.train.adam(0.1); // We define yhat to be our estimate of y. function model(x) { return tf.stack([x.square(), x, tf.onesLike(x)], 1).matMul(w).squeeze(); } // The loss is defined to be the mean squared error distance between our // estimate of y and its true value. function computeLoss(y, yhat) { return tf.losses.meanSquaredError(y, yhat); } // Generate some training data based on the true function. function generateData() { let x = tf.randomUniform([100]).mul(20).sub(10); let y = tf.tensor(5).mul(x.square()).add(3); return [x, y]; } animate((i) => { let [x, y] = generateData(); let loss = () => computeLoss(y, model(x)); let lossVal = opt.minimize(loss, true); scalarSummary(lossVal, "Loss"); if (i % 20 === 0) w.print(); }, null, 200); ``` With each step of gradient descent we get slightly closer to the solution which should be `[5, 0, 3]`. (Note that the `animate` and `scalarSummary` functions are built-in function on our platform to help with visualizations. `animate` simply runs the given loop for a given number of iterations, and `scalarSummary` plots a given value through time.) There you have it. Our very first tutorial in TensorFlow.js. Subscribe to our newsletter [here](https://effectivemachinelearning.com) for more.