Turing complete.

Welcome to Nextjournal!

This notebook will take you through a simple machine learning exercise to help acclimate you to the platform.

If you run into any problems, along the way, don't be afraid to raise your hand and ask for help.

Creating a Notebook

This notebook was created using one of Nextjournal's templates. They are based on the language the platform natively supports - R, Julia, Python, Clojure, and ClojureScript - or specific machine learning libraries like Tensorflow. Navigate to https://nextjournal.com/ and select Add new notebook to peruse them yourself.

Editing a Notebook

Code and commentary can be added by either using the ➕ insert menu in the left gutter or the ➕ Add new content at the bottom of the notebook. The list of options includes Code Cell: Python.

import tensorflow as tf
import tensorflow.contrib.eager as tfe


Note that there was no need to install Tensorflow. The template's Python runtime uses the Nextjournal Tensorflow default environment. An environment is where you code runs; it includes everything, from the operating system to specific libraries, necessary for proper execution. You will come to understand the unique power Nextjournal's reusable environments offer as you learn the platform. For now, it's enough to know that the default feature-rich environments will help you get up and running quickly.

Running a Notebook

The next cell is our training loop for fitting a line, defined as , from 2 points. The machine shall learn what and are.

[maybe add a plot here?]

# Batch of training data
x_values = tf.convert_to_tensor([0.0, 1.0])
y_values = tf.convert_to_tensor([0.0, 3.0])

# Model
A = tfe.Variable(0.0)
B = tfe.Variable(0.0)

def predict_y_values(x_values):
  return A * x_values + B   # the definition of a line

# the training loop
for i in range(200):
  with tfe.GradientTape() as tape: # tape is an implementation detail
      predicted_y_values = predict_y_values(x_values)
      loss_value = tf.reduce_mean(tf.square(predicted_y_values - y_values))
      if i % 20 == 0:
        print("Loss at step {:03d}: {:.3f}".format(i, loss_value))
  # Gradient tells us which direction to change the variables to reduce loss
  gradient_A, gradient_B = tape.gradient(loss_value, [A, B]) # Stay tuned!
  # Nudge the variables by a small step in the right direction
  A.assign_sub(gradient_A * 0.1)
  B.assign_sub(gradient_B * 0.1)

To run the notebook simply press the play button in the top menu bar. The runner will boot and execute the code from top to bottom. Change a cell and rerun an individual cell by hitting the play button at the bottom of the cell.

Publishing a Notebook

Once you have made your changes, it's time to publish. Press the publish button in the top menu bar. Give the notebook a memorable URL to share with colleagues. They can remix your work or if you have a private research plan, you can give them permission to edit the original.


Something clever

[Perhaps this notebook should load in edit mode? Something like http://nextjournal.com/try?]