
(x_train, y_train),(x_test, y_test) = fashion_mnist.load_data() Import TensorFlow, datetime, and os: import tensorflow as tfĭownload the FashionMNIST dataset and scale it: fashion_mnist = tf._mnist # Load the TensorBoard notebook extension Also, pass -bind_all to %tensorboard to expose the port outside the container. To have concurrent instances, it is necessary to allocate more ports. This will allocate a port for you to run one TensorBoard instance. Where the -p 6006 is the default port of TensorBoard. Thus, run the container with the following command: docker run -it -p 8888:8888 -p 6006:6006 \ The environment’s bin directory to PATH, as described here.įor Docker users: In case you are running a Docker image of Jupyter Notebook server using TensorFlow's nightly, it is necessary to expose not only the notebook's port, but the TensorBoard's port. One way to do this is to modify the kernel_spec to prepend The tensorboard binary is on your PATH inside the Jupyter notebookĬontext. More complicated setup, like a global Jupyter installation and kernelsįor different Conda/virtualenv environments, then you must ensure that The same virtualenv, then you should be good to go. Start by installing TF 2.0 and loading the TensorBoard notebook extension:įor Jupyter users: If you’ve installed Jupyter and TensorBoard into This can be helpful for sharing results, integrating TensorBoard into existing workflows, and using TensorBoard without installing anything locally. TensorBoard can be used directly within notebook experiences such as Colab and Jupyter.
