Jupyter - Under the Hood
07-15, 10:30–11:15 (Europe/Dublin), Liffey Hall 2

Jupyter Notebooks at their core are just JSON documents that contain all your code, markdown styles and outputs. Yet when you run a notebook, there's a lot that's happening under the hood - from starting a session with the notebook server, to launching an IPython kernel, and a rich Web UI communicating with the notebook server and the IPython kernel using Jupyter's REST APIs and ZMQ websockets. We will explore the Jupyter ecosystem (Jupyter, JupyterLab, JupyterHub) and see how this system comes together.


Jupyter Notebooks at their core are just JSON documents that contain all your code, markdown styles and outputs. Yet when you run a notebook, there's a lot that's happening under the hood - from starting a session with the notebook server, to launching an IPython kernel, and a rich Web UI communicating with the notebook server and the IPython kernel using Jupyter's REST APIs and ZMQ websockets. We will explore the Jupyter ecosystem and see how this system comes together.

The architecture of all the offerings in the Jupyter Project (such as the classic Jupyter Notebook), the newer JupyterLab IDE, or the scalable multi-user environment - JupyterHub is completely distributed.
At their core, there's a front end client like a web browser or a qt console that talks to the Notebook server using its many APIs (like the kernel API) and to the language kernel (in our case IPython) using ZMQ Sockets, allowing the Jupyter architecture to scale easily.
In this presentation, we look closely at these REST API calls, and the ZMQ socket traffic using simple tools like the browser's network tab. We will also try to manipulate a notebook using simple code to get a full appreciation of these internals.


Expected audience expertise: Domain

none

Expected audience expertise: Python

some

Abstract as a tweet

Let's explore the distributed architecture of the Jupyter project

Hi I am Dhanshree,
I have been developing with Python for over 3 years now. I enjoy working with computers. I work with machine learning, backend development, cloud and infrastructure.