Posts tagged jupyterhub
- 12 November 2021
As you propably know, we will rename/remove some unused/outdated/ python modules, please see the details here. Since the jupyterhub kernels are based on modules, the deprecated kernels will no longer be available as default kernels in jupyter notebooks/labs.
NO PANIC, if you have been working with those deprecated kernels and want to continue using them in your notebooks, please follow the steps below.
- 12 November 2021
Since several years, we are offering Python environments on Mistral. Many of them are not updated any more and should not be used for new development. However, older scripts may rely on these environments and the versions of their installed packages.
- 25 October 2021
The default location for R packages is not writeble and you can not install new packages. On demand we install new packages system-wide and for all users. However, it possible to install packages in different locations than root and here are the steps:
create a directory in
- 10 June 2021
In this tutorial, I will describe the steps to create a kernel for Matlab on Levante. get the
matlab_kernel working in Jupyterhub on Levante.
conda environment with python 3.9
- 06 May 2021
Do you want to create videos / animations with
ffmpeg from your
jupyter notebook? you need
ffmpeg-python (conda) which requires
ffmpeg software on Mistral (module)
conda env with
- 04 May 2021
We have seen in this blog post how to encapsulate a jupyter notebook (server) in a singularity container . In this tutorial, I am going to describe how you can run a jupyter kernel in a container and make it available in the jupyter*.
Possible use case for this is to install a supported
and work with jupyter notebooks (see GLIBC and the container-based workaround).
- 23 March 2021
We already provide a kernel for Julia based on the module julia/1.7.0.
In order to use it, you only need to install ÌJulia:
- 23 March 2021
I am just describing spontaneously what worked for me to connect my local Spyder instance to a remote node on Mistral THAT YOU CAN CONNECT TO VIA SSH FROM YOUR LOCAL MACHINE!!!!
This is just a draft tutorial that will be updated/optimized afterwards.
- 04 March 2021
Kernels are based on python environments created with
virtualenv or other package manager. In some cases, the size of the
environment can tremendously grow depending on the installed packages.
The default location for python files is the
$HOME directory. In
this case, it will quickly fill your quota. In order to avoid this, we
suggest that you create/store python files in other directories of the
filesystem on Mistral.
The following are two alternative locations where you can create your Python environment:
- 16 February 2021
This is a follow up on Kernels. In
some cases, the process of publishing new Python modules can take long.
In the meantime, you can create a
test kernel to use it in
Jupyterhub. Creating new conda environments and using them as kernels
has been already described here. In
this example, we are not going to create a new conda env but only the
kernel configuration files.
in this tutorial, I will take the module python3/2021-01. as an example.
- 05 January 2021
Slurm config on Mistral has been updated to fix an issue related to memory use.
Prior the update, some Slurm jobs continue consuming the available
memory (and even swap) of the allocated node and exceed the allocated
memory (set in
srun). If this occurs, it also affect
- 16 November 2020
According to the official Web site,
Dask jobqueue can be used to
deploy deploy Dask on job queuing systems like PBS, Slurm, MOAB, SGE,
LSF, and HTCondor. Since the queuing system on Mistral is Slurm, we are
going to show how to start a Dask cluster there. The idea is simple as
described here. The difference is that the workers can be distributed
through multiple nodes from the same partition. Using Dask jobqueue will
Dask cluster as a Slurm jobs.
In this case, Jupyterhub will often play an interface role and the Dask can use more than the allocated resources to your jupyterhub session (profiles).
- 13 November 2020
Extensions bring additional interesting features to Jupyter*. Depending on the workflow in the notebook, users can install/enable extensions when required. Although is easy to add extensions to both Jupyter notebook an lab, the process can be sometimes annoying based on where jupyter is served from.
In general, installing and enabling extensions in your laptop or using
start-jupyter script is straightforward, especially when the
developers well describe their extensions. There should be no
restrictions or permissions issues, just follow the instructions.
- 05 November 2020
can’t use NCL (Python) as kernel in Jupyter
This tutorial won’t work
- 07 October 2020
created your own conda env
- 05 October 2020
We introduced a new feature to the preset and advanced options form.
This is a nice feature especially for the advanced options form, which
contain many fields. You can also reset the options to their initial
values by clicking on
reset. The form options are saved in the
client’s browser every 10 seconds and are not lost if:
the browser crashes
- 02 October 2020
- 25 September 2020
Each Jupyter notebook is running as a SLUM job on MIstral. By default,
stderr of the SLURM batch job that is spawned by
Jupyterhub is written to your
HOME directory on the HPC system. In
order to make it simple to locate the log file:
if you use the
preset options form: the log file is named
- 18 September 2020
There are multiple ways to create a dask cluster, the following is only an example. Please consult the official documentation. The Dask library is installed and can be found in any of the python3 kernels in jupyterhub. Of course, you can use your own python environment.
The simplest way to create a Dask cluster is to use the distributed module: