Working with Files

When running Jupyter locally, your data files are typically on your local filesystem and can be read using the Python Standard Library; e.g.

1 2 with open('myfile.txt', 'r') as f: print(f.read())

In a cloud environment, your data files are now located in a cloud storage system such as Amazon S3 or GCP Cloud Storage. Access to cloud storage is usually via an API or library.

In the first release of Europa Notebooks, we are using Atlassian (attachment storage in your account), as the file storage layer.

To make working with files in Atlassian simple, we have introduced AtlasFS. AtlasFS works just like the Python Standard Library. The above file operation, in AtlasFS, is written as:

1 2 3 4 5 from atlasfs import AtlasFS fs = AtlasFS() with fs.open('myfile.txt', 'r') as f: print(f.read())

Writing files is also similar to standard Python:

1 2 with fs.open('output.txt', 'w') as f: f.write('Hello, World!')

Don’t forget to include the fs. prefix, otherwise the file will be saved locally on the notebook server, and will be lost when the server is shutdown.

The files that will be available to the notebook are the datasets in the same project. In this first version, it is a flat folder structure. Support for sub-folders and other file storage layers such as S3 and GCP are on the roadmap.


We hope this approach is more intuitive than trying to remember custom APIs.

The credentials to access your storage layer is automatically provided since a notebook runs in your own private server, which can only be initiated from your authenticated Atlassian session.