Skip to content

OHBA-analysis/BMRC

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

BMRC

Docs, scripts, and conda environments for working on the BMRC cluster at the University of Oxford.

  • envs/ — conda environment files.
  • slurm/ — example SLURM job scripts.

Using BMRC

Access

ssh <username>@cluster1.bmrc.ox.ac.uk   # or cluster2/3/4

clusterX are for file management and job submission only — do not run processing on them.

To make access persistent (no repeated passwords), add the following to ~/.ssh/config:

Host linux
    HostName linux.ox.ac.uk
    User <oxford-sso-username>
    IdentityFile ~/.ssh/ox-linux

Host cluster1.bmrc
    HostName cluster1.bmrc.ox.ac.uk
    User <bmrc-username>
    ProxyJump linux
    ControlMaster auto
    ControlPath ~/.ssh/ssh-%r@%h:%p
    ControlPersist yes

Then ssh cluster1.bmrc (and rsync ... cluster1.bmrc:...) just works.

Storage

Location Use
/users/<group>/<username> Home directory — small quota (10 GB)
/well/<group>/users/<username> Project/work storage — put data/code here

File transfer

From a local machine, use rsync:

# to BMRC
rsync -Phr <file> <username>@cluster1.bmrc.ox.ac.uk:/well/<group>/users/<username>/
# from BMRC
rsync -Phr <username>@cluster1.bmrc.ox.ac.uk:/path/to/file ./

Interactive jobs

Use screen so sessions survive disconnects:

screen                                   # start a session
srun -p short --pty bash                 # CPU node
srun -p gpu_interactive --gres gpu:1 --pty bash   # GPU node
# detach: Ctrl-A Ctrl-D ; list: screen -ls ; reattach: screen -r <id>

An interactive GPU node (compG017) with 2 GPUs is also directly accessible via ssh compG017. Check GPUs with nvidia-smi. On that node, unset proxy variables before network access:

unset https_proxy http_proxy no_proxy HTTPS_PROXY HTTP_PROXY NO_PROXY

Batch jobs (SLURM)

Write a job.sh with SLURM directives and submit:

sbatch job.sh

Useful commands:

watch squeue -u <username>    # monitor your jobs
squeue -p <queue>             # queue status
scancel <job-id>              # cancel one job
scancel -u <username>         # cancel all your jobs

Modules / environment

module load Miniforge3
module load git

Add frequently used module load lines to ~/.bashrc for convenience.

Remote editing with VSCode

VSCode with the Remote – SSH extension gives you a local editor with a shell and filesystem on BMRC.

One-time setup:

  1. Activate your Oxford Linux Shell Account via the university IT help portal.

  2. Install VSCode.

  3. Install the Remote - SSH extension from the Marketplace.

  4. Click the green remote indicator (bottom-left of VSCode) → Open SSH Configuration File… → pick ~/.ssh/config.

  5. Add a host entry:

    Host vscode-bmrc
        HostName cluster1.bmrc.ox.ac.uk
        ProxyJump <oxford-sso-username>@linux.ox.ac.uk
        User <bmrc-username>
        ForwardAgent yes

    On the Oxford VPN you can omit the ProxyJump line.

Connect:

Green remote indicator → Connect to Host…vscode-bmrc. Enter your SSO password, then your BMRC password.

Tip: Set up SSH keys for linux.ox.ac.uk to avoid typing two passwords each time (standard ssh-copy-id / ~/.ssh/authorized_keys workflow).

Help

About

Scripts and conda environments for using the Oxford BMRC cluster.

Resources

Stars

Watchers

Forks

Contributors

Languages