Child pages
  • Managing your Files

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

No Format
 MOX: mmlsquota -j mygroup  --block-size G gscratch

 IKT: mmlsquota -j hyak-mygroup  --block-size G suppscr 

Usage reporting is written to each group's space hourly in a file at the root of the directory called usage_report.txt. Reports can also be sent by e-mail weekly (Monday morning) and daily when your group is over their quota. If you're interested in e-mail reporting, please contact us.

...

No Format
MOX: mmlsquota --block-size G gscratch:home 

IKT: mmlsquota --block-size G suppscr:home 

Node Local Scratch

Each node in Hyak has an /scr filesystem, typically with ~100GB of space. These filesystems are accessible only to the local node and are not shared with any other Hyak nodes. This is available to you for use in cases where your application performance is limited by the shared scratch. mpiBLAST, for example, partitions a reference database and distributes it among the nodes participating in a calculation to achieve better I/O scaling. The local scratch disk is cleaned up after each job completes. If you would like to keep data on the local scratch disk of your nodes, create a directory with group name and put your data there (if your UNIX group is hyak-mygroup, you would create /scr/mygroup). Node local scratch filesystems are not backed up and cleared out upon job completion.

Note on filesystem permissions

...

  • Into Hyak
    • via scp, e.g. command line:
      scp filename user@iktuser@mox.hyak.uw.edu:path/to/destination/directory
    • via tar, e.g. command line:
      tar -czf - local_directory | ssh user@iktuser@mox.hyak.uw.edu 'tar -xvzf -'
    • via tar, e.g. command line through ssh channel (see Logging In)
      tar -czf - local_directory | ssh -S /tmp/$USER-hyak-socket iktsocket mox.uw.edu "tar -xvzf -"
    • via bbcp
      bbcp mydatafile ikt1mox1.hyak.uw.edu:/gscratch/mygroup/myuserid
      bbcp mydatafile ikt2mox2.hyak.uw.edu:/gscratch/mygroup/myuserid
    • via rsync
      insert example here
  • Out of Hyak
    • via scp, to local working directory
      scp user@hyakuser@mox.hyak.washington.edu:path/to/file .
    • via tar
      ssh user@hyakuser@mox.hyak.washington.edu '(cd path/to/destination/directory; tar -czf -)' | tar -xvzf -
    • via tar, e.g. command line through ssh channel (see Logging In)
      ssh -S /tmp/$USER-hyak-socket iktsocket mox.hyak.uw.edu "tar -czf /path/to/remote/directory" | tar -xvzf -
    • via rsync
      insert example here

...