To BioMade users that have been utilizing the "smap" command to view the Slurm queue.  The "smap" command has been removed on the lastest Slurm software that was deploy during last weeks biomade.sr.unh.edu shutdown.  I have already received two email wondering what happened to "smap" so I thought it made sense to send out a general message to all of the BioMade HPC Core users.

The Slurm team removed "smap" because they felt it was redundant with the "squeue" command.

See email message below with an example of the both the default "squeue" command, as well as the output of an alias that RCC added via the "unhrcc" module that provides a bit more useful information.  These commands can be adjusted to more or less information.  If you come up with a variation that you feel is useful, let me know and I can add another alias for it.,

Robert E. Anderson

Associate Director

Univ of NH Research Computing Center


-------- Original Message --------
Subject: Re: [biomade-users] BioMadee HPC Core is back online
Date: Fri Feb 04 2022 12:24:36 GMT-0500 (Eastern Standard Time)
From: "Robert E. Anderson" <rea@sr.unh.edu>
To: "Nikolai Matukhno" <nikolai.matukhno@unh.edu>
CC: "rccops@sr.unh.edu" <rccops@sr.unh.edu>

Thanks for the heads up, but apparently the "smap" command has been removed from slurm because it was considered to be just a special case of squeue.

https://slurm.schedmd.com/SLUG19/Slurm_20.02_and_Beyond.pdf

What was it that you used in smap?  Perhaps an alias for squeue could provide that same information.  I have already created one alias that I find useful.   Here's an example of the default squeue output, loading the "unhrcc" module and using the squeue2 command.  All followed by the actual alias for squeue2 (and sacct2):


[rea@login02 ~]$ squeue -a
            JOBID PARTITION     NAME     USER ST       TIME  NODES NODELIST(REASON)
1528022_[252-1349%   general TestPara   kv1033 PD       0:00      1 (JobArrayTaskLimit)
1528222_[110-1349%   general TestPara   kv1033 PD       0:00      1 (JobArrayTaskLimit)
      1528022_251   general TestPara   kv1033  R       0:49      1 node19
      1528022_250   general TestPara   kv1033  R       1:31      1 node19
      1528022_249   general TestPara   kv1033  R       1:34      1 node19
      1528022_248   general TestPara   kv1033  R       1:50      1 node19
      1528022_247   general TestPara   kv1033  R       2:14      1 node19
      1528022_229   general TestPara   kv1033  R      18:25      1 node19
      1528022_220   general TestPara   kv1033  R      25:36      1 node19
      1528022_202   general TestPara   kv1033  R      44:25      1 node19
      1528022_193   general TestPara   kv1033  R      53:16      1 node19
      1528022_184   general TestPara   kv1033  R      57:04      1 node19
          1528221   general newParam   kv1033  R      50:26      1 node21
      1528222_109   general TestPara   kv1033  R       0:20      1 node19
      1528222_106   general TestPara   kv1033  R      10:53      1 node19
      1528222_105   general TestPara   kv1033  R      11:40      1 node19
      1528222_103   general TestPara   kv1033  R      12:25      1 node19
       1528222_94   general TestPara   kv1033  R      17:46      1 node19
       1528222_76   general TestPara   kv1033  R      30:33      1 node19
       1528222_67   general TestPara   kv1033  R      33:16      1 node19
       1528222_49   general TestPara   kv1033  R      37:56      1 node19
       1528222_22   general TestPara   kv1033  R      44:05      1 node19
       1528222_13   general TestPara   kv1033  R      46:30      1 node19
          1528012   thrust3    k89a1   sp1564  R   13:21:30      1 node10
          1528020   thrust3    k89a2   sp1564  R    2:59:49      1 node18


[rea@login02 ~]$ module load unhrcc
[rea@login02 ~]$ squeue2 -a         
Fri Feb  4 12:19:18 EST 2022
JOBID  PARTITIO SUBMIT_TIME      USER       GROUP      TIME     START_TIME       TIME_LEFT     i NODELIST(R
152802 general  2022-02-04T09:53 kv1033     thrust2    0:00     N/A              1:00:00       i (JobArrayT
152822 general  2022-02-04T11:30 kv1033     thrust2    0:00     N/A              1:00:00       i (JobArrayT
152838 general  2022-02-04T09:53 kv1033     thrust2    0:06     2022-02-04T12:19 59:54         i node19     
152838 general  2022-02-04T09:53 kv1033     thrust2    1:03     2022-02-04T12:18 58:57         i node19     
152838 general  2022-02-04T09:53 kv1033     thrust2    1:48     2022-02-04T12:17 58:12         i node19     
152838 general  2022-02-04T09:53 kv1033     thrust2    2:04     2022-02-04T12:17 57:56         i node19     
152838 general  2022-02-04T09:53 kv1033     thrust2    2:28     2022-02-04T12:16 57:32         i node19     
152834 general  2022-02-04T09:53 kv1033     thrust2    18:39    2022-02-04T12:00 41:21         i node19     
152832 general  2022-02-04T09:53 kv1033     thrust2    25:50    2022-02-04T11:53 34:10         i node19     
152824 general  2022-02-04T09:53 kv1033     thrust2    44:39    2022-02-04T11:34 15:21         i node19     
152821 general  2022-02-04T09:53 kv1033     thrust2    53:30    2022-02-04T11:25 6:30          i node19     
152820 general  2022-02-04T09:53 kv1033     thrust2    57:18    2022-02-04T11:22 2:42          i node19     
152822 general  2022-02-04T11:28 kv1033     thrust2    50:40    2022-02-04T11:28 9:20          i node21     
152838 general  2022-02-04T11:30 kv1033     thrust2    0:34     2022-02-04T12:18 59:26         i node19     
152837 general  2022-02-04T11:30 kv1033     thrust2    11:07    2022-02-04T12:08 48:53         i node19     
152836 general  2022-02-04T11:30 kv1033     thrust2    11:54    2022-02-04T12:07 48:06         i node19     
152836 general  2022-02-04T11:30 kv1033     thrust2    12:39    2022-02-04T12:06 47:21         i node19     
152835 general  2022-02-04T11:30 kv1033     thrust2    18:00    2022-02-04T12:01 42:00         i node19     
152831 general  2022-02-04T11:30 kv1033     thrust2    30:47    2022-02-04T11:48 29:13         i node19     
152830 general  2022-02-04T11:30 kv1033     thrust2    33:30    2022-02-04T11:45 26:30         i node19     
152828 general  2022-02-04T11:30 kv1033     thrust2    38:10    2022-02-04T11:41 21:50         i node19     
152825 general  2022-02-04T11:30 kv1033     thrust2    44:19    2022-02-04T11:34 15:41         i node19     
152823 general  2022-02-04T11:30 kv1033     thrust2    46:44    2022-02-04T11:32 13:16         i node19     
152801 thrust3  2022-02-03T22:57 sp1564     thrust3    13:21:44 2022-02-03T22:57 59-10:38:16   i node10     
152802 thrust3  2022-02-04T09:19 sp1564     thrust3    3:00:03  2022-02-04T09:19 59-20:59:57   i node18


[rea@login02 ~]$ alias  
alias sacct2='date ; sacct --format jobid,partition,user,jobname,state,end,elapsed'
alias squeue2='date ; squeue -o "%6A %8P %16V %10u %10g %8M %16S %14Li %10R"'

Robert E. Anderson

Associate Director

Univ of NH Research Computing Center


Nikolai Matukhno wrote:


Hi Robert,

I logged into my account in BioMade and tried to run a simple Slurm command "smap" and it says "smap: command not found". But if I use "sinfo" and "squeue" - it works just fine. Am I missing here something? I usually use smap before submitting any job to make sure I can add my job with a specified number of CPUs and RAM.

Any help is appreciated.

Thank you,
Nikolai

From: biomade-users <biomade-users-bounces@lists.sr.unh.edu> on behalf of Robert E. Anderson <rea@sr.unh.edu>
Sent: Thursday, February 3, 2022 12:04 PM
To: biomade-users@lists.sr.unh.edu <biomade-users@lists.sr.unh.edu>
Subject: [biomade-users] BioMadee HPC Core is back online
 
RCC has completed the BIOS, OS Patches, and new hardware installs on the BioMade HPC Core.  

You may login and run jobs at this time.

If you discover anything that is not working correctly please let us know.

Thanks.

Robert E. Anderson

Associate Director

Univ of NH Research Computing Center