Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use instantaneous fields for coupling #1942

Merged
merged 33 commits into from
Nov 8, 2023

Conversation

DeniseWorthen
Copy link
Collaborator

@DeniseWorthen DeniseWorthen commented Oct 16, 2023

PR Author Checklist:

  • I have linked PR's from all sub-components involved in section below.
  • I am confirming reviews are completed in ALL sub-component PR's. (NA)
  • I have run the full RT suite on either Hera/Cheyenne AND have attached the log to this PR below this line:
  • I have added the list of all failed regression tests to "Anticipated changes" section.
  • I have filled out all sections of the template.

Description

Changes field aliases to allow export of instantaneous fields rather than mean. The inst and mean fields are not B4b, but are generally ~O-12 different. See figures #1800 (comment).

Logs will be run after the two dependent PRs are merged.

Linked Issues and Pull Requests

Associated UFSWM Issue to close

Subcomponent Pull Requests

Blocking Dependencies

Subcomponents involved:

  • AQM
  • CDEPS
  • CICE
  • CMEPS
  • CMakeModules
  • FV3
  • GOCART
  • HYCOM
  • MOM6
  • NOAHMP
  • WW3
  • stochastic_physics
  • none

Anticipated Changes

Input data

  • No changes are expected to input data.
  • Changes are expected to input data:
    • New input data.
    • Updated input data.

Regression Tests:

  • No changes are expected to any regression test.
  • Changes are expected to the following tests:
Tests effected by changes in this PR:

Changes are expected for all global coupled, hafs regional and atm-land tests.

tests/logs/log_hera: grep FAIL rt_* 
rt_001_cpld_control_p8_mixedmode_intel.log:71:Test 001 cpld_control_p8_mixedmode_intel FAIL Tries: 2
rt_002_cpld_control_gfsv17_intel.log:70:Test 002 cpld_control_gfsv17_intel FAIL Tries: 2
rt_004_cpld_control_p8_intel.log:71:Test 004 cpld_control_p8_intel FAIL Tries: 2
rt_006_cpld_control_qr_p8_intel.log:71:Test 006 cpld_control_qr_p8_intel FAIL Tries: 2
rt_008_cpld_2threads_p8_intel.log:59:Test 008 cpld_2threads_p8_intel FAIL Tries: 2
rt_009_cpld_decomp_p8_intel.log:59:Test 009 cpld_decomp_p8_intel FAIL Tries: 2
rt_010_cpld_mpi_p8_intel.log:59:Test 010 cpld_mpi_p8_intel FAIL Tries: 2
rt_011_cpld_control_ciceC_p8_intel.log:71:Test 011 cpld_control_ciceC_p8_intel FAIL Tries: 2
rt_012_cpld_control_c192_p8_intel.log:59:Test 012 cpld_control_c192_p8_intel FAIL Tries: 2
rt_014_cpld_bmark_p8_intel.log:54:Test 014 cpld_bmark_p8_intel FAIL Tries: 2
rt_016_cpld_control_noaero_p8_intel.log:70:Test 016 cpld_control_noaero_p8_intel FAIL Tries: 2
rt_017_cpld_control_nowave_noaero_p8_intel.log:68:Test 017 cpld_control_nowave_noaero_p8_intel FAIL Tries: 2
rt_018_cpld_debug_p8_intel.log:59:Test 018 cpld_debug_p8_intel FAIL Tries: 2
rt_019_cpld_debug_noaero_p8_intel.log:58:Test 019 cpld_debug_noaero_p8_intel FAIL Tries: 2
rt_020_cpld_control_noaero_p8_agrid_intel.log:68:Test 020 cpld_control_noaero_p8_agrid_intel FAIL Tries: 2
rt_021_cpld_control_c48_intel.log:56:Test 021 cpld_control_c48_intel FAIL Tries: 2
rt_022_cpld_control_p8_faster_intel.log:71:Test 022 cpld_control_p8_faster_intel FAIL Tries: 2
rt_023_cpld_control_pdlib_p8_intel.log:70:Test 023 cpld_control_pdlib_p8_intel FAIL Tries: 2
rt_026_cpld_debug_pdlib_p8_intel.log:58:Test 026 cpld_debug_pdlib_p8_intel FAIL Tries: 2
rt_134_hafs_regional_atm_ocn_intel.log:15:Test 134 hafs_regional_atm_ocn_intel FAIL Tries: 2
rt_136_hafs_regional_atm_ocn_wav_intel.log:17:Test 136 hafs_regional_atm_ocn_wav_intel FAIL Tries: 2
rt_147_hafs_regional_storm_following_1nest_atm_ocn_intel.log:15:Test 147 hafs_regional_storm_following_1nest_atm_ocn_intel FAIL Tries: 2
rt_149_hafs_regional_storm_following_1nest_atm_ocn_debug_intel.log:13:Test 149 hafs_regional_storm_following_1nest_atm_ocn_debug_intel FAIL Tries: 2
rt_150_hafs_regional_storm_following_1nest_atm_ocn_wav_intel.log:17:Test 150 hafs_regional_storm_following_1nest_atm_ocn_wav_intel FAIL Tries: 2
rt_151_hafs_regional_docn_intel.log:14:Test 151 hafs_regional_docn_intel FAIL Tries: 2
rt_152_hafs_regional_docn_oisst_intel.log:14:Test 152 hafs_regional_docn_oisst_intel FAIL Tries: 2
rt_171_control_p8_atmlnd_sbs_intel.log:91:Test 171 control_p8_atmlnd_sbs_intel FAIL Tries: 2
rt_235_cpld_control_p8_gnu.log:71:Test 235 cpld_control_p8_gnu FAIL Tries: 2
rt_236_cpld_control_nowave_noaero_p8_gnu.log:68:Test 236 cpld_control_nowave_noaero_p8_gnu FAIL Tries: 2
rt_237_cpld_debug_p8_gnu.log:59:Test 237 cpld_debug_p8_gnu FAIL Tries: 2
rt_238_cpld_control_pdlib_p8_gnu.log:70:Test 238 cpld_control_pdlib_p8_gnu FAIL Tries: 2
rt_239_cpld_debug_pdlib_p8_gnu.log:58:Test 239 cpld_debug_pdlib_p8_gnu FAIL Tries: 2

Regression tests were re-run on Hera at 3747dcd, confirming previous results.

Libraries

  • Not Needed
  • Needed
    • Create separate issue in JCSDA/spack-stack asking for update to library. Include library name, library version.
    • Add issue link from JCSDA/spack-stack following this item
Code Managers Log
  • This PR is up-to-date with the top of all sub-component repositories except for those sub-components which are the subject of this PR.
  • Move new/updated input data on RDHPCS Hera and propagate input data changes to all supported systems.
    • N/A

Testing Log:

  • RDHPCS
    • Hera
    • Orion
    • Hercules
    • Jet
    • Gaea
    • Cheyenne
  • WCOSS2
    • Dogwood/Cactus
    • Acorn
  • CI
    • Completed
  • opnReqTest
    • N/A
    • Log attached to comment

@github-actions
Copy link

@DeniseWorthen please bring these up to date with respective authoritative repositories

  • ufs-weather-model NOT up to date
  • fv3 NOT up to date

@DeniseWorthen DeniseWorthen marked this pull request as ready for review October 21, 2023 15:39
@zach1221
Copy link
Collaborator

zach1221 commented Nov 7, 2023

@ulmononian I'm not sure if you're the best contact for this or Dom, but I'm having trouble with failing gnu tests on Hercules, as mvapich insists a slurm module must also be loaded. However, it still fails to load when the slurm module is also loaded.
image

@jkbk2004
Copy link
Collaborator

jkbk2004 commented Nov 7, 2023

@ulmononian I'm not sure if you're the best contact for this or Dom, but I'm having trouble with failing gnu tests on Hercules, as mvapich insists a slurm module must also be loaded. However, it still fails to load when the slurm module is also loaded. image

@ulmononian @natalie-perlin can we follow up with the spack stack issue on hercules?

@zach1221
Copy link
Collaborator

zach1221 commented Nov 7, 2023

@ulmononian I'm not sure if you're the best contact for this or Dom, but I'm having trouble with failing gnu tests on Hercules, as mvapich insists a slurm module must also be loaded. However, it still fails to load when the slurm module is also loaded. image

@ulmononian @natalie-perlin can we follow up with the spack stack issue on hercules?

I'm talking to Natalie now.

@DeniseWorthen
Copy link
Collaborator Author

@zach1221 What changed on Hercules between Nov 2 (the last time we have a log) and now?

@zach1221
Copy link
Collaborator

zach1221 commented Nov 7, 2023

@zach1221 What changed on Hercules between Nov 2 (the last time we have a log) and now?

Not certain yet exactly, but Dom suspects hercules admins changes something with the slurm modules.

@jkbk2004
Copy link
Collaborator

jkbk2004 commented Nov 7, 2023

@zach1221 What changed on Hercules between Nov 2 (the last time we have a log) and now?

Not certain yet exactly, but Dom suspects hercules admins changes something with the slurm modules.

If we need re-installation of spack stack, then I think we should keep moving the PR and address the issue in next pr. Certainly, admin updated slurm on hercules. @BrianCurtis-NOAA what do you think?

@BrianCurtis-NOAA
Copy link
Collaborator

@zach1221 What changed on Hercules between Nov 2 (the last time we have a log) and now?

Not certain yet exactly, but Dom suspects hercules admins changes something with the slurm modules.

If we need re-installation of spack stack, then I think we should keep moving the PR and address the issue in next pr. Certainly, admin updated slurm on hercules. @BrianCurtis-NOAA what do you think?

I think we can run on hercules now: #1985

@zach1221
Copy link
Collaborator

zach1221 commented Nov 7, 2023

@BrianCurtis-NOAA Yes, I'm going to re-try gnu on hercules now with 1.5.0.

@zach1221
Copy link
Collaborator

zach1221 commented Nov 7, 2023

Ok, we're finally done with regression testing. Sending out final reviews now.

@zach1221 zach1221 merged commit 0d2fb37 into ufs-community:develop Nov 8, 2023
SamuelTrahanNOAA added a commit to SamuelTrahanNOAA/ufs-weather-model that referenced this pull request Nov 9, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Baseline Updates Current baselines will be updated. jenkins-ci Jenkins CI: ORT build/test on docker container Ready for Commit Queue The PR is ready for the Commit Queue. All checkboxes in PR template have been checked.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants