Dissipation thorn schedules LOCAL routines after GLOBAL ones

Issue #960 closed
Roland Haas created an issue

Dissipation currently contains a schedule item SCHEDULE setup_epsdis AT cctk_poststep after SphericalSurface_HasBeenSet { LANG: C SYNC: epsdisA_group } "Setup spatially varying dissipation" However SphericalSurface_HasBeenSet is AFTER SphericalSurface_Set which is a GLOBAL routine. Since GLOBAL routines run last in POSTSTEP (which is in EVOL) the AFTER modifier is ignored for all but the last (finest) refinement level. This can lead to the wrong surface shape to be used by the local routines.

It might actually make sense to teach the flesh about GLOBAL/LOCAL etc and refuse AFTER/BEFORE statements that span different modes. This of course depends on how much work this is and if we expect the dependency and task based scheduler to be finished soon and if there are legitimate uses for AFTER/BEFORE to span modes.

Keyword: Dissipation
Keyword: and
Keyword: SphericalSurface

Comments (10)

  1. Ian Hinder
    • removed comment

    Do you have some ideas for how to replace modes with something better? I imagine this is related to the data-driven scheduler ideas.

  2. Erik Schnetter
    • removed comment

    Instead of modes, the schedule entry should specify which regions of grid functions a routine reads and writes. We were currently thinking of region such as "everywhere", "interior", or "boundary" -- extending this to "all components on this level", or "all levels" should do the trick.

    We may still need modes for internal routines e.g. in driver-related or multiblock-related thorns, but I hope to be able to avoid this in physics thorns.

  3. Roland Haas reporter
    • changed status to open
    • removed comment

    The attached two patches (one with code changes and a compressed one with changes to the test data to stay below the 250K attachment size limit) add special logic to schedule setup_eps_diss in global mode in poststep. The extra logic ensures that SYNC only happens on the levels that the local routine would run on rather than all levels. As far as I can tell this cannot be achieved with SYNC statements in schedule.ccl and requires direct interaction with Carpet and CCTK_SyncGroup().

  4. Erik Schnetter
    • changed status to open
    • removed comment

    Instead of replicating Carpet's logic, there could be a level-mode routine that does nothing except record on which levels it is called. A subsequent global loop-level mode routine could call CCTK_SyncGroup on these levels. Would this work (and be simpler than the current code?)

    If not, please apply the patch.

  5. Roland Haas reporter
    • removed comment

    Attached please find a patch (loose-interaction.patch) which accomplishes this. It no longer replicates Carpet's logic but at the price of introducing a second function and more #ifdefs. Which one is nicer depends on how crucial it is to keep user thorns ignorant of Carpets internal logic I guess. Both versions are fine with me (though I slightly prefer the first version since it does not require a static variable). Alternatively and since I know of several other thorns that could benefit from something like this, how about defining (yet another) aliased function to return "RunOnLevel(rl)" which returns `cctk_iteration%do_every == 0` or `(cctk_iteration-1)%do_every == 0` depending on whether we are in EVOL or not? Ie. an aliased routine that exposes Carpet's decision (but not the logic behind it) to the user code in a hopefully controlled manner?

  6. Erik Schnetter
    • removed comment

    Such an aliased function is a good idea. Presumably, even Carpet itself may benefit from using it...

  7. Roland Haas reporter
    • changed status to open
    • removed comment

    changed to re-opened to remind me (rhaas) to write the aliased function mentioned in comment:7

  8. Roland Haas reporter
    • changed status to resolved
    • removed comment

    This issue is fixed along with an issue in AHFinderDirect in #1880 I created a new ticket for the aliased functions proposed in here #1901

  9. Log in to comment