Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Communication graph and Gantt chart #6

Merged
merged 31 commits into from
Feb 28, 2023
Merged

Communication graph and Gantt chart #6

merged 31 commits into from
Feb 28, 2023

Conversation

Mellich
Copy link
Collaborator

@Mellich Mellich commented Feb 21, 2023

This PR does the following changes:

  • Extend MPIEvent with an end time to allow the generation of a Gantt chart (Gantt Chart #3)
  • Adds functionality to generate a communication graph from a merged tape
  • Primitive plotting functions for Gantt charts and arrows to reflect communication between ranks

@Mellich Mellich requested a review from carstenbauer February 21, 2023 13:11
@@ -86,6 +86,8 @@ MPITape.print_mytape()
tape_merged = MPITape.merge()
if rank == 0 # Master
MPITape.print_merged(tape_merged)
display(MPITape.plot_merged(tape_merged))
readline()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The plot isn't interactive, right? Might be better to just directly save as png/pdf.


function srcdest_to_rankarray(srcdest, rank)
if srcdest in ["all", "each", "some"]
return vcat(collect(0:getcommsize()-1))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Das vcat ist hier überflüssig, oder nicht?

Ist wahrscheinlich auch nicht ganz korrekt, da z.B. bei "some" nicht alle ranks involviert sind (siehe z.B. MPI_Scan). Hängt aber auch einfach davon ab wie genau wir das darstellen wollen.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ja, das müssen wir irgendwann sicher noch einmal umbauen - insbesondere, wenn wir communicators unterstützen wollen. Für die meisten collectives reicht es im Moment aber, denke ich.

if typeof(srcdest) <: Integer
return [srcdest]
end
if srcdest == nothing
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

isnothing(srcdest) ist eleganter / besser.


function MPIEventNeighbors(ev::MPIEvent)
srcdest = getsrcdest(ev)
if srcdest == nothing
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

isnothing(srcdest) ist eleganter / besser.

opendsts = srcdest_to_rankarray(srcdest[:dest], ev.rank)
# Delete rank from recvs or sends it if is root!
if length(opensrcs) == 1 && (opensrcs[1] in opendsts)
deleteat!(opendsts, findfirst(x->x==opensrcs[1], opendsts))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tipp: findfirst(x->x==opensrcs[1], opendsts) kannst du auch als findfirst(isequal(opensrcs[1]), opendsts) schreiben.

srcdest = (src = nothing, dest = nothing)
end
opensrcs = srcdest_to_rankarray(srcdest[:src], ev.rank)
opendsts = srcdest_to_rankarray(srcdest[:dest], ev.rank)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Würde vielleicht opendsts in opendests umbenennen, damit wir einheitlich source als src und destination als dest abkürzen.

# Construct data structure for linked list of MPI calls
for e in tape
push!(open_links, MPIEventNeighbors(e))
end
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Direkt open_links = MPIEventNeighbors[MPIEventNeighbors(e) for e in tape] ?

if isempty(ev.args_subset)
return nothing
end
if haskey(ev.args_subset, :tag)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Verstehe die tag Geschichte nicht ganz. Ist haskey(ev.args_subset, :tag) aktuell nicht einfach immer false?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ich nutze die Funktion beim Matchen der MPI Events. Im Moment hat sie keinen Effekt, da sie immer nothing zurück gibt, aber sobald wir tags in unserem NamedTuple hinzufügen, sollten sie damit auch direkt berücksichtigt werden.

@Mellich
Copy link
Collaborator Author

Mellich commented Feb 21, 2023

I also added support for #1 with the function plot_sequence_merged. It is possible to output the graph directly in the terminal.

@carstenbauer
Copy link
Collaborator

carstenbauer commented Feb 21, 2023

Looks nice! I was first slightly confused that there are no MPI_Recvs. :)

BTW, there is a dropmpiprefix function (or similar) that drops the "MPI_" part from the function names. Should look a lot cleaner with this.

     ┌──────┐          ┌──────┐          ┌──────┐          ┌──────┐          ┌──────┐
     │Rank_0│          │Rank_1│          │Rank_2│          │Rank_4│          │Rank_3│
     └──┬───┘          └──┬───┘          └──┬───┘          └──┬───┘          └──┬───┘
        │     MPI_Send    │                 │                 │                 │
        │ ────────────────>                 │                 │                 │
        │                 │                 │                 │                 │
        │              MPI_Send             │                 │                 │
        │ ──────────────────────────────────>                 │                 │
        │                 │                 │                 │                 │
        │     MPI_Send    │                 │                 │                 │
        │ ────────────────>                 │                 │                 │
        │                 │                 │                 │                 │
        │                 │     MPI_Send    │                 │                 │
        │ ────────────────────────────────────────────────────>                 │
        │                 │                 │                 │                 │
        │              MPI_Send             │                 │                 │
        │ ──────────────────────────────────>                 │                 │
        │                 │                 │                 │                 │
        │                 │     MPI_Send    │                 │                 │
        │ ────────────────────────────────────────────────────>                 │
        │                 │                 │                 │                 │
        │                 │              MPI_Send             │                 │
        │ ──────────────────────────────────────────────────────────────────────>
        │                 │                 │                 │                 │
        │                 │              MPI_Send             │                 │
        │ ──────────────────────────────────────────────────────────────────────>
        │                 │                 │                 │                 │
        │                 │     MPI_Send    │                 │                 │
        │ ────────────────────────────────────────────────────>                 │
        │                 │                 │                 │                 │
        │     MPI_Send    │                 │                 │                 │
        │ ────────────────>                 │                 │                 │
        │                 │                 │                 │                 │
        │                 │              MPI_Send             │                 │
        │ ──────────────────────────────────────────────────────────────────────>
        │                 │                 │                 │                 │
        │              MPI_Send             │                 │                 │
        │ ──────────────────────────────────>                 │                 │
        │                 │                 │                 │                 │
        │     MPI_Send    │                 │                 │                 │
        │ <────────────────                 │                 │                 │
        │                 │                 │                 │                 │
        │              MPI_Send             │                 │                 │
        │ <──────────────────────────────────                 │                 │
        │                 │                 │                 │                 │
        │                 │              MPI_Send             │                 │
        │ <──────────────────────────────────────────────────────────────────────
        │                 │                 │                 │                 │
        │                 │     MPI_Send    │                 │                 │
        │ <────────────────────────────────────────────────────                 │
     ┌──┴───┐          ┌──┴───┐          ┌──┴───┐          ┌──┴───┐          ┌──┴───┐
     │Rank_0│          │Rank_1│          │Rank_2│          │Rank_4│          │Rank_3│
     └──────┘          └──────┘          └──────┘          └──────┘          └──────┘

@Mellich Mellich merged commit 82b9a27 into pc2:main Feb 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants