Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

mpi: Add basic2 mode #2307

Open
wants to merge 11 commits into
base: master
Choose a base branch
from
Open

mpi: Add basic2 mode #2307

wants to merge 11 commits into from

Conversation

georgebisbas
Copy link
Contributor

Preallocated buffers using MPIMsg.
An MPIMsgEnriched version for send/recv, will follow

Copy link

codecov bot commented Feb 10, 2024

Codecov Report

Attention: Patch coverage is 97.53086% with 4 lines in your changes are missing coverage. Please review.

Project coverage is 86.77%. Comparing base (19d4e33) to head (55b11d5).

Files Patch % Lines
devito/mpi/routines.py 97.38% 1 Missing and 3 partials ⚠️
Additional details and impacted files
@@           Coverage Diff            @@
##           master    #2307    +/-   ##
========================================
  Coverage   86.76%   86.77%            
========================================
  Files         233      233            
  Lines       43740    43872   +132     
  Branches     8075     8100    +25     
========================================
+ Hits        37952    38068   +116     
- Misses       5079     5092    +13     
- Partials      709      712     +3     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

devito/mpi/routines.py Show resolved Hide resolved
devito/mpi/routines.py Outdated Show resolved Hide resolved

return SendRecv('sendrecv%s' % key, iet, parameters, bufg, bufs)

def _call_sendrecv(self, name, *args, msg=None, haloid=None):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why do you need a haloid here ? never had to use it. What makes this mode require it?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similarly to the same method in OverlapHalo. Each call to sendrecv has its own message indexedpointer

devito/mpi/routines.py Show resolved Hide resolved
devito/mpi/routines.py Outdated Show resolved Hide resolved
devito/mpi/routines.py Show resolved Hide resolved
mapper[(d0, side, region)] = (sizes)

i = 0
for d in f.dimensions:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why not :

for i, halo in enumerate(self.halos):

like we have in the other message types?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I followed the basic style, which does not have yet a method to cleanup the redundant halos as in diag2 for example:
e.g.:

        # Only retain the halos required by the Diag scheme
        halos = sorted(i for i in hse.halos if isinstance(i.dim, tuple))

I tried this but did not manage to get it working nicely.

@@ -801,7 +801,7 @@ def test_trivial_eq_2d(self):
assert np.all(f.data_ro_domain[0, :-1, -1:] == side)
assert np.all(f.data_ro_domain[0, -1:, :-1] == side)

@pytest.mark.parallel(mode=[(8, 'basic'), (8, 'diag'), (8, 'overlap'),
@pytest.mark.parallel(mode=[(8, 'basic'), (8, 'basic2'), (8, 'diag'), (8, 'overlap'),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would drop basic here and below... or overlap perhaps. These tests can be quite expensive

@georgebisbas georgebisbas force-pushed the basic2mpi branch 4 times, most recently from 5a5e826 to c266e14 Compare April 9, 2024 17:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
MPI mpi-related
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants