Skip to content

Conversation

@alanwaketan
Copy link
Collaborator

Summary:
SPMD expects the mesh to be the same across the board. Therefore, introduce the concept of a global mesh to reduce the need to carry the same mesh in the code.

Test Plan:
python test/spmd/test_xla_sharding.py -v -k test_global_mesh

Summary: SPMD expects the mesh to be the same across the board. Therefore, introduce the concept of a global mesh to reduce the need to carry the same mesh in the code. Test Plan: python test/spmd/test_xla_sharding.py -v -k test_global_mesh
Copy link
Collaborator

@jonb377 jonb377 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great, thanks for adding this Jiewen!

Do you want to add a section to the spmd.md doc about the usage? It could be useful to explain why a global mesh is needed. (I'll also take an AI to update our resnet example to use a single mesh across the program)

@alanwaketan
Copy link
Collaborator Author

Thanks for the quick review, Jon. Let me do the doc update later. I'm in a rush...

@alanwaketan
Copy link
Collaborator Author

Let's ignore the GPU test given it takes forever. I'm in a very tight deadline.

@alanwaketan alanwaketan merged commit 9b12334 into master Feb 8, 2024
amithrm pushed a commit to amithrm/xla that referenced this pull request Mar 1, 2024
Summary: SPMD expects the mesh to be the same across the board. Therefore, introduce the concept of a global mesh to reduce the need to carry the same mesh in the code. Test Plan: python test/spmd/test_xla_sharding.py -v -k test_global_mesh
bhavya01 pushed a commit that referenced this pull request Apr 22, 2024
Summary: SPMD expects the mesh to be the same across the board. Therefore, introduce the concept of a global mesh to reduce the need to carry the same mesh in the code. Test Plan: python test/spmd/test_xla_sharding.py -v -k test_global_mesh
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

3 participants