tensor must have the same number of elements in all processes participating in the collective. See torch.distributed.broadcast().
• tensor (Tensor) – Data to be sent if src is the rank of current process, and tensor to be used to save received data otherwise.