31 Views · 25 Downloads · ☆☆☆☆☆ 0.0

Parallel and Distributed Machine Learning

PUBLISHED May 15, 2024 (DOI: https://doi.org/10.54985/peeref.2405p1620685)

NOT PEER REVIEWED

Authors

Kashish Agarwal1 , Aditya Bhat1 , Atharva Deshpande1 , Jayesh Bhave1 , Fatima Inamdar1
  1. Vishwakarma Institute of Information Technology

Conference / event

IJSAE, December 2023 (Virtual)

Poster summary

This poster outlines the difficulties, advantages, and distinctions between distributed and parallel machine learning (ML), navigating the complex field. It breaks down the difficulties in synchronisation, fault tolerance, and communication overhead that arise when parallelizing and spreading machine learning activities. On the other hand, it highlights the variety of advantages these methods provide, such as improved scalability, faster training rates, and the capacity to handle large datasets. Additionally, the poster clarifies the differences between distributed and parallel machine learning, explaining how distribution uses several machines in a network, whilst parallelism makes use of multiple resources on a single machine. The poster attempts to provide academics and practitioners with a comprehensive grasp of the intricacies surrounding parallel and distributed machine learning through lucid illustrations and succinct explanations, enabling well-informed decision-making.

Keywords

Parallel, Distributed, Data parallelism, Model paralellism, High Performance Computing

Research areas

Computer and Information Science , Systems Science

References

  1. Parallel stochastic gradient descent for deep learning: A survey" by J. Dean et al. (2012)

Funding

No data provided

Supplemental files

No data provided

Additional information

Competing interests
No competing interests were disclosed.
Data availability statement
Data sharing not applicable to this poster as no datasets were generated or analyzed during the current study.
Creative Commons license
Copyright © 2024 Agarwal et al. This is an open access work distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Rate
Cite
Agarwal, K., Bhat, A., Deshpande, A., Bhave, J., Inamdar, F. Parallel and Distributed Machine Learning [not peer reviewed]. Peeref 2024 (poster).
Copy citation

Find the ideal target journal for your manuscript

Explore over 38,000 international journals covering a vast array of academic fields.

Search

Add your recorded webinar

Do you already have a recorded webinar? Grow your audience and get more views by easily listing your recording on Peeref.

Upload Now