Загрузка страницы

Apache Spark - How to determine executors for a given Spark Job?

Following is the question from one of my Self Paced Data Engineering Bootcamp 6 Student.

https://kaizen.itversity.com/shop/all-courses/data-engineering-bootcamp/

Topic Link: http://discuss.itversity.com/t/apache-spark-how-to-determine-executors-for-a-given-spark-job/17742

How does a developer arrive at a decision to pass control arguments to override the executor memory and cores in a spark job ? Is there a decision-making hierarchy in engineering teams that the developer would have to go through?
As part of this live session/pre-recorded video, I will answer the above question. Here are the details which need to be understood.

* Cluster Capacity - YARN (or Mesos)
* Static Allocation vs. Dynamic Allocation
* Determining and use Capacity based on the requirement
* Setting Properties at Run Time
* Setting Properties Programmatically
* Overview of --num-executors, --executor-cores, --executor-memory
* Decision Making Hierarchy

Demos are given using our state of the art labs. If you are interested you can sign up at https://labs.itversity.com

Connect with me or follow me at
https://www.linkedin.com/in/durga0gadiraju
https://www.facebook.com/itversity
https://github.com/dgadiraju
https://www.youtube.com/itversityin
https://twitter.com/itversity
#sparkJobs #ApacheSpark #Execution

Видео Apache Spark - How to determine executors for a given Spark Job? канала itversity
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
14 апреля 2019 г. 18:56:16
01:42:12
Яндекс.Метрика