![1 Star 1 Star](https://i0.wp.com/www.devtech101.com/wp-content/plugins/wp-postratings/images/stars_crystal/rating_off.gif?w=1200&ssl=1)
![2 Stars 2 Stars](https://i0.wp.com/www.devtech101.com/wp-content/plugins/wp-postratings/images/stars_crystal/rating_off.gif?w=1200&ssl=1)
![3 Stars 3 Stars](https://i0.wp.com/www.devtech101.com/wp-content/plugins/wp-postratings/images/stars_crystal/rating_off.gif?w=1200&ssl=1)
![4 Stars 4 Stars](https://i0.wp.com/www.devtech101.com/wp-content/plugins/wp-postratings/images/stars_crystal/rating_off.gif?w=1200&ssl=1)
![5 Stars 5 Stars](https://i0.wp.com/www.devtech101.com/wp-content/plugins/wp-postratings/images/stars_crystal/rating_off.gif?w=1200&ssl=1)
![](https://i0.wp.com/www.devtech101.com/wp-content/plugins/wp-postratings/images/loading.gif?resize=16%2C16&ssl=1)
To address the errors – like the one below
6 WARN scheduler.TaskSetManager: Lost task 0.3 in stage 2.0 (TID 16, n06.domain.com): org.apache.spark.SparkException: Kryo serialization failed: Buffer overflow. Available: 0, required: 3. To avoid this, increase spark.kryoserializer.buffer.max value.
Set In CDH under SPARK, look for spark-defaults.conf, add the below.
One of the two values below shuld work (not sure which one) spark.kryoserializer.buffer.max=64m spark.kryoserializer.buffer.mb=64m
Deploy the config once done the modify
Note: The file below gets modified
/etc/spark/conf/spark-defaults.conf
0
0
votes
Article Rating