Debugging “Managed memory leak detected” in Spark 1.6.0

Debugging “Managed memory leak detected” in Spark 1.6.0

Asked on January 2, 2019 in Apache-spark.
Add Comment


  • 1 Answer(s)

    Here we cannot create memory leaks in the unified memory manager.

    In Spark bug such leaks happen.  SPARK-11293

    When we need to understand the cause of a memory leak, This can be done

    • First download the Spark source code and note to build it and the build works.
    • In TaskMemoryManager.java add extra logging in acquireExecutionMemory and releaseExecutionMemory: logger.error(“stack trace:”, new Exception());
    • Here all the other debug logs are changed to error in TaskMemoryManager.java. (Easier than figuring out logging configurations…)

    In this we could see the full stack trace for all allocations and deallocations. Try to match them up and find the allocations without deallocations. And here we have the stack trace for the source of the leak.

    Answered on January 2, 2019.
    Add Comment


  • Your Answer

    By posting your answer, you agree to the privacy policy and terms of service.