-->

DEVOPSZONES

  • Recent blogs

    How to Kill spark jobs in hadoop cluster

    How to Kill spark jobs  in hadoop cluster?


    There are 2 ways to Kill any spark job that you have submitted to the Hadoop Cluster.

    1. Spark UI.
    2. From CLI.


    1. Spark UI.


    1. Go to Resource manager UI.
    2. Click on your job or search your job from the left hand search box.
    3. Click on the Job once it is found.
    hadoop

    4. Click on the "Settings" Button on the right hind side.
    5. Then Click on the "Kill Application" Option to kill the Job.


    2. From CLI.


    First find the application id of the applications if do not have it.

    yarn application -list 

    Kill the job with following command with the APPLICATION ID.

    yarn application -kill <Application id>

    If you want to Kill all the Job at once , then follow the below procedure.

    First find the application id and send it to a file:

    yarn application -list | awk {'print $1'} >> /tmp/appid

    Then write a small script to kill it:

    for i in `cat /tmp/appid`; do yarn application -kill $i; done

    No comments