• Databricks
  • Databricks
  • Support
  • Feedback
  • Try Databricks
  • Help Center
  • Documentation
  • Knowledge Base
Knowledge Base for Databricks on Google Cloud
  • Business intelligence tools
  • Clusters
  • Data management
  • Data sources
  • Databricks SQL
  • Developer tools
  • Delta Lake
  • Jobs
    • Distinguish active and dead jobs
    • How to delete all jobs using the REST API
    • Job cluster limits on notebook output
    • Job fails, but Apache Spark tasks finish
    • Job fails due to job rate limit
    • Job fails with invalid access token
    • Task deserialization time is high
    • Spark job fails with Driver is temporarily unavailable
  • Libraries
  • Machine learning
  • Metastore
  • Notebooks
  • Streaming
  • Python with Apache Spark
  • R with Apache Spark
  • Scala with Apache Spark
  • SQL with Apache Spark

Updated Apr 14, 2022

Send us feedback

  • Documentation
  • Jobs

Jobs

These articles can help you with your Databricks jobs.

  • Distinguish active and dead jobs
  • How to delete all jobs using the REST API
  • Job cluster limits on notebook output
  • Job fails, but Apache Spark tasks finish
  • Job fails due to job rate limit
  • Job fails with invalid access token
  • Task deserialization time is high
  • Spark job fails with Driver is temporarily unavailable


© Databricks 2022. All rights reserved. Apache, Apache Spark, Spark, and the Spark logo are trademarks of the Apache Software Foundation.

Send us feedback | Privacy Policy | Terms of Use