Notebooks are a popular way to start working with data quickly without configuring a complicated environment. Notebook authors can quickly go from interactive analysis to sharing a collaborative workflow, mixing explanatory text with code. Often, notebooks that begin as exploration evolve into production artifacts. For example,
1. A report that runs regularly based on newer data and evolving business logic.
2. An ETL pipeline that needs to run on a regular schedule, or continuously.
3. A machine learning model that must be re-trained when new data arrives.
Perhaps surprisingly, many Databricks customers find that with small adjustments, notebooks can be packaged into production assets, and integrated with best practices such as code review, testing, modularity, continuous integration, and versioned deployment.
Read on for several tips and recommendations.