Databricks Workflows Through Terraform
Part 1 of the blog series on deploying Workflows through Terraform. How to create complex jobs / workflows from scratch in Databricks using...
View ArticleIntermittent Demand Forecasting With Nixtla on Databricks
This is a collaborative post from Nixtla and Databricks. We thank Max Mergenthaler Canseco, CEO and Co-founder, and Federico Garza Ramírez, CTO and...
View ArticleHow Databricks IT Transforms Business with Product-Driven Approach
The Databricks IT organization has grown rapidly over the last year, partnering closely with stakeholders across all functions to transform the business. Some...
View ArticleArcGIS GeoAnalytics Engine in Databricks
This is a collaborative post from Esri and Databricks. We thank Senior Solution Engineer Arif Masrur, Ph.D. at Esri for his contributions. Advances...
View ArticleWhy We Migrated From Apache Airflow to Databricks Workflows at YipitData
This is a collaborative post from Databricks and YipitData. We thank Engineering Manager Hillevi Crognale at YipitData for her contributions. YipitData is the...
View ArticleAn Automated Guide to Distributed and Decentralized Management of Unity Catalog
Unity Catalog provides a unified governance solution for all data and AI assets in your lakehouse on any cloud. As customers adopt Unity...
View ArticleBuild Reliable and Cost Effective Streaming Data Pipelines With Delta Live...
This year we announced the general availability of Delta Live Tables (DLT), the first ETL framework to use a simple, declarative approach to...
View ArticleHow Databricks Powers Stantec's Flood Predictor Engine
This is a collaborative post between Stantec and Databricks. We thank ML Operations Lead Assaad Mrad, Ph.D. and Data Scientist Jared Van Blitterswyk...
View ArticleAnnouncing General Availability of Data lineage in Unity Catalog
Today, we are excited to announce the general availability of data lineage in Unity Catalog, available on AWS and Azure. With data lineage...
View ArticleBest Practices for Super Powering Your dbt Project on Databricks
dbt is a data transformation framework that enables data teams to collaboratively model, test and document data in data warehouses. Getting started with...
View ArticleStreaming in Production: Collected Best Practices
Releasing any data pipeline or application into a production state requires planning, testing, monitoring, and maintenance. Streaming pipelines are no different in this...
View ArticleDatabricks at National Retail Federation (NRF) Retail’s Big Show 2023
Request a meeting with Databricks executives/thought leaders at NRF! Retail, at its core, is about the relationship between an organization’s brand and customers -...
View ArticleScalable Kubernetes Upgrade Using Operators
At Databricks, we run our compute infrastructure on AWS, Azure, and GCP. We orchestrate containerized services using Kubernetes clusters. We develop and manage...
View ArticleBuilding a Cybersecurity Lakehouse for CrowdStrike Falcon Events Part III
In Part I of this series, we walked through the process of setting up a Cybersecurity Lakehouse that allowed us to collect and...
View ArticleAccelerating SIEM Migrations With the SPL to PySpark Transpiler
In this blog post, we introduce transpiler, a Databricks Labs open-source project that automates the translation of Splunk Search Processing Language (SPL) queries...
View ArticleDatabricks Named a Leader in 2022 Gartner® Magic Quadrant™ for Cloud Database...
We are excited to announce that Gartner has recognized Databricks as a Leader for a second consecutive year in the 2022 Gartner Magic...
View ArticleWhy Startups Build on Databricks
In the 2010s, cloud infrastructures enabled a generation of startups to build and scale their businesses. In this decade, cloud infrastructure is table...
View ArticleBending Retails’ Curve – Moving Beyond Possible With Tredence
This is a collaborative post from Tredence and Databricks. We thank Morgan Seybert, Executive VP, Chief Business Officer-Retail at Tredence, for their contributions...
View ArticleIntroducing Upgrades to Databricks Notebooks - New Editor, Python Formatting,...
Databricks Notebooks offers a simple, unified environment for anyone building Data and AI products. Today we are excited to introduce updates to the...
View ArticleReuse Existing Workflows Through Terraform
Part 2 of the blog series on deploying Workflows through Terraform. Here we focus on how to convert inherited Workflows into Terraform IaC...
View Article