Benefits and options of becoming a Databricks partner

Becoming a Databricks partner has opened a world of benefits and options for our company. In this blog post, we will discuss the different opportunities that Databricks offers its partners and how our employees have profited from the partnership. 

1. Benefits for our company

Let’s start with company benefits that Databricks has divided into three groups: 

  • Engagement 
  • Marketing 
  • Sales & technical enablement  

Engagement 

As a Registered partner*, one of the options we have is being listed on the Databricks Directory, where potential clients can find us among more than 700 other companies that partner with them. 

Next, we can assist with exclusive partner insights (i.e., newsletters) or register an opportunity to become eligible for DCIF - Databricks Customer Investment Funds (we still haven’t tried that out, but fingers crossed). 

Some other features we have are access to the Partner Portal, which allows us to use the Databricks Lakehouse Platform for new use cases and strategic project leadership. 

* You can also become a Select or Elite partner if you meet the criteria. 

 

Marketing 

In this group of benefits, we can get marketing support from Databricks and access to marketing materials in the Partner Portal previously explained. In addition, we can use the digital tier badge to promote the partner program status. 

 

Sales & technical enablement 

Regarding the sales and technical enablement benefits, we are eligible for unlimited self-paced and instructor-led technical training, which is already proving its worth.  

In addition, we also have access to the Databricks Platform and sales training.  

Through the Databricks Platform, we can see the industry use cases that can be useful.

 

2. Benefits for employees 

In addition to the company benefits, various options are available to employees, such as access to the Databricks Community, which provides all necessary information to Data Engineers. 

The partnership has further aided our engineers by providing them with a sandbox environment to test their data without impacting the actual data, and being data enthusiasts as we are, this really cheered us up. 

The Databricks Academy offers a significant advantage, including the opportunity to request certificate discounts. We have already won some certifications, and plan to explore the Academy even further.  

Additionally, the Data Brew and Brick by Brick podcasts are available for our Data Engineers to listen to while managing their data, where one can hear new and interesting things.  

 

3. How we became a Databricks partner 

Becoming a Databricks partner was a simple process, as we were suitable due to our current use of Databricks in our projects (case study coming out soon). We could apply through their website, receive approval via email, and start the partnership program. Now your turn.  

 

4. A few words from our Data Engineer 

Ivan Derdić, Data Engineer @ bonsai.tech 

 

“For data processing, I take advantage of Databricks clusters with their Spark integration,” said Ivan. 

As a Data Engineer, Ivan primarily uses Databricks for data ingestion and processing. Ingestion in this context is ingestion into the Data Lakehouse from the staging area. Copy Activity in Azure Synapse Analytics is primarily used for fetching the data in the staging area.  

Let’s check his point of view.  

 

4.1. Making work easier with Databricks 

Databricks makes using Spark a breeze since there is no setup necessary. In addition, spark makes data processing easy since it allows using SQL-like commands in a programmatic style. For example, renaming all columns to snake case with Spark is easy since I can iterate over all columns. Unlike SQL, where he needs to write an SQL query that renames all columns manually. 

 

Another strength of Databricks is the Unity Catalog. Although Unity Catalog is only available on the pro tier, it makes table management easy. In addition, unity Catalog stores tables, so I can focus on writing pipelines.  

“One complaint I have is that all code needs to be written in Jupyter-style notebooks. Notebooks are fine for development, but for production, I would much rather use a regular Python file,” Ivan says. 

4.2. Who should use Databricks 

I would recommend Databricks to any company with both Data Analysis and Data Science requirements in the same project and for any project with a complex data set, such as highly nested JSON data or unstructured CSV data. 

 

Databricks lends itself to people working with data, that is, Data Scientists, Data Engineers, and Data Analysts. I believe Databricks is highly specialized for data work and only sees a use within that. Although it is technically possible to use it for general computing, it is too expensive for that use. 

 

If you want to know more about how we use Databricks, reach out and we will find another Data Engineer to give you her perspective on this pretty cool tool.

Comments are closed