DataForge has launched a data architecture tool that it says can help companies building new data lakehouses, particularly those doing so the Databricks platform.
The goal is to allow companies to “keep teams focused on extracting value from the data, and to make smarter, more informed decisions.” says Matt Kosovec, co-founder and CEO of DataForge.
The term "data lakehouse" combines data lake and data warehouse, according to Databricks.
DataForge says it is offering an expanded catalog to complement the Databricks Unity Catalog.
Data architecture designs have often required a full rebuild every three to five years.
DataForge utilizes uses predefined structures, auto dependencies, and auto-code checking. This reduces errors, enforces best practices and helps data engineering workflows, it claims.
In addition, Dataforge says it enables growing enterprises brands to centralize data from hundreds of databases. And, cluster management and automation features can help firms save over 40% on cloud spend, it adds.
“Enterprises can hardly afford to have their data engineers focused on boilerplate code, manual management tasks, and constant rebuilds just to stay competitive,” Kosovec says.