![Custom Data Catalog Parquet File using Azure Data Factory | by Balamurugan Balakreshnan | Analytics Vidhya | Medium Custom Data Catalog Parquet File using Azure Data Factory | by Balamurugan Balakreshnan | Analytics Vidhya | Medium](https://miro.medium.com/v2/resize:fit:1200/0*YZDNhq38KiOe8qmW.jpg)
Custom Data Catalog Parquet File using Azure Data Factory | by Balamurugan Balakreshnan | Analytics Vidhya | Medium
Convert plain parquet files to Delta Lake format using Apache Spark in Azure Synapse Analytics - Microsoft Community Hub
![Process more files than ever and use Parquet with Azure Data Lake Analytics | Azure Blog | Microsoft Azure Process more files than ever and use Parquet with Azure Data Lake Analytics | Azure Blog | Microsoft Azure](https://azure.microsoft.com/en-us/blog/wp-content/uploads/2018/06/7fc2cd74-63e8-495a-94a0-da6b68aec79c.webp)
Process more files than ever and use Parquet with Azure Data Lake Analytics | Azure Blog | Microsoft Azure
![When we use Azure data lake store as data source for Azure Analysis services, is Parquet file formats are supported? - Stack Overflow When we use Azure data lake store as data source for Azure Analysis services, is Parquet file formats are supported? - Stack Overflow](https://i.stack.imgur.com/pldP9.png)
When we use Azure data lake store as data source for Azure Analysis services, is Parquet file formats are supported? - Stack Overflow
Convert plain parquet files to Delta Lake format using Apache Spark in Azure Synapse Analytics - Microsoft Community Hub
![Processing 700 different parquet files to Delta Table in Databricks with load incremental | by Lucas Lira Silva | Medium Processing 700 different parquet files to Delta Table in Databricks with load incremental | by Lucas Lira Silva | Medium](https://miro.medium.com/v2/resize:fit:948/1*w8Cl6ZguiWFWq-BQY0IgKw.png)