Data Engineering
- Construct and maintain the architecture and infrastructure necessary for the effective acquisition, storage and analysis of large volumes of data
- Set up and manage relational and non-relational databases, data warehouses and big data systems
- Create and manage the flow of data through scripts and data pipelines
- Work closely with data scientists and analysts to ensure that the data is formatted correctly and is readily available for analysis
- Continuously monitor the systems, optimize queries, and make improvements to ensure that data can be processed and retrieved as efficiently as possible
Data Architecture
- Data modeling (Data Vault; Star Schema; Bridge, Junk Dimensions; Data Snapshot; Entity-Attribute-Value) using modern BI tools on Azure (ADF, Synapse), Databricks, Fabric, Power BI
- Create conceptual, logical and physical models
- Create databases for evaluating and acquiring large datasets
- Slowly changing dimension – type 0, 1, 2
Cloud Engineering
- Create all the necessary objects in Azure Portal, Fabric, Databricks – Subscription, Entra ID, Azure Data Factory, Synapse, Data Lake, Automation Account, Notenook, Catalog, Data Flow, etc
- Create end-to-end data solutions in Azure, fabric, Databricks with longing, error handling and error reporting
- Build, design test and maintain data architecture, data pipelines
- Validate data sets and data sources
Business Intelligence and Reporting
- Architecture – Star Schema, Data Vault
- Ingestion from any source – structured, semi-structured, non-structured databases, APIs, flat files and others
- ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) with SSIS, Azure, Fabric, Databricks
- MAD (Master, Analytical, Details levels) dashboards and reports with SSRS, Power BI, Tableau
Integration
- Develop new core functionality in existing databases
- Using Service Broker as communication layer
- Create REST/API Web Services to synchronize data between client’s application and SQL Server
- SSIS to extract parametrized SSRS report and save it in different formats in shared folder or API
Backend and Automation
- Automations with PowerShell, Python and C#
- Scheduled extractions and delivering data (flat files, upsert in database, etc.) to the clients
- Collectiog logs and raising notifications on previously specified events
- Run data validations after the execution of data ingestion, ETL or data transformation

