Data Engineer (PySpark, Azure Fabric)
Adecco Vezi toate joburile
- București
- Permanent
- Full-time
- Our client is a global technology and software engineering company specializing in digital transformation, product development, and advanced technology consulting
- The organization partners with enterprises across industries such as finance, retail, telecommunications, healthcare, automotive, and media, helping them design, build, and scale modern digital platforms and data-driven solutions
- 5+ years of experience in Data Engineering or related engineering roles
- Strong experience with Azure-based data platforms, ideally Azure Fabric
- Strong hands-on experience with Databricks
- Good understanding of Databricks features such as Unity Catalog, Delta Lake, clusters, notebooks and jobs
- Advanced Python programming skills
- Hands-on experience with PySpark and SparkSQL
- Experience implementing batch data processing pipelines
- Experience with Spark Streaming
- Knowledge of OneLake / Delta Lake concepts
- Experience with Data Factory Gen2 and M-code
- Experience implementing CI/CD pipelines (Azure DevOps or similar tools)
- Understanding of Azure services and cloud data architecture
- Experience integrating data platforms with Power BI
- Design and implement scalable data pipelines using Azure Fabric
- Develop data processing and transformation logic using Python, PySpark and SparkSQL
- Work with OneLake and Delta Lake concepts to support modern Lakehouse data architectures
- Develop and support solutions using Cosmos DB (NoSQL API)
- Contribute to Azure Fabric workloads including Data Engineering, Data Factory Gen2 and Lakehouse
- Build and optimize Spark workloads using Databricks
- Implement CI/CD pipelines and follow DevOps best practices
- Integrate data solutions with Power BI for reporting and analytics
- Collaborate with AI, data science and product teams to enable data-driven and AI-powered solutions
- Ensure data quality, performance, reliability and security across data platforms
- Participate in Agile ceremonies and contribute to sprint deliveries
- Support production environments and contribute to continuous improvements