Consultant

위치
Pune, 인도

Job Title:

Consultant

Job Description

Key Responsibilities
• Data Migration and Integration: Lead the migration of data from existing host systems into a new Data Lake/ Data Warehouse architecture on Databricks.
• Data Consolidation: Consolidate data from multiple sources and ensure that it is integrated into a unified data repository for reporting and analytics.
• Development of Databricks Workflows: Build, maintain, and optimize workflows and data pipelines on Databricks, ensuring reliable and timely data delivery.
• Data Quality Assurance: Ensure data quality and integrity throughout the migration process and proactively address any data quality issues.
Technical Documentation: Create and maintain clear technical documentation for all data engineering tasks, including workflows, pipelines, and any new data models.

Required Skills & Experience
•    6+ years’ experience in data engineering in cloud-based environments like Databricks and Azure Platform.
•    Strong experience in designing and building ETL (Extract, Transform, Load) pipelines.
•    Advanced proficiency in Python PySpark for data manipulation and modeling
•    Solid understanding of data modeling and data warehouse architecture.
•    Proficiency with SQL and working with large datasets.
•    Familiarity with data lake and data warehouse concepts.
•    Experience with data integration tools and technologies.
•    Familiarity with business intelligence tools (e.g., Domo, Tableau, Power BI).
•    Ability to collaborate with cross-functional teams, including data scientists and business stakeholders.
•    Experience with version control (e.g., Git) and agile development methodologies.

Location:

IND Pune - Amar Tech Centre S No.30/4A 1

Language Requirements:

Time Type:

Full time

If you are a California resident, by submitting your information, you acknowledge that you have read and have access to the Job Applicant Privacy Notice for California Residents