Eaton Jobs

Job Information

Eaton Corporation Snowflake Data Engineering Expert in Pune, India

What you’ll do:

We are seeking a highly skilled data engineer with specialized experience in Snowflake to join our team as a Snowflake Data Engineer Expert. The ideal candidate will have extensive experience in designing and implementing complex data pipelines and optimizing Snowflake performance. Key responsibilities will include designing and implementing complex data pipelines using Snowflake-specific features such as Snow pipe, data sharing, and multi-cluster warehouses.Building and maintaining Snowflake objects such as tables, views, and stored procedures, and optimizing their performance

This role will be part of DFI wherein the person need to closely work with group’s data architecture & data platform team, product owner team and project management team to ensure we develop DFI data platforms in accordance with group’s best practices."

• Analyzing, designing, and implementing data warehouses, data lakes, data models and data pipelines

• Building the code and logic required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL or programming languages, working with big data

• Building analytics solutions that provide valuable insights for clients and end users

• Working in an agile project team which has specific goals and deliverables

• Optimize Snowflake performance by tuning SQL queries, optimizing data loading processes, and implementing best practices for data modelling and schema design.

• Implement ETL/ELT into Snowflake using tools such as Snowflake's native functionalities, Snowpipe

• Implement data governance and security policies within Snowflake, including role-based access control, data encryption, and data masking techniques.

• Troubleshoot and resolve issues related to data quality, data consistency, and data integrity within Snowflake data warehouses.

• Design, develop, and maintain scalable data pipelines and ETL processes to support data analytics and machine learning initiatives.

• Utilize workflow management platforms to orchestrate and automate data engineering pipelines.

• Implement data models and schemas to support data storage, retrieval, and analysis.

• Work closely with data scientists and analysts to understand data requirements and provide data engineering solutions.

• Collaborate with software engineers, product owners and scrum team to integrate data pipelines with existing systems and applications.

• Implement data quality monitoring and validation processes to ensure data integrity and accuracy.

• Optimize data infrastructure and performance to meet scalability and performance requirements.

Qualifications:

  • BE in Computer Science, Electrical , Electronics/ Any other equivalent Degree

  • 10 plus years of experience as Snowflake expert

Skills:

PySpark, Spark SQL, Serverless SQL pool, Dedicated SQL pools, Data warehousing, Data modeling

Team player, collaboration, good communication skills

DirectEmployers