Saturday, October 23

Data Integration Engineer

Job title: Data Integration Engineer

Company:

Job description: JD (Job description)

· Collaborate with development teams and lead designers to develop architectural requirements to ensure customer satisfaction with ETL mappings.

· Responsible for validating the data model with Data Architect.

· Solve difficult design and development issues and manage complex ETL requirements and design.

· Lead and guide the development of an ETL architecture and ETL mappings.

Understand the data flow and guide the team for source-to-target mappings

· Responsible for validating ETL mappings.

· Responsible for unit testing and integration of ETL mappings.

· Implement an ETL solution that meets strict performance requirements.

Identify, recommend and implement ETL architecture and process improvements.

· Manage the construction phase and ensure the quality of the code to ensure compliance with requirements and adherence to the ETL architecture.

· Assist and verify the design of the solution and the production of all deliverables of the design phase.

Provide technical leadership and manage all aspects of SDLC process requirements analysis, design, development, testing and support / maintenance.

Good understanding of ETL tools like ADF and SSIS

Understand, design and develop ETL mappings using SQL Server and SSIS.

Develop / modify ETL mappings based on mappings for source to target

· Responsible for unit testing and integration of ETL mappings.

Responsibilities

  • Front end process delivery to data extraction, transformation,

and load from disparate sources in a form usable by analysis processes, for projects of moderate complexity, using strong technical capabilities and a sense of database performance

  • Good understanding of dimensional data modeling standards and best practices to ensure high quality
  • Batch processing – Ability to design an efficient processing method

high volumes of data when a group of transactions is collected over a period

  • Data integration (provisioning, storage and migration) – Ability to design and implement models, capabilities and solutions to manage data within the enterprise (structured and unstructured, data archiving principles,

data warehousing, data sourcing, etc.). This includes data models, storage requirements, and data migration from one system to another.

  • Data Quality, Profiling and Cleansing – The ability to examine (profile) a dataset to establish its quality against a defined set of metrics and highlight data for which corrective action (cleansing) is needed to correct the data. data
  • Stream Systems – Ability to discover, integrate and ingest all available data from the machines that produce it, as quickly as it is produced, in any format and at any quality
  • Excellent interpersonal skills to network with various departments within the company to understand data and deliver business value.

Role offers

  • Excellent opportunities to learn various tools and technologies used in sophisticated data architecture within business intelligence and analytics data services
  • Provides the opportunity to showcase strong analytical skills and problem-solving ability of candidates
  • An exceptional opportunity to re-imagine, redesign and apply technology to add value to the business and operations
  • Learning and Growth Opportunities in Cloud and Big Data Engineering Spaces

Essential skills

  • Over 3 years of experience developing large scale data pipelines in a cloud / on-premise environment.
  • Very proficient in one or more of the market leading ETL tools like Informatica, DataStage, SSIS, Talend, etc.,
  • Fundamental knowledge of Data Warehouse / Data Mart architecture and modeling
  • Define and develop data ingestion, validation, and transformation pipelines.
  • Basic knowledge of distributed data processing and storage
  • Fundamental knowledge of working with structured, unstructured and semi-structured data
  • For the cloud data engineer, experience with ETL / ELT models, preferably using Azure Data Factory and Databricks tasks
  • Extensive experience in applying analysis, knowledge and data mining to “real world” business problems
  • Technical experience in programming in Python, R and statistical software packages (R, SAS)

Essential qualification

  • BE / Btech in computer science, engineering or relevant field


Expected salary:

Location: Mumbai, Maharashtra – Pune, Maharashtra

Job date: Wed, 06 Oct 2021 06:56:10 GMT

Apply for the job now!


Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: