Job title: Data Integration Engineer
Company: Hexaware Technologies
Job description: Responsibilities
- Front end process delivery to data extraction, transformation,
and load from disparate sources in a form usable by analysis processes, for projects of moderate complexity, using strong technical capabilities and a sense of database performance<br> <ul> <li>Good understanding of dimensional data modeling standards and best practices to ensure high quality</li> </ul> <ul> <li>Batch processing - Ability to design an efficient processing method</li> </ul> high volumes of data when a group of transactions is collected over a period<br> <ul> <li>Data integration (provisioning, storage and migration) - Ability to design and implement models, capabilities and solutions to manage data within the enterprise (structured and unstructured, data archiving principles,</li> </ul> data warehousing, data sourcing, etc.). This includes data models, storage requirements, and data migration from one system to another.<br> <ul> <li>Data Quality, Profiling and Cleansing - The ability to examine (profile) a dataset to establish its quality against a defined set of metrics and highlight data for which corrective action (cleansing) is needed to correct the data. data</li> </ul> <ul> <li>Stream Systems - Ability to discover, integrate and ingest all available data from the machines that produce it, as quickly as it is produced, in any format and at any quality</li> </ul> <ul> <li>Excellent interpersonal skills to network with various departments within the company to understand data and deliver business value.</li> </ul> Role offers<br> <ul> <li>Excellent opportunities to learn various tools and technologies used in sophisticated data architecture within business intelligence and analytics data services</li> </ul> <ul> <li>Provides the opportunity to showcase strong analytical skills and problem-solving ability of candidates</li> </ul> <ul> <li>An exceptional opportunity to re-imagine, redesign and apply technology to add value to the business and operations</li> </ul> <ul> <li>Learning and Growth Opportunities in Cloud and Big Data Engineering Spaces</li> </ul> <b>Essential skills</b><br> <ul> <li>Over 3 years of experience developing large scale data pipelines in a cloud / on-premise environment.</li> </ul> <ul> <li>Very proficient in one or more of the market leading ETL tools like Informatica, DataStage, SSIS, Talend, etc.,</li> </ul> <ul> <li>Fundamental knowledge of Data Warehouse / Data Mart architecture and modeling</li> </ul> <ul> <li>Define and develop data ingestion, validation, and transformation pipelines.</li> </ul> <ul> <li>Basic knowledge of distributed data processing and storage</li> </ul> <ul> <li>Fundamental knowledge of working with structured, unstructured and semi-structured data</li> </ul> <ul> <li>For the cloud data engineer, experience with ETL / ELT models, preferably using Azure Data Factory and Databricks tasks</li> </ul> <ul> <li>Extensive experience in applying analysis, knowledge and data mining to "real world" business problems</li> </ul> <ul> <li>Technical experience in programming in Python, R and statistical software packages (R, SAS)</li> </ul> <b>Essential qualification</b><br> <ul> <li>BE / Btech in computer science, engineering or relevant field</li> </ul>
Location: Chennai, Tamil Nadu
Job date: Thu, 28 Oct 2021 06:51:00 GMT
Apply for the job now!