Facebook

Datastage Lead – DL202208001

Date Posted:  08-09-2022​, Job Code: DL202208001

Job Responsibilities:

  • Create ETL job design for the business requirements and implement the best possible solution using IBM Datastage.
  • Work with stakeholder’s right from requirement gathering till production deployment in all phases of Software Development Life Cycle.
  • Understand business requirements and to device technical solution using IBM Datastage, Talend, Unix, SQL.
  • Analyze the existing ETL Datastage components, Unix Scripts, SQL Scripts and propose enhanced solutions by following ETL tuning techniques.
  • Design ER relationship between fact and dimension tables using data warehousing methodologies.
  • Analyze source data, technical metadata, profile source data and create data mapping and transformation rules based on business requirements.
  • Work in ETL Datastage for building high-performance data integration solutions, including extraction, transformation, and load (ETL) packages for data warehousing.
  • Design dynamic ETL flows using run-time column propagation (RCP) and reusable components to run concurrently to utilize the memory efficiently following all best practices.
  • Design and develop ETL jobs by identifying various source files and applying complex transformations and ETL logic to them to load target databases.
  • Create complex SQL queries, Stored Procedures, Functions, Views and Triggers using CTE and Sub-queries to implement business logic for tables with huge data volume.
  • Document artifacts including requirement analysis, design documents (HLD, LLD), unit test plans and traceability matrix to ensure that the output is as per the requirements for all new development and modifications of the ETL processes and ELT jobs.
  • Create Python scripts and Unix batch scripts for file handling and batch processing using job scheduler Autosys.
  • Create packages of ETL jobs using GIT and deploy the components using Bit Bucket.
  • Provide Root Cause Analysis (RCA) for technical issues raised, and provide efficient technical solution.
  • Follow Agile methodology to breakdown EPIC into technical Stories, Tasks, Analyze and Finalize Requirements, Solution and Design Review approvals, Development, Testing, Autosys jobs, Pre-load data validation, DS & Non-DS Migration, Historical data load, Production data validation and Ongoing Support activities.
  • Migrate On-Premises Data Mart to Cloud Based Data Marts.
  • Prepare implementation plans, backups and back out plans to support release activities.

Skills Requirement:

  • Must possess a Bachelors degree in Computer Science/ Computer Engineering/Software Engineering/Computer Applications/ Information Technology/ Information Systems/ Electrical or Electronics/ Mathematics or related field or equivalent degree
  • Others: ETL, Python, SnowSQL, Data Warehousing, PL/ SQL, Data Modeling, RDBMS, PL/ SQL, Data Warehousing, Oracle, DB2, Snowflake,XML, UNIX scripting, Linux, Mainframe, SAS, Markit EDM,

Location: Charlotte, NC

Submit resume to: usitcareers@egrovesys.com

APPLY NOW:

If you like us to email you when a position is available, kindly fill out the below form.

Contact Us