Facebook

Datastage Developer – DD202206003

Date Posted:  06-24-2022​, Job Code:  DD202206003

Job Responsibilities:

  • Work with stakeholder’s right from requirement gathering till production deployment in all phases of Software Development Life Cycle
    Understand business requirements and to device technical solution using ETL Datastage, Python, SnowSQL.
  • Design ER relationship between fact and dimension tables using data warehousing methodologies.
  • Analyze source data, technical metadata, profile source data and create data mapping and transformation rules based on business requirements.
  • Work in ETL Datastage for building high-performance data integration solutions, including extraction, transformation, and load (ETL) packages for data warehousing.
  • Design dynamic ETL flows using run-time column propagation (RCP) and reusable components to run concurrently to utilize the memory efficiently following all best practices.
  • Design and develop ETL jobs by identifying various source files and applying complex transformations and ETL logic to them to load target databases.
  • Create complex SQL queries, Stored Procedures, Functions, Views and Triggers using CTE and Sub-queries to implement business logic for tables with huge data volume.
  • Document artifacts including requirement analysis, design documents (HLD, LLD), unit test plans and traceability matrix to ensure that the output is as per the requirements for all new development and modifications of the ETL processes and ELT jobs.
  • Create python scripts and batch scripts for file handling and batch processing using job scheduler Control-M.
  • Create packages of ETL jobs using GIT and deploy the components using Bit Bucket.
  • Provide Root Cause Analysis (RCA) for technical issues raised, and provide efficient technical solution.
  • Follow Agile methodology to breakdown EPIC in to technical Stories, Tasks, Analyze and Finalize Requirements, Solution and Design Review approvals, Development, Testing, Control-M jobs, Pre-load data validation, DS & Non-DS Migration, Historical data load, Production data validation and Ongoing Support activities.
  • Migrate On-Premises Data Mart to Cloud Based Data Marts (Snowflake Cloud Computing).
  • Prepare implementation plans, backups and back out plans to support release activities

Skills Requirement:

  • Must possess a Bachelors degree in Computer Science/ Computer Engineering/Software Engineering/Computer Applications/ Information Technology/ Information Systems/ Electrical or Electronics/ Mathematics or related field or equivalent degree
  • Others:  ETL, Python, SnowSQL, Data Warehousing, PL/ SQL, Data Modeling, RDBMS, PL/ SQL, Data Warehousing, Oracle, DB2, Snowflake, XML, UNIX scripting, Linux, Mainframe, SAS, Markit EDM.

Location: Plano, TX

Submit resume to: usitcareers@egrovesys.com

APPLY NOW:

If you like us to email you when a position is available, kindly fill out the below form.

Contact Us