ETL Developer

Full time TechCORE MSS in Consulting , in Information Technology
  • Virginia, United States, 800 Corporate Drive, Suite 301, Office 309, Stafford, 22554 View on Map
  • Post Date : March 25, 2021
  • Apply Before : July 10, 2021
  • Salary: $100,000.00 - $162,000.00 / Yearly
  • Applications 2
  • View(s) 228
Email Job
  • Share:

Job Detail

  • Career Level Consultant
  • Experience 1 Years
  • Qualifications Degree Bachelor

Job Description

ETL Developer – 5 years working experience required

This Position responsible for developing data integration programs that load, transform and extract data To/From the data warehouse, assures that architecture and development follow data warehouse best practices.

This Position responsible for developing data integration programs that load, transform and extract data To/From the data warehouse, assures that architecture and development follow data warehouse best practices. You will design, develop, and test extract, transform, and load processes for enterprise-wide data warehouse and data store implementations. Design, develop, and test ETL processes for enterprise-wide data warehouse implementations mainly with DataStage and Cloud Architect assuming overall responsibility for data migration solution and capable on Working with AWS services like S3 EC2 VPC Glue Lambda, python and Jason etc. and any of the databases like AWS Aurora Dynamo DB RDS database engines and Redshift

Technical Knowledge and Skills:
  • Work closely with Leadership and Data Governance to define and refine the Data Lake platform to achieve business objectives.
  • Extract, transform, and load data from various Databases.
  • Develops and automates advanced level ETL processes to gather or provide data to and from different sources respectively.
  • Document, publish and maintain ETL processes and related documentation.
  • Apply Best Practices in developing AWS Data Lake flow.
  • Create ETL flows to Integrate the On-Prem data to Cloud AWS S3 Buckets.
  • Translate detailed business requirements into optimal database and ETL solutions.
  • Create an ETL jobs to load data into AWS S3 buckets by performing cleansing.
  • Create a flow to load data from Amazon S3 to Redshift using Glue.
  • Diagnose complex problems, including performance issues, and driving to resolution.
  • Write reusable, testable, and efficient code in Python.
  • Create Collibra Data Governance Center (DGC) models
    including assets, relationships, domains and communities.
  • Configure Data Governance on Business requirements and specifications.
  • Assist Cloud Architect design overall for data migration solution.
  • Connect to extract the (meta) data from multiple source systems that comprise databases like Oracle, SQL Server, Hadoop etc. into data governance platform.
  • Develop DataStage Parallel Extender jobs using different stages like Aggregator, Join, Merge, Lookup, Source dataset, External Filter, Row generator, Column Generator, Change Capture, Copy, Funnel, Short, Peek stages etc.
  • Qualifications:
  • – Experience in coding dimension and fact tables. General understanding of dimensional modeling and star schemas.
  • Experience with data warehousing concepts for Data Quality and Governance implementation.
  • Working experience with database like DB2 and Redshift is required.
  • Having knowledge on Python and its related libraries.
  • Experience in creating technical design and implementing ETL solutions.
  • Knowledge of algorithms, data structures, and Python design patterns.
  • Must possess experience and skills in understanding data models and developing databases and database objects.
  • Solid experience in Python and great motivation to have it as your primary programming language.
  • Experience with data governance tools like IBM Information Governance Catalog, Collibra.
  • Experience in loading the data from S3 to Redshift.
  • Experience with web related technologies like JSON.
  • Excellent knowledge of SQL and PL/SQL with the ability to extract, manipulate and integrate enterprise data, developing reports and expanding reporting capability utilizing SQL backend mining tools.
  • Experience working with AWS technologies (S3, Glue, RedShift).
  • Advanced experience with Red Hat Satellite to include systems provisioning, content view management, host group management, Discovery, and Capsule management.
  • Develop design & architect solution using AWS foundation services VPC is a plus.
  • Advise senior staff on technical issues as appropriate.
  • Required Qualifications and Experience:
  • 4-5+ years of related information technology experience like cloud architecture/ Senior Developer, AWS, IBM DataStage.
  • 4+ years of proficiency and experience with AWS cloud environment for best practices
  • 4+ years’ Experience in IBM Tools like Information Analyzer, Watson Knowledge Catalog
  • 5+ years of Experience on designing , developing Datastage ETLs
  • 4 years of proficiency in developing scripts and scripting languages for cloud services.
  • 4+ years of strong working knowledge with AWS CLI, S3,Redshift, EC2, API Gateway and Lambda, Python step functions etc.
  • 4+ years of experience develop design & architect using AWS and IBM foundation services VPC/VPNe.
  • Knowledge with AWS CLI and knowledge of JSON and/or PowerShell scripting. (Preferred)
  • AWS – Solutions Architect Certification Preferred. (preferred)
  • Capability to migrate and manage on-premises workload to cloud service providers.
  • Information Server Admin experience (Preferred).

Other jobs you may like

Translate »