This site uses cookies. To find out more, see our Cookies Policy

Data Engineer ETL in Bloomfield, CT at Fast Switch

Date Posted: 4/16/2019

Job Snapshot

Job Description

Job ID: 49359

Data Engineer, ETL 
Bloomfield, Connecticut, United States 


Data Engineer, ETL 

Our client is seeking a Sr ETL talent with strong Object Oriented Programming knowledge with modern software engineering practices (Continuous Integration, Specification by Example, Behavior / Test Driven Development – BDD, TDD) and respective tools, to become a part of a dynamic team located in Bloomfield, CT ! This is an exciting opportunity to augment your current skills, as well as, learn new technologies. 

If you are looking for a new challenge and want to make a difference in the Healthcare Industry, this role is for you. 

Your future duties and responsibilities

List principle responsibilities.

• Develops Extract, Transform, Load (ETL) scripts utilizing a thorough understanding of relational databases, object oriented programming, available technology, tools and existing designs. 

• Designs, codes, tests, debugs, and documents programs leveraging Python and Java

• Provides comprehensive consultation to business unit and IT management and staff at the highest technical level on all phases of application programming. 

• Overall understanding of Big Data architecture and latest trends.

• Works closely with client and IT management and staff to identify application development solutions, new or modified programs, reuse of existing code through the use of program development software alternatives, or integration of purchased solutions or a combination of the available alternatives.

• Perform project delivery functions in support of releases

• Responsible for delivery of medium to high complexity project work

• Maintain in-depth knowledge of project tools and data sources. 

• Participate in all sprint planning, daily standups, and sprint review sessions

• Perform sprint demos to get stakeholder agreement that sprint deliverables meet the requirements 

• Provide support for technical specifications

• Engage in operational readiness activities to ensure that project “go-live” is stable and successful

• Perform QA Automation functions to ensure project delivery is successful

• Maintain breadth of knowledge on all business processes.

• Communicate analytical findings/gaps/solutions to management and offer solutions to meet gaps. 

• Manage the synthesis of large data-sets into meaningful information used to drive action. 

• Determine opportunities and identify technical solutions to increase analytical processing efficiency.

• Communicate statuses, good and bad, to partners to keep them informed and engaged in the project, minimizing surprises

• Develops Data workflows and automated scripts in Hadoop Environment using Hive, Impala, Hue, Sqoop, Oozie.

• Demonstrate passion for anticipating and meeting the expectations and requirements of business partners. 

• Ability to work with different groups to close bridges/gaps, problem solve, and develop solutions

• Strong accountability and desire to deliver a quality product on time, communicate statuses, or offer alternatives to meet customer objectives 

Required qualifications to be successful in this role
SQL (Expert in any RDBMS, such as Teradata, Oracle, DB2, MSSQL) 

Object Oriented Design and Development (Python, Java, C#)

Agile Engineering Practices 

Test Driven Development (TDD)

Continuous Integration

Continuous Delivery


Scripting in *nix environments

Database performance tuning

Skill Set Years of Experience Proficiency Level

-SQL (Expert in any RDBMS, such as Teradata, Oracle, DB2, MSSQL) 5 Expert

Object Oriented Design and Development (Python, Java, C#) 3 Advanced

Agile Engineering Practices 
Test Driven Development (TDD),Continuous Integration
Continuous Delivery
Refactoring - 2

Database performance tuning 5

-- Design Patterns 

-- Business Intelligence dashboard creation (Tableau, Looker, etc.)

-- Data automation, quality testing and validation