Bridges gaps between development and operation activities by
participating in enforcing automation in the building, testing, and
deployment of applications using Continuous Integration and
Continuous Delivery (CI/CD) methods, and Jenkins, Stash, and
Docker. Builds scalable and robust Extract, Transform, and Load
(ETL) processes and data flows. Provides business solutions by
developing complex or multiple software applications.
Develops original and creative technical solutions to on-going
Designs applications or subsystems on major projects and for/in
Develops applications for multiple projects supporting several
Supports and performs all phases of testing leading to
Assists in the planning and conducting of user acceptance
Develops comprehensive documentation for multiple applications
supporting several corporate initiatives.
Responsible for post-installation testing of any problems.
Establishes project plans for projects of moderate scope.
Works on complex assignments and often multiple phases of a
Performs independent and complex technical and functional
analysis for multiple projects supporting several initiatives.
Education and Experience:
Bachelors degree (or foreign education equivalent) in Computer
Science, Engineering, Information Technology, Information Systems,
Mathematics, Physics, or a closely related field and three (3)
years of experience in the job offered or three (3) years of
experience implementing data solutions in a Data Warehouse
environment using Extract Transform Load (ETL) technologies.
Or, alternatively, Masters degree (or foreign education
equivalent) in Computer Science, Engineering, Information
Technology, Information Systems, Mathematics, Physics, or a closely
related field and one (1) year of experience in the job offered or
one (1) year of experience implementing data solutions in a Data
Warehouse environment using Extract Transform Load (ETL)
Skills and Knowledge:
Candidate must also possess:
Demonstrated Expertise (DE) designing and implementing ETL
processes by developing mappings and workflows for processing ETL
workflows, using Informatica tool and UNIX/Python Shell Scripts;
and developing and managing data warehousing using Wherescape Red
DE performing data modelling and database design, using
dimensional (Star and Snowflake) and Data Vault data modelling
techniques, and type 1, 2, 3 dimensional structures in a data
DE performing Oracle PL/SQL and Snowflake application
development; developing Unix scripts; developing Python Scripts to
load data into Oracle or Snowflake data warehouses; designing and
developing Control-M jobs to automate and schedule end-to-end
processes; designing, developing, deploying, and operating highly
available, scalable, and fault tolerant systems using Cloud
services -- Amazon Web Services(AWS).
DE developing and implementing Continuous Integration/Continuous
Delivery (CI/CD) pipelines, using Docker, Jenkins, Stash, and
For full job details and to apply, please visit
https://jobs.fidelity.com/ and search for job number: 2041952.