Expands and optimizes data and data pipeline architecture, using
Azkaban, Luigi, and Airflow. Supports software developers, database
architects, and data analysts on data initiatives while applying
Amazon Web Services (AWS) Cloud Services -- EC2, EMR, RDS, and
Redshift. Builds analytical tools that utilize data pipelines to
provide actionable insights into customer acquisition, operational
efficiency, and key business performance metrics. Uses business
knowledge to translate the vision for divisional initiatives into
business solutions by developing complex or multiple software
applications and conducting studies of alternatives. Analyzes and
recommends changes in project development policies, procedures,
standards, and strategies to development experts and
Participates in architecture design teams.
Defines and implements application level architecture.
Develops applications on complex projects, components, and
subsystems for the division.
Recommends development testing tools and methodologies and
reviews and validates test plans.
Responsible for QA readiness of software deliverables.
Develops comprehensive documentation for multiple applications
Establishes full project life cycle plans for complex projects
across multiple platforms.
Responsible for meeting project goals on-time and on-budget.
Advises on risk assessment and risk management strategies for
Plans and coordinates project schedules and assignments for
Acts as a primary liaison for business units to resolve various
Provides technology solutions to daily issues and technical
evaluation estimates on technology initiatives.
Advises senior management on technical strategy.
Mentors junior team members.
Performs independent and complex technical and functional
analysis for multiple projects supporting several divisional
Develops original and creative technical solutions to on-going
Education and Experience:
Bachelors degree (or foreign education equivalent) in Computer
Science, Engineering, Information Technology, Information Systems,
Mathematics, Physics, or a closely related field and five (5) years
of experience in the job offered or five (5) years of experience
developing end-to-end Cloud or on-premise data pipelines within an
asset management environment.
Or, alternatively, Masters degree (or foreign education
equivalent) in Computer Science, Engineering, Information
Technology, Information Systems, Mathematics, Physics, or a closely
related field and three (3) years of experience in the job offered
or three (3) years of experience developing end-to-end Cloud or
on-premise data pipelines within an asset management
Skills and Knowledge:
Candidate must also possess:
Demonstrated Expertise (DE) developing end-to-end Cloud data
pipelines on Amazon Web Services (AWS) within an asset management
environment; designing and implementing data warehouse solutions --
high-volume data movement from operational data stores (ODS) to
dimensional models -- for traditional RDBMS (Oracles Exadata) and
distributed platforms (Hive/Hadoop), using Informatica, Spark,
Autosys, and PL/SQL; and collating data from varied sources -- S3
files, Vendor files, REST APIs, SQL Server, and RDS instances.
DE applying Continuous Integration/Continuous Deployment (CI/CD)
practices to code releases -- automated deployments -- using
Jenkins, Urban Code Deploy (uDeploy), CloudFormation, Concourse,
and Artifactory; and versioning database deployments using git and
DE integrating reference and common data into a single
environment using Master Data Management (MDM); and simplifying
consumption modeling data, using normalization and denormalization
techniques in Sybase PowerDesigner.
DE designing and implementing data solutions and SQL data
pipelines on Cloud platforms -- Salesforce or Amazon Web Services
(EC2, EMR, RDS, Redshift, SQS, and SNS) -- including data movement
to/from on premise environments -- Oracle and Exadata
For full job details and to apply, please visit
https://jobs.fidelity.com/ and search for job number: 2033872.