CWP - 1206 - Senior Data Engineer Contingent Worker Information Position Title Senior Data Engineer

New Today

Temporary Job Request Number: CWP - 1206 - Senior Data Engineer Contingent Worker Information Position Title Senior Data Engineer NYPA Business Unit Information Technology & Cyber Security Experience Level 2 (5-10 years) Work Location White Plains Office Anticipated Start Date Immediate Duration of Assignment 12 month(s) Project/Assignment Project Luminate Finance, Accounting, SSM, HCM CWP Manager Date of Posting 05/19/2025 Due Date 06/03/2025 Work Schedule Monday through Friday 7.5 hour work day with .5 hour unpaid lunch Paid for time worked only; there are no paid holidays, vacation, or sick days Overtime is not permitted unless authorized in advanced. Contingent Worker is required to badge in and out of the turnstiles located in the lobby for all daily entrances and exits. CWPO Engagement Requirements Candidates are REQUIRED to be local to the Tri-State area and in W2 tax status as an hourly paid employee of Service Provider. Fully remote positions are prohibited. Hybrid schedules are permissible with a minimum of 3 days on-site depending on assignment and can be fully on-site depending on business needs. Remote work privileges can be revoked at any time. 2 NYPA does not provide laptops. Instead, NYPA provides desktops and dedicated workstations which candidates will remote into from their personal equipment on virtual days. If candidate does not have proper equipment, internet connection, or he/she is deemed to not work effectively from home, the hybrid privilege will be revoked. Vendors must disclose foreign national status and all Visas at the time of candidate submission. Additionally, a Benefits Waiver and Ethics Disclosure will be required of the candidate during onboarding. Submission Instructions Your email submissions must contain: The LEGAL name of candidate (no aliases, monikers, or nicknames) Job Broadcast number in your email subject header Direct Hourly Labor and Hourly Billing rates Visa work authorization, if applicable For job broadcasts outside your contractual compensation schedules, use competitive rates based on local and state compensation. NYPA CWP staff will contact you if your candidate is selected to be interviewed, which is usually via MS Teams. Please inform your candidates that an interview does not indicate acceptance of the proposed labor rates. For all questions or concerns, please contact Christine Prendergast at Project Overview NYPA’s current on-premises Enterprise Resource Planning (ERP) system, SAP ECC 6.0, has been in use for over a decade and is nearing technological obsolescence resulting in NYPA's requirement to select and implement a new ERP system. The objective for deploying a new ERP system, is to successfully implement a system that integrates all business functions, including finance, operations, and human resources into a cohesive platform. This implementation aims to enhance organizational efficiency, improve data accuracy, and provide real-time reporting capabilities. The goal is to streamline processes, reduce operational costs, and support informed decision-making across all departments. Job Functions & Responsibilities Cloud Data Engineering & Integration: - Design and implement data pipelines across AWS, Azure, and Google Cloud. - Develop SAP BTP integrations with cloud and on-premise systems. - Ensure seamless data movement and storage between cloud platforms. ETL & Data Pipeline Development: - Develop and optimize ETL workflows using Pentaho and Microsoft ADF /or equivalent ETL tools. - Design scalable and efficient data transformation, movement, and ingestion processes. - Monitor and troubleshoot ETL jobs to ensure high availability and performance. API Development & Data 3 Integration: - Develop and integrate RESTful APIs to support data exchange between SAP and other platforms. - Work with API gateways and authentication methods like OAuth, JWT, API keys. - Implement API-based data extractions and real-time event-driven architectures. Data Analysis & SQL Development: - Write and optimize SQL queries, stored procedures, and scripts for data analysis, reporting, and integration. - Perform data profiling, validation, and reconciliation to ensure data accuracy and consistency. - Support data transformation logic and business rules for ERP reporting needs. Data Governance & Quality (Ataccama, Collibra): - Work with Ataccama and Collibra to define and enforce data quality and governance policies. - Implement data lineage, metadata management, and compliance tracking across systems. - Ensure compliance with enterprise data security and governance standards. Cloud & DevOps (AWS, Azure, GCP): - Utilize Azure DevOps and GitHub for version control, CI/CD, and deployment automation. - Deploy and manage data pipelines on AWS, Azure, and Google Cloud. - Work with serverless computing (Lambda, Azure Functions, Google Cloud Functions) to automate data workflows. Collaboration & Documentation: - Collaborate with SAP functional teams, business analysts, and data architects to understand integration requirements. - Document ETL workflows, API specifications, data models, and governance policies. - Provide technical support and troubleshooting for data pipelines and integrations. Skills Required Skills & Experience: - 7+ years of experience in Data Engineering, ETL, and SQL development. - Hands-on experience with SAP BTP Integration Suite for SAP and non-SAP integrations. - Strong expertise in Pentaho (PDI), Microsoft ADF, and API development. - Proficiency in SQL (stored procedures, query optimization, performance tuning). - Experience working with Azure DevOps, GitHub, and CI/CD for data pipelines. - Good understanding of data governance tools (Ataccama, Collibra) and data quality management. - Experience working with AWS, Azure, and Google Cloud (GCP) for data integration and cloud-based workflows. - Strong problem-solving skills and ability to work independently in a fast-paced environment. --- Preferred Qualifications: - Experience working on SAP S/4HANA and cloud-based ERP implementations. - Familiarity with Python, PySpark for data processing and automation. - Experience working on Pentaho, Microsoft ADF/or equivalent ETL tools - Knowledge of event-driven architectures - Familiarity with CI/CD for data pipelines (Azure DevOps, GitHub Actions, etc.). Education & Certifications Bachelor's or Master's degree in a relevant field like Computer science , Data Engineering or related technical field. Nice to have below certifications: A) Azure Data Engineer associate B) SAP Certified Associate - Integration Developer
Location:
White Plains