We use cookies. Find out more about it here. By continuing to browse this site you are agreeing to our use of cookies.
#alert
Back to search results
New

Senior Data Engineer

Ampcus, Inc
United States, New York, White Plains
May 22, 2025
Ampcus Inc. is a certified global provider of a broad range of Technology and Business consulting services. We are in search of a highly motivated candidate to join our talented Team.

Job Title: Senior Data Engineer

Location(s): White Plains, NY

Project Overview


  • Client current on-premises Enterprise Resource Planning (ERP) system, SAP ECC 6.0, has been in use for over a decade and is nearing technological obsolescence resulting in client requirement to select and implement a new ERP system. The objective for deploying a new ERP system, is to successfully implement a system that integrates all business functions, including finance, operations, and human resources into a cohesive platform. This implementation aims to enhance organizational efficiency, improve data accuracy, and provide real-time reporting capabilities. The goal is to streamline processes, reduce operational costs, and support informed decision-making across all departments.


Job Functions & Responsibilities


  • Cloud Data Engineering & Integration: - Design and implement data pipelines across AWS, Azure, and Google Cloud. - Develop SAP BTP integrations with cloud and on-premise systems. - Ensure seamless data movement and storage between cloud platforms. ETL & Data Pipeline Development: - Develop and optimize ETL workflows using Pentaho and Microsoft ADF /or equivalent ETL tools. - Design scalable and efficient data transformation, movement, and ingestion processes. - Monitor and troubleshoot ETL jobs to ensure high availability and performance. API Development & Data Integration: - Develop and integrate RESTful APIs to support data exchange between SAP and other platforms. - Work with API gateways and authentication methods like OAuth, JWT, API keys. - Implement API-based data extractions and real-time event-driven architectures. Data Analysis & SQL Development: - Write and optimize SQL queries, stored procedures, and scripts for data analysis, reporting, and integration. - Perform data profiling, validation, and reconciliation to ensure data accuracy and consistency. - Support data transformation logic and business rules for ERP reporting needs. Data Governance & Quality (Ataccama, Collibra): - Work with Ataccama and Collibra to define and enforce data quality and governance policies. - Implement data lineage, metadata management, and compliance tracking across systems. - Ensure compliance with enterprise data security and governance standards. Cloud & DevOps (AWS, Azure, GCP): - Utilize Azure DevOps and GitHub for version control, CI/CD, and deployment automation. - Deploy and manage data pipelines on AWS, Azure, and Google Cloud. - Work with serverless computing (Lambda, Azure Functions, Google Cloud Functions) to automate data workflows. Collaboration & Documentation: - Collaborate with SAP functional teams, business analysts, and data architects to understand integration requirements. - Document ETL workflows, API specifications, data models, and governance policies. - Provide technical support and troubleshooting for data pipelines and integrations


Skills


  • Required Skills & Experience: - 7+ years of experience in Data Engineering, ETL, and SQL development. - Hands-on experience with SAP BTP Integration Suite for SAP and non-SAP integrations. - Strong expertise in Pentaho (PDI), Microsoft ADF, and API development. - Proficiency in SQL (stored procedures, query optimization, performance tuning). - Experience working with Azure DevOps, GitHub, and CI/CD for data pipelines. - Good understanding of data governance tools (Ataccama, Collibra) and data quality management. - Experience working with AWS, Azure, and Google Cloud (GCP) for data integration and cloud-based workflows. - Strong problem-solving skills and ability to work independently in a fast-paced environment. --- Preferred Qualifications: - Experience working on SAP S/4HANA and cloud-based ERP implementations. - Familiarity with Python, PySpark for data processing and automation. - Experience working on Pentaho, Microsoft ADF/or equivalent ETL tools - Knowledge of event-driven architectures - Familiarity with CI/CD for data pipelines (Azure DevOps, GitHub Actions, etc.).


Education & Certifications


  • Bachelor's or Master's degree in a relevant field like Computer science , Data Engineering or related technical field. Nice to have below certifications: A) Azure Data Engineer associate B) SAP Certified Associate - Integration Developer


All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, protected veterans or individuals with disabilities.
Applied = 0

(web-df5f8654-48d87)