Data Engineer / QA Analyst
New Yesterday
Job Summary
Argent ARC is hiring a Data Engineer / QA Analyst to build and safeguard the cloud-native data pipelines and quality frameworks behind DoDEA’s NDSP Community Profiles platform. In this role, you’ll design scalable ingestion and transformation workflows, automate tests that protect data integrity and accessibility, and collaborate with engineers, designers, and product managers to deliver reliable, user-focused releases. From crafting ETL jobs and performance tests to monitoring dashboards and regression suites, you’ll be the linchpin for trustworthy data and high-quality software in a mission-critical environment. This position is a part-time, contingent hire.
Primary Responsibilities
Design and maintain cloud-native data pipelines . Ingest, cleanse, and transform diverse datasets using AWS services and infrastructure-as-code.
Implement automated test suites (unit, integration, regression, performance, accessibility) and embed them in the CI/CD pipeline for every build.
Validate data quality and integrity . Build checks, anomaly detection, and reconciliation reports to keep accuracy and completeness above contract thresholds.
Conduct performance and load testing ; analyze results and partner with engineers to tune latency, throughput, and scalability.
Embed compliance controls . Apply Section 508/WCAG, FedRAMP-Moderate, and NIST RMF requirements to data workflows and test criteria.
Monitor production health . Use dashboards, logs, and alerts to track pipeline uptime, test coverage, and key quality metrics; act quickly on incident signals.
Collaborate daily with DevSecOps, application engineers, designers, and product managers to refine data models, testable requirements, and release plans.
Document schemas, data flows, and test plans ; provide clear status updates and quality reports to technical and non-technical stakeholders.
Support release and hyper-care windows, performing root-cause analysis and driving continuous improvements in reliability and security.
Coach teammates on data-engineering and QA best practices, fostering a culture of automation, measurement, and user-centered quality.
Minimum Qualifications
5–8 years combined experience in data engineering, QA automation, or closely related roles, including 2 + years on cloud-hosted systems.
Bachelor’s degree in computer science, engineering, data science, or a similar discipline (or equivalent practical experience).
Hands-on skill building ETL/ELT pipelines on AWS (., Glue, Lambda, Redshift, S3) with Python, SQL, or PySpark.
Proven ability to design and run automated test suites (unit, integration, regression, performance) integrated into CI/CD workflows such as GitHub Actions or Jenkins.
Experience monitoring pipeline health and application performance with logging, metrics, and alerting tools (CloudWatch, Grafana, or equivalent).
Working knowledge of Section 508/WCAG accessibility checks and FedRAMP Moderate / NIST RMF security requirements.
Strong analytical, problem-solving, and communication skills; comfortable collaborating with engineers, designers, and product managers.
Ability to obtain and maintain a Public Trust clearance.
Legal authorization to work in the .
Desired Qualifications
Prior experience supporting Federal or DoD cloud programs.
AWS certifications such as AWS Data Analytics–Specialty or Solutions Architect – Associate.
QA or testing credentials: ISTQB Advanced Test Analyst, Certified Agile Tester, or DHS Trusted Tester v5 for Section 508.
Hands-on expertise with big-data and analytics tools (PySpark, Glue ETL, Redshift, Athena) and data-quality frameworks (Great Expectations, dbt tests).
Experience shepherding systems through FedRAMP Moderate / DoD IL-4 ATO processes and authoring security-control evidence.
Familiarity with DORA/Accelerate metrics and SRE practices to improve pipeline reliability and lead time.
- Location:
- Us
- Job Type:
- PartTime