Senior DataOps Engineer

New Today

About Dechen Consulting Group (DCG)
Dechen Consulting Group (DCG) is a rapidly expanding, innovative IT Professional Services and Management Consulting company with a track record of more than twenty-five years in delivering skilled professionals to our clients across diverse sectors.
Opportunity Overview
We are currently seeking a highly skilled and experienced professional for a W2 contract opportunity in Dearborn, MI. This role has the potential to extend over multiple years, with the chance to transition to a direct hire position with our client. We provide healthcare, vacation, relocation assistance, and visa sponsorship/transfer. This is a W2 position, not C2C. THIRD PARTIES NEED NOT APPLY. This role offers excellent prospects for career progression!
Position Description
Join our EPEO DataOps team where you will be pivotal in designing, building, and maintaining robust, scalable, and secure telemetry data pipelines on Google Cloud Platform (GCP). The ideal candidate will have a strong background in DataOps principles, deep expertise in GCP data services, and a solid understanding of IT operations, especially within the security and network domains. You will enable real-time visibility and actionable insights for our security and network operations centers, contributing directly to our operational excellence and threat detection capabilities.
Key Responsibilities Design & Development: Lead the design, development, and implementation of high-performance, fault-tolerant telemetry data pipelines for ingesting, processing, and transforming large volumes of IT operational data (logs, metrics, traces) from diverse sources, with a focus on security and network telemetry. GCP Ecosystem Management: Architect and manage data solutions using a comprehensive suite of GCP services, ensuring optimal performance, cost-efficiency, and scalability. This includes leveraging services like Cloud Pub/Sub for messaging, Dataflow for real-time and batch processing, BigQuery for analytics, Cloud Logging for log management, and Cloud Monitoring for observability. DataOps Implementation: Drive the adoption and implementation of DataOps best practices, including automation, CI/CD for data pipelines, version control (e.g., Git), automated testing, data quality checks, and robust monitoring and alerting. Security & Network Focus: Develop specialized pipelines for critical security and network data sources such as VPC Flow Logs, firewall logs, intrusion detection system (IDS) logs, endpoint detection and response (EDR) data, and Security Information and Event Management (SIEM) data (e.g., Google Security Operations / Chronicle). Data Governance & Security: Implement and enforce data governance, compliance, and security measures, including data encryption (at rest and in transit), access controls (RBAC), data masking, and audit logging to protect sensitive operational data. Performance Optimization: Continuously monitor, optimize, and troubleshoot data pipelines for performance, reliability, and cost-effectiveness, identifying and resolving bottlenecks. Collaboration & Mentorship: Collaborate closely with IT operations, security analysts, network engineers, and other data stakeholders to understand data requirements and deliver solutions that meet business needs. Mentor junior engineers and contribute to the team's technical growth. Documentation: Create and maintain comprehensive documentation for data pipelines, data models, and operational procedures. Skills Required Data Analysis Cloud Infrastructure Data Governance Data Modeling Network Security Data Warehousing Data Acquisition Python Data Conversion Skills Preferred Technical Communication Troubleshooting (Problem Solving) Technologies Critical Thinking Performance Tuning Cross-functional Experience Required
Senior Specialist Exp: 7 years in the relevant field.
Experience Preferred Bachelor's or Master's degree in Computer Science, Data Engineering, Information Technology, or a related quantitative field. Typically, 8 years of experience in data engineering, with at least 4 years in a Senior or Lead role focused on DataOps or cloud-native data platforms. Education Required Master's Degree Bachelor's Degree Additional Information Core DataOps & Engineering: Proven experience in DataOps/Data Engineering, operationalizing large-scale, optimized batch and real-time data pipelines. Strong understanding of DataOps principles: CI/CD, automation, data quality, data governance, monitoring. Proficiency in Python. Experience with Infrastructure as Code (IaC) tools (e.g., Terraform). Solid understanding of data modeling, schema design, and data warehousing (e.g., star schema). GCP Expertise: Data Ingestion & Messaging: In-depth experience with Cloud Pub/Sub, Cloud Logging, and integrating diverse log sources (e.g., VPC Flow Logs). Data Processing & Transformation: Hands-on experience with Dataflow (Apache Beam), Cloud Functions. Data Storage & Warehousing: Proficient in BigQuery, Cloud Storage (and potentially Bigtable). Orchestration: Experience with Cloud Composer (Apache Airflow). Monitoring & Observability: Strong skills in Google Cloud Operations Suite (Cloud Monitoring, Cloud Logging, Cloud Trace). Security Services: Familiarity with IAM, VPC Service Controls, KMS; understanding of Google Security Operations (Chronicle) desirable. Domain-Specific (IT Operations, Security & Networks): Understanding of IT operations telemetry (logs, metrics, traces). Experience collecting/processing security logs (firewall, IDS/IPS, EDR, SIEM) and network flow data (VPC Flow Logs). Familiarity with security concepts: threat detection, incident response, compliance. Knowledge of network protocols and topologies. 4 days in office
We Are a People-Focused Company with a deep emphasis on family values and look forward to working with you.
Location:
Dearborn, MI, United States
Category:
Architecture And Engineering Occupations

We found some similar jobs based on your search