Lead Data Engineer

2 Days Old

Role: Lead Data Engineer Locations: Richmond - VA / McLean - VA/ Plano - TX / Chicago - IL / NYC - NY / Wilmington - DE ( Preferred Location; Hybrid Role; Needs to work 3 days from Office in a week) Job Type: Long Term Contract Responsibilities: Lead the design, development, and deployment of scalable data engineering solutions that deliver operational and analytical data to third-party systems. Collaborate with data scientists, analysts, and other stakeholders to design and implement data pipelines and workflows. Ensure the smooth integration and real-time processing of data using cloud technologies like AWS. Develop and maintain efficient data warehousing solutions using Snowflake. Oversee the implementation and management of ETL pipelines using technologies such as AWS Glue, Spark, and Databricks. Optimize database performance and ensure data integrity across both SQL and NoSQL systems (including DynamoDB). Ensure high availability and reliability of data pipelines and services. Lead and mentor a team of data engineers, providing guidance on best practices and troubleshooting technical challenges. Implement and manage monitoring and alerting systems, using tools like AWS CloudWatch, Splunk, and New Relic. Ensure adherence to Agile engineering practices, including the use of JIRA for issue tracking and project management. Work with teams on production support and resolution of production incidents, including after-hours or weekend support as required. Required Skills and Qualifications: 10+ years of experience required Extensive experience with AWS technologies such as EMR, Glue, S3, Lambda, and CloudWatch. Strong programming skills in Python, Scala, Java, and experience with Spark. Proficiency with Databricks, SQL, and Bash scripting. Experience working with Data Warehousing solutions, particularly Snowflake. Hands-on experience with NoSQL databases, especially DynamoDB. Solid understanding of Agile engineering practices and experience with JIRA. Preferred Skills and Qualifications: Familiarity with Kafka for real-time data streaming. Experience with Airflow for orchestrating complex workflows. Knowledge of open table formats such as Delta, Hudi, or Iceberg. Experience with monitoring tools such as Splunk and New Relic.
Location:
Richmond

We found some similar jobs based on your search