Earnbetter

Job Search Assistant

Lead Data Engineer

Logic Software Solutions • Remote • Posted 5 days ago via ZipRecruiter

Boost your interview chances in seconds

Tailored resume, cover letter, and cheat sheet

Remote • Full-time • Senior Level

Job Highlights

Using AI ⚡ to summarize the original job post

The Lead Data Engineer at Logic Software Solutions will be responsible for building and operating a data lakehouse to deliver cutting-edge security operation capabilities. This role involves leadership in driving critical initiatives, owning technical and architectural decision-making, and partnering with customers to translate their needs into data products. The position requires hands-on development of data products and capabilities, mentoring of teammates, and collaboration with other security teams.

Responsibilities

  • Lead the design and development of scalable data architectures to support modern data analytics and real-time data processing
  • Build, maintain, monitor and troubleshoot streaming and batch data pipelines using Apache Spark
  • Build, maintain, and implement a unified semantic layer for security use cases
  • Develop and manage alerting, reporting, and visualizations for security partners
  • Develop clean, well-documented and well-tested code to build capabilities around the datalake in order to extract value from the data for data customers
  • Collaborate with other security teams to understand, prioritize and meet customer demands
  • Hands-on development of data products and capabilities
  • Mentor, train, and develop teammates

Qualifications

Required

  • B.S. or M.S. in Computer Science or related field, or equivalent experience
  • 10+ years experience designing, building and maintaining ETL pipelines
  • 10+ years experience delivering data products
  • 8+ years experience with Spark
  • 8+ years of experience in software development including Python, SQL stack, Scala, Java
  • 8+ years of experience working with cloud - GCP, Big Query
  • Strategic thinker and problem solver with excellent analytical and reporting skills
  • Strong oral and written competency, along with outstanding interpersonal skills
  • Ability to effectively plan, prioritize, and deliver on programs and projects

Preferred

  • Experience working with cyber security datasets
  • Experience working with or exposure to cyber security functions
  • Experience working with CI/CD pipelines
  • Experience working with modern data infrastructure technologies (Databricks/Snowflake)
  • Experience working with modern data formats (Delta/Parquet)
  • Experience with or knowledge of Agile Software Development methodologies

Full Job Description

Job Description

Job Description
About the Role:

Our Data Engineering team is looking to hire a Lead Data Engineer that has a passion for delivering high quality data products. The Lead Data Engineer will be responsible for building and operating a data lakehouse that delivers cutting-edge security operation capabilities. This individual will play a leadership role driving critical initiatives and owning both technical and architectural decision-making. A key portion of this role will involve partnering with customers to translate their needs into data products that extract value from data in a sustainable fashion.

You will drive the following responsibilities:
  • Lead the design and development of scalable data architectures to support modern data analytics and real-time data processing
  • Build, maintain, monitor and troubleshoot streaming and batch data pipelines using Apache Spark
  • Build, maintain, and implement a unified semantic layer for security use cases
  • Develop and manage alerting, reporting, and visualizations for security partners
  • Develop clean, well-documented and well-tested code to build capabilities around the datalake in order to extract value from the data for data customers.
  • Collaborate with other security teams to understand, prioritize and meet customer demands
  • Hands-on development of data products and capabilities
  • Mentor, train, and develop teammates


Position Requirements
  • B.S. or M.S. in Computer Science or related field, or equivalent experience
  • 10+ years experience designing, building and maintaining ETL pipelines
  • 10+ years experience delivering data products
  • 8+ years experience with Spark
  • 8+ years of experience in software development including Python, SQL stack, Scala ,Java
  • 8+ years of experience working with cloud - GCP , Big Query.
  • Strategic thinker and problem solver with excellent analytical and reporting skills
  • Strong oral and written competency, along with outstanding interpersonal skills
  • Ability to effectively plan, prioritize, and deliver on programs and projects


Preferred Experience
  • Experience working with cyber security datasets
  • Experience working with or exposure to cyber security functions
  • Experience working with CI/CD pipelines
  • Experience working with modern data infrastructure technologies (Databricks/Snowflake)
  • Experience working with modern data formats (Delta/Parquet)
  • Experience with or knowledge of Agile Software Development methodologies


Physical Requirements
Should be US Citizen or Green Card -
This role is offered with fully-remote flexibility and can be performed from anywhere within the United States. This approach is role specific and each team will have some slight variations that we will be able to describe in more detail throughout the recruiting process.