Carbon Black Company Profile

Senior Threat Data Engineer at Carbon Black (Boulder, CO)

Carbon Black
Ta praca ma 18 miesiące i prawdopodobnie nie jest już dostępna.

Opis pracy

We’re looking for a Senior Threat Data Engineer who will contribute to the design, development and validation of our ETL processes and develop APIs and applications to expose that data to numerous stakeholders within our engineering organization. You’ll join a talented team of developers using a wide variety of new and cool technologies to solve Large scale distributed data-related problems with a capital “L”. You’ll build solutions and enable features that our customers are clamoring for, and that may help our customers prevent the next front page security breach. You’ll make a difference and help us win a huge market.What You’ll Do
  • Ensure data quality and availability for ongoing analytics projects through stable ETL processes
  • Build out resilient and scalable components for our stream processing machinery
  • Design, implement and validate the monitoring and alerting infrastructure as necessary to ensure uninterrupted and performant data flow
  • Develop large scale, performant, RESTful APIs and Streaming applications to expose and provide access to large and disparate data sources
  • Deliver high-quality clean code along with the rest of the dev team
  • Analyze performance of services and optimize / refactor as needed
  • Investigate functional problems, finding and fixing defects as needed
  • Lead the development of new features, collaborating with other senior engineers and mentoring junior engineers
  • Design proof of concepts or working prototypes for software services
  • Ensure that software designs follow best-practices and architectural guidelines
  • Separate and secure country-specific data across national boundaries utilizing AWS regions
What You’ll Bring
  • B.S. in Computer Science or equivalent experience
  • Experience with big data and machine learning is a plus
  • Demonstrated expertise programming in Python preferred, Go and/or Scala a plus
  • Experience writing applications for Spark/Hadoop and AWS environments
  • Demonstrated knowledge of message queueing and stream processing.
  • Advanced knowledge and a demonstrated ability to work with structured and unstructured datasets across a variety of databases.
  • Demonstrated aptitude in quickly mastering new languages and technologies
  • Excellent communication and interpersonal skills
  • Excellent problem solving and troubleshooting skills
  • A devops mentality and an eagerness to take responsibility for portions of the ETL / stream processing for our backend analytic environment from end-to-end
  • Experience building and maintaining CI/CD pipelines
  • Experience with AWS EMR, Kinesis, s3 a big plus
  • Security experience a big plus
Znajdź swoją wymarzoną pracę