About the Company
You will work with a cybersecurity awareness training company that produces high-quality, Hollywood-style learning content designed to educate individuals and organizations on how to recognize and respond to cyber threats. Their platform combines engaging educational media with an integrated phishing simulator, helping businesses reduce risk through behavior change. Their solution is trusted globally by organizations seeking to strengthen their human layer of security.
The Role
We’re looking for a Data Engineer to support the core data infrastructure powering product insights, BI reporting, and future AI/ML capabilities. In this role, you’ll lead the ongoing modernization of the company’s data stack—focusing on managing a Snowflake-based warehouse, improving data quality, and delivering reliable, structured data for both internal analytics and customer-facing features. This is a hands-on engineering position requiring deep SQL expertise, ETL development, and cross-functional collaboration with engineering, product, and analytics teams.
Key Responsibilities
- Design, implement, and optimize a Snowflake-based data warehouse to support a growing range of product and business needs
- Develop and maintain robust ETL pipelines to ingest, transform, and organize data from various sources
- Monitor data pipelines for integrity, performance, and anomalies—responding quickly to issues and optimizing performance as needed
- Build and maintain clean, reliable datasets to support internal BI dashboards and reporting workflows
- Translate business requirements into structured data models and schema designs
- Support SQL performance tuning, troubleshoot query exceptions, and respond to ad hoc data support requests from internal teams
- Implement data validation and cleansing routines to ensure accuracy and consistency
- Collaborate with Product and Data Science teams to deliver structured datasets that power AI/ML model development and feature engineering
- Participate in data governance efforts by documenting processes, data sources, and best practices
- Ensure security and compliance across all stages of data collection, storage, and access
Qualifications
- 3–5 years of experience in data engineering with a focus on data warehousing, ETL pipelines, and structured analytics environments.
- Proven experience with Snowflake or similar cloud data warehouse platforms.
- Strong SQL skills are essential—including complex query design, optimization, and exception handling.
- Proficiency with Python or another scripting language. Familiarity with modern cloud platforms (AWS, GCP, or Azure), BI tools (e.g., Tableau, Power BI, Looker), and orchestration frameworks like Apache Airflow.
- Experience supporting AI/ML workflows is a plus.
Why Join
This is a chance to build and scale the data foundation of a cybersecurity company making a real-world impact. You’ll work with a small, agile team that values autonomy, ownership, and smart engineering choices. If you’re passionate about delivering clean, performant data systems that drive real business outcomes—and excited by the challenge of supporting AI innovation and robust BI reporting—this is the role for you.