Interview will consist a coding test.
This is a contract position.Job DescriptionWe are seeking a skilled Data Engineer to design, develop, and maintain scalable data pipelines and workflows.
The ideal candidate will have strong expertise in Python, SQL, Snowflake, and Airflow, with experience in building ETL/ELT solutions and optimizing data infrastructure.
This role involves collaborating with data analysts, scientists, and business stakeholders to ensure data availability, reliability, and efficiency.Roles & ResponsibilitiesDesign, build, and maintain scalable ETL/ELT pipelines to process large volumes of structured and unstructured data.Develop and optimize SQL queries within Snowflake for efficient data storage and retrieval.Implement workflow orchestration using Apache Airflow to automate data processing tasks.Write efficient, reusable, and scalable Python scripts for data extraction, transformation, and loading (ETL).Monitor and troubleshoot data pipelines to ensure high availability and performance.Collaborate with data teams to define best practices for data modeling and maintain a structured data warehouse.Work with cloud platforms (AWS, GCP, or Azure) to integrate data sources and manage cloud-based data infrastructure.Ensure data security, governance, and compliance with industry best practices.Required Skills & QualificationsStrong programming skills in Python.Expertise in SQL for querying, transformation, and performance tuning.Hands-on experience with Snowflake (schema design, performance optimization, Snowpipe, Streams, and Tasks).Experience with Apache Airflow for scheduling and orchestrating data pipelines.Knowledge of ETL/ELT processes and best practices in data engineering.Experience with cloud platforms (AWS, GCP, or Azure) and their data services.Familiarity with data modeling (Star Schema, Snowflake Schema) and data warehouse concepts.Experience with Git and CI/CD pipelines.Preferred SkillsExperience with big data processing frameworks (Spark, Databricks).Knowledge of Kafka, Kinesis, or other real-time data streaming tools.Familiarity with containerization (Docker, Kubernetes) for deploying data pipelines.Understanding of Data Governance, Data Quality, and Data Security principles.
Queremos fazer diferente com o essencial. Apostamos numa estratégia de corporativismo sustentável. Acreditamos que o sucesso se baseia na construção de uma equipa centrada no bem-estar coletivo, no desenvolvimento pessoal e profissional, e assente numa filosofia de trabalho que potencie a gestão saudável das nossas vidas. Work-life balance, beginner’s mind e community involvement são a nossa...
📍 Location: Fully Remote (Supporting PT Time Zone) 💼 Employment Type: Contract/ Freelance / CNPJ We are Remobi - We empower businesses to scale, innovate, and achieve their goals through high-quality technical solutions delivered by high performing, high quality, REMOTE nearshore technology teams. About the Role We are seeking a Senior Data Platform Engineer to join a project within the...