About us:
Talent has no borders. Proxify's mission is to connect top developers around the world with the opportunities they deserve. So, it doesn't matter where you are; we are here to help you fast-track your independent career in the right direction. 🙂
Since our launch, Proxify's developers have successfully worked with 1200+ happy clients to build their products and growth features. 5000+ talented developers trust Proxify and its network to fulfill their dreams and objectives.
Proxify is shaped by a global network of supportive, talented developers interested in remote full-time jobs. Our Glassdoor (4.5/5) and Trustpilot (4.8/5) ratings reflect the trust developers place in us and our commitment to our members' success.
The Role:
We are looking for a Senior Data Engineer specializing in modern, cloud-native data platforms, with a strong focus on Amazon Web Services (AWS) and Python. You will be responsible for designing, building, and optimizing highly scalable and reliable ETL/ELT pipelines and data warehouses that power analytics, machine learning, and business intelligence for our clients.
What we’re looking for:
- 5+ years of professional experience in data engineering
- Expert proficiency in Python for data manipulation, scripting, and pipeline development (e.g., Pandas, PySpark).
- Deep hands-on experience with the AWS cloud platform, specifically the core services used for data ingestion, storage, and processing (S3, Glue, Lambda, EMR).
- Proven experience working with modern data warehouses (Snowflake, Amazon Redshift, or Google BigQuery/Azure Synapse).
- Solid expertise in SQL and complex query writing/optimization.
- Strong understanding of containerization and orchestration concepts (Docker, Kubernetes).
- Fluent English communication skills.
- Located in CET timezone (+/- 3 hours), we are unable to consider applications from candidates in other time zones.
Nice-to-have:
- Experience with Infrastructure as Code (Terraform or CloudFormation).
- Proficiency with a modern orchestration tool like Apache Airflow.
- Familiarity with data streaming technologies (Kafka, Kinesis, or Flink).
- AWS certifications
Responsibilities:
- Architect, implement, and maintain scalable data pipelines (ETL/ELT) using Python and native AWS services to ingest data from various sources (APIs, databases, streaming services) into data lakes and warehouses.
- Serve as the subject matter expert for the core AWS data services, including S3, Glue, EMR, Kinesis/MSK, and Redshift or Amazon Aurora.
- Design robust and efficient data models (e.g., star schema, snowflake, data vault) for analytical and reporting needs.
- Perform performance tuning and query optimization on large datasets within cloud data warehouses to ensure fast data delivery.
- Implement infrastructure as code (IaC) using Terraform or CloudFormation, and integrate data pipelines into modern CI/CD processes.
- Establish data quality monitoring, logging, alerting, and governance standards across the data platform.
What we offer:
Get paid, not played
No more unreliable clients. Enjoy on-time monthly payments with flexible withdrawal options.
Predictable project hours
Enjoy a harmonious work-life balance with consistent 8-hour working days with clients.
Flex days, so you can recharge
Enjoy up to 24 flex days off per year without losing pay, for full-time positions found through Proxify.
Career-accelerating positions at cutting-edge companies
Discover exclusive long-term remote positions at the world's most exciting companies.
Hand-picked opportunities, just for you
Skip the typical recruitment roadblocks and biases with personally matched positions.
One seamless process, multiple opportunities
A one-time contracting process for endless opportunities, with no extra assessments.
Compensation
Enjoy the same pay, every month with positions landed through Proxify.
Be the first to know aboutnew jobs every week
Get 8 new jobs with salaries, once per week! Sign up here so you don't miss a single newsletter.