All active Airflow roles based in Costa Rica.
Pick a job to read the details
Tap any role on the left — its description and apply link will open here.
Share this job
CSQ326R204
About Databricks
Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.
Benefits
At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region click here.
Our Commitment to Diversity and Inclusion
At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics.
Compliance
If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.
Ready to apply?
Apply to Databricks
Share this job
Seeking to hire a Contractor based out of Mexico or Argentina for Lead Analytics Engineering Data Team
Why you'll love this role:Data Modeling and Transformation
Data Quality and Testing
Collaboration and Communication
Infrastructure and Automation
Why you're a great fit:
Please note that given the nature of the contract, this role will not be eligible to participate in company-sponsored benefits
Newsela is a leading education technology company dedicated to meaningful classroom learning for every student. We deliver integrated, AI-powered solutions designed to unlock student engagement, empower teachers, and drive meaningful learning outcomes. Our suite of products supports knowledge and skill development, writing practice, daily instruction, assessment, and data-informed decision-making across K–12 classrooms. Grounded in learning science research, Newsela’s solutions integrate content, assessment, and analytics to help educators track progress, understand student outcomes, and deliver high-impact instruction that supports every learner.
#LI-Remote
Ready to apply?
Apply to Newsela
Share this job
Type: Contract, per-project.
Location: Remote — within LATAM with ET ±1 hour
Availability: Contractor (40 hours per week)
Job Title: Data / Platform Engineer (AWS & Data Pipelines)
We are looking for a highly motivated Data / Platform Engineer to join our team and help design, build and operate scalable data pipelines and cloud-based solutions.
In this role, you will work closely with engineering and product teams to build both streaming and batch data pipelines, contribute to system design, and help drive automation and monitoring across our data platform.
Key Responsibilities
Design, build, maintain and primarily operate scalable streaming and batch data pipelines, with a strong focus on maintenance, monitoring, troubleshooting and continuous improvement of existing pipelines.
Work with AWS services, including Redshift, EMR and ECS, to support data processing and analytics workloads.
Develop and maintain data workflows using Python and SQL.
Orchestrate and monitor pipelines using Apache Airflow.
Build and deploy containerized applications using Docker and Kubernetes.
Break down high-level system designs into well-defined, deliverable tasks with realistic estimates.
Collaborate with cross-functional teams in a fast-paced and distributed environment across the US and Europe.
Drive automation, observability and monitoring to improve reliability, performance and operational efficiency.
Support knowledge transfer and ownership handover as part of the planned transition to the consuming team.
Required Qualifications
Strong professional experience with Python and SQL.
Hands-on experience with AWS, specifically Redshift, EMR and ECS. AWS experience is mandatory (other cloud providers are not considered equivalent for this role).
Proven experience building and operating both streaming and batch data pipelines.
Professional experience with Apache Airflow, Docker and Kubernetes.
Ability to translate high-level system designs into actionable technical tasks and realistic estimates.
Comfortable working in dynamic and fast-paced environments and in distributed teams.
Strong interest in automation and monitoring.
Strong hands-on experience with Apache Spark.
Senior-level profile with strong autonomy, communication skills and ability to work effectively in distributed teams.
Proven ability to transfer knowledge and support ownership handovers.
Fluent or professional working proficiency in English (both written and spoken).
Nice to Have
Previous experience in the telecom industry.
Experience with machine learning systems and/or event-driven architectures.
Experience with Apache Iceberg.
(*) SOUTHWORKS only hires individuals from countries that are not blocked or sanctioned by the United States, including those identified on the United States Office of Foreign Asset Control (OFAC).
Ready to apply?
Apply to SOUTHWORKS
Cookies & analytics
This site uses cookies from third-party services to deliver its features and to analyze traffic.