All active Databricks roles based in Czech Republic.
Pick a job to read the details
Tap any role on the left — its description and apply link will open here.
Share this job
Emplifi is a leading AI-powered social media marketing and customer experience platform, empowering brands to deliver meaningful, connected experiences across digital channels. Recognized as a Leader by renowned analysts and celebrated as a customer favorite, Emplifi provides innovative, data-driven insights and AI-powered tools to help brands optimize social media performance, elevate their influencer marketing strategies, and deliver impactful customer engagement across marketing, commerce, and care.
We're a group of enthusiastic business analysts, data analysts, engineers, and Salesforce developers. We also have a team consisting of developers taking care of our SaaS toolsets. People matter the most. We try to create a culture where people feel productive and creative, can learn and grow, and every voice is heard.
Projects run on agile methodology, and we tackle interesting problems from anomaly detection, to optimal product setup, retention prediction to analyzing macro market trends using gigabytes of data. Our stack consists of Python, SQL, Tableau, Apache Spark, AWS S3, and PostgreSQL. We also manage Salesforce (both marketing and sales processes). While code quality matters, what defines us is our ability to combine strong technical solutions with a sharp business mindset.
At Emplifi, data is more than numbers—it’s insight, innovation, and impact. As our Data Analyst, you’ll be at the forefront of shaping business and product decisions across the organization. We are looking for someone with natural curiosity, strong independence, and a deep sense of accountability—someone who thrives on discovering the "why" behind the data and who is motivated to deliver meaningful outcomes. You’ll uncover trends in the social media landscape that feed into global publications, deliver actionable insights that power our strategy, and support client success through fast, accurate data exports.
In this dynamic role, you'll also support Sales enablement and pipeline building, while mastering powerful technologies like Python, Databricks, PySpark, S3, and Airflow. You’ll work with terabytes of data in our evolving Data Lake, all within a collaborative and flexible Agile (SCRUM) environment where experimentation is encouraged and learning never stops.
On-Demand Insights
Autonomous Insight Generation
Automated Reporting Pipelines
Go Beyond Analytics
Must Have Skills
Your Work Style
Additional Tools
At Emplifi, we are committed to creating a workplace where everyone is valued, respected, and empowered to bring their whole selves to work. We welcome applications from individuals of all ages, races, religions, genders, sexual orientations, gender identities, and LGBTQ+ communities.
Emplifi offers a safe, inclusive, and supportive environment where every employee has the opportunity to thrive and is encouraged to be who they are.
We welcome and encourage applicants with disabilities. Accommodations are available upon request at any stage of the recruitment process.
Ready to apply?
Apply to Emplifi FR
Share this job
At SentinelOne, we are driven by a clear purpose: to give the advantage to those who secure our future. As AI reshapes how organizations build, operate, and innovate, the responsibility to protect them becomes more critical than ever. When you join SentinelOne, your work helps protect global enterprises, critical infrastructure, and the technologies shaping tomorrow. If you are motivated by meaningful challenges and want your impact to be real, measurable, and global, you will find purpose here.
SentinelOne is a company at the intersection of AI and security, pioneering a new operating model for cybersecurity. Our AI-native platform unifies protection across endpoint, cloud, identity, data, and AI systems to deliver autonomous detection and response with clarity and speed. By combining real-time analytics, intelligent automation, and a unified data foundation, we reduce noise, simplify complexity, and empower security teams to focus on what truly matters.
Our teams are builders, problem-solvers, and innovators committed to shaping the future of security. If you are excited to solve hard problems alongside talented, mission-driven people, we invite you to help us build a safer future for humanity.
We’re looking for people who are relentlessly curious and committed to continuous learning. AI is reshaping every function across our business, and we enable every team member, regardless of role or level, to build fluency in AI tools and concepts. Those who thrive here actively seek out new solutions, experiment thoughtfully, and apply what they learn to drive better, faster, smarter outcomes.
As a Senior ML Engineer you will join our Static AI team which develops state-of-the-art ML models that operate on millions of machines worldwide, making sub-second decisions to stop malware before it runs. We bring together cross-functional skill sets, including data science, ML engineering, and software engineering, to research, develop, ship, and support ML malware detection engines across Windows, Mac, and Linux machines and cloud environments.
Primary responsibilities include:
Ideal candidates will have:
Experience in some of these areas is an advantage
AI is redefining how the world operates and rewriting the rules of security in real time, and SentinelOne was built for this moment. From day one, we architected an AI-native platform designed to operate at machine speed, not as an add-on to legacy systems but as the foundation itself. If you want to build where innovation and impact move together, this is that place.
We invest in our Sentinels with comprehensive, competitive benefits designed to support you and your family:
SentinelOne is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.
SentinelOne participates in the E-Verify Program for all U.S. based roles.
Ready to apply?
Apply to SentinelOne
Share this job
Join the newly form Datacraft team — the team building the next-generation data platform for Bloomreach Engagement. Datacraft owns three interconnected domains:
As a Senior SRE, you will be the reliability backbone of this AI-first data team. Your work will directly impact the deployments, pipelines, reliability, and observability of pipelines and services that hundreds of enterprise customers depend on — from data exports into Databricks and BigQuery, to the AI agent Loomi uses to surface insights.
Datacraft is an AI-first team. We believe code is a commodity and expect every engineer to fluently use coding agents (e.g., Cursor, Claude Code, Copilot, Gemini CLI) as a core part of their daily workflow. The ability to leverage AI tooling to accelerate development, prototyping, and problem-solving is not optional — it's foundational. Working in one of our Central European offices (Bratislava, Praha, Brno) or from home on a full-time basis, you'll become a core part of the Engineering team.
As a P3 (Senior) SRE at Bloomreach, you are an independent professional — expert in reliability engineering, able to decompose objectives into actionable infrastructure improvements, and lead initiatives end-to-end with minimal day-to-day guidance.
We need you to build and operate an ecosystem where data engineers can safely and efficiently develop, debug, and operate data-intensive jobs and services — spanning Kafka ingest pipelines, Iceberg data lakes, multi-DWH exports, Databricks deployment and orchestration (Airflow / Cloud Composer), and agentic AI workloads.
Languages: Python (primary), Go, SQL Messaging & streaming: Apache Kafka Storage & databases: Databricks, BigQuery, Apache Iceberg, GCS, Mongo, Redis Data processing & orchestration: Apache Spark, DataFlow, Airflow / Cloud Composer Infrastructure: GCP, Kubernetes, Terraform AI / Agentic: LLM APIs, MCP, agent orchestration frameworks Observability: Grafana, Prometheus, Victoria Metrics, PagerDuty, Sentry, OpenTelemetry CI/CD & tooling: GitLab, Jira, Confluence AI coding agents: Cursor, Claude Code
Impact
Ownership
Systematic approach
Data-driven
Technical skills
In 30 days:
In 90 days:
In 180 days:
#LI-KP1
(*Subject to employment type. Interns are exempt from marked benefits, usually for the first 6 months.)
Excited? Join us and transform the future of commerce experiences!
If this position doesn't suit you, but you know someone who might be a great fit, share it - we will be very grateful!
Any unsolicited resumes/candidate profiles submitted through our website or to personal email accounts of employees of Bloomreach are considered property of Bloomreach and are not subject to payment of agency fees.
#LI-Remote
Ready to apply?
Apply to Bloomreach
Share this job
Join the newly form Datacraft team — the team building the next-generation data platform for Bloomreach Engagement. Datacraft owns three interconnected domains:
As a Senior SRE, you will be the reliability backbone of this AI-first data team. Your work will directly impact the deployments, pipelines, reliability, and observability of pipelines and services that hundreds of enterprise customers depend on — from data exports into Databricks and BigQuery, to the AI agent Loomi uses to surface insights.
Datacraft is an AI-first team. We believe code is a commodity and expect every engineer to fluently use coding agents (e.g., Cursor, Claude Code, Copilot, Gemini CLI) as a core part of their daily workflow. The ability to leverage AI tooling to accelerate development, prototyping, and problem-solving is not optional — it's foundational.
For candidates at the P3 / Senior SRE level, starting monthly compensation begins at 3 800 € gross, with the final offer tailored for each candidate based on their skills and experience. Stock options and a comprehensive benefits package are also included. Working in one of our Central European offices (Bratislava, Praha, Brno) or from home on a full-time basis, you'll become a core part of the Engineering team.
As a P3 (Senior) SRE at Bloomreach, you are an independent professional — expert in reliability engineering, able to decompose objectives into actionable infrastructure improvements, and lead initiatives end-to-end with minimal day-to-day guidance.
We need you to build and operate an ecosystem where data engineers can safely and efficiently develop, debug, and operate data-intensive jobs and services — spanning Kafka ingest pipelines, Iceberg data lakes, multi-DWH exports, Databricks deployment and orchestration (Airflow / Cloud Composer), and agentic AI workloads.
Languages: Python (primary), Go, SQL Messaging & streaming: Apache Kafka Storage & databases: Databricks, BigQuery, Apache Iceberg, GCS, Mongo, Redis Data processing & orchestration: Apache Spark, DataFlow, Airflow / Cloud Composer Infrastructure: GCP, Kubernetes, Terraform AI / Agentic: LLM APIs, MCP, agent orchestration frameworks Observability: Grafana, Prometheus, Victoria Metrics, PagerDuty, Sentry, OpenTelemetry CI/CD & tooling: GitLab, Jira, Confluence AI coding agents: Cursor, Claude Code
Impact
Ownership
Systematic approach
Data-driven
Technical skills
In 30 days:
In 90 days:
In 180 days:
#LI-KP1
(*Subject to employment type. Interns are exempt from marked benefits, usually for the first 6 months.)
Excited? Join us and transform the future of commerce experiences!
If this position doesn't suit you, but you know someone who might be a great fit, share it - we will be very grateful!
Any unsolicited resumes/candidate profiles submitted through our website or to personal email accounts of employees of Bloomreach are considered property of Bloomreach and are not subject to payment of agency fees.
#LI-Remote
Ready to apply?
Apply to Bloomreach
Share this job
Writing software is your hobby. You want to team up with great people because that’s the best way to grow and learn. You want your work to make a positive impact in the world. You love the idea that your code can help combat global crime. Is this you? Then you will fit well in our team.
We are building cutting-edge solutions to create a safer world and stop money from ending up in the hands of criminals. Your work will allow our clients to identify and understand criminal, financial, and political risks, updating dynamically as new data becomes available from various sources (sanctions, watchlists, corporate registries, media, …).
As a Junior Software Engineer, you will:
Our Tech Stack:
The data-driven nature of the role and the prominence of Python in the data and ML ecosystem means you will primarily work in Python and engineer data pipelines. Any relevant experience is going to be helpful but we believe that top engineers are able to learn quickly and adapt to new technologies so that we can innovate and react to the fast changing world.
About you:
As a Junior Software Engineer, you will have:
Nice to haves:
What’s in it for you?
About us:
Our mission is to empower every business to eliminate financial crime.
By harnessing AI, a unified platform, and an extensive partner ecosystem, we help customers turn compliance into a catalyst for growth, operational resilience, and enduring regulatory trust.
More than 3,000 enterprises across 75 countries rely on our end-to-end platform and the world’s most comprehensive financial crime risk intelligence. With full-stack agentic automation, we help organizations automate up to 95% of KYC, AML, and sanctions reviews, cut onboarding times by 50%, reduce false positives by 70%, and handle 7x more work with the same staff.
ComplyAdvantage is headquartered in London and has global hubs in New York, Lisbon, Singapore, and Cluj-Napoca. It is backed by Balderton Capital, Index Ventures, Ontario Teachers’ Pension Plan, Goldman Sachs, and Andreessen Horowitz. Learn more about compliance re-engineered for the age of AI at complyadvantage.com.
Ready to apply?
Apply to ComplyAdvantage
Cookies & analytics
This site uses cookies from third-party services to deliver its features and to analyze traffic.