All active Airflow roles based in Pune.
Pick a job to read the details
Tap any role on the left — its description and apply link will open here.
Share this job
![]()
Come help us build the world's most reliable on-demand, logistics engine for delivery! We're bringing on experienced engineers to help us further our 24x7, global infrastructure system that powers DoorDash’s three-sided marketplace of consumers, merchants, and dashers.
The Data User Platform and Programs team works at the intersection of data infrastructure, reliability, governance, developer tooling and AI systems, supporting thousands of internal data users including engineers, analysts, data scientists and business stakeholders.
This role is part of a founding engineering team in India, offering high ownership, strong growth opportunities, and direct influence over the platform’s long-term architecture.
You will work on systems such as:
You will collaborate closely with product managers, data engineers, and analytics teams to deliver solutions that reduce friction and increase trust across the data ecosystem.
Notice to Applicants for Jobs Located in NYC or Remote Jobs Associated With Office in NYC Only
We use Covey as part of our hiring and/or promotional process for jobs in NYC and certain features may qualify it as an AEDT in NYC. As part of the hiring and/or promotion process, we provide Covey with job requirements and candidate submitted applications. We began using Covey Scout for Inbound from August 21, 2023, through December 21, 2023, and resumed using Covey Scout for Inbound again on June 29, 2024.
The Covey tool has been reviewed by an independent auditor. Results of the audit may be viewed here: Covey
At DoorDash, our mission to empower local economies shapes how our team members move quickly, learn, and reiterate in order to make impactful decisions that display empathy for our range of users—from Dashers to merchant partners to consumers. We are a technology and logistics company that started by enabling door-to-door delivery, and we are looking for team members who can help us go from a company that is known as the place you order food to a company that people turn to for any and all goods.
DoorDash is growing rapidly and changing constantly, which gives our team members the opportunity to share their unique perspectives, solve new challenges, and own their careers. We're committed to supporting employees’ happiness, healthiness, and overall well-being by providing comprehensive benefits and perks.
We’re committed to growing and empowering a more inclusive community within our company, industry, and cities. That’s why we hire and cultivate diverse teams of people from all backgrounds, experiences, and perspectives. We believe that true innovation happens when everyone has room at the table and the tools, resources, and opportunity to excel.
If you need any accommodations, please inform your recruiting contact upon initial connection.
We use Covey as part of our hiring and/or promotional process for jobs in certain locations.
The Covey tool has been reviewed by an independent auditor. Results of the audit may be viewed here: https://getcovey.com/nyc-local-law-144
To request a reasonable accommodation under applicable law or alternate selection process, please inform your recruiting contact upon initial connection.
Ready to apply?
Apply to DoorDash India
Share this job
![]()
Come help us build the world's most reliable on-demand, logistics engine for delivery! We're bringing on experienced engineers to help us further our 24x7, global infrastructure system that powers DoorDash’s three-sided marketplace of consumers, merchants, and dashers.
The Data User Platform and Programs team works at the intersection of data infrastructure, reliability, governance, developer tooling and AI systems, supporting thousands of internal data users including engineers, analysts, data scientists and business stakeholders.
This role is part of a founding engineering team in India, offering high ownership, strong growth opportunities, and direct influence over the platform’s long-term architecture.
You will work on systems such as:
You will collaborate closely with product managers, data engineers, and analytics teams to deliver solutions that reduce friction and increase trust across the data ecosystem.
Notice to Applicants for Jobs Located in NYC or Remote Jobs Associated With Office in NYC Only
We use Covey as part of our hiring and/or promotional process for jobs in NYC and certain features may qualify it as an AEDT in NYC. As part of the hiring and/or promotion process, we provide Covey with job requirements and candidate submitted applications. We began using Covey Scout for Inbound from August 21, 2023, through December 21, 2023, and resumed using Covey Scout for Inbound again on June 29, 2024.
The Covey tool has been reviewed by an independent auditor. Results of the audit may be viewed here: Covey
At DoorDash, our mission to empower local economies shapes how our team members move quickly, learn, and reiterate in order to make impactful decisions that display empathy for our range of users—from Dashers to merchant partners to consumers. We are a technology and logistics company that started by enabling door-to-door delivery, and we are looking for team members who can help us go from a company that is known as the place you order food to a company that people turn to for any and all goods.
DoorDash is growing rapidly and changing constantly, which gives our team members the opportunity to share their unique perspectives, solve new challenges, and own their careers. We're committed to supporting employees’ happiness, healthiness, and overall well-being by providing comprehensive benefits and perks.
We’re committed to growing and empowering a more inclusive community within our company, industry, and cities. That’s why we hire and cultivate diverse teams of people from all backgrounds, experiences, and perspectives. We believe that true innovation happens when everyone has room at the table and the tools, resources, and opportunity to excel.
If you need any accommodations, please inform your recruiting contact upon initial connection.
We use Covey as part of our hiring and/or promotional process for jobs in certain locations.
The Covey tool has been reviewed by an independent auditor. Results of the audit may be viewed here: https://getcovey.com/nyc-local-law-144
To request a reasonable accommodation under applicable law or alternate selection process, please inform your recruiting contact upon initial connection.
Ready to apply?
Apply to DoorDash India
Share this job
About Us
Abacus Insights is transforming how data works for health plans. Our mission is simple: make healthcare data usable, so the people responsible for care and cost decisions can act faster, with confidence.
We help health plans break down data silos to create a single, trusted data foundation. That foundation powers better decisions —so plans can improve outcomes, reduce waste, and deliver better experiences for members and providers alike.
Backed by $100M from top investors, we’re tackling big challenges in an industry that’s ready for change. Our platform enables GenAI use cases by delivering clean, connected, and reliable healthcare data that can support automation, prioritization, and decision workflows—and it’s why we are leading the way.
Our innovation begins with people. We are bold, curious, and collaborative—because the best ideas come from working together. Ready to make an impact? Join us and let's build the future together.
About the role
We are seeking an accomplished Data Engineer to join our dynamic and rapidly expanding Tech Ops division. With significant projected growth, this is an opportunity to drive meaningful technical impact. In this role, you will work directly with customers, data vendors, and internal engineering teams to design, implement, and optimize complex data integration solutions within a modern, large‑scale cloud environment.
You will leverage advanced skills in distributed computing, data architecture, and cloud-native engineering to enable scalable, resilient, and high‑performance data ingestion and transformation pipelines. As a trusted technical advisor, you will guide customers in adopting Abacus’s core data management platform and ensure high-quality, compliant data operations across the lifecycle.
Your day to day
What you bring to the team
What we would like to see but not required:
What you’ll get in return :
Work arrangements :
Our Commitment as an Equal Opportunity Employer
As a mission-led technology company helping to drive better healthcare outcomes, Abacus Insights believes that the best innovation and value we can bring to our customers comes from diverse ideas, thoughts, experiences, and perspectives. Therefore, we dedicate resources to building diverse teams and providing equal employment opportunities to all applicants. Abacus prohibits discrimination and harassment regarding race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws.
At the heart of who we are is a commitment to continuously and intentionally building an inclusive culture—one that empowers every team member across the globe to do their best work and bring their authentic selves. We carry that same commitment into our hiring process, aiming to create an interview experience where you feel comfortable and confident showcasing your strengths. If there’s anything we can do to support that—big or small—please let us know.
Ready to apply?
Apply to Abacus Insights
About Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
Role: GCP Data Engineer
Job Summary:
We are looking for an enthusiastic GCP Data Engineer to join our dynamic and growing project team. In this role, you will have the opportunity to develop your skills while contributing to large-scale data projects on Google Cloud Platform. You will work closely with senior engineers and data analysts to build and maintain robust data pipelines, helping to turn raw data into actionable insights. This is an excellent opportunity for a foundational career in data engineering within a supportive and collaborative environment.
Key Responsibilities:
· Develop, test, and maintain data pipelines and ETL/ELT processes using Python, SQL, and GCP services.
· Write and optimize complex SQL queries in BigQuery for data transformation, extraction, and analysis.
· Assist in the design and implementation of data models within our data warehouse.
· Collaborate with the broader project team to understand data requirements and deliver effective solutions.
· Troubleshoot and resolve issues in data pipelines and data quality to ensure accuracy and availability.
· Contribute to documentation and adhere to data engineering best practices.
· Support the team in various data-related tasks and be eager to learn new technologies and techniques.
Required Qualifications:
· Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field.
· 5-8 years of hands-on experience with strong programming skills in Python for data manipulation and processing.
· Proficiency in SQL for querying and data transformation.
· Hands-on experience or significant project work with Google Cloud Platform (GCP).
· Familiarity with core GCP data services, particularly BigQuery and Pub/Sub.
· Strong understanding of data warehousing concepts and ETL/ELT processes.
· A proactive and curious mindset with strong problem-solving skills.
· Excellent communication and teamwork skills, with the ability to work effectively in a large project team.
Preferred Qualifications:
· Experience with other GCP services such as Cloud Composer (Airflow) or Dataflow, Cloud Functions.
· Familiarity with version control systems, especially Git.
· Google Cloud certification (e.g., Associate Cloud Engineer) is a plus.
· A passion for data and a strong desire to grow a career in the data engineering field.
Ready to apply?
Apply to CapcoShare this job
About Us
Abacus Insights is transforming how data works for health plans. Our mission is simple: make healthcare data usable, so the people responsible for care and cost decisions can act faster, with confidence.
We help health plans break down data silos to create a single, trusted data foundation. That foundation powers better decisions —so plans can improve outcomes, reduce waste, and deliver better experiences for members and providers alike.
Backed by $100M from top investors, we’re tackling big challenges in an industry that’s ready for change. Our platform enables GenAI use cases by delivering clean, connected, and reliable healthcare data that can support automation, prioritization, and decision workflows—and it’s why we are leading the way.
Our innovation begins with people. We are bold, curious, and collaborative—because the best ideas come from working together. Ready to make an impact? Join us and let's build the future together.
About the role
We are seeking an accomplished Data Engineer to join our dynamic and rapidly expanding Tech Ops division. With significant projected growth, this is an opportunity to drive meaningful technical impact. In this role, you will work directly with customers, data vendors, and internal engineering teams to design, implement, and optimize complex data integration solutions within a modern, large‑scale cloud environment.
You will leverage advanced skills in distributed computing, data architecture, and cloud-native engineering to enable scalable, resilient, and high‑performance data ingestion and transformation pipelines. As a trusted technical advisor, you will guide customers in adopting Abacus’s core data management platform and ensure high-quality, compliant data operations across the lifecycle.
Your day to day :
What you bring to the team :
What we would like to see but not required:
What you’ll get in return :
Work arrangements :
Our Commitment as an Equal Opportunity Employer
As a mission-led technology company helping to drive better healthcare outcomes, Abacus Insights believes that the best innovation and value we can bring to our customers comes from diverse ideas, thoughts, experiences, and perspectives. Therefore, we dedicate resources to building diverse teams and providing equal employment opportunities to all applicants. Abacus prohibits discrimination and harassment regarding race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws.
At the heart of who we are is a commitment to continuously and intentionally building an inclusive culture—one that empowers every team member across the globe to do their best work and bring their authentic selves. We carry that same commitment into our hiring process, aiming to create an interview experience where you feel comfortable and confident showcasing your strengths. If there’s anything we can do to support that—big or small—please let us know.
Ready to apply?
Apply to Abacus Insights
Share this job
DeepIntent is leading the healthcare advertising industry with data-driven solutions built for the future. From day one, our mission has been to improve patient outcomes through the artful use of advertising, data science, and real-world clinical data. For more information visit, www.DeepIntent.com.
What You’ll Do:
In this role, you will take the lead in designing, building, and optimizing data pipelines, ensuring high-quality data integration and real-time analytics. As a Senior Staff Engineer, you will mentor and guide a team of data engineers, collaborate with cross-functional teams, and drive best practices for data architecture and engineering.
Who You Are:
We believe great work starts with great support. That’s why DeepIntent offers a competitive, holistic benefits package designed to empower you both professionally and personally.
Here’s what you can expect when you join our team based in the US: Competitive base salary plus performance based bonus or commission, comprehensive medical, dental, and vision insurance, 401K match program, Unlimited PTO policy and paid holidays, remote friendly culture with flexible work options, career development and advanced education support, WFH and internet stipends, plus many more perks and benefits!
Here’s what you can expect when you join our team based in India: Competitive base salary plus performance based bonus, comprehensive medical insurance, and paid holidays. Hybrid-friendly culture with flexible work options, professional development reimbursement, WiFi reimbursement and health and wellness allowance plus many more perks and benefits!
Here's what you can expect when you join our team based in Europe: Competitive base salary plus performance-based bonus, comprehensive medical insurance, and flexible PTO. Hybrid-friendly culture with flexible work options, professional development reimbursement, WiFi reimbursement, and parental leave plus many more perks and benefits!
DeepIntent is committed to bringing together individuals from different backgrounds and perspectives. We strive to create an inclusive environment where everyone can thrive, feel a sense of belonging, and do great work together. DeepIntent is an Equal Opportunity Employer, providing equal employment and advancement opportunities to all individuals. We recruit, hire and promote into all job levels the most qualified applicants without regard to race, color, creed, national origin, religion, sex (including pregnancy, childbirth and related medical conditions), parental status, age, disability, genetic information, citizenship status, veteran status, gender identity or expression, transgender status, sexual orientation, marital, family or partnership status, political affiliation or activities, military service, immigration status, or any other status protected under applicable federal, state and local laws. If you have a disability or special need that requires accommodation, please let us know in advance. DeepIntent’s commitment to providing equal employment opportunities extends to all aspects of employment, including job assignment, compensation, discipline and access to benefits and training.
CCPA Notice: If you are a California resident applying for a role at DeepIntent, we may collect personal information as part of the application process in accordance with the California Consumer Privacy Act (CCPA). To learn more about the categories of information we collect and your rights, please contact PeopleOps@deepintent.com to see our full Applicant Privacy Notice.
Ready to apply?
Apply to DeepIntent
Share this job
About Turing
Based in San Francisco, California, Turing is the world’s leading research accelerator for frontier AI labs and a trusted partner for global enterprises looking to deploy advanced AI systems. Turing accelerates frontier research with high-quality data, specialized talent, and training pipelines that advance thinking, reasoning, coding, multimodality, and STEM. For enterprises, Turing builds proprietary intelligence systems that integrate AI into mission-critical workflows, unlock transformative outcomes, and drive lasting competitive advantage.
Recognized by Forbes, The Information, and Fast Company among the world’s top innovators, Turing’s leadership team includes AI technologists from Meta, Google, Microsoft, Apple, Amazon, McKinsey, Bain, Stanford, Caltech, and MIT. Learn more at www.turing.com
Strategic Project Lead / Delivery Manager
Data Annotation & AI Delivery
8+ Years of Experience
Position Overview
We are looking for a seasoned Delivery Manager (8+ years of experience) to lead large-scale data annotation and curation projects in support of cutting-edge LLM and multimodal AI research. You will oversee end-to-end delivery for projects with 100+ team members, ensure world-class data quality, and drive process and tooling innovation in collaboration with AI researchers and cross-functional teams.
This role is ideal for someone who combines deep technical expertise, proven leadership in managing large teams, and a passion for advancing AI/ML innovation. Given that our key customers operate in the data analytics, data engineering, and Business Intelligence (BI) space, the ideal candidate will bring hands-on leadership experience with platforms such as Databricks and Snowflake, enabling them to deeply understand customer domains, speak their language, and deliver annotation outcomes that are contextually relevant and technically sound.
Primary Responsibilities
Required Skills & Qualifications
Preferred Skills & Qualifications
Don’t meet every single requirement? Studies have shown that women and people of color are less likely to apply to jobs unless they meet every single qualification. Turing is proud to be an equal opportunity employer. We do not discriminate on the basis of race, religion, color, national origin, gender, gender identity, sexual orientation, age, marital status, disability, protected veteran status, or any other legally protected characteristics. At Turing we are dedicated to building a diverse, inclusive and authentic workplace and celebrate authenticity, so if you’re excited about this role but your past experience doesn’t align perfectly with every qualification in the job description, we encourage you to apply anyways. You may be just the right candidate for this or other roles.
For applicants from the European Union, please review Turing's GDPR notice here.
Ready to apply?
Apply to Turing
66degrees is an end-to-end AI transformation partner that guides enterprises from complex business challenges to clear, quantifiable outcomes. Our company is the culmination of several successful firms, each a leader in its own right in cloud, artificial intelligence, and data. This convergence of talent and expertise is how we help businesses reach their own "inflection point," where chaotic data becomes a strategic asset, complexity becomes clarity, and AI becomes an engine for growth. Our ultimate vision is to be the catalyst for a future where every business operates as an intelligent enterprise, with autonomous systems unlocking human potential.
At 66degrees, we believe in thriving through challenges and winning together. These values not only guide us in achieving our goals as a company but also for our people. We are dedicated to creating a significant impact for our employees by fostering a culture that sparks innovation and supports professional and personal growth along the way.
We are seeking a highly skilled Cloud Data Engineer to join our Managed Services team. In this role, you will be the technical backbone for our enterprise clients, responsible for designing, building, and maintaining high-performance data platforms using Snowflake and Azure Databricks. You will act as a trusted advisor, ensuring that our customers' data ecosystems are scalable, secure, and cost-optimized across multi-cloud environments.
|
Category |
Must-Have Skills |
|
Snowflake |
Snowpipe, Streams & Tasks, Dynamic Tables, Zero-Copy Cloning, and Snowpark (Python/Java), Data Sharing. |
|
Azure Databricks |
Spark SQL, PySpark, Delta Lake, Unity Catalog, and Medallion Architecture (Bronze/Silver/Gold) |
|
Orchestration |
Azure Data Factory, Airflow, or dbt (Data Build Tool). |
|
Cloud Infrastructure |
Azure Data Lake Storage (ADLS Gen2), Azure DevOps (CI/CD), and Terraform/Bicep for IaC. |
|
Languages |
Advanced SQL (Window functions, CTEs) and Python. |
66degrees is an Equal Opportunity employer. All qualified applicants will receive consideration for employment without regard to actual or perceived race, color, religion, sex, gender, gender identity, national origin, age, weight, height, marital status, sexual orientation, veteran status, disability status or other legally protected class.
As an AI transformation partner, 66degrees leverages intelligent solutions to enhance our recruitment experience. We utilize AI tools—including LinkedIn Recruiter’s Hiring Assistant and interview transcription technologies—to assist with sourcing, role analysis, and capturing interview highlights.
These tools augment our process, but we "Commit to Our Craft" by ensuring all final hiring decisions are made by our human Talent Team. By applying, you acknowledge the use of these technologies to help us "Win Together" in finding the best fit for our team.
Ready to apply?
Apply to 66degrees
Share this job
Headquartered in Santa Barbara, California, HG Insights is the global leader in technology intelligence. We help the world’s most innovative companies accelerate their go-to-market efforts with precision through advanced data science methodologies and proprietary data assets. We offer a culture that blends innovation, collaboration, and growth, where each team member is empowered to make a measurable impact.
We are looking for a Staff/Senior Machine Learning Engineer to join our growing AI and Data Platform team in Pune, India. In this role, you will be responsible for designing, building, and scaling ML systems that power our core data intelligence products. You’ll work at the intersection of data engineering and machine learning, collaborating closely with data scientists, software engineers, and product teams to turn models and agents into robust, production-ready systems.
This is a high-impact role suited for someone who thrives on ownership, scalability, and deploying real-world ML and AI solutions at scale.
Ready to build the future of AI infrastructure? Apply now to join HG Insights as a Staff Machine Learning Engineer in Pune.
Ready to apply?
Apply to HG Insights
About Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
In this role, you will:
· Technical leadership for a team of engineers that focus on development, deployment and operations
· Lead and contribute to multiple pods with moderate resource requirements, risk, and/or complexity
· Interface technically with a range of stakeholders with customer and business impact
· Leading others to solve complex problems
· Experience in working within an agile, multidisciplinary DevOps team
· Migrate and re-engineer existing services from on-premises data centers to Cloud (GCP/AWS)
· ETL Engineer is responsible for performing system development work with background of various data models and data warehousing concepts.
· Write, analyse, review, and rewrite programs to departmental and Group standards
· Understanding the business requirements and provide the real-time solutions
· Following the project development tools like JIRA, Confluence and GIT
· Build and maintain operations tools for monitoring, notifications, trending, and analysis.
· Enhance programs to increase operating efficiency or adapt to new requirements
· Review code from team members Analyst/Developers as part of the quality assurance process
· Produce unit test plans with detailed expected results to fully exercise the code.
· Work closely with solution architect, business Analyst and technology lead to contribute in achieving final outcome.
· Demonstrate upward graph in adaption of various Engineering Practices.
To be successful in this role, you should meet the following requirements:
· Mandatory Skills: -
o Senior Data Engineer with Cloud (GCP) (Experience - 8 to 12 Years)
o Mandatory Google Cloud Big Query Scripting Skills or Cloud SQL hands-on experience
o Mandatory SQL PL/SQL Scripting Experience [High level expertise and hands-on work]
o Mandatory shell scripting Or Python Skills [High level expertise and hands-on work]
o Should have experience on any of the RDBMS
o Should have leadership experience.
· Good To Have: -
o GCP Data Engineer certifications is an added advantage.
o Experience/knowledge on GCP components like GCS, Big Query, Airflow, Cloud SQL and Google Cloud SDK
o Good to have knowledge - Preferable Control M or ETL Tool or Any from - UC4 Atomic, Airflow Composer
o Prefer to have Juniper Ingestion process awareness
o Experience in working within an agile, multidisciplinary ‘Dev-Ops’ team.
o Changes, Incident and Problem Management
Ready to apply?
Apply to CapcoJob Title: Control Automation Engineer (IT)
About Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
Job Location: Pune
Build and run automation that improves the effectiveness, reliability, and auditability of IT controls. You’ll use Python/Unix scripting, CI/CD (Jenkins/Groovy), orchestration (Airflow/Composer), and ServiceNow APIs to automate control checks, evidence collection, monitoring, and remediation workflows—delivered in line with SDLC and controlled deployment (DEPL/ICE) expectations and within agreed SLA/OLA.
Ready to apply?
Apply to Capco
About Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
In this role, you will:
Job Summary:
We are looking for an enthusiastic GCP Data Engineer to join our dynamic and growing project team. In this role, you will have the opportunity to develop your skills while contributing to large-scale data projects on Google Cloud Platform. You will work closely with senior engineers and data analysts to build and maintain robust data pipelines, helping to turn raw data into actionable insights. This is an excellent opportunity for a foundational career in data engineering within a supportive and collaborative environment.
Key Responsibilities:
Required Qualifications:
Preferred Qualifications:
Ready to apply?
Apply to CapcoAbout Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
In this role, you will:
· Technical leadership for a team of engineers that focus on development, deployment and operations
· Lead and contribute to multiple pods with moderate resource requirements, risk, and/or complexity
· Interface technically with a range of stakeholders with customer and business impact
· Leading others to solve complex problems
· Experience in working within an agile, multidisciplinary DevOps team
· Migrate and re-engineer existing services from on-premises data centers to Cloud (GCP/AWS)
· ETL Engineer is responsible for performing system development work with background of various data models and data warehousing concepts.
· Write, analyse, review, and rewrite programs to departmental and Group standards
· Understanding the business requirements and provide the real-time solutions
· Following the project development tools like JIRA, Confluence and GIT
· Build and maintain operations tools for monitoring, notifications, trending, and analysis.
· Enhance programs to increase operating efficiency or adapt to new requirements
· Review code from team members Analyst/Developers as part of the quality assurance process
· Produce unit test plans with detailed expected results to fully exercise the code.
· Work closely with solution architect, business Analyst and technology lead to contribute in achieving final outcome.
· Demonstrate upward graph in adaption of various Engineering Practices.
To be successful in this role, you should meet the following requirements:
· Mandatory Skills: -
o Senior Data Engineer with Cloud (GCP) (Experience - 8 to 12 Years)
o Mandatory Google Cloud Big Query Scripting Skills or Cloud SQL hands-on experience
o Mandatory SQL PL/SQL Scripting Experience [High level expertise and hands-on work]
o Mandatory shell scripting Or Python Skills [High level expertise and hands-on work]
o Should have experience on any of the RDBMS
o Should have leadership experience.
· Good To Have: -
o GCP Data Engineer certifications is an added advantage.
o Experience/knowledge on GCP components like GCS, Big Query, Airflow, Cloud SQL and Google Cloud SDK
o Good to have knowledge - Preferable Control M or ETL Tool or Any from - UC4 Atomic, Airflow Composer
o Prefer to have Juniper Ingestion process awareness
o Experience in working within an agile, multidisciplinary ‘Dev-Ops’ team.
o Changes, Incident and Problem Management
Ready to apply?
Apply to CapcoShare this job
HG Insights is the pioneer of Revenue Growth Intelligence. For more than a decade, we have delivered comprehensive, AI-driven datasets on B2B buyers, technology adoption, IT spend, and buyer intent, sourced from billions of data points. Today, we are a trusted partner to Fortune 500 technology companies, hyperscalers, and innovative B2B vendors seeking precise go-to-market analytics and decision-making.
Through an evolving suite of AI agents that incorporate first-party data and buyer signals, HG Insights enables AI-powered GTM automation across sales, marketing, RevOps, and data analytics teams, modernizing GTM execution from strategy through activation.
Product Group: TrustRadius is a B2B technology decisioning platform that helps buyers make confident software purchase decisions and enables vendors to activate high‑intent, in‑market buyers through rich review content and intent data signals. As part of HG Insights, TrustRadius powers data‑driven go‑to‑market strategies for leading B2B technology brands worldwide.
Job Title: Principal Engineer
Role Overview
We are looking for a Principal Engineer – Engineering to provide technical thought leadership for our vendor‑side data platform and pipelines. This is a senior individual contributor role (IC) with Principal Engineer scope, responsible for the architecture, evolution, and reliability of our microservices and data pipelines that power vendor lead generation, intent resolution.
You will set the long‑term technical direction, mentor senior/staff engineers, and stay hands‑on in critical code paths across NestJS/Node.js, TypeScript, Python data pipelines, and event‑driven services.
Key Responsibilities
Core Technical Skills
Nice‑to‑Have Skills
Business Outcomes and Impact
In this role, you will be accountable for:
Ready to apply?
Apply to HG Insights
Share this job
About Us
HG Insights is the global leader in technology intelligence, delivering actionable AI driven insights through advanced data science and scalable big data solutions. Our platform informs go-to-market decisions, and we influence how businesses spend millions of marketing and sales budgets.
What You’ll Do:
What You’ll Be Responsible For
What You’ll Need
Nice-to-Haves
Ready to apply?
Apply to HG Insights
About Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
Role: GCP Data Engineer
Job Summary:
We are looking for an enthusiastic GCP Data Engineer to join our dynamic and growing project team. In this role, you will have the opportunity to develop your skills while contributing to large-scale data projects on Google Cloud Platform. You will work closely with senior engineers and data analysts to build and maintain robust data pipelines, helping to turn raw data into actionable insights. This is an excellent opportunity for a foundational career in data engineering within a supportive and collaborative environment.
Key Responsibilities:
· Develop, test, and maintain data pipelines and ETL/ELT processes using Python, SQL, and GCP services.
· Write and optimize complex SQL queries in BigQuery for data transformation, extraction, and analysis.
· Assist in the design and implementation of data models within our data warehouse.
· Collaborate with the broader project team to understand data requirements and deliver effective solutions.
· Troubleshoot and resolve issues in data pipelines and data quality to ensure accuracy and availability.
· Contribute to documentation and adhere to data engineering best practices.
· Support the team in various data-related tasks and be eager to learn new technologies and techniques.
Required Qualifications:
· Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field.
· 5-8 years of hands-on experience with strong programming skills in Python for data manipulation and processing.
· Proficiency in SQL for querying and data transformation.
· Hands-on experience or significant project work with Google Cloud Platform (GCP).
· Familiarity with core GCP data services, particularly BigQuery and Pub/Sub.
· Strong understanding of data warehousing concepts and ETL/ELT processes.
· A proactive and curious mindset with strong problem-solving skills.
· Excellent communication and teamwork skills, with the ability to work effectively in a large project team.
Preferred Qualifications:
· Experience with other GCP services such as Cloud Composer (Airflow) or Dataflow, Cloud Functions.
· Familiarity with version control systems, especially Git.
· Google Cloud certification (e.g., Associate Cloud Engineer) is a plus.
· A passion for data and a strong desire to grow a career in the data engineering field.
Ready to apply?
Apply to CapcoCookies & analytics
This site uses cookies from third-party services to deliver its features and to analyze traffic.