All active Composer roles based in Pune.
Pick a job to read the details
Tap any role on the left — its description and apply link will open here.
About Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
Role: GCP Data Engineer
Job Summary:
We are looking for an enthusiastic GCP Data Engineer to join our dynamic and growing project team. In this role, you will have the opportunity to develop your skills while contributing to large-scale data projects on Google Cloud Platform. You will work closely with senior engineers and data analysts to build and maintain robust data pipelines, helping to turn raw data into actionable insights. This is an excellent opportunity for a foundational career in data engineering within a supportive and collaborative environment.
Key Responsibilities:
· Develop, test, and maintain data pipelines and ETL/ELT processes using Python, SQL, and GCP services.
· Write and optimize complex SQL queries in BigQuery for data transformation, extraction, and analysis.
· Assist in the design and implementation of data models within our data warehouse.
· Collaborate with the broader project team to understand data requirements and deliver effective solutions.
· Troubleshoot and resolve issues in data pipelines and data quality to ensure accuracy and availability.
· Contribute to documentation and adhere to data engineering best practices.
· Support the team in various data-related tasks and be eager to learn new technologies and techniques.
Required Qualifications:
· Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field.
· 5-8 years of hands-on experience with strong programming skills in Python for data manipulation and processing.
· Proficiency in SQL for querying and data transformation.
· Hands-on experience or significant project work with Google Cloud Platform (GCP).
· Familiarity with core GCP data services, particularly BigQuery and Pub/Sub.
· Strong understanding of data warehousing concepts and ETL/ELT processes.
· A proactive and curious mindset with strong problem-solving skills.
· Excellent communication and teamwork skills, with the ability to work effectively in a large project team.
Preferred Qualifications:
· Experience with other GCP services such as Cloud Composer (Airflow) or Dataflow, Cloud Functions.
· Familiarity with version control systems, especially Git.
· Google Cloud certification (e.g., Associate Cloud Engineer) is a plus.
· A passion for data and a strong desire to grow a career in the data engineering field.
Ready to apply?
Apply to CapcoAbout Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
In this role, you will:
· Technical leadership for a team of engineers that focus on development, deployment and operations
· Lead and contribute to multiple pods with moderate resource requirements, risk, and/or complexity
· Interface technically with a range of stakeholders with customer and business impact
· Leading others to solve complex problems
· Experience in working within an agile, multidisciplinary DevOps team
· Migrate and re-engineer existing services from on-premises data centers to Cloud (GCP/AWS)
· ETL Engineer is responsible for performing system development work with background of various data models and data warehousing concepts.
· Write, analyse, review, and rewrite programs to departmental and Group standards
· Understanding the business requirements and provide the real-time solutions
· Following the project development tools like JIRA, Confluence and GIT
· Build and maintain operations tools for monitoring, notifications, trending, and analysis.
· Enhance programs to increase operating efficiency or adapt to new requirements
· Review code from team members Analyst/Developers as part of the quality assurance process
· Produce unit test plans with detailed expected results to fully exercise the code.
· Work closely with solution architect, business Analyst and technology lead to contribute in achieving final outcome.
· Demonstrate upward graph in adaption of various Engineering Practices.
To be successful in this role, you should meet the following requirements:
· Mandatory Skills: -
o Senior Data Engineer with Cloud (GCP) (Experience - 8 to 12 Years)
o Mandatory Google Cloud Big Query Scripting Skills or Cloud SQL hands-on experience
o Mandatory SQL PL/SQL Scripting Experience [High level expertise and hands-on work]
o Mandatory shell scripting Or Python Skills [High level expertise and hands-on work]
o Should have experience on any of the RDBMS
o Should have leadership experience.
· Good To Have: -
o GCP Data Engineer certifications is an added advantage.
o Experience/knowledge on GCP components like GCS, Big Query, Airflow, Cloud SQL and Google Cloud SDK
o Good to have knowledge - Preferable Control M or ETL Tool or Any from - UC4 Atomic, Airflow Composer
o Prefer to have Juniper Ingestion process awareness
o Experience in working within an agile, multidisciplinary ‘Dev-Ops’ team.
o Changes, Incident and Problem Management
Ready to apply?
Apply to Capco
About Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
In this role, you will:
Job Summary:
About the Role We are seeking an experienced GCP Lead to spearhead cloud data architecture and delivery on Google Cloud Platform (GCP). In this leadership role, you will design, implement, and optimize scalable data pipelines using BigQuery, Pub/Sub, and ETL frameworks, while driving stakeholder alignment across engineering, product, and business teams. The ideal candidate thrives in fast-paced environments, mentoring teams on GCP best practices and ensuring seamless data-driven solutions for enterprise clients. Key ResponsibilitiesLead end-to-end GCP data platform implementations, focusing on BigQuery for analytics, Pub/Sub for real-time messaging, and ETL pipelines using Dataflow, Composer, or custom tools.Architect and optimize high-volume data ingestion, transformation, and querying workflows to support business intelligence and ML workloads.Collaborate with cross-functional stakeholders—including product owners, data scientists, and executives—to gather requirements, define roadmaps, and deliver value through clear communication and presentations.Mentor junior engineers on GCP services, conduct code reviews, and enforce best practices for security, cost optimization, and scalability.Troubleshoot complex issues in production environments, perform performance tuning, and integrate with hybrid/multi-cloud setups.Drive migration projects from on-premise or other clouds (e.g., AWS/Cloudera) to GCP, ensuring minimal downtime and data integrity.
Required Qualifications10-12 years of total experience in IT/cloud consulting, with at least 5+ years hands-on with GCP (BigQuery, Pub/Sub, Dataflow/ETL).Bachelor's or Master's degree in Computer Science, Engineering, or related field.Proven track record leading GCP projects with measurable outcomes (e.g., reduced ETL latency by 40% or scaled Pub/Sub to millions of events/sec).Strong communication skills for stakeholder management, including executive presentations and agile ceremonies.GCP certifications (e.g., Professional Data Engineer, Professional Cloud Architect) preferred.Preferred SkillsExperience with Dataproc, Vertex AI, or integration with tools like Snowflake/Databricks.Familiarity with CI/CD (Cloud Build), IAM, and monitoring (Cloud Monitoring/Logging).Knowledge of data governance, compliance (GDPR), and cost management in GCP.
Ready to apply?
Apply to CapcoJob Title: Control Automation Engineer (IT)
About Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
Job Location: Pune
Build and run automation that improves the effectiveness, reliability, and auditability of IT controls. You’ll use Python/Unix scripting, CI/CD (Jenkins/Groovy), orchestration (Airflow/Composer), and ServiceNow APIs to automate control checks, evidence collection, monitoring, and remediation workflows—delivered in line with SDLC and controlled deployment (DEPL/ICE) expectations and within agreed SLA/OLA.
Ready to apply?
Apply to Capco
About Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
In this role, you will:
Job Summary:
We are looking for an enthusiastic GCP Data Engineer to join our dynamic and growing project team. In this role, you will have the opportunity to develop your skills while contributing to large-scale data projects on Google Cloud Platform. You will work closely with senior engineers and data analysts to build and maintain robust data pipelines, helping to turn raw data into actionable insights. This is an excellent opportunity for a foundational career in data engineering within a supportive and collaborative environment.
Key Responsibilities:
Required Qualifications:
Preferred Qualifications:
Ready to apply?
Apply to CapcoAbout Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
In this role, you will:
· Technical leadership for a team of engineers that focus on development, deployment and operations
· Lead and contribute to multiple pods with moderate resource requirements, risk, and/or complexity
· Interface technically with a range of stakeholders with customer and business impact
· Leading others to solve complex problems
· Experience in working within an agile, multidisciplinary DevOps team
· Migrate and re-engineer existing services from on-premises data centers to Cloud (GCP/AWS)
· ETL Engineer is responsible for performing system development work with background of various data models and data warehousing concepts.
· Write, analyse, review, and rewrite programs to departmental and Group standards
· Understanding the business requirements and provide the real-time solutions
· Following the project development tools like JIRA, Confluence and GIT
· Build and maintain operations tools for monitoring, notifications, trending, and analysis.
· Enhance programs to increase operating efficiency or adapt to new requirements
· Review code from team members Analyst/Developers as part of the quality assurance process
· Produce unit test plans with detailed expected results to fully exercise the code.
· Work closely with solution architect, business Analyst and technology lead to contribute in achieving final outcome.
· Demonstrate upward graph in adaption of various Engineering Practices.
To be successful in this role, you should meet the following requirements:
· Mandatory Skills: -
o Senior Data Engineer with Cloud (GCP) (Experience - 8 to 12 Years)
o Mandatory Google Cloud Big Query Scripting Skills or Cloud SQL hands-on experience
o Mandatory SQL PL/SQL Scripting Experience [High level expertise and hands-on work]
o Mandatory shell scripting Or Python Skills [High level expertise and hands-on work]
o Should have experience on any of the RDBMS
o Should have leadership experience.
· Good To Have: -
o GCP Data Engineer certifications is an added advantage.
o Experience/knowledge on GCP components like GCS, Big Query, Airflow, Cloud SQL and Google Cloud SDK
o Good to have knowledge - Preferable Control M or ETL Tool or Any from - UC4 Atomic, Airflow Composer
o Prefer to have Juniper Ingestion process awareness
o Experience in working within an agile, multidisciplinary ‘Dev-Ops’ team.
o Changes, Incident and Problem Management
Ready to apply?
Apply to CapcoJob Title: GCP Platform Engineer
About Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
Job Title: GCP Platform Engineer
Location: Pune
Experience: 12-15 years
We are seeking a motivated and talented GCP Platform Engineer to join our growing data platform team. In this role, you will be responsible for building, managing, and automating the Google Cloud Platform infrastructure that powers our large-scale data engineering projects. You will work closely with our data engineers and Platform lead to create a scalable, reliable, and secure environment for our data workloads.
Key Responsibilities:
• Design, build, and maintain scalable and reliable GCP infrastructure using Infrastructure as Code (IaC) principles, primarily with Terraform.
• Provision and manage core GCP services, including VPC, IAM, Google Cloud Storage (GCS), BigQuery, Compute, CloudSQL, Dataflow, Dataproc, and Composer.
• Develop and maintain CI/CD pipelines for infrastructure deployment and automation.
• Implement and manage monitoring, logging, and alerting solutions to ensure platform health and performance.
• Collaborate with data engineers to understand their infrastructure needs and provide support.
• Ensure the security and compliance of the GCP environment by implementing best practices and policies.
• Contribute to the continuous improvement of the platform by evaluating new technologies and automating processes.
• Work under the guidance of a platform lead in a collaborative team environment.
Required Qualifications:
• Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent practical experience.
• 9+ years of hands-on experience with Google Cloud Platform (GCP).
• Proficiency in writing Infrastructure as Code (IaC) using Terraform.
• Experience with scripting languages such as Python or Bash.
• Understanding of networking concepts (VPCs, subnets, firewalls).
• Familiarity with CI/CD concepts and tools (e.g., Google Cloud Build, Jenkins, GitLab CI).
• Strong problem-solving skills and a collaborative mindset.
• Excellent communication skills.
Preferred Qualifications:
• Experience with data engineering services on GCP (e.g., BigQuery, Compute, CloudSQL, Dataflow, Dataproc, Composer).
• GCP certification (e.g., Associate Cloud Engineer, Professional Cloud DevOps Engineer).
• Experience with monitoring and logging tools like Google Cloud's operations suite (formerly Stackdriver).
• Familiarity with Agile development methodologies.
Ready to apply?
Apply to CapcoJob Title: GCP Platform Engineer
About Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
Job Title: GCP Platform Engineer
Location: Pune
Experience: 5-8 years
We are seeking a motivated and talented GCP Platform Engineer to join our growing data platform team. In this role, you will be responsible for building, managing, and automating the Google Cloud Platform infrastructure that powers our large-scale data engineering projects. You will work closely with our data engineers and Platform lead to create a scalable, reliable, and secure environment for our data workloads.
Key Responsibilities:
• Design, build, and maintain scalable and reliable GCP infrastructure using Infrastructure as Code (IaC) principles, primarily with Terraform.
• Provision and manage core GCP services, including VPC, IAM, Google Cloud Storage (GCS), BigQuery, Compute, CloudSQL, Dataflow, Dataproc, and Composer.
• Develop and maintain CI/CD pipelines for infrastructure deployment and automation.
• Implement and manage monitoring, logging, and alerting solutions to ensure platform health and performance.
• Collaborate with data engineers to understand their infrastructure needs and provide support.
• Ensure the security and compliance of the GCP environment by implementing best practices and policies.
• Contribute to the continuous improvement of the platform by evaluating new technologies and automating processes.
• Work under the guidance of a platform lead in a collaborative team environment.
Required Qualifications:
• Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent practical experience.
• 3-5 years of hands-on experience with Google Cloud Platform (GCP).
• Proficiency in writing Infrastructure as Code (IaC) using Terraform.
• Experience with scripting languages such as Python or Bash.
• Understanding of networking concepts (VPCs, subnets, firewalls).
• Familiarity with CI/CD concepts and tools (e.g., Google Cloud Build, Jenkins, GitLab CI).
• Strong problem-solving skills and a collaborative mindset.
• Excellent communication skills.
Preferred Qualifications:
• Experience with data engineering services on GCP (e.g., BigQuery, Compute, CloudSQL, Dataflow, Dataproc, Composer).
• GCP certification (e.g., Associate Cloud Engineer, Professional Cloud DevOps Engineer).
• Experience with monitoring and logging tools like Google Cloud's operations suite (formerly Stackdriver).
• Familiarity with Agile development methodologies.
Ready to apply?
Apply to CapcoAbout Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
Role: GCP Data Engineer
Job Summary:
We are looking for an enthusiastic GCP Data Engineer to join our dynamic and growing project team. In this role, you will have the opportunity to develop your skills while contributing to large-scale data projects on Google Cloud Platform. You will work closely with senior engineers and data analysts to build and maintain robust data pipelines, helping to turn raw data into actionable insights. This is an excellent opportunity for a foundational career in data engineering within a supportive and collaborative environment.
Key Responsibilities:
· Develop, test, and maintain data pipelines and ETL/ELT processes using Python, SQL, and GCP services.
· Write and optimize complex SQL queries in BigQuery for data transformation, extraction, and analysis.
· Assist in the design and implementation of data models within our data warehouse.
· Collaborate with the broader project team to understand data requirements and deliver effective solutions.
· Troubleshoot and resolve issues in data pipelines and data quality to ensure accuracy and availability.
· Contribute to documentation and adhere to data engineering best practices.
· Support the team in various data-related tasks and be eager to learn new technologies and techniques.
Required Qualifications:
· Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field.
· 5-8 years of hands-on experience with strong programming skills in Python for data manipulation and processing.
· Proficiency in SQL for querying and data transformation.
· Hands-on experience or significant project work with Google Cloud Platform (GCP).
· Familiarity with core GCP data services, particularly BigQuery and Pub/Sub.
· Strong understanding of data warehousing concepts and ETL/ELT processes.
· A proactive and curious mindset with strong problem-solving skills.
· Excellent communication and teamwork skills, with the ability to work effectively in a large project team.
Preferred Qualifications:
· Experience with other GCP services such as Cloud Composer (Airflow) or Dataflow, Cloud Functions.
· Familiarity with version control systems, especially Git.
· Google Cloud certification (e.g., Associate Cloud Engineer) is a plus.
· A passion for data and a strong desire to grow a career in the data engineering field.
Ready to apply?
Apply to CapcoCookies & analytics
This site uses cookies from third-party services to deliver its features and to analyze traffic.