All active dbt roles based in Utah.
Pick a job to read the details
Tap any role on the left — its description and apply link will open here.
Share this job
This position does not offer visa sponsorship now or in the future.
We’re looking for an Analytics & Business Intelligence, AVP to measurably drive growth for our Operations and Investment Management (OIPM) group at iCapital.
This role involves defining and calculating operational KPIs, transforming raw source data into processed business data, creating live dashboards to monitor performance, identifying driver metrics and statistical correlations, and finally working with business teams to enable data-driven change.
You will sit in the Analytics group of the broader Data & Analytics team within Technology, and will work closely with our Data Engineering group. Although you are initially expected to be an individual contributor, the role would eventually entail managing a team of 1-3 direct reports.
On the technical side, you will interface with our data stack of Airbyte, Snowflake, dbt, Prefect, Python and Tableau (among others). On the business side, you will work with our operations teams to produce management visibility, efficiency improvements and workflow automation.
Ideal candidates will be highly proficient in our technical tooling, understand complex business problems, interface with senior stakeholders, develop execution plans, and finally present on their work.
Responsibilities:
Required Qualifications:
Preferred Qualifications:
Benefits
The base salary range for this role is $100,000 to $135,000 depending on level and experience. iCapital offers a compensation package which includes salary, equity for all full-time employees, and an annual performance bonus. Employees also receive a comprehensive benefits package that includes an employer matched retirement plan, generously subsidized healthcare with 100% employer paid dental, vision, telemedicine, and virtual mental health counseling, parental leave, and unlimited paid time off (PTO).
We believe the best ideas and innovation happen when we are together. Employees in this role will work in the office Monday-Thursday, with the flexibility to work remotely on Friday.
For additional information on iCapital, please visit https://www.icapitalnetwork.com/about-us Twitter: @icapitalnetwork | LinkedIn: https://www.linkedin.com/company/icapital-network-inc | Awards Disclaimer: https://www.icapitalnetwork.com/about-us/recognition/
iCapital is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, gender, sexual orientation, gender identity, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.
Ready to apply?
Apply to iCapital
Share this job
This position does not offer visa sponsorship now or in the future.
iCapital is seeking an exceptional Data Engineer to help scale the data foundations that power the business. This individual will design, build, and optimize the data pipelines and infrastructure that enable data to be a strategic asset across the organization. This role is highly collaborative and will partner with the Product, Engineering, Analytics teams and business stakeholders to quickly understand evolving challenges and deliver scalable, maintainable, and high‑quality solutions.
Data sits at the core of iCapital’s operations. The firm relies on clean, reliable, well‑structured data to drive decision‑making, enhance client experiences, and support a rapidly growing product suite. This individual will be responsible for moving and transforming data and developing a deep understanding of the business context behind it and help teams act on insights.
Responsibilities
Qualifications
Benefits
The base salary range for this role is $100,000 to $135,000. iCapital offers a compensation package which includes salary, equity for all full-time employees, and an annual performance bonus. Employees also receive a comprehensive benefits package that includes an employer matched retirement plan, generously subsidized healthcare with 100% employer paid dental, vision, telemedicine, and virtual mental health counseling, parental leave, and unlimited paid time off (PTO).
We believe the best ideas and innovation happen when we are together. Employees in this role will work in the office Monday-Thursday, with the flexibility to work remotely on Friday.
For additional information on iCapital, please visit https://www.icapitalnetwork.com/about-us Twitter: @icapitalnetwork | LinkedIn: https://www.linkedin.com/company/icapital-network-inc | Awards Disclaimer: https://www.icapitalnetwork.com/about-us/recognition/
iCapital is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, gender, sexual orientation, gender identity, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.
Ready to apply?
Apply to iCapital
Share this job
Flex is a growth-stage, NYC headquartered FinTech company that is creating the best rent payment experience. It’s hard to believe that it’s 2026 and paying rent on time is expensive, inflexible, and difficult. We’re here to change that! Flex enables our users to pay rent throughout the month on a schedule that better fits their finances and budget. Our mission is to empower as many renters as possible with flexibility over their most significant recurring expense. After deliberately keeping a stealth profile as we built up unprecedented investor support and an enthusiastic user base, we are looking for motivated individuals to help us keep our mission growing. Will you be a part of the team?
The Role
Analytics Engineering at Flex sits at the intersection of Data, Product, Engineering, and business objectives. As Director, Analytics Engineering you will run the team that enables stakeholders at Flex to easily find the data they need to make decisions. You’ll be responsible for developing a world-class analytics platform that powers all data consumption at Flex, as well as managing and developing a team of high-performing Analytics Engineers. You'll also be hands-on, leveraging your technical skills to develop data models and build better data products and solutions. You'll partner with analysts, engineers, PMs and others to define the requirements while being accountable for delivering the end data product.
You Will:
You Have:
Flex takes a market-based approach to pay, and compensation may vary depending on your primary work location. Work locations are categorized into one of three tiers based on a cost of labor index for that geographic area.
Tier A (NYC/SF/Seattle): $240,000-290,000
Tier B (Austin, Washington D.C. Philadelphia, Los Angeles, San Diego, Chicago, Miami, Atlanta : $215,000-265,000
Tier C (SLC/All Other States): $200,000-250,000
Offices
Roles posted in New York, San Francisco, or Salt Lake City are hybrid positions with on-site expectations of 2-3 days per week in our local offices. For candidates outside of these areas, you may be eligible for our relocation assistance program.
We understand that it takes a diverse team of highly intelligent, curious, determined, empathetic, and self aware people to grow a successful company. Our HQ is located in New York City, but we have employees located throughout the US, Australia, Canada and South America. We are growing quickly, but deliberately, with a focus on building an inclusive culture. Our dynamic team has incredible perspectives to share, just as we know you do, and we take great pride in being an equal opportunity workplace.
Offices
Roles posted in New York, San Francisco, and Salt Lake City are hybrid positions with on-site expectations of 2-3 days per week in our local offices. For candidates outside of these areas, you may be eligible for our relocation assistance program.
Benefits
For full-time U.S. employees we offer:
For full-time non-U.S. employees, we offer:
Ready to apply?
Apply to Flex
Share this job
NetDocuments is committed to providing an excellent candidate experience and will never ask you to engage in recruitment activity without phone, video, and in person meetings and communications from emails using the @netdocuments.com domain. If you have any concerns or questions about communications you have received, please send them to hrgroup@netdocuments.com so our team members can review.
NetDocuments is the world’s #1 trusted cloud-based content management and productivity platform that helps legal professionals do their best work. We strive to win together through passionate hard work, exploring new things and recognizing every interaction matters.
NetDocuments provides rewarding career growth in an inclusive, diverse environment where employees are encouraged to openly contribute creative ideas and innovation, backed by supportive peers and leadership working together to achieve our goals as a unified team.
At our core, we are dedicated to empowering our employees to drive successful business outcomes and better user experiences for our customers and partners. Our customer-centric approach and employee enablement has allowed us to enjoy many accolades, including being named among the 2022, 2023, & 2024 list of Inc. Magazine’s 5000 Fastest-Growing Private Companies in America.
Other recent awards include:
NetDocuments is a hybrid, remote-friendly workplace. Come join our team and work inspired each day!
About the role
We are seeking a highly skilled Senior BI Engineer to take ownership of our Snowflake data platform and help evolve it into a scalable, well-governed foundation for analytics and AI. This is a hands-on engineering role responsible for shaping data architecture, enforcing standards, and building a clean, reliable data layer that supports both business intelligence and AI-driven use cases. You will report directly to the Director of BI and have significant autonomy in how the platform evolves.
What You’ll Do
Own and Elevate the Data Platform
Architect for Scale and Clarity
Build an AI-Ready Data Layer
Establish a Real Data Engineering SDLC
Leverage AI as a Force Multiplier
Collaborate and Lead Technically
Documentation & Transparency
What You’ll Need to be Successful
Data Engineering Depth & Ownership
AI-First Mindset
What Will Make You Stand Out
What You’ll Love About NetDocuments
Compensation Transparency
The compensation range for this position is: $155,000-$165,000
The posted cash compensation for this position includes on target earnings, base salary and variable if applicable. Some roles may qualify for overtime pay. Individual compensation packages are determined based on various factors specific to each candidate, such as career level, skills, experience, geographic location, qualifications, and other job-related
Equal Opportunity
NetDocuments is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind. All employment decisions are based on business needs, job requirements, individual qualifications, without regard to race, color, religion, sex, (including pregnancy), national origin, age, physical and mental disability, marital status, sexual orientation, gender identity and/or expression, military and veteran status, or any other status protected by laws or regulations in the locations where we operate. NetDocuments believes diversity and inclusion among our employees is critical to our success, and we are committed to providing a work environment free of discrimination and harassment.
Ready to apply?
Apply to NetDocumentsWe Are Route
When shoppers hit "buy," a great customer experience doesn’t end, it’s just getting started. For too long, everything after checkout has been the weakest link in ecommerce: confusing tracking, lost or damaged packages, clunky returns, and missed opportunities to turn first-time buyers into lifelong fans. That's why we built Route.
Our mission is simple: we create shopper confidence that fuels brand growth. Route is the leading post-purchase platform for modern ecommerce, trusted by thousands of brands and protecting more than $20 billion in gross merchandise value to date. From package protection and industry-leading order tracking to returns and exchanges, cash back loyalty, and engaging product recommendations, Route brings every moment after checkout into one powerful platform, empowering shoppers with visibility and peace of mind while giving merchants loyalty that lasts well beyond the first sale.
Since launching in 2018, Route has raised over $250 million from leading investors including Craft Ventures, Hedosophia, and Hanaco Ventures, and has grown into a complete post-purchase ecosystem loved by millions of shoppers and the brands they buy from. Throughout that growth, we've stayed committed to building innovative products that empower our customers and to fostering a people-first, values-driven culture that makes Route a place where great work and great people thrive.
We're looking for talented people across the ecommerce space to join us on the next chapter of this adventure. Don't just take our word for it, Discover what life at Route has to offer.
The team
The Data Engineering team (DAENGR) is the backbone of Route's data ecosystem. We are primarily responsible for the data infrastructure, quality, standards, frameworks, and architecture that power enterprise data, reporting, and analytics across the entire organization. We solve problems of poor data quality and limited accessibility, simplify reporting for teams outside of data engineering, and monitor the uptime, security, and consistency of the majority of Route's data lifecycle. We collaborate closely with every corner of the business to make data a first-class citizen at Route.
The opportunity
This is a rare chance to shape the data destiny of a young, fast-growing company at an inflection point. As Route moves into the AI era of data infrastructure, you'll have the opportunity to leave a lasting legacy — exploring new systems and designs, and building tools that genuinely change how Route operates day to day. We are actively migrating from legacy Snowflake pipelines into a modern Databricks-first architecture, building out an Enterprise Data Warehouse (EDW) with a normalized 3NF core, and laying the groundwork for AI-ready data systems. You'll be co-authoring the next chapter of data at Route.
The ideal candidate is organized and articulate in their thinking, able to adapt and compromise to keep pace with business needs, but firm enough to push back when security and data integrity are at stake. We aim to help Route improve profitability, cash flow, and return on investment by providing accurate data to make informed decisions and process improvements.
What you’ll do
What we’re looking for
Equal opportunity for all
Route is an Equal Opportunity Employer. We embrace diversity and equal opportunity in a serious way. We are committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better our work will be.
Total Rewards
We know our team works best when everyone feels happy, healthy, and supported. We offer to pay 95% - 100% of your health insurance premiums for you and your family, remote or hybrid work arrangements, unlimited PTO, 401k matching, formalized growth opportunities, learning & development, DEI programs & events, and so much more.
Pay Transparency
Salary for this role: $138,000 - $146,000
The cash compensation above includes base salary, and is not reflective of potential commission for employees in eligible roles, or annual bonus targets under Route’s bonus plan for eligible roles. In addition to cash compensation, all Route employees are eligible to participate in Routes equity incentive plan to receive stock options per the terms of the agreement. Some roles may also be eligible for overtime pay. Individual compensation packages are based on a few different factors unique to each candidate, including their career level, skills, experience, specific geographic location qualifications and other job-related reasons.
Ready to apply?
Apply to Route
ABOUT LVT
LVT is redefining how businesses operate in the physical world, moving beyond traditional security solutions to deliver AI-driven, actionable intelligence that makes sites smarter, safer, and more secure. Since pioneering our first mobile, solar-powered units, our commitment to scrappy, hands-on innovation has made us an established leader and one of the fastest-growing companies in intelligent site technology. We are building the next generation of solutions—from our physical units in the field to a powerful Agentic AI platform—that allows our customers to gain unprecedented visibility and control over safety, compliance, and operations. This is your chance to join a cutting-edge team that isn't just watching the world change, but actively building the technology that is changing it.
We’re a team that’s focused on growth and innovation, and we’re proud that our crew, products, and leadership are being recognized for it.
As Data Engineer, AI, you will play a key role in building and advancing LVT’s data platform — contributing across the full data engineering lifecycle, from ingestion and transformation to semantic modeling and delivery — while also helping build the data infrastructure that powers LVT’s AI initiatives, including RAG pipelines and Snowflake Cortex models. This is a core engineering role with an AI edge: you’ll keep the data platform running with precision and reliability, and you’ll be the person who makes sure our AI systems have the clean, well-structured data they need to perform.
LVT is a flexible-first company. This role can be performed remotely, with a preference for candidates who can work from our American Fork, UT office.
BENEFITS
We believe you do your best work when your whole life is supported. We invest in our crew’s health, families, and financial futures with a benefits package designed to support you inside and outside the office. Full-time benefits include, but not limited to: Comprehensive health, dental and vision coverage, retirement benefits (401k match up to 4%), and flexible PTO.
LVT IS PROUD TO BE AN EQUAL OPPORTUNITY EMPLOYER. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. All candidates must pass a drug screening and background check upon employment. Some roles may also require passing a federal background check and fingerprinting. Must be authorized to work in the U.S. If reasonable accommodation is needed to participate in the job application or interview process, and/or to perform essential job functions, please reach out to your recruiter.
Ready to apply?
Apply to LVT
About the Role
We’re looking for a Senior Data Engineer to design, build, and optimize our data platforms so teams across the company can make fast, reliable, data-driven decisions. You’ll be a key technical leader, owning end-to-end data pipelines and modeling, and setting best practices around how we work with data.
You’ll work heavily with Databricks, Spark, SQL, and Python, building scalable data solutions that power analytics, reporting, and data products. Experience in payments or financial services is a strong plus.
What You’ll Do
Required Qualifications
Nice to Have
What You’ll Bring
What We Offer
Role Information:
Applicants must be authorized to work for any employer in the U.S. We are unable to sponsor or take over sponsorship of an employment Visa at this time.
Ever wondered how your favorite local shops compete with the big guys? That’s where we come in. We’re Quilt Software, providing Main Street's unsung heroes – from quirky cheese shops to family-run jewelry stores – with the tools they need to compete. Last year, we helped 14,000+ shops make over $2 billion in sales with our family of industry-specific software solutions.
If you get a kick out of supporting local businesses, love great software, and want to be part of a company that’s powering Main Street, we’d love to chat. Come join us in our quest to keep local retail not just alive, but thriving!
Notice - Employment Scams
Communication from our team regarding job opportunities will only be made by a Quilt Software employee with an @quiltsoftware.com email address. We do not conduct interviews over email or chat platforms, and we will never ask you to provide personal or financial information such as your mailing address, social security number, credit card numbers, or banking information. If you believe a scammer is contacting you, please mark the communication as "phishing" or “spam” and do not respond.
Ready to apply?
Apply to Quilt LLC
Share this job
ABOUT LVT
LVT is redefining how businesses operate in the physical world, moving beyond traditional security solutions to deliver AI-driven, actionable intelligence that makes sites smarter, safer, and more secure. Since pioneering our first mobile, solar-powered units, our commitment to scrappy, hands-on innovation has made us an established leader and one of the fastest-growing companies in intelligent site technology. We are building the next generation of solutions—from our physical units in the field to a powerful Agentic AI platform—that allows our customers to gain unprecedented visibility and control over safety, compliance, and operations. This is your chance to join a cutting-edge team that isn't just watching the world change, but actively building the technology that is changing it.
We’re a team that’s focused on growth and innovation, and we’re proud that our crew, products, and leadership are being recognized for it.
ABOUT THIS ROLE
As the Sr. Manager of Data Engineering and Architecture, you will be a hands-on leader responsible for defining and executing the data engineering strategy, architecture, and technology stack. You will be responsible for the foundational data infrastructure that powers analytics and decision-making across the organization.
A critical part of this role will be building, mentoring, and managing a team of 3 data engineers. You will guide the team's efforts while also contributing directly to the development of our modern data warehouse in Snowflake using dbt and SQL, transforming raw data into reliable and accessible datasets. Crucially, you will spearhead the data migration efforts for our ongoing Oracle Fusion Cloud deployment, designing a robust, cohesive data ecosystem that bridges our new Oracle environment with Snowflake.
Your leadership will ensure the successful design, build, and maintenance of robust data pipelines that ingest data from a variety of internal and external sources into Snowflake. Your team's work will support reporting, dashboarding, and analysis across the company, enabling teams to make informed decisions based on trusted data. Given the green-field nature of this initiative, the role is expected to be approximately 30% technical leadership, strategy, and people management, and 70% direct, hands-on engineering and architecture.
This position is based in a hybrid work environment and requires regular in-office collaboration. It offers an opportunity to lead with modern data tools, build a high-performing team, and make a direct, strategic impact on data quality and accessibility during a massive phase of enterprise scaling.
ROLE RESPONSIBILITIES
Data Strategy & Architecture: Define the long-term vision, strategy, and architecture for the company’s data platform. Design a cohesive hybrid architecture that maximizes the strengths of our full tech stack, ensuring it drives measurable business value and scales efficiently to support hyper-growth.
Team Leadership & Management: Build, mentor, and manage a team of 3 data engineers, fostering a culture of technical excellence, accountability, and continuous improvement.
Data Modeling & Transformation: Lead the team in building and maintaining robust data models using dbt and SQL that support complex analytics and reporting needs. Contribute directly as an individual contributor as needed.
Snowflake Development: Oversee the design and optimization of the Snowflake data warehouse to ensure performance, scalability, and usability. Participate directly in key development efforts.
Cross-Functional Collaboration: Act as the primary technical partner to analysts, business stakeholders, and data teams to deeply understand requirements and translate them into strategic engineering solutions and delivery plans.
Performance Tuning: Guide the optimization of SQL queries and data transformations to improve execution speed and resource efficiency across the platform.
Tooling & Automation: Identify, evaluate, and implement opportunities to automate data workflows, improve pipeline reliability, and establish a formal DataOps/MLOps framework using modern orchestration tools (e.g., Airflow, Prefect, cloud-native serverless functions).
BI Tool Support: Ensure the team provides clean, well-structured data models to enable effective use of BI tools like Looker, Sigma, Tableau, or similar platforms.
Pipeline Engineering: Direct the development and maintenance of scalable data ingestion pipelines that pull data from APIs and other sources into Snowflake, including exploring solutions for near real-time data feeds.
Data Governance & Quality: Champion best practices in data governance, data lifecycle management, and dimensional modeling. Implement data validation checks, documentation standards, and lineage tracking to maintain high data integrity across all of our systems.
Oracle Fusion Cloud Migration & Integration: Lead the complex data migration strategy for our active Oracle deployment. Architect, build, and maintain secure, high-performing data flows and syncs between Oracle and Snowflake to ensure operational continuity and analytical excellence.
OUR IDEAL CANDIDATE
Experience: 8+ years in data engineering or related roles, with a strong focus on data modeling and pipeline development.
Education: Bachelor’s degree in Computer Science, Engineering, Data Analytics, or a related field.
Technical Skills:
Proven experience leading complex data migration or implementation projects.
Advanced proficiency in SQL and experience with data modeling (e.g., star/snowflake schemas).
Hands-on experience with dbt for building modular and testable data transformations.
Experience developing data pipelines using Python and workflow orchestration tools (e.g., Airflow, Prefect).
Deep understanding of Snowflake.
Demonstrated ability to design and implement a modern data architecture from scratch.
BI & Analytics Tools:
Familiarity with BI platforms such as Looker, Tableau, or Sigma is helpful.
Infrastructure & Governance:
Understanding of ELT/ETL workflows, data governance, and monitoring practices.
Experience defining and enforcing organizational standards for data quality, metadata management, and cost optimization within a cloud data warehouse (Snowflake).
Communication & Problem Solving:
Ability to clearly explain technical details to both technical and non-technical stakeholders.
Strong analytical and debugging skills; attention to detail in code and data quality.
BENEFITS
We believe you do your best work when your whole life is supported. We invest in our crew’s health, families, and financial futures with a benefits package designed to support you inside and outside the office. Full-time benefits include, but not limited to: Comprehensive health, dental and vision coverage, retirement benefits (401k match up to 4%), and flexible PTO.
LVT IS PROUD TO BE AN EQUAL OPPORTUNITY EMPLOYER. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. All candidates must pass a drug screening and background check upon employment. Some roles may also require passing a federal background check and fingerprinting. Must be authorized to work in the U.S. If reasonable accommodation is needed to participate in the job application or interview process, and/or to perform essential job functions, please reach out to your recruiter.
Ready to apply?
Apply to LVT
Cookies & analytics
This site uses cookies from third-party services to deliver its features and to analyze traffic.