What Youβll Do
β
As a Data Engineer at Brex, you will be a core contributor in transforming raw data into actionable insights for various departments across the organization. You'll collaborate closely with Data Scientists, Software Engineers, and business units to create efficient data models, pipelines, and analytics frameworks that drive the business forward. You also play a leading role in the design, implementation, and maintenance of Core Data tables, our high-quality, curated data source for a wide range of analytic applications.
β
β
β
Where youβll work
β
This role will be based in our San Francisco office. We are a hybrid environment that combines the energy and connections of being in the office with the benefits and flexibility of working from home. We currently require a minimum of two coordinated days in the office per week, Wednesday and Thursday. Starting February 2, 2026, we will require three days per week in office - Monday, Wednesday and Thursday. As a perk, we also have up to four weeks per year of fully remote work!
β
β
Responsibilities:
β
- Design, build, and maintain data models and pipelines that scale with the growing number of services, products, and changes in the company.
- Collaborate closely with Data Scientists, Data Analysts, and Business teams to understand their data needs, translating them into robust, efficient, scalable data solutions that enable ease of predictive analytics, data analysis, and metrics formulation.
- Maintain data documentation and definitions, building and ensuring that source-of-truth tables remain high quality for data science and reporting applications.
- Develop and enable integration with various data sources, allowing for more data-driven initiatives across the company.
- Apply best practices in data management to ensure the reliability and robustness of data utilized across various analytics applications.
- Set and proliferate company-wide standards for data relating to structure, quality, and expectations.
- Act as a liaison between the technical and non-technical teams, bridging gaps and ensuring that data solutions align with business objectives.
β
β
Requirements:
β
- 3+ years of experience in Data Engineering, Data Analytics, or a related field such as Analytics Engineering.
- 2+ years of experience working with modern data transformation tools like DBT.
- Advanced knowledge of databases and SQL with the ability to efficiently stage, process, and transform data.
- Experience integrating and orchestrating data workflows with various modern data tools and systems.
- Experience with data modeling, ETL/ELT processes, and data warehousing solutions.
- Experience working with a data warehouse such as Snowflake.
- Experience with a data workflow orchestrator tool such as Airflow.
- Experience with a programming language such as Python.
- Familiarity with BI tools such as Looker, Tableau, or similar platforms is a plus.
- Exceptional quantitative and analytical skills.
- Strong communication skills and ability to collaborate with various stakeholders, both technical and non-technical.
β