Careers at 7bridges

Senior/Lead Data Engineer wanted

Help shape our software platform as we deliver AI powered supply chains to our customers.

Control Tower

Mission

To create and manage a robust, coherent, high quality, high fidelity data asset that spans the data 7bridges receives and processes and to make it easily accessible by different business domains (product, data science, customer success and marketing) in a performant manner.

 

Basics

  • UK based ideally London area to work in Soho office 2/3 days per week.
  • Permanent employee basis
  • Benefits include private health, life insurance, electric car lease scheme and more.

Outcomes

  • Build and make robust data services platform available to product, engineering, data science and customer success in appropriate ways for each domain. Have these data services maintainable by data engineers and domain specific analysts. Their upkeep and running should not adversely affect any other services or products and their performance should be defined by each domain and adhered to 99% of the time.
  • Cost for the data services platform are within budget and transparent to the company leadership
  • The provision of rates and transit time estimates should be accurate to within 95% of the real figure
  • Data from Logistics World should cover 90% of possible logistics decisions that our customers should want (covering 90% of their logistics network)
  • It should take a domain analyst no more than 2 hours to debug a data pipeline in their field
  • The whole company should be able to understand what pipelines are available and how they can be accessed

Required Competencies

NOTE: All applicants should be legal residents of the United Kingdom.

Technical skills

  • Experience designing scalable data platforms (e.g. Snowflake, BigQuery, Databricks) - particularly BigQuery
  • Built robust ETL/ELT pipelines using tools like Airflow, Dagster, or dbt
  • Strong in Python and SQL, with sound data modeling skills (star/snowflake schema, etc.)
  • Experience with schema evolution, partitioning, and incremental loads
  • Experience of using version control tools such as github or gitlab.
  • Good understanding of dimensional data modelling - kimball star schema.
  • Good understanding of Normal Forms (1NF -> 2NF -> 3NF) and how that impacts dimensional data modeling.

Data Harmonisation

  • Experience ingesting diverse document/data types: PDF, CSV, EDI, XML, JSON
  • Experience with OCR tools (Tesseract, Textract, etc.) and document parsing strategies
  • Has worked with supply chain document types (airway bills, POs, commercial invoices)
  • Able to define canonical data models and harmonization logic
  • Understands validation, lineage, and observability in data pipelines

Other technical skills

  • Building and maintaining CI/CD pipelines.
  • Understanding stored procedures.
  • Running DBT in Kubernetes.
  • Experience in using tools such as Pubsub, Datastream, Argo Workflow etc.
  • Ability to write terraform
  • Understanding of how costs are impacted on good/bad SQL structure depending on the query engine being used. ( i.e. understands difference between row by row vs columnar data storage, this to be able to design performant queries and being able to formulate partitioning strategies, data structures to optimise storage layers for a given query engine.)

Technical Leadership & Team Building

  • Has experience leading small engineering or data teams in high-growth environments
  • Drives adoption of CI/CD, testing, code reviews, and data engineering best practices
  • Good collaborator with product, backend, and analytics teams
  • Experience making build-vs-buy decisions (e.g., data catalog, OCR vendor, etc.)

Execution & Delivery

  • Can deliver quickly and iterate on MVP solutions
  • Writes clear documentation and plans
  • Capable of owning a data platform roadmap from 0 → 1
  • Experience balancing tech debt with shipping value

Culture & Fit

  • Comfortable in fast-paced, ambiguous environments
  • Ownership mindset, strong communicator, works autonomously
  • Collaborative and values-aligned with the mission
  • Passionate about solving real-world logistics/supply chain challenges

BI tools

  • Experience with BI tools (especially Superset) such as Tableau, Apache Superset, Metabase, Power BI etc.