Job description
Senior Data Engineer (Cloud, AWS, Azure, GCP)
(Salary: £50-65,000. Remote-first + two days in London per month)
Our technology
Adam is the leading digital employee for rental communities, delivering automation and front-line intelligence through natural conversation. Through Adam, we are changing the way real estate is operated, bringing automation for owners/operators and world-class customer service to renters. Travtus is an AI R&D Company based in London, developing AI solutions for the Residential Real Estate industry.
About the Role
As a Senior Data Engineer in the team, you will help take our data platform to the next level. This role is to scope, design and implement the data pipelines, analytics, reporting and alerting that help our clients utilise our apps including Adam and benefit from the insights generated.
You will work daily with the data science and engineering teams, have the opportunity to define the technical stack and drive the direction of the data platform.
Responsibilities
The role of the Data Engineer is hands-on with the data and product every day. You will:
- Understand business needs and define user requirements
- Demonstrate experience architecting complex technical solutions
- Research the latest technologies and explore how to apply them to fulfil product and client needs
- Perform technical design and architecture
- Design and implement data pipeline solutions
- Design and implement Database/Warehouse solutions
- Design and implement Visualisation Reporting to enable clients to access and analyse data
- Design and Implement a Time Series Alerting solution
- Work closely with the data science team to assist in pipelining of machine learning models
- Work closely with the Data Science team to leverage machine learning models
- Maintain Documentation of Solution
- Defining, executing and improving the release process
- Monitoring & Maintaining Production Application
Requirements
We are looking for someone that will be a great fit for the role and the team. If that's you, you will be:
- Have commercial experience developing solutions with Python
- Have experience designing and implementing data pipelines (ideally using Kafka or Pika)
- Have hands-on cloud experience, ideally AWS but GCP or Azure considered.
- Knowledge of serverless infrastructure (S3, Lambdas, Step-functions, MKS, Athena)
- Demonstrate good database design and development skills
About the Team
Our team is a multi-disciplinary team of experts with everyone contributing their own area of specialism; from infrastructure to knowledge graphs, Real Estate Operations to dialogue design. By bringing all of that together we are able to push the boundaries in this new area of technology, and we’re able to fundamentally challenge the way one of the largest industries in the world operates.
Benefits:
- Free food (Deliveroo allowance)
- Private medical insurance
- Unlimited paid holidays
- Work from home
Salary: Up to £65,000.00 per year
Benefits:
- Company pension
- Discounted or free food
- Flexitime
- Private medical insurance
- Unlimited paid holidays
- Work from home
Schedule:
- Flexitime
- Monday to Friday
Application question(s):
- How do you monitor/measure pipeline performance?
- Do you report on pipeline performance, and if so what's included in the report?
Experience:
- Python: 3 years (required)
Work Location: Hybrid remote in London EC2M 7PD