Analytics (Data Engineering) - Principal Engineer
JP Morgan
Location: Greater London
Job Type: Full time
Posted
We are not your standard bank, we are an innovation hub. We are part of the team that launched Chase as a new UK bank in 2021, and are now starting on a new initiative with big ambitions. More details will be provided during the interview but for now we can enthusiastically say: It’s challenging, it’s high stakes, it’s fun!
Our team is at the heart of building this new venture. We have created a new organisation with diverse professionals that come from a wide set of skills, backgrounds and experiences, including top tech giants, blockchain innovators and FinTech unicorns.
Team members have autonomy to collaborate and work in a cross-functional way. We value your opinion in every matter. Our inclusive culture is hugely important to us and we are looking for intellectually curious, compassionate people who would like to expand their skills whilst working on a new exciting venture for the firm. We are about bringing new products to market that solve real-world problems for real-life customers, not just innovation for the sake of innovation. We are human first; we love working with each other, and we need personal connections and in-person bonding. We find we are happier, more motivated
and more productive this way. If this sounds like something that you would like to be a part of we’d encourage you to read on and apply.
- Structure software so that it is easy to understand, test and evolve.
- Build solutions that avoid single points of failure, using scalable architectural patterns.
- Develop secure code so that our customers and ourselves are protected from malicious actors.
- Promptly investigate and fix issues and ensure they do not resurface in the future.
- Ensure fully automate releases.
- See that our data is written and read in a way that's optimized for our needs.
- Keep an eye on performance, making sure we use the right approach to identify and solve problems.
- Ensure our systems are reliable and easy to operate.
- Keep us up to date by continuously updating our technologies and patterns.
This is a hands-on role for an Analytics (Data Engineering) Lead who wants to be part of a flat-structure organisation and influence the design & development of a green-field initiative.
Your responsibilities will include:
- Lead and manage a team of data engineers to design, develop, and implement data analytics platforms, pipelines, and infrastructure for our multi-cloud product across GCP and AWS.
- Architecture and design: Design and develop scalable, cost-effective, and secure data architectures and solutions, utilizing appropriate GCP and AWS services and technologies.
- Data pipeline development: Design, implement, and maintain data pipelines that efficiently collect, process, and store large volumes of data from various sources, ensuring data quality and integrity.
- Data analytics and insights: Collaborate with data analysts, data scientists, and business stakeholders to develop analytics solutions that deliver valuable insights to support data-driven decision-making.
- Develop an ability to query both realtime and batch data, with ease and speed
- Evaluate and recommend appropriate data storage and processing technologies, such as Bigtable, BigQuery, and other GCP and AWS services.
- Performance optimization: Continuously monitor, analyze, and optimize data pipelines and analytics solutions to improve performance, reduce costs, and ensure reliability and scalability.
- Security and compliance: Ensure that data solutions comply with relevant data security and privacy regulations, and implement best practices for securing data at rest and in transit.
- Technical leadership: Provide technical guidance and mentorship to the data engineering and analytics team, ensuring adherence to best practices, and setting the technical direction for projects.
- Stay current with industry trends: Continuously research and stay up-to-date with emerging technologies, tools, and best practices in the data engineering and analytics space, incorporating new knowledge into projects and sharing insights with the team.
We’d like you to have extensive experience in a similar role, along with a few key technical skills:
- Data engineering skills: Proficiency in designing, building, and optimizing data pipelines, as well as experience with big data processing tools like Apache Spark, Hadoop, and Dataflow.
- Cloud platforms expertise: Deep understanding of GCP and AWS services, architectures, and best practices, with experience in designing and implementing scalable and cost-effective solutions. Relevant certifications (e.g., AWS Certified Solutions Architect, Google Cloud Professional Data Engineer) are a plus, as well as hands-on knowledge of Bigtable and BigQuery
- Data storage and databases: Knowledge of various data storage options (e.g., relational databases, NoSQL, data lakes) and hands-on experience with managing and optimizing databases such as PostgreSQL, MySQL, BigQuery, and Redshift.
- Data integration: Familiarity with data integration tools and techniques, including ETL (Extract, Transform, Load) processes and real-time data streaming (e.g., using Apache Kafka, Kinesis, or Pub/Sub).
- Data analytics: Strong understanding of data analytics, data visualisation, and business intelligence tools (e.g., Tableau, Looker, Power BI) to help organizations derive insights from their data. Experience with lambda architectures is beneficial
- In-depth understanding of data modelling, ETL processes, and data warehousing concepts.
- Programming languages: Proficiency in languages commonly used in data engineering and analytics, such as Python, R, Java, or Scala.Strong problem-solving and critical-thinking skills, with the ability to break down complex problems and develop innovative solutions.
- Leadership and communication: Ability to lead a team of engineers, collaborate effectively with cross-functional teams, and communicate complex technical concepts to both technical and non-technical stakeholders.
We’d also like you to have a few key personal skills. Every organisation has its own culture, and we want you to enhance ours.
- A desire to teach others and share knowledge. We aren’t looking for hero developers, more for team players. We want you to coach other team members on coding practices, design principles, and implementation patterns
- Comfortable in uncharted waters. We are building something new. Things change quickly. We need you to learn technologies and patterns quickly
- Ability to see the long term. We don’t want you to sacrifice the future for the present. We want you to choose technologies and approaches based on the end goals.
- High standards. We are looking for people who expect personal performance and team performance to be nothing short of the best
- Clarity of thought. We operate quickly and efficiently, and we value people who are economical with their time and clear with their opinions
- Comfortable working within a geographically distributed team
- We primarily use JVM based languages (Java/Kotlin) but we also have parts of the platform that use other languages as needed (Go, Python, etc.)
- We look to use Open source/SaaS when it makes sense and build ourselves when it doesn't
- We are entirely cloud native and want to build a truly multi-cloud solution
- We look at each problem independently and pick the right technology to solve it
- We aren’t afraid to try new things but we always remember that we are looking to build something to last, and we focus on solving real world problems for real life customers
Compensation
Our compensation team pay close attention to the external market and endeavour to provide balanced and competitive offers, generally consisting of a base salary and incentive compensation.
Work type
We prefer to be co-located but we understand that people need flexibility so we expect 2/3 days in office
Holidays:
27 days / year + bank holidays
Benefits:
Private Healthcare, Pension (matching up to 6% + 6% core contribution from JPMorgan), Gym, Theatre and Entertainment discounts and many other flexible options.