Job description
Spire Global is a space-to-cloud analytics company that owns and operates the largest multi-purpose constellation of satellites. Its proprietary data and algorithms provide the most advanced maritime, aviation, and weather tracking in the world. In addition to its constellation, Spire’s data infrastructure includes a global ground station network and 24/7 operations that provide real-time global coverage of every point on Earth.
Our products are data analytics APIs consumed by businesses, governments, and nonprofits alike who seek best-in-class information on weather, maritime activity, aviation activity, or are looking to leverage our space program to launch and operate custom hardware.
The Data Platform team is responsible for ingestion of tens of thousands of events per second, data lakehousing (Databricks), OLTP stores (Hbase, OpenSearch) and APIs. The customers of the Data Platform are the internal business units who use it for data engineering, AI/ML, and Business Intelligence.
Scope of the role:
- Work with your customers across the aviation, maritime, weather, and orbital services business units to improve their ability to move fast and deliver deeper innovative new data products for their customers.
- Join a collaborative, fast-moving, high-functioning team that values on-time delivery and broad expertise: backend services, distributed and streaming data analysis, CI/CD, infrastructure, operations, etc.
- Leverage Spire’s 150-person engineering organization, including SREs, security engineers, and PaaS tooling, to accelerate the team’s delivery velocity.
Basic Qualifications:
- Computer science degree or equivalent work experience.
- Experience with compiled and interpreted languages ( Python, Java, or Scala)
- Experience owning full-lifecycle software development: requirement gathering, development, testing, delivery, monitoring, and incident response.
- 3+ years of experience in Data Lake, Data Modeling Architecture, and Development
- Experience with relational SQL and NoSQL databases, including MySQL, ElasticSearch, etc.,
- Experience with big data and workflow management technologies like Hadoop, Spark, Redshift, Athena, Airflow, etc.
- Experience with visualization tools such as Grafana, QuickSight
- Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc.
- Experience with Agile practices and data ops.
- Deep understanding of data management, administration, security, and access control processes and implementation
- Experience and desire to set up data capabilities and platforms as well as managing them
- Self-organizing and self-starting
- Continuous learner, independent worker, and strong decision maker (with minimal supervision)
Preferred Qualifications / Experience:
- Domain flexibility, good communication skills, and preference for simple, robust solutions.
- Effective written communications.
- Experience with a variety of compiled and dynamic languages and runtimes, particularly Java, JVM, Go, Docker, TypeScript, and Node.
- Hands on DevOps experience. Familiarly with services and tools like AWS, Kubernetes, Docker, Terraform, Concourse, Argo CD, etc.
- Experience with distributed systems and high-throughput data analysis, e.g., Kafka, Hadoop, and Flink.
- Solid understanding of algorithms and data structures.
- Highly detailed oriented and completeness driven; won’t stop until it’s done properly.
- Drive to keep your skills and knowledge of software development and delivery up to date, and incorporate those learnings into your day-to-day work.
- Ability to see the big picture, understand where the world is heading, and instinctively know the right way to do things.
- Product mentality: figure out how to maximize value and deliver that to the customer.
Spire is Global and our success draws upon the diverse viewpoints, skills and experiences of our employees. We are proud to be an equal opportunity employer and are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, marital status, disability, gender identity or veteran status.
Access to US export controlled software and/or technology may also be required.
Applying from California? Please review the CCPA Applicant Privacy Policy.