Job description
Summary
Our client works closely with their partners and clients to understand their needs and ensure their short and long term targets are met or exceeded. Their expert team works to design and execute progressive, innovative solutions that are helping our clients modernize their infrastructure and stay ahead of the constantly evolving thread landscape.
They also believe in hiring smart people then giving them space to thrive. Staying ahead of the pack requires not just economic vigilance but ambitious business goals and a purposeful, cohesive workforce. Our client is able to attract and retain their professionals by consistently creating an environment based on trust, fairness and opportunity. Additionally, they believe in establishing open communication that encourages achievable performance expectations. With also incorporating collaboration within their organization, it creates a positive energy and true ownership in providing services that are essential to deliver results of the highest quality.
Responsibilities
- Design, develop, and maintain data pipelines, data warehouses, and data models
- Implement and maintain data integration processes to collect, process, and manage large datasets
- Collaborate with data analysts, data scientists, and other stakeholders to understand business requirements and design data solutions that meet their needs
- Implement and maintain data quality checks and validations to ensure data accuracy and completeness
- Optimize data processing and storage to ensure high availability, scalability, and performance
- Develop and maintain automated data workflows and scheduling tools
- Troubleshoot and resolve data-related issues and incidents
- Stay up-to-date with the latest technologies and trends in data engineering and recommend improvements and optimizations to the data architecture and infrastructure
- Participate in code reviews and contribute to the development of best practices and standards for data engineering
- Strong proficiency in SQL and experience with relational databases (e.g., Oracle, SQL Server, PostgreSQL)
- Experience with data modeling, data warehousing, and ETL processes
- Proficiency in at least one programming language (e.g., Python, Java, Scala)
- Experience with cloud platforms (e.g., AWS, Azure, GCP) and big data technologies (e.g., Hadoop, Spark, Kafka, Cassandra)
- Knowledge of data visualization and BI tools (e.g., Tableau, Power BI)
- Excellent problem-solving skills and attention to detail
- Strong verbal and written communication skills
- A Bachelor's degree in Computer Science, Information Systems, or a related field is preferred.
- None.
- Applicants selected will be subject to a security investigation and may need to meet eligibility requirements for access to classified information; The selected candidate will start the process of obtaining an active Secret clearance.