LMI is seeking a Data Engineer to design, develop and administer solutions in support of large Defense Business Systems. This role will have significant influence on overall strategy by defining features, driving solution architectures, and establishing best practices. The data infrastructure engineer will work with Government Leads, solution architects, data owners, data stewards, policy developers, process and workflow analysts, data scientists, and business analysts to design and implement a framework to increase speed to insight across the Army enterprise. Work location is currently teleworking, eventually hybrid teleworking and client site in Fort Belvoir, VA.
LMI is a consultancy dedicated to powering a future-ready, high-performing government, drawing from expertise in digital and analytic solutions, logistics, and management advisory services. We deliver integrated capabilities that incorporate emerging technologies and are tailored to customers’ unique mission needs, backed by objective research and data analysis. Founded in 1961 to help the Department of Defense resolve complex logistics management challenges, LMI continues to enable growth and transformation, enhance operational readiness and resiliency, and ensure mission success for federal civilian and defense agencies.
- Establish system interfaces between the Army’s enterprise big data management platform and authoritative data sources dispersed across the Army and DoD networks.
- Design and develop data pipelines that support advanced analytics workflows.
- Help grow a trusted data foundation through ETL, data streaming, and data processing across distributed systems.
- Develop and maintain documentation that enable platform maintenance and front-office knowledge management.
- Participate in infrastructure reviews to identify best practices and opportunities for improvement.
- Follow good practices and project guidelines in all cases including the use of source code control, automated testing and deployment approaches, and proper documentation practices
- Perform and/or participate in requirements discovery and design work sessions to determine best practice solution requirements.
- Architect and develop applications supporting internal and external communications; intra-service communications.
- Collaborate with diverse technical delivery teams to build service based architecture to scale and innovate.
- Articulate trade-offs and drive high-impact technology decisions on topics including (but not limited to) IaaS/PaaS providers, container orchestration, service mesh, API gateways, and commercial vs. open source software.
- Ability to work with Palantir Foundry – Palantir software
- Bachelor’s degree in computer science, information technology, computer engineering or related field preferred.
- Direct relevant experience can be considered in lieu of a degree.
- 2-5+ years of experience in information systems, system administration, data engineering or network engineering.
- Experience working with a cloud provider (AWS/Azure/Google Cloud) or in-house data centers.
- Ability to understand a variety of system-to-system interface types and design solutions across on-prem and cloud infrastructures.
- Proficient with a few programming languages including Python and Java.
- Familiar with ETL concepts and solutions particularly for big data engineering, including experience with Apache Spark.
- Experience with administering or operating in Linux or Unix-like environments.
- In-depth understanding of database systems, such as Oracle, MSSQL, S3, etc.
- Demonstrated web-development experience (Java/JSON, HTML, CSS, XML)
- Business capability and domain modeling experience, including an understanding of domain driven design, including modeling of events.
- An understanding of data governance frameworks and methods.
- Must possess and maintain a DOD Secret Security Clearance.
- Analytical mindset – data-driven and able to problem solve, particularly in the use of platform metadata or logging data to diagnose issues raised by platform users in relation to system functionality, or data pipelines.
- Good communicator – able to deliver messages effectively, specifically as it relates to translating functional requirements into technical solutions and coordinating across technical and non-technical teams.
- Be self-motivated and self-starting – able to execute with minimal guidance, embrace new methods, and learn and apply new concepts.
- Attention to detail – focus on rigor and completeness of work output