MacDill, AFB, FLORIDA
Data Integration Developer
Assist in the design and development of a robust data pipeline and Extract Transform and Load (ETL) solutions using Apache NiFi, along with other commercial-off-the-shelf (COTS) and open source software.
Manage, perform and develop Data as a Service (DaaS) functions to include: data collection, parsing, transformation, delivery, reconciliation, de-duplication and storage, received from and used by multiple systems.
Document development and changes to include: White papers, SOP's, outlines, Data Source Surveys
Work in conjunction with business analyst, database administrator (DBA), data architects, and systems engineers on data warehousing solutions.
Work in conjunction with database administrator (DBA) and systems engineers on data pipeline health and outage alerting, and data metrics reporting.
Acquire knowledge of multiple types of intelligence and operations data, with an understanding of how the information is used by the customer's user base.
Expertise with XPath, XQuery, XSLT, XML, JSON, REST/SOAP, Java programming
Expertise with data parsing, transformation, integration and transmission
Expertise working with structured, semi-structured and un-structured data sets
Experience with protocols to include HTTP/S, JMS, SFTP, and SharePoint OWA
Experience with Apache NiFi, MiNiFi, Camel or Data Flow frameworks
Experience with data modeling and associated tools
Experience with SQL/NoSQL databases and indexers
Experience with Linux, Windows and Mobile (android, CE) operating systems
Knowledge of DoD/IC and commercial industry object modeling and warehousing
Knowledge of cloud-based software and database solutions
Minimum of a Bachelor's degree in computer science or a related field or equivalent work experience
Current Security+ certification or ability to obtain with-in 90 days
Minimum of five years of relevant IT experience
Minimum of three years of experience designing, developing, and fielding DevOps solutions
Current Top Secret with SCI eligibility