Kforce Data Engineer - Remote in Saint Louis, Missouri
Kforce has a Saint Louis, MO client that is seeking a Data Engineer for a remote position,Summary:One of our large clients is expanding their current data footprint on the cloud to provide analytics, BI and data APIs. Majority of data will be batch processed with data validation, data quality and transformation into a multitude of data platforms such as Redshift, Postgres and Hive. A Senior Technical Consultant is expected to be knowledgeable in two or more technologies within (a given Solutions/Practice area). The Senior Technical Consultant is expected to have strong development and programming skills in Spark with a focus on Scala/Java and other ETL development experience in the big data space. You are expected to be experienced and fluent in agile development and agile tools as well as code repositories and agile SLDC/DevOps frameworks. You will work with architects and infrastructure teams to develop, test, deploy and troubleshoot your code as well as provide input into solutions and design of the system. You will collaborate with some of the best talent in the industry to create and implement innovative high quality solutions focused on our clients' business needs.Responsibilities:
Work with data engineering team to define and develop data ingestion, validation, transformation and data engineering code
Develop open source platform components using Spark, Scala, Java, Oozie, Hive and other components
Document code artifacts and participate in developing user documentation and run books
Troubleshoot deployment to various environments and provide test support
Participate in design sessions, demos and prototype sessions, testing and training workshops with business users and other IT associates
Bachelors degree in Computer Science or related field
Certification in Spark, Azure or other cloud platform
At least 3 years of experience in developing large scale data processing/data storage/data distribution systems
At least 3 years of experience on working with large Hadoop projects using Spark-DataBricks and working with Spark DataFrame, Dataset APIs with SparkSQL as well as RDDs and Scala function literals and closures.
Hands-on experience with DataBricks and CosmoDB
Must be able to design CosmoDB collections and administrate CosmoDB as an application DBA.
Experience with ELT/ETL development, patterns and tooling, experience with Azure Data Factory and or SSIS.
Experience with Azure and cloud environments including object storage, VMs, and HDInsight.
Experience with SQL including Postgres, MySQL RDBMS platforms
Experience with various IDE and code repositories as well as unit testing frameworks.
Experience with code build tools such as Maven.
Fundamental knowledge of distributed data processing systems and storage mechanisms.
Ability to produce high quality work products under pressure and within deadlines with specific references
Strong communication and collaborative skills
At least 5 years of working with a complex Big Data environment
Experience with JIRA/GitHub/Git and other code management toolsets
Kforce is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.Compensation Type:HoursMinimum Compensation:0.00Maximum Compensation:75.00