Acosta Jobs

Acosta Career Site

Job Information

Acosta Developer, Big Data Sr in United States, United States

Overview

The Sr Developer – Big Data - DevOps role reports to the Director, Data Sciences DevOps. The role is part of the DevOps team in charge of the daily operations of various integration technologies. The role is responsible for supporting workloads running on the Hadoop environment and associated technologies. This position will be responsible for monitoring the batch jobs, resolving incidents, optimizing workloads, tuning jobs, and making job enhancements.

The role will focus on production support and will also take part in the DevOps rotation for making enhancements. Also, the role will be involved in R&D of emerging technologies with the application administrators and technical architects.

Responsibilities

  1. Evaluate tools and technologies in the context of the future state architecture, and evolving business requirements

  2. Responsible for review of project artifacts during the transition phase and ensure operational needs are met.

  3. Research & development about new Hadoop & Analytical technologies

  4. Review solution and technical designs

  5. Propose best practices/standards.

  6. Benchmark the performance in line with the non-functional requirements

  7. Previous experience in developing and deploying operational procedures, tuning guides and best practices documentation.

  8. Attention to detail to review project deliverables for completeness, quality, and compliance with established project standards.

Qualifications

  • 3+ years developing and supporting applications leveraging the Hadoop stack

  • 3+ of experience with data integration/ETL tools such as DataStage or alternatives

  • SDLC knowledge in both waterfall and agile methodologies

  • Hands-on experience with source code management system (SVN, Git) and continuous integration tools (Jenkins)

  • Experience on following tools: Hive, SQL, Spark, Kafka, Flume, Sqoop, HBase ,Pig, HDFS, R, NoSQL

  • Experience on handling data processing, delivering distributed and highly scalable application

  • Experience with HortonWorks Hadoop Distribution

  • Experience with large scale domain or enterprise solution analysis development, selection and implementation

  • Experience with high-volume, transaction processing software applications

  • Good understanding of workload management, schedulers, scalability and distributed platform architectures

  • Experience in software development and architecture experience using Java EE technologies (Application Server, Enterprise Service Bus, SOA, Messaging, Data Access Layers)

  • Experience in scripting languages & automation such as bash, PERL, and Python

  • Experience in data warehousing, analytics, and business intelligence/visualization/presentation

  • Experience using SQL against relational databases.

  • Working knowledge of search technologies like Lucene, Solr,

  • 5+ years of hands-on experience on Linux , AIX, and z/OS

  • Excellent communication skills (both written and oral) combined with strong interpersonal skills

  • Strong analytical skills and thought processes combined with the ability to be flexible and work analytically in a problem solving environment

  • Attention to detail

  • Strong organizational & multi-tasking skills

Work State US

Job ID 2021-193321

Work City United States

PCN Sourcing Req

Position Type Regular Full-Time

Work Zip 00000

Starting average hours per week 37.5 +

Category Corporate Jobs

DirectEmployers