Acosta Jobs

Acosta Career Site

lowing:</strong></p><p style="margin: 0px;"><strong>Facebook</strong><a href="" target="_blank" title="" rel="noopener">(click here)</a></p><p style="margin: 0px;"><strong>Twitter</strong><a href="" target="_blank" title="" rel="noopener">(click here) </a></p><p style="margin: 0px;"><strong>YouTube</strong><a href="" target="_blank" rel="noopener">(click here)</a></p><p style="margin: 0px;"></p><p style="margin: 0px;"> DISCLAIMER: The
s of Conditions.<br>US:<a href="" target="_blank" rel="noopener"></a><br>Canada:<a href="" target="_blank" rel="noopener"></a> </p> <br><br><strong>Work State</strong><i>

Job Information

Acosta Architect, Big Data in *Jacksonville, Florida


The Architect Big Data role is part of the Data Sciences delivery team. The role works in a DevOps environment and is tasked with developing the capabilities and daily operations of various integration technologies around our Data Sciences platform. The role is responsible for supporting workloads running on the Hadoop environment and associated ingestion technologies (Real-Time, Batch, Streaming). This position will be responsible data organization and model in the data lake as well as implementing an ingestion framework to automate data loading and data transport processes. The role will be involved in R&D of emerging technologies with the application administrators and technical architects.


  • Evaluate tools and technologies in the context of the future state ingestion architecture, and evolving business requirements

  • Responsible for review of project artifacts during the transition phase and ensure operational needs are met.

  • Provide technology operational support as new systems/platforms are rolled out to the enterprise

  • Implement and Optimize Hadoop, Integration and Analytical technologies

  • Review solution and technical designs

  • Propose best practices/standards.

  • Benchmark the performance in line with the non-functional requirements

  • Previous experience in developing and deploying operational procedures, tuning guides and best practices documentation.

  • Capacity to provide technical guidance, including serving as a resource to project teams and the other teams by evaluating and proposing technical alternatives for resolving business and technology issues.

  • Attention to detail to review project deliverables for completeness, quality, and compliance with established project standards.

  • Working with data architects to ensure that data structures and models are efficiently designed to optimize loading, tagging, cleansing and reprocessing activities.

  • Acts as a positive role model, co-workers and the business group to promote high performance solutions.

  • Develops strong relationships with the developers and managers to deliver effective services.

  • Promotes an environment of continuous learning and continuous improvements

  • Provides 2nd level and 3rd level support

  • Keep abreast of software updates and vendor strategies

  • Perform bug fixes and minor enhancements to existing Hadoop application.


  • 3+ years developing and supporting applications leveraging the Hadoop stack

  • 3+ years of experience with data integration/ETL tools such as DataStage or preferably a pure-play ETL tool on Hadoop (CDAP, Talend)

  • 1+ years of experience with Data Vault or equivalent

  • Previous experience developing ETL / Hadoop ingestion frameworks is a must.

  • SDLC knowledge in both waterfall and agile methodologies

  • Hands-on experience with source code management system (SVN, Git) and continuous integration tools (Jenkins)

  • Experience on following tools: Hive, SQL, Nifi, Spark, Kafka, Flume, Sqoop, HBase ,Pig, HDFS, R, NoSQL

  • Deep knowledge of Hive optimization

  • Experience on handling data processing, delivering distributed and highly scalable application

  • Experience with HortonWorks Hadoop Distribution

  • Experience with large scale domain or enterprise solution analysis development, selection and implementation

  • Experience with high-volume, transaction processing software applications

  • Good understanding of workload management, schedulers, scalability and distributed platform architectures

  • Experience in software development and architecture experience using Java EE technologies (Application Server, Enterprise Service Bus, SOA, Messaging, Data Access Layers)

  • Experience in scripting languages & automation such as bash, PERL, and Python

  • Experience in data warehousing, analytics, and business intelligence/visualization/presentation

  • Experience using SQL against relational databases.

  • Working knowledge of search technologies like Lucene, Solr,

  • 5+ years of hands-on experience on Linux

  • Excellent communication skills (both written and oral) combined with strong interpersonal skills

  • Strong analytical skills and thought processes combined with the ability to be flexible and work analytically in a problem solving environment

  • Attention to detail

  • Strong organizational & multi-tasking skills

  • Proven track of continuous learning and self-improvement

  • MosaicNorth America is an Equal Opportunity Employer

Follow us on the following:

Facebook (click here)

Twitter (click here)

YouTube (click here)

DISCLAIMER: The above statements are intended to describe the general nature and level of work being performed by people assigned to this classification. They are not intended to be construed as an exhaustive list of all responsibilities, duties, and skills required of personnel so classified. Mosaic reserves the right to modify all or part of any job descriptions at its discretion in order to meet and or exceed the needs of the business.

By submitting your application you agree with and accept the Acosta Privacy Statement and Terms of Conditions.



Work State _US-FL-

  • Jacksonville_

Job ID 2021-192213

Work City Jacksonville

PCN Sourcing Req

Position Type Regular Full-Time

Work Zip 32216

Starting average hours per week 37.5 +

Category Corporate Jobs