New
Sr. Hadoop Administrator
![]() | |
![]() United States, Florida, Jacksonville | |
![]() | |
Job Title: Sr. Hadoop Administrator
Duration: 12 months Work Model: Remote Project Scope: Iceberg Implementation and other projects for EDS. This will be a heavy analytics-based project, involving SQL support specifically. Description: The Hadoop Administrator administers database systems to protect the confidentiality, integrity and availability of data. The Hadoop Administrator is responsible for installations, upgrades, backups, and configuration. They also maintain query language for established database systems including Starburst Trino , design and execute backup and recovery schemes and implement disaster recovery BDR for Hadoop procedures. The Administrator works with end-users and project team members to understand and advise on database Impala, Hive and Hbase and query requirements and data science query capabilities. They are assigned to complex database systems including those that work on Data science Linux servers and are relied upon to optimize database performance, R configuration, access and security and to serve on project teams contributing database subject matter expertise. Responsibilities: * Responsible for implementation and ongoing administration of Hadoop infrastructure, Data science infrastructure and Lakehouses * Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments to Lakehouse technologies * Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users. * Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, Dell Open Manage and other tools. * Performance tuning of Hadoop clusters and Hadoop MapReduce routines. * Screen Hadoop cluster job performances and capacity planning * Monitor Hadoop cluster connectivity and security * Manage and review Hadoop log files. * File system management and monitoring. * HDFS support and maintenance. * Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability. * Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required. * Point of Contact for Vendor escalation DBA Responsibilities Performed by a Hadoop Administrator: * Data modelling, design & implementation based on recognized standards. * Software installation and configuration. * Database backup and recovery. * Database connectivity and security. * Performance monitoring and tuning. * Disk space management. * Software patches and upgrades. * Automate manual tasks. Required Experience: *3-5 Years of DBA experience required *5-8 Years of Hadoop Admin experience required *Strong communication skills required *Must have strong experience working with SQL (our analytics platform) *Must have strong experience working with Spark (main platform) *Must have strong experience working with Linux OS *Exposure to Data Science tools highly preferred (but not required) Required Education: *Related Bachelor's degree in an IT related field or relevant work experiencePosition is offered by a no fee agency. |