iremote.grstudy1 Cloudera Data Lake Architect – Remote – Now Hiring

Cloudera Data Lake Architect – Remote – Now Hiring

  • Anywhere

CLOUDERA DATALAKE ADMINISTRATOR / ARCHITECT
Location: US-Remote
6 month+ contract to hire
Hourly pay rate: $65/hr. W2 / C2C is OK

ALTA IT Services has a remote 6-month contract to hire opening for a DataLake Administrator/Architect to support a leading, Auburn Hills, MI automotive OEM.

Duties and Responsibilities: Manage and maintain Data Lake clusters infrastructure on premise and in cloud: installation, configuration, performance tuning and monitoring of Hadoop clusters
Demonstrated strong concepts in Unix/Linux, Windows OS, cloud platforms (AWS, GCP), Kubernetes, Open Shift & Docker
Strong exposure to Cloudera manager, Cloudera Navigator, or similar cluster management tools
Collaborate and assist developers in successful implementation of their code, monitor and fine-tune their process for optimum resource utilization on cluster, ability to automate run time process
Must have strong knowledge of HDFS, Ranger/Sentry, Hive, Impala, Spark, HBase, Kudu, Kafka, Kafka Connect, Schema Registry, Ni-Fi, Sqoop, and other Hadoop-related services
Exposure to Data Science collaborative tools such as data science workbench, CML, anaconda, etc.
Strong Networking concepts: topology, proxy, F5, firewall
Strong security concepts: Active Directory, Kerberos, LDAP, SAML, SSL, data encryption @rest
Programming language concepts: Java, Perl, Python, PySpark, and Unix shell scripting
Possess experience in cluster management, cluster upgrade, migration, and testing
Perform periodic updates to cluster and keep the stack current
Ability to expand clusters by adding new nodes and rebalancing cluster storage systems
Manage application databases, application integration, users, roles, and permissions within the cluster
Collaborate with OpenShift, Unix, network, database, and security teams on cluster-related matters
Must monitor cluster for maximum uptime, ability to research cluster issues via logs, and collaborate with support in a proactive way

QUALIFICATIONS: Minimum 10 years’ experience in advanced technologies, including a minimum of 5+ years as data lake admin/architect
BS degree, preferably in Computer science or equivalent
Strong communication skills with the right attitude to blend in with the team
Minimum of 5 years of work experience in Hadoop ecosystems (Horton HDP or Cloudera’s CDP)
Solid experience in Cloudera data lake environments both on-prem and cloud
Solid experience in administration and setup, including security topics related to a data lake
Strong experience in architecting and designing solutions for new business needs
Thorough understanding and hands-on experience with implementing robust logging and tracing implementation for end-to-end systems traceability
Familiarity with Cloudera’s BDR tool to perform and monitor backups of critical data and the ability to restore data when needed
Willing and ready to get hands-on code development with the dev team for developing and troubleshooting, doing quick proof of concepts for exploring new solutions, products, etc.
Experienced in working with technical teams to discuss, analyze, understand, and negotiate business requirements, being able to explain to architects about the technical considerations and associated implications on the user journey/experience/requirements.
Experience in tuning and optimizing the Hadoop environment in keeping clusters healthy and available for end users and applications with maximum cluster uptime as defined in SLA
Deep knowledge and related experience with Hadoop and its ecosystem components, i.e., HDFS, Yarn, Hive, MapReduce, Pig, Sqoop, Oozie, Kafka, Spark, Presto, and other Hadoop components
Min Citizenship Status Required: No Restrictions

Hourly pay rate: $65/hr. W2 / C2C is OK

Please contact Melissa McNally via [email protected] for consideration.

#M2

Ref: #855-IT Baltimore
CLOUDERA DATALAKE ADMINISTRATOR / ARCHITECT
Location: US-Remote
6 month+ contract to hire
Hourly pay rate: $65/hr. W2 / C2C is OK

ALTA IT Services has a remote 6-month contract to hire opening for a DataLake Administrator/Architect to support a leading, Auburn Hills, MI automotive OEM.

Duties and Responsibilities: Manage and maintain Data Lake clusters infrastructure on premise and in cloud: installation, configuration, performance tuning and monitoring of Hadoop clusters
Demonstrated strong concepts in Unix/Linux, Windows OS, cloud platforms (AWS, GCP), Kubernetes, Open Shift & Docker
Strong exposure to Cloudera manager, Cloudera Navigator, or similar cluster management tools
Collaborate and assist developers in successful implementation of their code, monitor and fine-tune their process for optimum resource utilization on cluster, ability to automate run time process
Must have strong knowledge of HDFS, Ranger/Sentry, Hive, Impala, Spark, HBase, Kudu, Kafka, Kafka Connect, Schema Registry, Ni-Fi, Sqoop, and other Hadoop-related services
Exposure to Data Science collaborative tools such as data science workbench, CML, anaconda, etc.
Strong Networking concepts: topology, proxy, F5, firewall
Strong security concepts: Active Directory, Kerberos, LDAP, SAML, SSL, data encryption @rest
Programming language concepts: Java, Perl, Python, PySpark, and Unix shell scripting
Possess experience in cluster management, cluster upgrade, migration, and testing
Perform periodic updates to cluster and keep the stack current
Ability to expand clusters by adding new nodes and rebalancing cluster storage systems
Manage application databases, application integration, users, roles, and permissions within the cluster
Collaborate with OpenShift, Unix, network, database, and security teams on cluster-related matters
Must monitor cluster for maximum uptime, ability to research cluster issues via logs, and collaborate with support in a proactive way

QUALIFICATIONS: Minimum 10 years’ experience in advanced technologies, including a minimum of 5+ years as data lake admin/architect
BS degree, preferably in Computer science or equivalent
Strong communication skills with the right attitude to blend in with the team
Minimum of 5 years of work experience in Hadoop ecosystems (Horton HDP or Cloudera’s CDP)
Solid experience in Cloudera data lake environments both on-prem and cloud
Solid experience in administration and setup, including security topics related to a data lake
Strong experience in architecting and designing solutions for new business needs
Thorough understanding and hands-on experience with implementing robust logging and tracing implementation for end-to-end systems traceability
Familiarity with Cloudera’s BDR tool to perform and monitor backups of critical data and the ability to restore data when needed
Willing and ready to get hands-on code development with the dev team for developing and troubleshooting, doing quick proof of concepts for exploring new solutions, products, etc.
Experienced in working with technical teams to discuss, analyze, understand, and negotiate business requirements, being able to explain to architects about the technical considerations and associated implications on the user journey/experience/requirements.
Experience in tuning and optimizing the Hadoop environment in keeping clusters healthy and available for end users and applications with maximum cluster uptime as defined in SLA
Deep knowledge and related experience with Hadoop and its ecosystem components, i.e., HDFS, Yarn, Hive, MapReduce, Pig, Sqoop, Oozie, Kafka, Spark, Presto, and other Hadoop components
Min Citizenship Status Required: No Restrictions

Hourly pay rate: $65/hr. W2 / C2C is OK

Please contact Melissa McNally via [email protected] for consideration.

#M2

Ref: #855-IT Baltimore