Lead Big-Data Engineer || Remote - Dallas

Company: Arkhya Tech Inc
Your Application: You have not applied yet
Location: Dallas, USA
JOB DESCRIPTION

Role: Lead Bigdata Hadoop Developer(5Position)

Contract

Remote

Job Description:

In this role, you will work with our data engineering team and build big data platforms to deliver value to our clients.

Responsible for data ingestion, processing, storage, extraction and orchestration in big data ecosystem (Hadoop, MapR, Spark) to meet the business objectives.

Work with the data validation team to address the data defects and ensure the defects are closed in efficient and timely manner.

Help apply your expertise in building world-class solutions, conquering business problems, addressing technical challenges using big data platforms and technologies.

Required to utilize the existing frameworks, standards, patterns to create architectural foundation and services necessary for building data pipelines that scale and demonstrate yourself as an expert by actively

researching and identifying new ways to solving data management problems in this emerging area.

Ensure the assigned tasks are completed within the timelines and budget requirements.

Understand client's business problems, defining, executing, and delivering assigned tasks to meet these requirements.

Develop POCs and create POVs as

required towards achieving program objectives.

Your day-to-day interactions is with peers, clients and emids management.

You will be given minimal instruction on daily work/tasks and a moderate level of instructions on new assignments.

You will need to consistently seek and provide meaningful and actionable feedback in all interactions.

You will be expected to be constantly on the lookout for ways to enhance value for your respective stakeholders/clients.

Decisions that are made by you will impact your work and may impact the work of others.

You would be an individual contributor and/or oversee a small work effort and/or team.

Proactive towards organizational initiatives and contribute towards people activities like training, developing people by creating growth plans, etc.

Must have skills:

PySpark, SQL, Python, Scala, Big Data Hadoop, Data Warehouse, and Financial Data, ETL


Employment type: C2C, W-2, 1099, Full Time


JOB TYPE
Work Day: Full Time
Salary: Negotiable


JOB REQUIREMENTS
Minimal experience: 10 years



Jobs you may be interested in


    Tips You May Be Interested In