Role Description To help manage a complex distributed platform like Hadoop, Think Big Managed Services group requires a team of technically savvy and world-class support professionals and we are looking for candidates to join the Managed Services, Think Big Solution Center India team. This is a technical role and to be successful in this role, the candidate must possess the ability to shift gears from communicating sensitive issues to the customers at a high level to driving technical discussions with the support, engineering and Professional Services teams. As a consultant, you are expected to lead from the front by demonstrating excellent troubleshooting and problem-solving skills along with the ability to look at the big picture to assist customers to analyze unstructured data with structured, relational data to unlock the full value of their big data. The Managed Services Application Support Consultant (MSASC) requires specific technical knowledge about the administration and control of applications deployed on Hadoop environments, including its integration with Teradata, Aster or other data platform solutions and their associated operating systems and related tools. The MSASC will provide services to administer, maintain, control and optimize data acquisition, transformation and integration and, data management solutions. The MSASC develops and maintains relationships with key persons on the client engagement to ensure the clients needs are met in a professional manner. The Consultant also participates in proposing, designing, and implementing solutions that address relevant people, process, information, and technology needs. The Consultant is responsible for day-to-day execution of multiple assignments and for communicating task approach and structure to others. The Consultant provides thought leadership to their practice and client engagements. This consultant transfers knowledge and expertise to other Managed Services consultants through trainings and participates in MS activities to build knowledge capital. The Consultant may be the Project Tech Lead on a project. Grade: 10 Location: Mumbai, Pune Qualifications Minimum Requirements: 1. Minimum experience of 5 years in application support (data integration, ETL, BI operations, Analytics support) engagements on large scale distributed data platforms for e.g. Teradata, DB2, Oracle, etc. 2. Experience in application support engagements on production Hadoop (any distribution) environments. 3. Focused, Proactive having strong analytical and problem solving skills, Strong work ethic, Ability to work independently and collaborate with multiple (including virtual) teams 4. Excellent oral and written communication skills in the English language 5. Must be willing to provide 24x7 on-call support on a rotational basis with the team 6. Must be willing to travel both short-term and long-term Desired Experience 1. Platform operations administration 2. Development, implementation or deployment experience in the Hadoop ecosystem 3. Working with Linux environments. Should be proficient in shell scripting 4. Experience with ANY ONE of the following: a. Proficiency in Hive internals (including HCatalog), SQOOP, Pig, Oozie and Flume/Kafka. b. Proficiency with at least one of the following: Java, Python, Perl, Ruby, C or Web-related development c. Development or administration on NoSQL technologies like Hbase, MongoDB, Cassandra, Accumulo, etc. d. Development or administration on Web or cloud platforms like Amazon S3, EC2, Redshift, Rackspace, OpenShift, etc. e. Development/scripting experience on Configuration management and provisioning tools e.g. Puppet, Chef f. Web/Application Server & SOA administration (Tomcat, JBoss, etc.) 5. Managing and handling large data set in diverse industries including sensors, social media/blogs, web logs etc. and ability to help build analytic environments 6. Monitoring, analyses & troubleshooting data acquisition, transformation & integration workflow/session failures & related issues to ensure SLAs are met. 7. Root cause analysis for job failures & data quality issues & providing solutions. 8. Handle deployment methodologies, code and data movement between Dev., QA and Prod Environments (deployment groups / folder copy/ data-copy etc.) 9. Maintain data integration environment scheduler (Control-M, JCL, Unix/Linux-cron etc.) schedules and dependencies as required 10. Should be able to articulate and discuss the principles of performance tuning on Hadoop 11. Develop and produce daily/ weekly operations reports and metrics as required by IT management 12. Experience on any of the following will be an added advantage: a. Hadoop integration with large scale distributed DBMSs like Teradata, Teradata aster, Vertica, Greenplum, Netezza, DB2, Oracle, etc. b. Data Modeling or ability to understand data models c. Knowledge of Business Intelligence and/or Data Integration (ETL) solution delivery techniques, models, processes, methodologies d. Exposure to tools data acquisition, transformation & integration tools like Talend, Informatica, etc. & BI tools like Tableau, Pentaho, etc. Salary: Not Disclosed by Recruiter Industry: IT-Software / Software Services Functional Area: IT Software - Application Programming , Maintenance Role Category:Programming & Design Role:Software Developer Desired Candidate Profile Education- UG: B.Tech/B.E. PG:M.Tech Doctorate:Any Doctorate - Any Specialization, Doctorate Not Required Please refer to the Job description above

Salary

1,000,000 - 1,700,000 INR

Yearly based

Location

Dehradun, Uttarakhand, India

Job Benefits
Free Transportation
Job Overview
Job Posted:
1 month ago
Job Expire:
2 months from now
Job Type
Full Time
Job Role
Executive
Education
Bachelor Degree
Experience
Fresher
Total Vacancies
3

Share This Job:

Location

Dehradun, Uttarakhand, India