Sign in or register for a JobTech account.

Jobs that require hadoop skill

THOUGHTWORKS PTE. LTD.
18Jan
Senior Data Engineer
THOUGHTWORKS PTE. LTD.   via JobsCentral



Roles & ResponsibilitiesSingapore, SingaporeThoughtWorks Singapore is looking for talented engineers passionate about building large scale data processing systems to help manage the ever-growing information needs of our clients.

You will be responsible for -

Creating complex data processing pipelines, as part of diverse, high energy teams

Designing scalable implementations of the models

Hands-on programming based on TDD, usually in a pair programming environment

Deploying data pipelines in production based on Continuous Delivery practices

Advising    Read more

clients on the usage of different distributed storage and computing technologies from the plethora of options available in the ecosystem

Requirements

Ideally, you should have -

5+ years of experience building and deploying large scale data processing pipelines in a production environment

Production-level hands-on experience working on HDFS, Java MapReduce, Hive, Apache Spark, Oozie etc.

Solid understanding of YARN, Mesos, MPP Databases, SQL-on-Hadoop solutions like Impala etc.

Experience working with, or an interest in Agile Methodologies, such as Extreme Programming (XP) and Scrum

Knowledge of software best practices, like Test-Driven Development (TDD) and Continuous Integration (CI)

Strong communication and client-facing skills with the ability to work in a consulting environment is essential

Senior developers (7+ years) are expected to be the Architect for small and large enterprise projects. On larger projects, you are expected to work closely with the fellow architects to come up with the architecture and take it further.

Desire to contribute to the wider technical community through collaboration, coaching, and mentoring of other technologists

If you relish the idea of being part of a community that extends beyond the work we do for our customers, you may find ThoughtWorks is the right place for you. If you share our passion for technology and want to help change the world with software, we want to hear from you!

To apply, please submit your CV and tell us why you want to join ThoughtWorks. We will ask you to write code as part of your interview process, so be prepared! Our recruiters will be in touch.

Skills
A-IT SOFTWARE SERVICES PTE LTD
18Jan
Machine Learning Engineer
A-IT SOFTWARE SERVICES PTE LTD   via JobsCentral



Roles & Responsibilities

Building machine learning & analytics platform and working with data scientists to create, optimize and productionize of machine learning models and building data pipelines for machine learning systems for various business units within the org.

Requirements

• Build and improve machine learning and analytics platform.

o Develop components of machine learning and analytics platform.

o Improve the machine learning workflow, from data exploration, model experimentation/prototyping to    Read more

production.

o Build framework to support machine learning and data-driven business activities at large scale.

• Work with data scientists to build end-to-end machine learning and analytics solution to solve business challenges.

o Build data pipeline for machine learning systems.

o Turn advanced machine learning models created by data scientists into end-to-end production grade system.

o Build analytics platform components to support data collection, exploratory, and integration from various sources being data API, RDBMS, or big data platform.

o Optimize efficiency of machine learning algorithm by applying state-of-the-art technologies, i.e. distributed computing, concurrent programming, or GPU parallel computing.

• Excellent understanding of software engineering principles and design patterns.

• Excellent programming skills in Python, Scala, or Java.

• Working knowledge of big data technology stack: Hadoop/Yarn, Hive, HBase, and Spark.

• Experience to one or more commercial / open source data warehouses or data analytics systems, e.g. Teradata, is a big plus.

• Experience to one or more NoSQL databases is a big plus.

• Hands-on experience in Cloud platforms, e.g. AWS, or containerization/ virtualization platforms, e.g. Docker/Kubernetes, is a big plus.

• Experience to any data science or machine learning platform, e.g. IBM Data Science Experience or Cloudera Data Science Workbench, is a big plus.

• Good understanding of data science and machine learning technologies and methodologies is a big plus.

• Exposure to mainframe system is a plus.

• Passion about machine learning and data-driven intelligence system.

• Excellent communication and presentation skills in English.

• Team player, self-starter, ability to work on multiple projects in parallel is necessary.

• Experience in software engineering, devops automation, and big data engineering

• Experience working in multi-cultural environments

Skills
The Advertiser
18Jan
Senior Product Development Engineer (71663)
The Advertiser   via JobsCentral



Roles & Responsibilities

Sr Engineer, Data Science

We are looking for a sr. software engineer to support our global delivery of world-class products.The ideal candidate will have professional experience developing and deploying big data solutions with a strong desire to learn new technologies and to help build better tools in a team environment.

Responsibilities:

Full-stack Engineering

Data warehouse development

Manage data pipeline integrity and quality

Analytics in Spark

Optimization/troubleshooting of low performing and failing jobs

Solution development    Read more

with a production end-state in mind

Review code and provide peer feedback relative to best practices

Work closely with architects to deliver appropriate technical solutions

Interact professionally with business partners and key contacts

Own new features from design to production release

Work with product management and operations in an Agile environment

Participate in resolution of production issues and lead efforts toward solutions

Grow a high-performance production environment in a way that makes the system increasingly reliable

Requirements

BS in Computer Science and 5 years of experience (or MS and 3 years of experience)

4+ years of professional Java software development experience

3+ years of demonstrated Hadoop experience with hands on experience with Spark, MapReduce, HBase, Hive, Phoenix or Yarn

2+ years of experience with business intelligence tools such as Tableau or Power BI

Excellent communication skills with ability to work well cross-functionally across teams

Analytical problem solver passionate about delivering a high-quality solutions

Practical experience managing full software lifecycles

Preferred Skills:

Background in statistics and analytics

Experience in cloud deployments (Azure/AWS)

Experience with Git best practices

Experience with Agile software development principles

Familiarity with JIRA issue management system and development workflows

Skills
CAPGEMINI SINGAPORE PTE. LTD.
18Jan
Data Engineer
CAPGEMINI SINGAPORE PTE. LTD.   via JobsCentral



Roles & Responsibilities

Activities:

• Industrialize data integration, data cleansing, data analytics programs or data management processes.

• Contribute to the design, development, testing, deployment, performance in production and maintenance of the data-centric software including APIs, cloud-based architectures, libraries, toolbox.

• Liaise with the Data Scientists, Architects, software developers, and business experts to understand how data needs to be converted, loaded, processed and presented.

• Help the Data Architect    Read more

to create an overview of the Data Lineage (from data flows, data transformations inside applications).

• Provide clear documentation of the business rules embedded in the systems and potentially manage or help solving Data Quality issues.

• Adapt to local context and tools provided by AXA’s entity as well as local entity development standards and IT landscape.

Skills:

• Technical: Strong development skills - languages will depends on the entity but mainly python and Scala, Big Data experience (Spark, Hadoop Suite) Experience in Git, continuous integration and delivery.

• Awareness on Data Management practices, including Data Lifecycle Management across Core IT & Big Data ecosystems as well as Data Privacy & Security constraints.

• Knowledge in Data Modelling (including third normal form, star schema, data vault modelling methods).

• Focus on end user and customer centricity, Strong oral and written communication skills, Passion for learning new tools, languages and frameworks, Fast adaptation to changing requirements, Strong problem-solving skills, able to work with minimal direct guidance, self-motivated and proactive, in a collaborative model, side by side with the business, Practical, hands on approach to get results.

• Experienced in Cloud services & architecture.

• Discipline in writing technical & non-technical documentation.

Requirements

Activities:

• Industrialize data integration, data cleansing, data analytics programs or data management processes.

• Contribute to the design, development, testing, deployment, performance in production and maintenance of the data-centric software including APIs, cloud-based architectures, libraries, toolbox.

• Liaise with the Data Scientists, Architects, software developers, and business experts to understand how data needs to be converted, loaded, processed and presented.

• Help the Data Architect to create an overview of the Data Lineage (from data flows, data transformations inside applications).

• Provide clear documentation of the business rules embedded in the systems and potentially manage or help solving Data Quality issues.

• Adapt to local context and tools provided by AXA’s entity as well as local entity development standards and IT landscape.

Skills:

• Technical: Strong development skills - languages will depends on the entity but mainly python and Scala, Big Data experience (Spark, Hadoop Suite) Experience in Git, continuous integration and delivery.

• Awareness on Data Management practices, including Data Lifecycle Management across Core IT & Big Data ecosystems as well as Data Privacy & Security constraints.

• Knowledge in Data Modelling (including third normal form, star schema, data vault modelling methods).

• Focus on end user and customer centricity, Strong oral and written communication skills, Passion for learning new tools, languages and frameworks, Fast adaptation to changing requirements, Strong problem-solving skills, able to work with minimal direct guidance, self-motivated and proactive, in a collaborative model, side by side with the business, Practical, hands on approach to get results.

• Experienced in Cloud services & architecture.

• Discipline in writing technical & non-technical documentation.

Skills
DBS Bank Limited
18Jan
AVP / Senior Associate, Devops Lead, Middle Office Technology, Technology and Operations
DBS Bank Limited   via DBS Bank Limited



Business Functions

roup Technology and Operations (T&O) enables and empowers thebank with an efficient, nimble and resilient infrastructure through astrategic focus on productivity, quality & control, technology, peoplecapability and innovation. In Group T&O, we manage the majority of theBank's operational processes and inspire to delight our business partnersthrough our multiple banking delivery channels.

Job Purpose

To work with product owners to define the user stories, technical solutioning, doing work break down    Read more

and ensuring work distributed among right squads/developers for on time quality deliveryTo implement best practices in DEVOPS, automation and build resilient application. Key Accountabilities Technical Leadership\: Guide team development efforts towards successful project delivery and provide technical leadership to teammates through coaching and mentorship.In-house Capability\: Maintain high standards of software quality within the team by establishing good practices and habits while delivering solutions on-time and on-budget People Leadership\: Identify and encourage areas for growth and improvement within the team

Responsibilities

Participate in discovery phase and translate

Requirements

into technical solution Appreciation for business

Requirements

and domain knowledgeHands on and keen to develop/trouble shoot complex technical scenariosAs a L2/L3, provide support coverage or pilot phase to smoothen release processes and resolve issuesAs required, on a rotation basis work as L2 to understand the challenges of DevOps team and drive efficiency/as a input to the dev stack on a ongoing basisGroom juniors and developers on technical and functional domainParticipate/lead in project meetings, process activities and ensures practices are followed in the teamArchitecture, Design and Code review for own and other teamsPartner with business stakeholders to deliver the technical solutionResource planning, recruitment and key talent development

Requirements

10+ years of experience in developing and supporting enterprise applications.Experience in DevOps and agile development methodologiesExperience in developing and supporting banking applications in Hadoop, Spark, Hive, Impala, Java, cloud technologiesExperience in CI-CD tools like Bitbucket Jenkins, Nexus, JIRA, confluenceExperience in TWS and shell scriptingFamiliarity with relational databases like MySQL, Maria DB, Mongo DBSelf-starter with the ability to develop resilient application. Good knowledge in banking Finance platformKnowledge in elastic search and GrafanaExperience in Site Reliability Engineering (SRE). Able to manage business and provide timely updates on issues & resolutions. Able to do capacity management and suggest on the capacity

Requirements

.Responsible for the application stability Fluency in written and spoken English; Good communication and interpersonal skillsCore CompetenciesDependability — Job requires being reliable, responsible, and dependable, and fulfilling obligations.Adaptability/Flexibility — Job requires being open to change (positive or negative) and to considerable variety in the workplace.Cooperation — Job requires being pleasant with others on the job and displaying a good-natured, cooperative attitude.Stress Tolerance — Job requires accepting criticism and dealing calmly and effectively with high stress situations.Integrity — Job requires being honest and ethical.Concern for Others — Job requires being sensitive to others' needs and feelings and being understanding and helpful on the job.Leadership — Job requires a willingness to lead, take charge, and offer opinions and direction.Persistence — Job requires persistence in the face of obstacles.Analytical Thinking — Job requires analyzing information and using logic to address work-related issues and problems.Initiative — Job requires a willingness to take on

Responsibilities

and challenges.Apply NowWe offer a competitive salary and benefits package and the professional advantages of a dynamic environment that supports your development and recognises your achievements.

Skills
HELIUS TECHNOLOGIES PTE. LTD.
18Jan
Data /Mis Business Analyst
HELIUS TECHNOLOGIES PTE. LTD.   via JobsCentral



Roles & Responsibilities

Defining functional and data requirements around ongoing as well as upcoming Digi bank product features to deliver analytics MIS and dash boards

Develop and manage MIS reports and Qlikview Dashboards based on the requirements provided by business users.

Develop efficient data models/data marts required to facilitate efficient usage of data for analytics and decision management.

Define business analytics requirements clearly and engage in discussions and solutions around them to    Read more

get them successfully implemented

Work in tandem with IT to develop frameworks around data exploration, and report generation with Qlikview for management reporting.

Efficient project management to ensure the delivery of analytics requirements are on time and with utmost quality

Owning the end to end delivery of analytics requirements from requirements to testing and production deployment

Manage day to day adhoc analysis requirements

Requirements

B.S. or master’s or equivalent degree in Statistics, Analytics, Applied Mathematics, Engineering or equivalent quantitative, Data management fields preferred.

At least 8 years’ experience in industry (consumer banking, telecoms, retail) analytics and reporting using various analytical tools

Strong programming skills using analytics tools, programming experience in Teradata SQL, SAS, Pyhton

Experience of developing MIS dashboards and knowledge of Qlikview / development.

Experience of data analysis, translating business requirements into source system data attribute identification, logical data model solution development and mapping the data attributes.

Sound understanding of data models and knowledge of various data warehouse technologies

Good understanding of technology tools especially those related to analytics, data & reporting business MIS

Good written and oral communication skills

Technical Competencies

Teradata SQL /SAS/Python/

Teradata Data base and Big Data stack

Cloud database architecture

Hadoop framework tools

Business Analysis/ Data analysis

Qlik View/Tableau

Skills
tourego
18Jan
Full Stack Developers
tourego   via Tech In Asia

Led by Stanford and MIT alumni with years of professional and management experience, Tourego is a disruptive new force in the niche market of tourism finance, solving outbound travellers’ pain points through technology. We seek to undermine the existing order among the incumbent market players with our new ideas on how business should be done in this US$65b market. Tourego is mentioned in Parliament as an example of    Read more

an innovative homegrown start-up. More information can be found at Channel News Asia. (https://www.channelnewsasia.com/news/business/tourego-gst-tax-refunds-airport-tourists-10039630) Every member in Tourego plays a key role; this is your opportunity to be part of our fast-growing company and a highly-driven team! Responsibilities Develop server-based components based on business needs. Develop test plans and cases, as well as prepare business and technical documents. Provide post-launch maintenance and support, as well as implement product enhancements. Requirements A degree/ diploma, preferably in Computer Science, Information Technology, Information System, Info-communications or Engineering. Proficient in development using LAMP stack and PHP frameworks (Laravel, Symfony, CakePHP, CodeIgniter etc) Familiar with RMDBs (mySQL, Oracle, MS SQL Server), noSQL and Hadoop Familiar with jQuery, AngularJS, Bootstrap, HTML5, CSS3 etc Posses strong knowledge of web services (XML, JSON, SOAP, REST) and cloud technologies (AWS) Must be self-driven, good team player with initiative and eagerness to pick up new knowledge / technologies. This is the job for you? Join Us Now! We regret only short-listed candidates will be notified.

Skills
PALO IT SINGAPORE PTE. LTD.
17Jan
Senior Database Consultant - Big Data Engineer
PALO IT SINGAPORE PTE. LTD.   via JobsCentral



Roles & Responsibilities

Your profile & role on the project

YOU:

Thrive on challenge. When was the last time you failed?

Are curious & always learning. What are you up to right now?

Can deal with constant change. When were you last surprised?

Have mastered at least one skill of your trade but you’re not defined by it. What can you teach us? Can you wear many hats?

YOU AGAIN:

The DevOps Architect will    Read more

install, maintain, and support an on-premises cloud infrastructure

and apply DevOps practices and solutions. The person will also implement cloud-related and

DevOps technologies such as AWS/Puppet/Chef/Elk/Azure/Openstack. Other infrastructure related activities such as maintaining the company internal server infrastructure and respond

to consultant requests when required will be expected.

Install, maintain, and support on-premises and off-premises cloud stack.

Configure, maintain, and support the cloud-related infrastructures.

Act as a system administrator on different OSes (e.g. RHEL, Opensolaris, Ubuntu, etc.) and help teams deploy their application and automate their development and releases on the cloud.

Ability to develop solutions and self-learn new tools and technologies.

Document, and share knowledge on developed DevOps solutions.

STILL YOU:

Unix / Linux / Bash knowledge

Very good understanding of cloud computing (e.g. Technologies, Deployment, costing, HA/DR, etc.)

Good understanding of DevOps principles (e.g. testing automation, BDD, TDD, Release automation, CI/CD, etc.)

2 years experience with cloud deployment (e.g. Openstack, VMWare, AWS, Azure, Terraform, etc.)

1 year experience with testing automation (e.g. Maven, Selenium, HP QC, LoadRunner)

1 year experience with release automation process (e.g. CA-RA, Jenkins, etc.)

1 year experience with Configuration Management (e.g. Ansible, SaltStack, Puppet/Chef, etc.)

1 year experience with monitoring tools (e.g. ELK, Prometheus, Grafana and Splunk)

Experience with developing and implementing processes to handle releases from

Development to Operations while respecting internal rules, and offering solutions for rollback)

Experience with designing an architecture to implement development-to-production workflows.

Knowledge of SRE, Containers, Kubernetes, Openshift is a plus.

Good understanding of microservice architecture and DevOps practices that support.

Strong RDMS and NoSQL skill in deploying and fine tuning such as MySQL, Oracle, Elasticsearch.

Your role at PALO IT

You will be invited to take part in R&D works done within our Practices. You will have the

chance to assist or be a speaker at must-attend international IT conferences. You will have the

opportunity to write articles for our Blog or specialized press. Genuine ambassador of PALO IT,

you will present our offers and take an active role in the development of the company.

Your technical environment

# Cloud and DevOps based technologies (AWS/Puppet/Chef/Elk/Azure/Opencloud)

# DevOps practices

# Linux OS, Shell Scripting, SQL

# Agile and scrum environment

Requirements

✔You hold a Bachelor, Master or PhD degree in IT, Information Management and/or Computer Science

✔You are just graduated or have less than 3 years of working experience

✔Good knowledge of big data technology landscape and concepts related to distributed storage / computing

✔Experience with big data frameworks (e.g. Hadoop, Spark) and distributions (Cloudera, Hortonworks, MapR)

✔Experience with batch & ETL jobs to ingest and process data from multiple data sources

✔Experience with NoSQL databases (e.g. Cassandra, MongoDB, Neo4J, ElasticSearch)

✔Experience with querying tools (e.g Hive, Spark SQL, Impala)

✔Experience or willingness to go in real-time stream processing, using solutions such as Kafka, Flume and/or Spark Streaming

✔You are passionate about technology and continuous learning comes naturally to you

Skills
BEATHCHAPMAN (PTE. LTD.)
17Jan
Dba Manager
BEATHCHAPMAN (PTE. LTD.)   via JobsCentral



Roles & Responsibilities

Multinational Bank

Lead DBA Team

FVP Level

On behalf of our client, BeathChapman is assisting in identifying a DBA on FVP level.

The DBA Manager will lead a team within the Application Database/ Support/ Operation Domain.

Job Responsibilities:

Experience in leading and managing a team of DBAs (minimum 8 persons) in supporting the operation of organization’s databases in a highly complex and regulated environment.

Develop and owned an enterprise wide database architecture and    Read more

data modeling standards

Support and collaborate with Application and Development team in projects on database requirements and the use, optimization and troubleshooting of database technologies including an understanding of Agile/SCRUM development practice

Keep leadership appraised at all times on the health and availability of production systems with timely reports in terms of performance and patches

Take part in technical and business discussion, share ideas and provide feedback in continuing the improvement of the database team

Owns and handle escalations related to database issues, incidents and problems

Working with the application development team to ensure that appropriate and efficient SQL is being coded and tested

Established service level agreements with business users

Responsibility in overseeing the use of new database technologies such as MariaDB and other cloud database technologies

Requirements

Requirements:

Bachelor degree from a recognized university

Experience in Oracle, MS-SQL, Teradata and Hadoop Technology is a must

Prior practical DBA experience (minimum 5 years as DBA) before going in to management is required

Team Leadership

Experience in managing database teams of at least 5 members with ability to manage and coordinate team members technical and development goals

Effectively managed DBAs supporting Microsoft SQL and Oracle

Technical Skills

Highly skilled in optimization of SQL queries or stored procedures to improve application effectiveness and performance

Strong written and communication skills to effectively communicate with stakeholders and senior management

Minimum 10 years of working experience with at least 5 leading a team

Understand DevOps practices and tooling for automation with respect to database automation and code management

Familiar with best practices and principles of database management, security, accesses and regulations

Strong information assurance skill set such as auditing, DR, backup/restore and DB availability

Data Warehousing experience and cloud-based database technologies a plus

Interested candidates can forward their CVs in MS Word format to [Click Here to Email Your Resume] quoting reference number JAS/AKEN-460520/BC

Reg No. 1874652

BeathChapman Pte Ltd

Licence no. 16S8112

Skills
DBS Bank Limited
17Jan
VP / AVP, Development Engineer - (Consumer Finance Technology - Engineering Stream (Lead)), Group Consumer Banking and Big Data Analytics Technology, Technology and Operations
DBS Bank Limited   via DBS Bank Limited


Business Functions
roup Technology and Operations (T&O) enables and empowers the bank with an efficient, nimble and resilient infrastructure through a strategic focus on productivity, quality & control, technology, people capability and innovation. In Group T&O, we manage the majority of the Bank's operational processes and inspire to delight our business partners through our multiple banking delivery channels.
Job Purpose
iche engineering team comprising of architects, designers, development engineers and automation    Read more

engineers. The team will be responsible for re-engineering next-gen banking products in Cards and Unsecured Loans (UL) segment. Collectively the team should be able to Design and Develop products by collaborating with Business Systems Analysts (also part of the team), selecting the best technology and architecture for the problem at hand. We are looking for problem solvers who apply best engineering practices to software development and own the outcome.

Responsibilities
Extensive experience in Java, JavaScript, Spring boot, Hibernate, Eclipse, JUnit, Apache, Open Source stacks and Linux (Scripting and Shell). Prior experience with Mainframes preferableExperienced in ETL, Legacy modernizationWell versed with hands-on development, design using SOA/MicroservicesProven experience in design and development of APIs using API Gateways including Gateway deployment, configuration, policy development, migration, debugging and troubleshootingWorking knowledge of Web API, REST, XML, JSON, Security (such as OAuth, OpenID Connect) Ability to work with Linux OS to deploy and configure componentsOptional - Hands-on design and development experience in TIBCO suite of BW, BPM, BE, EMS, Hawk, and Adapters etc. Beneficial – Python, Kafka, Hadoop and SparkCloud based Development (PCF/AWS)Experienced in CI/CD Cross functional/cross technical knowledge about ETL, Data analytics, UI development

Requirements
7+ years, Extensive experience in Java, JavaScript, Spring boot, Hibernate, Eclipse, JUnit, Apache, Open Source stacks and Linux (Scripting and Shell)Willing to research and innovate on various data

Requirements
(Transformation/Processing)Well versed with hands-on development, design using SOA/MicroservicesProven experience in design and development of APIs using API Gateways including Gateway deployment, configuration, policy development, migration, debugging and troubleshootingWorking knowledge of Web API, REST, XML, JSON, Security (such as OAuth, OpenID Connect) Ability to work with Linux OS to deploy and configure AXWAY gateway and other components Optional - Hands-on design and development experience in TIBCO suite of BW, BPM, BE, EMS, Hawk, and Adapters etc. Cloud based Development (PCF/AWS)Experienced in CI/CDSolid software engineering experience Strong analytical and problem-solving skillsStrong Java and SQL skills (MariaDB)Excellent written and verbal reasoning and communication skillsAbility to lead technical solutions end to endApply Now We offer a competitive salary and benefits package and the professional advantages of a dynamic environment that supports your development and recognises your achievements.

Skills