• Mon Compte

Python Java Hadoop Developer

CDI @cloudlink dans Autres
  • asphodeles 16 ghandi Google Map
  • Date de Parution : 21 août 2021
  • Postuler Avant : 23 septembre 2021
  • Partager:

Description du Poste

Are you looking for a career, not just a job ?

If you’re looking to start a cloud career with international experience and an encouraging work environment and  work on innovative projects, then this opportunity is just of you.

Who Are we ?

CloudLink takes pride in procuring the “Next Gen” cloud experience and IT consultation to its customers. We are a team of highly skilled professionals that will guide and assist you in efficiently planning, designing, building, running, and optimizing your cloud environments. Our dedicated experts offer a range of services to our valuable customers varying from automation, proactive monitoring, management, and maintenance of our customer’s cloud environment.

Job Description :

As a software developer, you’ll be the brain behind crafting, developing, testing, going live and maintaining the system. You are passionate in understanding the business context for features built to drive better customer experience and adoption.

 Responsibilities:

  • Familiar with the software development life cycle (SDLC) from analysis to deployment.
  • Believes in systematic approach to developing the system through clear documentation (flowcharts, layouts, & etc) of functionality, address every use case through creative solutions.
  • Adapts structured coding styles for easy review, testing and maintainability of the code.
  • Integrate the developed functionality and/or component into a fully functional system.
  • Active participate in troubleshooting, debugging and updating current live system.
  • Verify user feedback in making system more stable and easy.
  • Work closely with analysts, designers and other peer developers.
  • Improve functionality of existing systems
  • Develop in Big Data architecture, Hadoop stack including HDFS cluster, Hive, Spark and Scala
  • Write processes in Ni-Fi, Spark to Ingest data from various sources including sftp, mainframe, RDBMS, Kafka etc.
  • Write Spark/Scala code to transform data using business rules provided
  • Load Hive/HBase and RDBMS tables
  • Analyze source system data to understand the data structure, definitions and anomalies.
  • Designs, develops and implements ELT processes and procedures
  • Ability to program in SQL to perform the data query, extract, transformation and load functions.
  • Oracle – Relational databases, PL/SQL
  • XML, XSD, XPATH
  • WebLogic Application Server (version 12.1.2 and above)
  • Developing WebSphere cells, clusters, and nodes.
  • Creating and testing applications.

 Requirements:

  • BS/MS/ PhD in Computer Science, Engineering, Math, Physics or related field completed upon employment
  • Strong programming skills in Object Oriented Python Development
  • Strong analytic and problem-solving skills
  • Ability to communicate, obtain requirements, find solutions and implement them in a clean and concise way
  • Create an easily understood and testable codebase that can change as quickly as market conditions require
  • Define, recommend, and implement highly efficient software solutions
  • Improve quality of output from the Toolkits team with testing, code reviews, and team coding
  • Drive the creation of bundles of assets that meet targeted technical requirements for bioinformatic analysis
  • Hands-on with SQL or NoSQL database.
  • Experience with a unit testing framework (e.g., pytest)
  • Experience with source control system (e.g., git)
  • Intermediate to advanced knowledge of Java
  • Familiarity with software development and project management processes and tools (Agile, JIRA, Wiki, etc.)
  • Demonstrated development capability in Python 3
  • Developing web apps in popular web frameworks (ASP .Net, Apache Wicket, JavaServer Faces (JSF) & Spring MVC etc,).
  • Experience in designing interactive applications.
  • Background in Engineering with sound oral and written communication skills.
  • AWS, Azure, Google or Openstack will be a plus.

Compétences requises