Barclays is hiring Junior Developer | Pune

Barclays is hiring Junior Developer. Interested Candidates can go through the details and apply using the link provided at the bottom of the Post.

About Barclays

Barclays, a renowned multinational bank headquartered in London, has established itself as a prominent player in the global financial services industry. With a rich history spanning over three centuries, Barclays has consistently demonstrated a commitment to excellence, integrity, and innovation. The bank offers a diverse range of financial products and services, catering to the needs of individuals, businesses, and institutional clients worldwide. Known for its strong customer-centric approach, Barclays strives to deliver exceptional banking experiences while maintaining the highest standards of professionalism and ethical conduct. Through its extensive network of branches and digital platforms, Barclays continues to empower individuals and businesses, facilitating economic growth and prosperity in the communities it serves.

Barclays Recruitment 2023

Company nameBarclays
Websitewww.barclays.com
Job RoleJunior Developer
Work LocationPune, India
Job TypeFull Time
ExperienceFreshers(0-2 years)
QualificationBachelor’s Degree
BatchNot Mentioned
Packageup to 8 LPA(Expected)

Job Description

What we’re seeking:

As a developer, it is essential to have a strong understanding of distributed computing architecture, including core Hadoop components such as HDFS, Spark, Yarn, Map-Reduce, HIVE, Impala, as well as AWS services and related technologies. Your responsibilities will involve the technical design and development of ETL/Hadoop and Analytics services/components, contributing to end-to-end architecture and process flow, and creating reusable designs based on business requirements. Your result-oriented approach and ability to provide effective solutions will be crucial. Proficiency in performance improvement and fine-tuning of ETL and Hadoop implementations is desired, and you should be capable of working independently with minimal supervision. Strong analytical and problem-solving skills, along with experience or exposure to SQL and advanced SQL skills, are required.

Skills that will contribute to your success in this role:

  • A Bachelor’s Degree with 0-2 years of experience.
  • Hands-on experience with at least one programming language, such as Unix Shell Scripting, Python, C, C++, or Java.
  • A strong understanding of distributed computing architecture, particularly the Hadoop ecosystem and its key services (HDFS, Yarn, Map-Reduce, HIVE, Impala, Spark).
  • Proficiency in writing SQL queries involving joins, filters, and sub-queries.
  • Familiarity with different AWS concepts and services like Availability Zones, Regions, VPC, EC2, S3, Athena, etc.
  • A solid understanding of various data storage formats such as CSV, delimited files, Parquet, Avro, etc.
  • Thorough knowledge of different components of a Data Warehouse.
  • Familiarity with ETL tools or technologies like Python, Spark, Ab Initio, Informatica, Data Stage, etc.
  • Experience in automating jobs/workloads using any tool or language, including Unix Cron.
  • Strong understanding of Agile Project Delivery and different stages of SDLC.
  • Familiarity with NoSQL databases and their applications.
  • Excellent communication, organizational, and analytical skills.
  • Strong listening skills.
  • Ability to raise requests in a web portal based on provided instructions and track them to closure by coordinating activities with different teams.
  • Proficiency in producing presentations using Microsoft tools and technologies.

What will your responsibilities be?

In this role, you will have hands-on experience with ETL tools like Ab Initio, Informatica, Data Stage, or Talend. Familiarity with data visualization tools such as Tableau, QlikView, Python, or R would be advantageous. You will also work with REST APIs, and familiarity with the financial services domain is desirable. Writing reusable code, understanding requirements, designing software solutions, analyzing data, and presenting findings to end users are all part of the role. Any certifications in AWS, RDBMS, NoSQL, Data Visualization & Reporting tools would be considered an additional advantage.

Where will you be working?

You will be based in Pune.

How to Apply?

  • To apply for a job, read through all information provided on the job listing page carefully.
  • Look for the apply link on the job listing page, usually located somewhere on the page.
  • Clicking on the apply link will take you to the company’s application portal.
  • Enter your personal details and any other information requested by the company in the application portal.
  • Pay close attention to the instructions provided and fill out all necessary fields accurately and completely.
  • Double-check all the information provided before submitting the application.
  • Ensure that your contact information is correct and up-to-date, and accurately reflect your qualifications and experience.
  • Submitting an application with incorrect or incomplete information could harm your chances of being selected for an interview.