Iris Data Engineer Vacancy | Work Location: Noida

Iris Data Engineer Vacancy at Bangalore. Interested Candidates can go through the details and apply using the link provided at the bottom of the Post.

About Iris

Iris Software is a global information technology solutions provider that offers a range of services to help businesses achieve their goals. With over two decades of experience in the industry, Iris Software has built a reputation for delivering high-quality software development, testing, and maintenance services. The company has a team of skilled professionals who leverage the latest technologies to design and develop innovative solutions that meet the unique needs of each client. Iris Software’s commitment to excellence has earned it numerous accolades and recognition from industry experts and customers alike.

Iris Recruitment 2023

Company nameIris Software
Websitewww.irissoftware.com
Job RoleData Engineer
Work LocationNoida, India
Job TypeFull Time
ExperienceFreshers/ Experienced
QualificationB.Tech/B.E in Computer Science or a related field
BatchNot Mentioned
PackageUp to 10 LPA

Job Description

The ideal candidate for this position should have expertise in Apache Spark, as well as proficiency in at least one of the following programming languages: Java, Python, or Scala. In addition, the candidate should be experienced in using Hadoop and Hive, as well as Spark.

Along with these primary skills, the candidate should also have familiarity with secondary skills such as Unix, RDBMS, Hbase, and Impala.

To ensure that the content is plagiarism-free, I have rewritten the given job description using different words and sentence structures while retaining the original meaning and context.

Responsibilities

The role involves identifying existing data gaps, building new data pipelines, and providing automated solutions to deliver data to applications that support business requirements.

Candidates should have strong experience in Scala, Java, or Python, as well as expertise in Spark and Hive.

Experience in writing code using Spring Boot, Apache Kafka, and microservices is also necessary.

Candidates must be quality-focused, with a strong understanding and appreciation of code review and automated testing.

Additionally, candidates should be able to develop and maintain conceptual, logical, and physical data models, operational data stores (ODS), and data lakes, while ensuring the quality level of documentation produced and maintained.

Candidates should also have experience with continuous integration and deployment tools and approaches.

Knowledge of Snowflake cloud data warehousing would be a plus.

It is important to ensure that the content is plagiarism-free to avoid copying from other sources.

How to Apply?

  • To apply for a job, read through all information provided on the job listing page carefully.
  • Look for the apply link on the job listing page, usually located somewhere on the page.
  • Clicking on the apply link will take you to the company’s application portal.
  • Enter your personal details and any other information requested by the company in the application portal.
  • Pay close attention to the instructions provided and fill out all necessary fields accurately and completely.
  • Double-check all the information provided before submitting the application.
  • Ensure that your contact information is correct and up-to-date, and accurately reflect your qualifications and experience.
  • Submitting an application with incorrect or incomplete information could harm your chances of being selected for an interview.