Attention prospective job seekers! Beware of fraudulent offers Read more
Position Summary
The Lakehouse Developer is responsible for designing implementing and maintaining data lakehouse solutions that integrate data storage and analytics This role involves collaborating with data engineers and analysts to enable data workflows and ensure data integrity The developer will leverage tools and technologies for efficient data processing and analysis
Minimum Qualifications
6 Years overall IT experience with minimum 4 years of work experience in below tech skills
Tech Skills
Work experience in data lakehouse in any of these Apache Iceberg Databricks Delta Lake
Proficient in Python scripting and PySpark for data processing tasks
Strong SQL capabilities with hands on experience managing big data using ETL tools
Experience with the AWS cloud platform and its data services
Skilled in BASH Shell scripting
Preferred Experience with Kafka and Mulesoft API
Understanding of healthcare data systems is a plus
Experience in Agile methodologies
Strong analytical and problem solving skills
Effective communication and teamwork abilities
Responsibilities
Implement lakehouse architecture to integrate and optimize data storage and analytics processes
Develop ETL pipelines for efficient data ingestion transformation and loading from various sources
Optimize query performance by analyzing and tuning data workflows to minimize latency
Implement data governance policies to ensure compliance and protect sensitive data
Collaborate with data teams to gather requirements and deliver solutions that enhance data accessibility
Position Summary
The Lakehouse Developer is responsible for designing implementing and maintaining data lakehouse solutions that integrate data storage and analytics This role involves collaborating with data engineers and analysts to enable data workflows and ensure data integrity The developer will leverage tools and technologies for efficient data processing and analysis
Minimum Qualifications
6 Years overall IT experience with minimum 4 years of work experience in below tech skills
Tech Skills
Work experience in data lakehouse in any of these Apache Iceberg Databricks Delta Lake
Proficient in Python scripting and PySpark for data processing tasks
Strong SQL capabilities with hands on experience managing big data using ETL tools
Experience with the AWS cloud platform and its data services
Skilled in BASH Shell scripting
Preferred Experience with Kafka and Mulesoft API
Understanding of healthcare data systems is a plus
Experience in Agile methodologies
Strong analytical and problem solving skills
Effective communication and teamwork abilities
Responsibilities
Implement lakehouse architecture to integrate and optimize data storage and analytics processes
Develop ETL pipelines for efficient data ingestion transformation and loading from various sources
Optimize query performance by analyzing and tuning data workflows to minimize latency
Implement data governance policies to ensure compliance and protect sensitive data
Collaborate with data teams to gather requirements and deliver solutions that enhance data accessibility
Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 36,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us.
Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence.
Virtusa is an Equal Opportunity Employer. All applicants will receive fair and impartial treatment without regard to race, color, religion, sex, national origin, ancestry, age, legally protected physical or mental disability, protected veteran status, status in the U.S. uniformed services, sexual orientation, gender identity or expression, marital status, genetic information or on any other basis which is protected under applicable federal, state or local law.
Applicants may be required to attend interviews in person or by video conference. In addition, candidates may be required to present their current state or government-issued ID during each interview. All candidates must be authorized to work in the USA.
Learn more
Have any questions?
To join our bright team of professionals, you can apply directly to our website under the Careers tab and search all open jobs. https://www.virtusa.com/careers
Yes, you can. Virtusa gives you the flexibility to apply for multiple open positions that excite you about your future and align to your experience and career goals.
Yes, you can. Virtusa is a global Company, and we serve our clients through our global delivery model.
Our dedicated recruitment team will review your online application and match it to all our open jobs. We update our open jobs on a daily basis and encourage you to check back often.
Our team of recruiters will review your application, relevant job experience, and skills to appropriately align it to our open jobs. From there, the recruitment team will contact the qualified candidate to start the interview process.
Want to explore the ways you can engineer your career in technology? Our thought leaders share key career insights for candidates from entry-level job seekers to senior technologists.
Check your downloads folder for files and implementation instructions.
Assets are now available in your profile for future editing and use.