Join Our Team

We’re always on the look out for new talent. Check out our current openings and apply today!

Results (1-12 of 1467)

List Grid
JOB TITLE
CATEGORY
LOCATION
JOB ID
Pune

DevOps with Harness

Harness CI/CD Expertise:Proficiency in setting up and managing Harness CI/CD pipelines for enterprise systems.Experience in creating CI/CD pipelines across any 2 technologies (Java, .NET, Lambda, Python and Database) with a focus on build, automation and deployment optimization.Hands-on experience integrating SaaS platforms (e.g., JIRA, JFrog, DataPower, F5) and hosted platforms (e.g., GitHub, Jenkins, SonarQube, Checkmarx) into Harness pipelines.Knowledge of deployment patterns such as Blue-Green and Canary.Familiarity with quality gates (e.g., SonarQube, Checkmarx) and artifact repositories like JFrog Artifactory.Expertise in configuring Harness Delegate Provisioning, RBAC rules and LDAP Integrations setups.Cloud and Infrastructure as Code (IaC):Strong expertise in AWS and cloud infrastructure management.4years of experience with Terraform for provisioning across development, staging, and production environments.CI/CD Tools and Pipelines:Experience with Jenkins, Harness or similar CI/CD tools for pipeline developmentProficiency in pipeline scripting, including Jenkins and Harness-specific pipelines.Version Control:Expertise in GitHub, including enterprise-level setups and integration with CI/CD pipelines.Programming and Scripting:4years of experience with scripting languages such as Python, Shell and Terraform for automation and troubleshooting.Containers and Orchestration:1years of hands-on experience with Docker and Kubernetes (CKA certification preferred).Knowledge of container image repositories such as ECR.Secondary SkillsMigration and Integration Support:Hands on experience with pipeline migration strategies from Jenkins or other CI/CD tools to Harness.Familiarity with integrating SSO, SailPoint, Service Now, Jira, SonarQube and other enterprise tools into the Harness ecosystem.Configuration and Automation:Experience customizing accounts with roles, policies, OPA and security groups using Terraform and other automation tools. Observability and Monitoring Tools:Knowledge of monitoring and reporting mechanisms for CI/CD pipeline performance and reliability.Data and Messaging Tools:Familiarity with Kafka, Zookeeper, and ELK Stack for application monitoring and data flow optimization.
Learn More
CREQ210594
DevOps
India, Pune
Hyderabad

.Net Architect

Key Responsibilities:Backend Development:Design, develop, and maintain server-side applications using .NET Core, Entity Framework (EF) Core, and ASP.NET.Ensure high performance, scalability, and reliability of backend systems.REST API/Web API Development:Create robust and scalable RESTful and Web APIs for seamless communication between different components.Implement security best practices for API development.SOLID Principles:Apply SOLID principles for designing maintainable and scalable code.Promote best practices in software development and code reviews.Design Patterns:Utilize design patterns in API development to ensure code reusability and efficiency.Mentor team members on the effective use of design patterns.Database Management:Work with SQL Server to design, implement, and optimize database structures.Ensure data integrity, performance tuning, and efficient query execution.Serverless Computing:Utilize AWS Lambda or Azure Functions for serverless computing, ensuring efficient and scalable application architecture.Integrate serverless solutions with existing systems and workflows.Team Leadership:Guide and mentor multiple development teams, fostering a collaborative and innovative environment.Conduct regular code reviews and provide constructive feedback.Proof of Concepts (PoCs):Build PoCs to identify the right design approach and validate technical feasibility.Present PoC results and recommendations to client stakeholders.Client Engagement:Provide insights and recommendations to client stakeholders on digital modernization strategies.Collaborate with clients to understand their business needs and translate them into technical solutions.Required Skills and Qualifications:Proficient in .NET Core, Entity Framework (EF) Core, ASP.NET, REST API development, and Web API.Strong experience with SQL Server, including database design and optimization.Hands-on experience with AWS or Azure, including serverless computing (AWS Lambda or Azure Functions).Solid understanding of SOLID principles and design patterns.Expertise in Angular for front-end development.Excellent problem-solving skills and the ability to think critically and creatively.Strong communication and interpersonal skills, with the ability to engage effectively with clients and team members.Proven track record of leading and mentoring development teams.Preferred Qualifications:Experience with other front-end frameworks and libraries.Knowledge of DevOps practices and CI/CD pipelines.Certification in AWS or Azure.
Learn More
CREQ213203
Core Tech .NET
India, Hyderabad
Bangalore

DevOps with Harness

Harness CI/CD Expertise:Proficiency in setting up and managing Harness CI/CD pipelines for enterprise systems.Experience in creating CI/CD pipelines across any 2 technologies (Java, .NET, Lambda, Python and Database) with a focus on build, automation and deployment optimization.Hands-on experience integrating SaaS platforms (e.g., JIRA, JFrog, DataPower, F5) and hosted platforms (e.g., GitHub, Jenkins, SonarQube, Checkmarx) into Harness pipelines.Knowledge of deployment patterns such as Blue-Green and Canary.Familiarity with quality gates (e.g., SonarQube, Checkmarx) and artifact repositories like JFrog Artifactory.Expertise in configuring Harness Delegate Provisioning, RBAC rules and LDAP Integrations setups.Cloud and Infrastructure as Code (IaC):Strong expertise in AWS and cloud infrastructure management.4years of experience with Terraform for provisioning across development, staging, and production environments.CI/CD Tools and Pipelines:Experience with Jenkins, Harness or similar CI/CD tools for pipeline developmentProficiency in pipeline scripting, including Jenkins and Harness-specific pipelines.Version Control:Expertise in GitHub, including enterprise-level setups and integration with CI/CD pipelines.Programming and Scripting:4years of experience with scripting languages such as Python, Shell and Terraform for automation and troubleshooting.Containers and Orchestration:1years of hands-on experience with Docker and Kubernetes (CKA certification preferred).Knowledge of container image repositories such as ECR.Secondary SkillsMigration and Integration Support:Hands on experience with pipeline migration strategies from Jenkins or other CI/CD tools to Harness.Familiarity with integrating SSO, SailPoint, Service Now, Jira, SonarQube and other enterprise tools into the Harness ecosystem.Configuration and Automation:Experience customizing accounts with roles, policies, OPA and security groups using Terraform and other automation tools. Observability and Monitoring Tools:Knowledge of monitoring and reporting mechanisms for CI/CD pipeline performance and reliability.Data and Messaging Tools:Familiarity with Kafka, Zookeeper, and ELK Stack for application monitoring and data flow optimization.
Learn More
CREQ210593
DevOps
India, Bangalore
Piscataway

Azure Solutions Architect

Azure Solutions Architect Job Description: Strategy: Responsible for creating capability and offering roadmap to accelerate the business. Pre-Sales: Leading growth of the offerings globally working with client partners and go-to-market team. Pre-Sales- Actively participate in presales activities, deal reviews for the Application Services engagements Solutioning: Solutioning differentiated offering for customers helping them achieve their business objectives. Partner- Partner with Hyperscalers in creating strategic initiatives that would help the growth of business. Partner: Establish Virtusa as one of the top Cloud Application Services partners. Offering- Work with Offering and marketing organization to create cloud offerings and lead the socialization and pre-sales of those offerings. Cross-organizational agility, communication, and leadership. In depth understanding of multiple key technology platforms (e.g. Azure, Red Hat, VMWare), tool chains (e.g. automation, CI/CD, etc.) and adjacent service domains (e.g. Security, Data & AI, Marketing, ERP, etc.). Experience with helping to shape transformational deals Collaborate with clients and internal teams to develop strategic plans for the assigned portfolio that align with business objectives & drive revenue growth Collaborate with sales, Alliance, Marketing and Response (Presales) teams to identify and grow opportunities within portfolio accounts Deep understanding of Azure Cloud solutions both on premises and in the cloud Demonstrated ability to work and interact with high-level client executives Ability to generate effective present solutions via MS office tools Strong understanding of the sales drivers, value proposition, for pre-sales and solutioning cloud computing
Learn More
158478
Others
United States, Piscataway
Hyderabad

Data Lead

ResponsibilitiesTechnical Leadership: Provide technical direction and mentorship to a team of data engineers, ensuring best practices in coding, architecture, and data operations.End-to-End Ownership: Architect, implement, and optimize end-to-end data pipelines that process and transform large-scale datasets efficiently and reliably.Orchestration and Automation: Design scalable workflows using orchestration tools such as Apache Airflow, ensuring high availability and fault tolerance.Data Warehouse and Lake Optimization: Lead the implementation and optimization of Snowflake and data lake technologies like Apache Iceberg for storage, query performance, and scalability.Real-Time and Batch Processing: Build robust systems leveraging Kafka, SQS, or similar messaging technologies for real-time and batch data processing.Cross-Functional Collaboration: Work closely with Data Science, Product, and Engineering teams to define data requirements and deliver actionable insights.Data Governance and Security: Establish and enforce data governance frameworks, ensuring compliance with regulatory standards and maintaining data integrity.scalability and Performance: Develop strategies to optimize performance for systems processing terabytes of data daily while ensuring scalability.Team Building: Foster a collaborative team environment, driving skill development, career growth, and continuous learning within the teamInnovation and Continuous Improvement: Stay ahead of industry trends to evaluate and incorporate new tools, technologies, and methodologies into the organization.QualificationsReuired Skills:8+ years of experience in data engineering with a proven track record of leading data projects or teams.Strong programming skills in Python, with expertise in building and optimizing ETL pipelines.Extensive experience with Snowflake or equivalent data warehouses for designing schemas, optimizing queries, and managing large datasets.Expertise in orchestration tools like Apache Airflow, with experience in building and managing complex workflows.Deep understanding of messaging queues such as Kafka, AWS SQS, or similar technologies for real-time data ingestion and processing.Demonstrated ability to architect and implement scalable data solutions handling terabytes of data.Hands-on experience with Apache Iceberg for managing and optimizing data lakes.Proficiency in containerization and orchestration tools like Docker and Kubernetes for deploying and managing distributed systems.Strong understanding of CI/CD pipelines, including version control, deployment strategies, and automated testing.Proven experience working in an Agile development environment and managing cross-functional team interactions.Strong background in data modeling, data governance, and ensuring compliance with data security standards.Experience working with cloud platforms like AWS, Azure, or GCP.Preferred Skills:Proficiency in stream processing frameworks such as Apache Flink for real-time analytics.Familiarity with programming languages like Scala or Java for additional engineering tasks.Exposure to integrating data pipelines with machine learning workflows.Strong analytical skills to evaluate new technologies and tools for scalability and performance.
Learn More
CREQ209245
Data Platforms
India, Hyderabad
Page Please enter expected page number. of 123

GO

Previous

Next