Job ID: POS-6710
Job Title: Snowflake Architect
Primary Skill: Strong in Snowflake, Data Modelling, and architecture solutions
Secondary Skill: PySpark, SQL
Location: Pune, Hyderabad
Experience:12-15 years
Job Summary:
Are you someone who has a strong data engineering background who is passionate about tackling complex enterprise business challenges by designing data driven solutions? Then read on!
ValueMomentum is seeking snowflake architect to join our DataLeverage teams who will be responsible for leading technology innovation in cloud Technology. You will be part of a highly collaborative and growing team and solve complex business challenges leveraging the modern data and analytics technologies. In this role, you will be responsible for designing, building, and maintaining data warehousing solutions using Snowflake’s cloud-based data platform. You will collaborate with data engineers, data analysts, and other stakeholders to ensure efficient data storage, retrieval, and processing for the organization’s analytical and reporting needs.
At ValueMomentum, we endorse a culture of experimentation and constantly strive for improvement and learning. You will be exposed to advanced tools, technologies, methodologies and best practices that will help you develop your skills and grow in your career.
Know your team:
ValueMomentum’s Data Leverage (DL) team is trusted by leading insurance, financial services, and healthcare payers to empower businesses with real-time insights. The DL team is focussed on helping our customers modernize data infrastructure, create effective business intelligence capabilities and develop analytics models to generate business driven insights and outcomes. DL team provides a wide range of capabilities to our customers including advisory services, strategy consulting, data engineering, data management, data governance, BI and Analytics.
Responsibilities:
Job description
The Data Architect for presales will engage with clients to understand their data challenges, tailor technical solutions, and present compelling proposals. Bridge the gap between technical intricacies and client business needs, showcasing the value of our data architecture and domain expertise. Drive pre-sales success by leveraging deep understanding of data systems and strategic thinking.
Solution Development and Strategy
- Develop and implement enterprise data architecture strategies aligned with organizational goals and industry best practices, tailoring solutions for presales engagements.
- Provide reference architecture and guidelines for technology adoption, emphasizing cloud migration, and data modernization.
Data Solutions Delivery
- Collaborate with business stakeholders to understand data needs and design solutions.
- Create a unified data model showcasing the organization’s assets, relationships, and attributes.
- Design and oversee the implementation of data integration, storage, and retrieval solutions, ensuring data accessibility, reliability, and performance.
- Implement solutions leveraging AI and ML for better productivity.
Leadership and Innovation
- Provide technical guidance to data architects, engineers, and other data-related roles.
- Drive data governance initiatives, ensuring compliance.
- Research, architect, and deliver expanding solutions on the Data Analytics stack.
- Work with cross-functional project teams and provide data strategy and architecture, resolving performance and technical issues during proposal and delivery stages.
Building and Growth of Practice
- Customize solutions to fit client needs and industry requirements.
- Cultivate partnerships, emphasizing the long-term benefits of robust data architecture.
- Drive innovation, integrating cutting-edge technologies into practices.
- Establish industry thought leadership through active contributions.
- Develop scalable frameworks adaptable to evolving client needs and growth.
- Foster a collaborative team culture, encouraging knowledge sharing and skill development.
- Manage Centre of Excellence for Data engineering and Analytics.
Marketing Support
- Contribute for White Paper, Case Studies and Technical blogs.
- Participate as a Panelist in technical session.
- Support in activities for adding or enhancing partnership with Tech product companies.
What we need in you
- Proven experience in designing and implementing complex data solutions aligned with business objectives.
- Expertise in data modelling, integration, security, and governance
- Hands-on experience with guiding the virtual data model definition, defining Data Virtualization architecture and deployment with focus on Azure/AWS, Snowflake, PySpark technologies.
- Prior experience with establishing best practices for business optimizations.
- Experience with relational and non-relational data stores (Hadoop, SQL, Mongo DB), ETL or ELT tools (SSIS, Informatica, Matillion, DBT), DevOps, Data Lake and Data Fabric concepts
- In-depth experience with data governance, data integration and related technologies.
- Proficiency in a variety of database technologies, both relational and non-relational.
- Knowledge of cloud-based data solutions (e.g., AWS, Azure).
- Excellent collaboration and communication skills
- Experience in Insurance Domain will be a differentiator.
- Data architecture certifications (e.g., TOGAF, DAMA) are a plus.
Requirements:
Candidates are required to have these mandatory skills:
- At least 12 years’ overall experience in Data Modeling/Data Warehousing.
- 5 years’ experience in Snowflake, Data Modeling, and Architecture including expertise in Cloning, Data Sharing, and Search optimization.
- Proficient in Python and PySpark.
- Ability to write complex SQL for analysis and troubleshooting of data anomalies.
- Experience in Performance Management in Snowflake.
- Working knowledge of cloud platforms (AWS, Azure, Google Cloud Platform, etc.).
- Experience across Snowflake and Cloud-based databases on IAM & Role Management.
- Familiarity with GIT for development, deployment, and support of data processes and procedures.
- Detailed understanding of AWS, Google Cloud Platform, Azure, Snowflake resources (e.g., S3, Glue, Blob, EC2, Containers, etc.).
- Experience with Linux and Windows environments.
- Experience in Snowpark, Snowflake Data Sharing, Cloning, and Database replication for Disaster Recovery (DR) features.
- Expertise in high-volume data processing using Snowflake, Redshift, Databricks, and other relevant cloud technologies.
- Excellent communication skills, both verbal and written
- Experience with BI tool and Database integration.
About the Company:
Headquartered in New Jersey, US, ValueMomentum is the largest standalone provider of IT Services and Solutions to Insurers. Our industry focus, expertise in technology backed by R&D, and our customer-first approach uniquely position us to deliver the value we promise and drive momentum to our customers’ initiatives. ValueMomentum is amongst the top 10 insurance-focused IT services firms in North America by number of customers. Leading Insurance firms trust ValueMomentum with their Digital, Data, Core, and IT Transformation initiatives.
Benefits:
We at ValueMomentum offer you a congenial environment to work and grow in the company of experienced professionals. Some benefits that are available to you are:
- Career Advancement: Individual Career Development, coaching and mentoring programs for professional and leadership skill development.
- Comprehensive training and certification programs.
- Performance Management: Goal Setting, continuous feedback and year-end appraisal. Reward & recognition for the extraordinary performers.
- Benefits: Comprehensive health benefits, wellness and fitness programs. Paid time off and holidays.
- Culture: A highly transparent organization with an open-door policy and a vibrant culture