Solutions Consultant, SnowPark

See more jobs from Snowflake Inc.

5 months old

Apply Now

Build the future of data. Join the Snowflake team.

We are looking for a Solutions Consultant to be part of our Professional Services team to deploy cloud products and services for our customers. This person must be a hands-on, self-starter who loves solving innovative problems in a fast-paced, agile environment. The ideal candidate will have the insight to connect a specific business problem and Snowflake’s solution and communicate that connection and vision to various technical and executive audiences.

The person we’re looking for shares our passion for reinventing the data platform and thrives in a dynamic environment. That means having the flexibility and willingness to jump in and get it done to make Snowflake and our customers successful. It means keeping up to date on the ever-evolving data and analytics technologies, and working collaboratively with a broad range of people inside and outside the company  to be an authoritative resource for Snowflake and its customers.

AS A SOLUTIONS CONSULTANT AT SNOWFLAKE, YOU WILL:

  • Be responsible for delivering exceptional outcomes for our teams and customers during our modernization projects.  You will engage with customers to migrate from legacy environments into Snowpark/Snowflake.  You will act as the expert for our customers and partners throughout this process.
  • In addition to customer engagements, you will work with our internal team to provide requirements for our Snowconvert utility, based on project experiences.  This ensures that our tooling is continuously improved based on our implementation experience.  

OUR IDEAL SOLUTIONS CONSULTANT WILL HAVE:

  • University degree in computer science, engineering, mathematics or related fields, or equivalent experience
  • Minimum 5 years of experience as a solutions architect, data architect, database administrator, or data engineer.
  • Willingness to forge ahead to deliver outcomes for customers in a new arena, with a new product set
  • Passion for solving complex customer problems
  • Ability to learn new technology and build repeatable solutions/processes
  • Ability to anticipate project roadblocks and have mitigation plans in-hand
  • Experience in Data Warehousing, Business Intelligence, AI/ML, application modernization, or Cloud projects
  • Experience in building realtime and batch data pipelines using Spark and Scala
  • Proven track-record of results with multi-party, multi-year digital transformation engagements
  • Proven ability to communicate and translate effectively across multiple groups from design and engineering to client executives and technical leaders
  • Strong organizational skills, ability to work independently and manage multiple projects simultaneously
  • Outstanding skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations Hands-on experience in a technical role (SQL, data warehousing, cloud data, analytics, or ML/AI)
  • Extensive knowledge of and experience with large-scale database technology (e.g. Snowflake, Netezza, Exadata, Teradata, Greenplum, etc.)
  • Software development experience with Python, Java , Spark and other Scripting languages
  • Proficiency in implementing data security measures, access controls, and design within the Snowflake platform.
  • Internal and/or external consulting experience.

Skillset and Delivery Activities:

  • Have the ability to outline the architecture of Spark and Scala environments 
  • Guide customers on architecting and building data engineering pipelines on Snowflake 
  • Run workshops and design sessions with stakeholders and customers
  • Create repeatable processes and documentation as a result of customer engagement
  • Scripting using python and shell scripts for ETL workflow
  • Develop best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own
  • Weigh in on and develop frameworks  for Distributed Computing, Apache Spark, PySpark, Python, HBase, Kafka, REST based API, and Machine Learning as part of our tools development (Snowconvert) and overall modernization processes