JuliaCortRecruiting: Search Results
Previous Next
Back to record list
Record 3 of 20
Job #:  515
Job Title:  Big Data Solutions Architect
Perm or Contract:  Permanent
Date Posted:  09-Apr-2018
Position:  SW Engineer
Location:  Mountain View, CA
Area Code:  650
Skills:  Big Data, presale technical, Hadoop, Spark, Kafka, Cassandra, SQL, Cloud, Microsoft Azure and AWS, customer, support
Job Description:  Solution Architect will create and deliver the highest level of pre-sales technical interaction with our customers. This position will provide subject matter expertise in driving the success of our technical development and performance optimization of big data engines.

Candidates must be analytical, strategic thinkers comfortable with driving tactical execution and has a “get the job done” attitude. You must have an innate ability to understand end users, be passionate about user experience, attention to detail and be comfortable working on a fast pace environment as part of working on an early stage company.

By joining our team, you are entering a world of cutting edge development and innovation, where you will get both encouragement and guidance as well as a free hand to solve tough technical problems using good design and your personal inspiration. As a member of a well-funded early stage startup you will receive an equity share and market rate salary, as well as good food, rocking coffee, company outings and freedom to be yourself. And, of course, medical and 401k benefits too.

Responsibilities:

- Serve as a customer advocate (voice of the customer) and become a trusted advisor to customers
- Understand and capture customer’s technical requirements to build the right architecture in order to fulfill their business objectives
- Prioritize issues in process and manage the overall solution lifecycle for customers (verifying fixes to the product, coordinating delivery, and tracking/documentation of each support incidents)
- Manage the deployment of our products on our POC cloud and eventually at customer’s site
- Monitor the success of our POC and frequently collected KPI data (e.g., TCO savings, cluster size, and processing throughput)

Qualifications:

- Bachelor’s degree in Computer Engineering and/or Computer Science preferred
- 2+ years of experience working with Big Data engines (Hadoop, Spark, Kafka, Cassandra, etc.) in production
- Experience with customer-facing support for on-premise and cloud environments
- Experience with SQL-based databases
- Experience with data warehousing and relational database architecture
- Solid understanding of cloud platforms, such as Microsoft Azure and AWS
- Strong familiarity with Linux, Windows operating system, and data center networking
- Knowledge in programming with Java/Scala/Python
Job #:  515

Back to Home Page

Submit your resume or contact us