Hadoop Solutions Engineers - HW1330 (BBBH336) Los Angeles, California
Big Data is exploding as organizations across the world are deploying Apache Hadoop to derive business meaningful insights. Customers are starving for Data Platform Engineers whom are energetic Pre-Sales Engineers that have and hone a diverse range of talents, beyond the technical. Equally important , nothing can substitute for the hands-on demonstration when it comes to closing a deal, therefore deep technical skills are a must.
As a Solution Engineer you will utilize your strong technical, business competencies and customer service orientation to provide the highest level of business and technical consultation to sales teams, prospects and customers to support sales goals. You will work on cross-functional teams including sales, Product Management, Product Marketing, Services, Business development, Training and Engineering to create solutions that provides the immense business value. You will become intimately acquainted with customers business requirements, technical needs, systems, environment and service history, enabling you to present targeted business solutions. A successful Solution Engineer has the heart of a salesperson wrapped in the mind of an IT Pro.
- Educate the prospects in how their business priorities (use cases) map to Hadoop and integrate with the ecosystem
- Understand the customers customer, what business each of the business units are in and their criterion for success
- Interface with account and partner teams to share account insight and solicit inclusion for increasing account penetration, revenues and deal size
- You will utilize your excellent technical and relationship building skills in working with prospects, customers, vendors, partners and sales to ensure the optimum systems solutions are provided to customers.
- Understand prosepect insiders and how competition sells to the account. Develop insights from this and from public sources
- Start the journey of being a trusted and business advisor to the community and your customer base by balancing customers success criterion with ouraspirations
- Understand the vast immature and mature Hadoop ecosystem unfolding and apply to your customers business priorities
- Grow your vast technical acumen on Apache Hadoop, the ecosystem and customer critical and specific solutions
- Stay ahead and on top of all new product features and fixes
- Provide technical consultation and education to the Sales team by keeping them apprised on new product information
- Limited management supervision and direction is provided. This individual will operate independently and drive results as a team and as an individual
- Ability to pass US Government security clearances highly desirable
- Identify customer needs and requirements
- Define all of the elements of a customer's technical computing environments
- Answer customer questions
- Provide technical support to current customer and 'try-and-buy' test projects
- Define customer acceptance criteria for 'try-and-buy' evaluation projects
- Work with Sales to respond to RFP's
- Attend customer calls with the Sales Representative
- Provide technical consultation and disseminate product information to the sales team
- Excellent written and verbal communication skills
- Good interpersonal communication and customer service skills are needed in orderwork successfully with prospects, customers, and cross functional teams to meet performance goals
- Account management and project management experience
- Strong aptitude for learning new technologies and understanding how to utilize them in a customer-facing environment.
- Ability to follow standard engineering principles and practices
- Creative approach to problem solving
- Travel to the prospective customer's sites as necessary
- Excellent Prospect and Partner presentation skills
Desired Skills and Experience:
- Experience with core enabling technologies, networking, Unix, Linux, Window operating systems, building and managing compute farms, scripting and java programming, and/or data warehousing solutions.
- Experience using and developing infrastructure solutions including but not limited to: Apache Hadoop, (HDFS, MapReduce, Common), Hive, Pig, Sqoop,Oozie, Ambari, HCatalog, Zookeeper, and Flume
- Experience with enterprise scale distributed NoSQL solutions including but not limited to: HBase, Cassandra, Riak, MongoDB, etc.
- Experience with Enterprise Data Warehouses including but not limited to: Teradata, Netezza,Exadata, MSSQL PDW and GreenPlume