Job Description

Come be the first Hadoop Expert for a company thats grown 500% over the last 3 years. They are building out a new PaaS that is more conducive to Big Data and Hadoop providing companies with real-time data to improve the customer experience. If you dream about Hadoop/Spark clusters, keep reading.  
What you will do:

  • Build out a new infrastructure in Big Data and Hadoop
  • Create DevOps automation using Spark/Hadoop clusters and application monitoring
  • Troubleshoot and solve production issues
  • Support a 24/7 global production operation in a 100% cloud environment
What you need:
  • 2+ years’ with Hadoop and/or Spark administering and maintaining ETL pipelines
  • Automation experience with Ruby, Python, or Bash
  • Experience with AWS enterprise management tools such as EC2, S3, Redshift, etc..
  • Strong communication skills and ability to work cross-functionally
If this sounds like you, please email your resume to and we will reach out to you.
**No visa sponsorship available for this position**

Application Instructions

Please click on the link below to apply for this position. A new window will open and direct you to apply at our corporate careers page. We look forward to hearing from you!

Apply Online