Fractal is Hiring Big Data Engineer | Fractal | Big Data
Company Name: Fractal
Role: Big Data Engineer
Location: Bangalore, India
Experience: 1 to 4 Years
Education: B.E/B.Tech in Computer Science or Related Field
Official Website: www.fractal.ai
Job Type: Full Time
Responsibilities at Fractal:
- You would be responsible for evaluating, developing, maintaining and testing big data solutions for advanced analytics projects
- The role would involve big data pre-processing & reporting workflows including collecting, parsing, managing, analyzing and visualizing large sets of data to turn information into business insights
- The role would also involve testing various machine learning models on Big Data, and deploying learned models for ongoing scoring and prediction.
- An appreciation of the mechanics of complex machine learning algorithms would be a strong advantage.
Required Qualifications & Experience:
- 1 to 4 years of demonstrable experience designing technological solutions to complex data problems, developing & testing modular, reusable, efficient and scalable code to implement those solutions. Ideally, this would include work on the following technologies:
- Expert-level proficiency in at least one of Java, C++ or Python (preferred). Scala knowledge a strong advantage.
- Strong understanding and experience in distributed computing frameworks, particularly Apache Hadoop (YARN, MR, HDFS) and associated technologies — one or more of Hive, Sqoop, Avro, Flume, Oozie, Zookeeper, etc..
- Hands-on experience with Apache Spark and its components (Streaming, SQL, MLLib) is a strong advantage.
Operating knowledge of cloud computing platforms (AWS/Azure ML)
- Experience working within a Linux computing environment, and use of command-line tools including knowledge of shell/Python scripting for automating common tasks
- Ability to work in a team in an agile setting, familiarity with JIRA and clear understanding of how Git works
In addition, the ideal candidate would have great problem-solving skills, and the ability & confidence to hack their way out of tight corners.
- Scala or Python expertise
- Linux environment and shell scripting
- Distributed computing frameworks (Hadoop or Spark)
- Cloud computing platforms (AWS/Azure ML).