Data Scientist - IoT
Location : REMOTE
Headquarters : United States
Hiring Mode : Full Time
Experience : Mid Level
Tiger Analytics is looking for a fully remote experienced Telematics Data Scientist to join our fast-growing advanced analytics consulting firm. Our consultants bring deep expertise in Data Science, Machine Learning and AI. We are the trusted analytics partner for multiple Fortune 500 companies, enabling them to generate business value from data. Our business value and leadership has been recognized by various market research firms, including Forrester and Gartner. We are looking for top-notch talent as we continue to build the best global analytics consulting team in the world.
As a part of our team, you will apply strong expertise in AI through the use of machine learning, data mining, and information retrieval to design, prototype, and build next generation advanced analytics engines and services. You will collaborate with cross-functional teams and business partners to define the technical problem statement and hypotheses to test. You will develop efficient and accurate analytical models which mimic business decisions and incorporate those models into analytical data products and tools. You will have the opportunity to drive current and future strategy by leveraging your analytical skills as you ensure business value and communicate the results.
- Bachelor’s Degree in Computer Science or closely related field.
- 4 to 8 years of experience in Data Science.
- Experience with R, Python, or similar programming languages.
- Experience working in the areas of vehicle telematics, logistics, pattern detection in sensor or smartphone-connected data, and geospatial mapping.
- Experience with the development of computational algorithms to reduce computation time (e.g. MapReduce).
- Deep expertise with relevant geospatial packages (e.g. geopandas and rasterio in Python; maptools, spdep, or OpenStreetMap) is a major plus.
- Experience with popular machine learning and deep learning frameworks (e.g. H2O, TensorFlow, PySpark, PyTorch, MXNet, Caffe).
- Experience with distributed storage and database platforms.
- Experience working with weather and atmospheric data.
- Experience with batch, micro-batch, streaming, and distributed processing platforms such as Flink, Hadoop, Kafka, Spark, Hudi, AWS EMR, Arrow, or Storm.
- Experience working within Amazon Web Services (AWS) cloud computing environments.
- Experience with terabytes, petabytes, or even exabytes of data.
- Familiarity with containerization tools such as Docker and Kubernetes.
- Background in spatial optimization algorithms is a major plus.
This position offers an excellent opportunity for significant career development in a fast-growing and challenging entrepreneurial environment with a high degree of individual responsibility.