At Density, we build one of the most advanced people sensing systems in the world. The product and infrastructure is nuanced and one-of-a-kind. Building this product for scale has been an exercise in patience, creativity, remarkable engineering, laser physics, global logistics, and grit. The team is thoughtful, driven, and world-class.
We build systems that are real-time, accurate, and anonymous by design. Our systems help today’s largest companies understand how their buildings get used. We have counted hundreds of millions of people.
Counting people in “real-time” is unique and particularly hard to achieve. It allows a user to walk into a room, beneath our sensor, and see the room’s occupancy increment 700ms later.
Today alone, Density will ingest over 1m events. In the coming year, our sensor network is on track to grow ten-fold. The overall system load is exploding. Maintaining our low latency standards requires an increasingly thoughtful system.
We’re architecting infrastructure where annual, unscheduled downtime is measured in minutes. We’re building intelligent redundancies so missed events are an oddity. We’re constructing an exceptional engineering team to support always-on, intelligible analytics generated on the fly.
This role reports to our Director of Data Science & Engineering.
As a Data Scientist at Density, you will work on algorithms developed for applications deployed on state-of-the-art remote devices as well as our cloud platform. The data you will work with includes ToF depth data, point clouds from radar, and device parameters. Algorithms include computer vision segmentation identification, 3D point cloud surface analysis, and optimization algorithms for sensor tuning. Join a team of experienced data scientists as we expand what is possible.
In this role you will:
- Work with depth imaging and 3D point clouds.
- Collaborate with hardware and software engineers to design cutting edge algorithms for spatial segmentation and analysis.
- Analyze large data sets to optimize the configuration of intelligent hardware sensor devices.
- Use a deep understanding of math to adapt and extend existing algorithms to solve new problems.
- Ideate on available data sets to come up with cool new mash-ups and products.
The ideal candidate will have:
- 3+ years experience as a data scientist.
- Extensive knowledge of Python programming for data science.
- Familiarity with Python libraries such as NumPy, SciPy, Pandas, and SciKitLearn.
- Knowledge of image processing with libraries like OpenCV, ImageIO, NumPy, TensorFlow, or Keras.
- Exposure to working with 3D point cloud data.
- A working knowledge of SQL.
- You have an awareness of your weak spots and a genuine desire to improve.
- You’re looking for a long-term role with a company that has long-term ambition.
- You can balance a demanding workload, discern priorities, and communicate tradeoffs effectively
The icing on the cake:
- Apache Spark (PySpark) - An ability to write and debug Spark jobs.
- Apache Airflow - Experience configuring DAGs for job execution.
- GPU configuration experience with Cuda, TensorFlow, PyTorch, or OpenCV.
- Linux command line - Working knowledge of Linux command line and security.
- AWS cloud experience - EC2, S3, and IAM configuration and automation.
- ETL - Experience implementing and monitoring production data pipelines.
- Experience with Python remote kernels (with Spyder or PyCharm).
- Knowledge of big data streaming tools like Spark Streaming, Flink, or Kafka Streams.
- Familiarity with C++
What we bring:
- A team hailing from places like Apple, Meraki, HashiCorp, Stanford, NASA, and beyond.
- $100M from investors such as Kleiner Perkins, Founders Fund, and Upfront Ventures.
- A work environment full of fun, smart, dedicated, and kind teammates.
- Our principles - Be Humble, Seek Feedback, and Solve the Fundamental Problem.
- Excellent health benefits including medical, dental, vision, mental, reproductive, and active. Mandatory PTO and more.