Magic Leap is an eclectic group of visionaries, rocket scientists, wizards, and gurus from the fields of film, robotics, visualization, software, computing, and user experience. We are growing quickly, and this is the time to get on board and play a role in shaping the way people will be interacting with the world tomorrow.
The Computer Vision Data team at Magic Leap is responsible for creating large scale datasets for training, building and validating Computer Vision and Machine Learning algorithms. In this role you will work on creating tools for integrating ground truth data sources with Magic Leap sensor data, and on processing collected data to validate it's quality, and extract metadata from it. This role will require working on a variety of platforms: On device, on local machines for integration, and on the cloud for processing at scale. In this role you will work across the computer vision stack, working on tools for head pose, world reconstruction, eye tracking, hand tracking, and more, and be exposed to many of the core technologies at Magic Leap.
- Design and build tools to record and save sensor data from Magic Leap devices and other XR devices
- Design and implement tools for integration with ground truth tools (motion capture, LIDAR, etc.)
- Design and develop tools that summarize and visualize data to drive data use in algorithm research and development
- Design and develop tools and procedures to find, visualize and improve curated data
- Work closely with Vision team members to define requirements and validate tool performance
- 3+ years of experience working with Python as a main programming language
- Some experience with C++ and web stack development
- Experience with Linux OS, docker and cloud platforms
- Excellent problem solving, troubleshooting and debugging skills
- Familiarity with Unity / Unreal - a plus for on device tools
- Familiarity with computer vision / deep learning - a plus
- BS/BE in CS or equivalent.
- All your information will be kept confidential according to Equal Employment Opportunities guidelines.