Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
These folders contain the Schematic, Firmware, and Bill of Materials used in building and programming the display driver board on the Project North Star AR Headset.
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
A collection of terms and acronyms of varying degrees of ambiguity
Given that Northstar is a project that combines many different fields of research and expertise, it's unlikely that any one person will ever know or be able to recall all the terms, acronyms and technical jargon that is used to discuss these various industries. This glossary intends to define some of the more particular terms, starting with the field of electrical engineering, but will eventually grow to cover other topics in more depth, like firmware, software etc. This page is open to pull requests, please feel free to submit PRs with more terms!
XR is an umbrella term for Virtual Reality, Augmented Reality and Mixed Reality. Some say it stands for Extended Reality, others say the X is a variable, check twitter if you want to see people argue about it for literal years. In general, XR is the term the northstar community uses when discussing VR/AR or MR. For more general discussions we tend to enjoy using "Spatial Computing".
A Combiner is an Optical component that takes rays of light and focuses them to a single end point.
So, DOF is a fun one because there are two meanings for the acronym. In traditional camera work it means Depth of Field. For Northstar and other XR devices it means "Degrees of Freedom". In a 6-DOF system you can rotate on 3 Axis and also move on 3-axis.
IPD stands for Interpupilary distance. This is the term for the distance between your pupils, which is useful information when correcting for optical distortion.
The Eye Box is the area in which the image from a headset is clearly visible. Northstar has a fairly large Eye Box meaning you can adjust the headset to be more comfortable without worrying too much about whether or not you've got it positioned just right.
FOV stands for field of view, this is usually referring to how much you can see through a headset, and can also be used to describe the amount a camera can see. FOV can be measured horizontally, vertically or diagonally.
Latency is an overall term for how long a signal takes to get from one location to another, this is applicable to many areas of the northstar project. For example there is latency from when an image is rendered by the computer to when it is displayed on the northstar screen, or latency from when the image from a tracking camera makes its way to the onboard processor. In short, latency is time, and the longer something takes the more latency it has.
VAC stands for the Vergence Accomodation Conflict. This term describes the difficulty our eyes have in focusing on objects that optically appear a certain distance away and via stereo disparity appear a different distance away.
DP stands for Display Port. This is the display signal type that northstar uses. In general, converting from DP to HDMI is easy, converting from HDMI to DP is incredibly difficult and rarely works, in our experience it is a waste of time trying to look into HDMI to DP adapters.
DP alt mode is a specific subsection of the USB Type C spec that allows it to carry a display port signal.
PD stands for power delivery. This is another subsection of the USB specification that specifies a USB port has the neccessary components to supply more power than a standard port.
PCB stands for Printer Circuit Board. These are typically sheets of multi-layered copper that contain traces which allow electrical signals to be transmitted on the board. These signals can include power and data.
FPC stands for Flexible Printed Circuit, these are similar to PCBs, but the main difference is that these are flexible and can bend. These cables are used to connect the displays and other components and are often used in board to board connections when PCBs may be located in different locations that a rigid connection isn't suited for.
MCU stands for Micro-controller. Microcontrollers are little self-contained computers in a chip that execute programs called firmware. These chips are on PCBs, or printed circuit boards. The programs control various peripherals that are either built into the chip or connected externally. Popular examples of Micro-controllers include Arduino and Rasberry Pi.
MIPI is a general industry group and standard for display signal interfaces and camera signal interfaces, among others. For Northstar's case we care mostly about display and camera signals. You can read more about MIPI here.
HID stands for Human Interface Device, it a set of standards used to connecting peripherals like keyboards, mice etc.. intended for 'driverless' operation. You can read about the specification here, in addition Microsoft have a great resource on HID here.
I2C stands for Inter-Integrated Circuit, It is widely used for attaching lower-speed peripheral ICs to processors and microcontrollers in short-distance, intra-board communication.
The Serial Peripheral Interface (SPI) is a synchronous serial communication interface specification used for short-distance communication, primarily in embedded systems. The interface was developed by Motorola in the mid-1980s and has become a de facto standard. Typical applications include Secure Digital cards and liquid crystal displays.You can learn more about SPI here.
A DP to MIPI bridge is a specific chip designed to convert signal from a DP input to a MIPI signal to be read by displays.
Git is a common type of Source Control, Allowing developers to maintain a history of their code which is incredibly helpful in diagnosing and figuring out undesired results of changes to that code. You can learn more about Git here: https://docs.gitlab.com/ee/topics/git/
A Pull Request is a term used to describe the act of submitting a piece of code for review and merging into the main code base. You can learn more about pull requests here
A runtime is an intermediary process between end user applications and hardware. In Northstar's case runtimes can include SteamVR, Monado and Esky.
An OpenXR runtime is software which implements the OpenXR API. There may be more than one OpenXR runtime installed on a system, but only one runtime can be active at any given time.
OpenXR is an API (Application Programming Interface) for XR applications. XR refers to a continuum of real-and-virtual combined environments generated by computers through human-machine interaction and is inclusive of the technologies associated with virtual reality (VR), augmented reality (AR) and mixed reality (MR). OpenXR is the interface between an application and a runtime. The runtime may handle such functionality as frame composition, peripheral management, and raw tracking information.
SLAM is a term that originated in robotics meaning Simulatenous Localization and Mapping. It's become the generic term for systems that can see their environment, localize themselves to a previously known location, and generate a map of that environment for other devices to localize against.
VIO stands for Visual Intertial Odometry, VIO is an important part of the processing needed for SLAM, but does not provide localization or mapping. VIO is soley performing the act of combing visual data with inertial measurements from an IMU.
IMU stands for Inertial Measurement Unit, it is a device that can read out vectors like acceleration, and gravity.In Northstar's case these devices are used for 3DOF movement and can help with VIO and SLAM https://en.wikipedia.org/wiki/Inertial_measurement_unit
Project North Star is an open source augmented reality headset. It was initially released by LeapMotion (now UltraLeap) in 2018 and is currently supported by its open source community.
There have been several iterations of the headset since the project began. The latest build is Northstar Next.
As opposed to Virtual Reality, where the user's view is generated entirely by a computer, Augmented Reality projects a rendered image over the real-world perspective of the user. This gives an effect of the rendered objects existing in the real world.
To achieve this effect, an AR headset can use either a "pass-through" or a transparent optical system. A pass-through display streams the outside world to an enclosed LCD/LED screen. A transparent optical system projects an image which is reflected off a lens in front of the user's field of view. Project North Star uses a transparent optical system.
Though an AR headset is a very complex device, we can simplify it by separating its hardware and software into different components, each of which serve a specific function.
Project North Star develops and/or sources the parts for each of these components, and provides instructions to combine them into a complete headset. Each headset version is comprised of different iterations of some or all of these parts.
Each of these components can separated into the following categories:
Display: The hardware that produces the images.
Optics: Lenses which bend light from the image so it is in-focus for the user.
Sensors: Hardware and software that track the user's position and orientation through space.
Hand-tracking: Hardware and software that allows the user to interact with the artificial objects and environment.
Integrator: Hardware and software that allows each component to communicate with each other.
Runtime: Software that acts as an intermediary between user applications and hardware.
Assembly: Hardware that houses all the components to create an complete headset.
This separation of responsibility allows for different open or closed source solutions to be used in a headset, resulting in a modular design.
The challenges in this space are not limited to technological problems. Few people have the means to fabricate Printed Circuit Boards (PCBs), lenses, and LCD screens. So maintaining Project Northstar also means project contributors find and work with manufacturers and distributors for bespoke electronics. This influences both the design and availability of the headsets.
With an active community, open documentation, and modular design, Project Northstar's headsets are customizable and repairable in a way that no other closed-source headset is. Anyone can contribute to the project. So the more its users customize and upgrade their devices, the more the project benefits as a whole.
By participating in the NorthStar Community, you agree to the following code of conduct.
A primary goal of Project NorthStar is to think and work in collaboration, so we can transcend the potential of each of us, and others. We aim to be inclusive to the largest number of contributors, with the most varied and diverse backgrounds possible. We are committed to providing a friendly, safe, and welcoming environment for all, regardless of gender, sexual orientation, skills and education, ability, ethnicity, socioeconomic status, and religion (or lack thereof). This Code of Conduct outlines our expectations for all those who participate in our community, as well as the consequences for unacceptable behavior. We invite all those to help us create safe and positive experiences for everyone, and we also include ways for our attendees to report any violations. Together we can make a fun, harassment-free, magical experience for everyone, regardless of gender, gender identity, sexual orientation, disability, physical appearance, body size, race, or religion.
A supplemental goal of this Code of Conduct is to increase open [source/culture/tech] citizenship by encouraging participants to recognize and strengthen the relationships between our actions and their effects on our community. Communities mirror the societies in which they exist, and positive action is essential to counteract the many forms of inequality and abuses of power that exist in society. If you see someone who is making an extra effort to ensure our community is welcoming, friendly, and encourages all participants to contribute to the fullest extent, we want to know..
The following behaviors are expected and requested of all community members:
Communicate! Participate in an authentic and active way. In doing so, you contribute to the health and longevity of this community.
Exercise consideration and respect in your speech and actions. Come from a place of understanding.
Attempt collaboration before a conflict.
Refrain from demeaning, discriminatory, or harassing behavior and speech.
Be mindful of your surroundings and your fellow participants. Alert community leaders if you notice a dangerous situation, someone in distress, or violations of this Code of Conduct, even if they seem inconsequential.
The following actions are considered harassment and are unacceptable within our community:
Being a jerk. Being mean. Unkind intentions.
Violence, threats of violence or violent language directed against another person.
Sexist, racist, homophobic, transphobic, ableist or otherwise discriminatory jokes and language.
Posting or displaying sexually explicit or violent material, posting or threatening to post other people’s personally identifying information ("doxing").
Personal insults, particularly those related to gender, sexual orientation, race, religion, or disability.
Inappropriate photography or recording. Inappropriate physical contact. You should have someone’s consent before touching them.
Unwelcome sexual attention. This includes sexualized comments or jokes; inappropriate touching, groping, and unwelcome sexual advances.
Deliberate intimidation, stalking or following (online or in person).
Advocating for, or encouraging, any of the above behavior.
Sustained disruption of community events, including talks and, presentations.
Don't be that person. We help people help people.
Unacceptable behavior from any community member, including sponsors and those with decision-making authority, will not be tolerated. Anyone asked to stop unacceptable behavior is expected to comply immediately. If a community member engages in unacceptable behavior, moderators may take any action they deem appropriate, up to and including a warning, temporary ban, or permanent expulsion from the community/event without warning. You can make a report either personally or anonymously.
Anonymous Report
Pulled from the #helpful-content channel in the North Star Discord
This page has a handful of links that will help you learn more about #ProjectNorthstar and connect with the community, we highly recommend checking out the discord and hanging out with some of the awesome people there! It's the quickest way to get your questions answered.
Joining the #ProjectNorthstar discord server is the best way to get help with any troubles you run into! It's also a fun and friendly community, come hang out!
You can order ready to build kits, or pre-built kits here!
Alexandria Heston put together an incredible collection of knowledge about design for VR and AR. You can check it out here! https://aliheston.gitbook.io/the-design-of-virtual-and-augmented-reality/
GitHub Build Guide
GitHub Repository
Forums
This medium article by @Tasuku is really good! Check it out for relevant links for Exii, 1-10.inc and other tweaks.
@mdrjjn put together this guide on how to build Exii Version 1, He made a video too!
Here's a link to Psychic Vr Lab's guide on going through the calibration process
@atlee19 put together this cool website with a bunch of Open Source Demos
@eswar made this awesome calibration walkthrough
@eswar also made this tutorial for 6dof tracking using a vive tracker!
Just joining? @callil made this awesome presentation for the New York meetup, it’s a great summary of what has happened so far!
The world is full of stereo cameras and sensors, this page will help you choose which one to get for your headset.
The Northstar Next platform is designed to be modular, this means you can swap in or use any tracking camera/platform you'd like! Given how quickly the landscape around tracking changes, and how often new components/parts come out it's important to be able to upgrade electronics instead of having to build a whole new HMD, it's the reuse part of reduce/reuse/recycle!
Below are a few recommendations for various sensors, they all provide different features and have pros/cons so be sure to ask on discord if you have any specific questions!
Generally when it comes to out of the box hand tracking, Ultraleap (formerly leap motion) are the best in the field. They have a variety of sensors you can choose. For Northstar headsets we generally recommend using the SIR170 as it's the lightest available sensor, and is designed for integrating into headsets. However you can also use the 1st or second gen leap motion cameras. The LeapMotion platform currently works on Windows and Linux, with MacOS support coming soon!
Throught the duration of Project Northstar there have been many 6DOF sensors on the market. In this category, we'll show sensors that can be used "out of the box" with minimal configuration.
The Xvisio XR50 sensor is one of the smallest 6DOF tracking sensors available. It uses a mix of onboard and on host compute to calculate pose, and has some other available features like Plane Detection.
While these cameras are now in the end of life stage, they do still function and are generally one of the simplest tracking platforms to get started with. The t261 is just the embedded version of the T265, which means it's better suited for use in a weight-sensitive project like this one, but either sensor will function the same. The T26x sensors provide access to a stream of camera poses directly from the device, which means your computer doesn't have to process the poses.
If you want more reliable 6DOF tracking at the cost of portability, you can choose to use a SteamVR Lighthouse based solution. There are a variety of trackers available for this system, the most common being from HTC Vive and Tundra Labs.
Lighthouse based solutions use two external laser sources to track the position of the tracker. The Lighthouse sensors themselves can be acquired via Valve or HTC.
The Luxonis cameras also sport an onboard movidus AI chip which means you can run other computer vision tasks as well, directly on the sensor itself.
There are many variants of Luxonis cameras to choose from, we generally recommend the Oak-D-W or Oak-D-Pro-W. It is important to get the -W variants, as these are the wide FOV versions.
This page goes over the general setup you should do after you've finished building and calibrating your headset.
There are currently three methods of developing software on Northstar headsets.
This is the recommended developer experience. Esky is currently built on Unity and has built in support for the MRTK. It has video passthrough with the t261/t265, temporal reprojection, and supports both the 2D and 3D calibration methods.
This is the default unity experience, it's barebones and built for the 3D calibration rig. If you're experienced with unity and want to tinker with the original source code for Northstar this is the place for you.
The SteamVR integration allows any SteamVR game to run on a Northstar headset. Hand tracking isn't a replacement for controllers yet so you won't have a fun time in beat saber, but for demos like cat explorer or the infamous cubes demo you'll have full support for hand tracking.
Prebuilt Examples There are a handful of prebuilt demos for Northstar including LeapPaint, Galaxies and others. These will be linked on a separate page/database at a future date, for now, and check the #showcase channel.
You'll want to make sure that your headset works properly, plug in the power to the integrator board or driver board first, then the display port connection. On some headsets there can be an issue where plugging the display in first causes the driver board to get enough power through the display port connection to boot, but not enough power to run properly. If you run into this issue unplug your headset and then plug-in power first, then the display connection.
Your Northstar display will show up as a normal monitor and will look "upside-down" if viewed through the headset. This is normal, we compensate for this in software written for the headset. Make sure your headset is set up so it's to the right of your main display(s), and that the resolution is 2880x1600. You'll also want to make sure that the scale and layout section is set to 100%. By default, the headset will run at 90hz. You'll also want to ensure that your headset is set to extended mode and not mirrored.
Push down on both circular buttons on the headset for four seconds to power cycle the integrator board. This should cause the RealSense device to enumerate and cause windows to detect it.
If step 1 did not work you can try unplugging the headset and plugging it back in, make sure your USB connection is plugged into a USB 3.1 port.
If both of the above steps did not work you can try resetting the USB hub in device manager. This solution has solved most edge cases we've seen so far.
There are currently three methods of getting software running on North Star headsets.
Project Esky aims to be the software companion to allow development with the Deck X/North Star Display system out of the box! (Utilizing the t265/1 or a ZED system for 6DoF tracking)
Project Esky is a complete unity framework that handles
Rendering (with 2D and 3D (V1 and V2) s)
MRTK Integration with the Leap Motion controller (aligned with the user's view)
6DoF Head Tracking + Re-Localization events (Save map, load map, add persistence, callbacks, etc.)
Peer to Peer based co-localization and networking at the click of a button (Provided devices are on the same local network)
Spatial mapping (Via the ZED SDK, with future plans to generate point clouds via the t261/t265 sensor)
Temporal Reprojection (2D calibration only)
In order to use Esky you'll need Unity, A Northstar headset, and a or file.
Esky runs within the Unreal engine, but the Unity integration is more complete
Esky's Unity Integration uses Unity 2020.3.X, if it complains that you're using a newer version, as long as you're using Unity 2020.3 it should be ok to upgrade. We currently use 2020.3.11f1
Note that the EskySettings.json file is located in the root of your unity project folder. (Not visible within Unity)
For the V2 calibration, Copy the contents of your NorthStarCalibration.json file into Esky's EskySettings.json file, you will be replacing the following values within the v2CalibrationValues section:
left_uv_to_rect_x
left_uv_to_rect_y
right_uv_to_rect_x
right_uv_to_rect_y
For the V1 calibration, Copy the contents of your NorthStarCalibration.json file into Esky's
EskySettings.json file, you will be replacing all of the values within the v1CalibrationValues section
Make sure Northstar Display is not flipped via windows display manager, additionally it can be helpful to have your Northstar Display setup to the far right of your monitor setup, though if necessary it can be placed anywhere, you'll just need to take note of the position for later.
Then, within the EskySettings.Json file, edit the displayWindowSettings so that the DisplayXLoc and DisplayYLoc reflect the position of your northstar display relative to your main monitors.
In the below example, Monitor #2 is my NorthStar, and Monitor #3 is marked as my 'Main Display'. Since my 'Main Display' is 1920x1080 pixels in size, my DisplayXLoc and DisplayYLoc values will be 0, and 1080, respectively
Save the EskySettings.json file. You're now free to proceed to open the Unity Project to complete the calibration
Before you can begin doing cool northstar stuff, you'll need to align your hands so that the virtual image matches the real world, and configure the MRTK to your specific North Star Setup.
For V1:
Observe the following section of your V1 (3D) calibration json:
You will need to copy the position, and rotation values into the following area within the EskySettings.json file. the position maps to TranslationEyeToLeapMotion, and the rotation maps to RotationEyeToLeapMotion.
Save when complete, you can see an example of a completed EskySettings.json file in the Extras section
For V2:
Hit play, then, click on the unity Game window (so that unity can receive the keyboard input)
Alignment is pretty straightforward, following the instructions displayed in your headset:
1) Hold your right hand up, so that the virtual hand appears in the center of the screen
2) Hold the space bar
3) While holding space, translate your right hand so that it aligns with the frozen virtual hand, NOTE: Try not to move your fingers as you do this
4) Once aligned, release the space bar
5) Repeat steps 1-4 3 or 4 times, the alignment system will let you know when it has collected enough samples.
If you are happy with the alignment, hit 's' to save, if not, stop unity (hit the play button again) then hit play to start the process again, repeating steps 1-5
Further adjustment can be done with the arrow keys
If you look at the Scene Heirarchy, you will notice the MixedRealityToolkit gameobject.
The inspector will show the Mixed Reality Toolkit configuration. Click Input -> Input Data Providers
You will see the following window for configuring all of esky's settings.
The settings are explained as follows:
Rig To Use: This controls which optical setup is used for your northstar (V1, V2), Project Ariel, or a custom rig can be selected. Filter System to Use: We work hard to develop Project Esky, and while we have a newer (and in our opinion better designed) pose filtering system, you can change this value to revert back to the old filtering pipeline here Reprojection Settings (V2/Ariel Only): This enables/disables the late-stage temporal reprojection built into Project Esky's native rendering pipeline.
Native Shader To Use (V2/Ariel Only, Currently not implemented): This changes the undistortion method used by the native rendering pipeline, we recommend not editing this.
Target Frame Rate (V2/Ariel Only): This changes the target frame rate for Unity. You can select 120, 90, 60, and 30 frames per second! NOTE: This frame rate is independent of the native rendering pipeline that handles composition, which always runs at 120 frames per second!
Leap Controller Orientation: This changes the way the MRTK handles the leapmotion controller, we recommend leaving this as 'Esky'
Enter/Exit pinch distance: This changes the distance between the index and thumb before the MRTK considers a pinch start/finish action (in meters).
Save after stopping editor: The default behaviour of Project Esky is to dump your current MRTK settings back into the EskySettings.json file, which is also copied to any build directory when you built your unity project. You can disable this behaviour but we don't recommend it!
Use Tracker Offsets (Not Implemented): This places the virtual camera relative to the 6DoF tracker, good for external between-sensor calibrations.
Uses Camera Preview: This changes whether you intend to use image passthrough preview with the t261, keep in mind you must have USB 3.0 connected to your NorthStar in order to use it!
Uses External RGB Camera: For those with an RGB sensor, you can enable this to use the RGB image passthrough, unless you know what you're doing Keep this disabled!!
Then simply hit 'play' and you're good to go :D
Note: Some of the values you see are either controlled by the MRTK in editor, or not yet in use.
We have an anonymous report form available . We can't follow up on an anonymous report with you directly, but we will fully investigate it and take whatever action is necessary to prevent a recurrence.
This code of conduct was adopted from the code of conduct used by the wonderful people at the .
All the sensors below use a combination of Visual + data.
These cameras are no longer being manufactured or supported by Intel, and you'd likely have to find them second hand. Note that the last version of the Realsense SDK that supports these sensors is
Luxonis cameras are designed for a variety of computer vision uses cases, and can work well for hand tracking, but aren't explicitly designed for the task. This won't be an "out of the box" solution for hand tracking like Ultraleap, but if you're more of a tinkerer, and willing to set up a lot of the software yourself, you can try a camera from Luxonis and use open source hand tracking on Linux platforms.
There are a handful of demos that require leap motion's multi-device beta driver, located . If you're having issues with getting your leap working in demos or unity this is probably the reason. Once you have the drivers installed open the leap motion control panel and make sure that the connection is "good", as shown in the figure below.
In order to use your T261 / T265 sensor you'll need to install the RealSense SDK, located (be sure to install v2.53.1, as this is the last version that supports the RealSense Tracking range.). Once you are finished installing the SDK, open the RealSense viewer application to ensure that your RealSense device is connected. If you are using the Deck X headset and your T261 is not showing up you can take the following actions to resolve it:
This is the recommended developer experience. It has video passthrough with the T261 / T265, built-in support for the Mixed Reality Tool Kit and support for both 2D and 3D calibration methods.
This is the default unity experience, it's bare-bones and built for the 3D calibration rig. If you're experienced with unity and want to tinker with the original source code for North Star this is the place for you.
The SteamVRr integration allows any SteamVR game to run on a Northstar headset. Hand tracking isn't a replacement for controllers yet so you won't have a fun time in beat saber, but for demos like cat explorer or the infamous cubes demo you'll have full support for hand tracking.
Prebuilt Examples There are a handful of pre-built demos for North Star including LeapPaint, Galaxies and others. These will be linked on a separate page/database at a future date, for now, and check the #showcase channel
Open the Unity project, then navigate to ///LMHandCalibration.unity
Now we get to the fun stuff! As of RC1, the MRTK handles all of the configuration for Project Esky! Open Project Esky in Unity, then navigate to ///HandInteractionExamples.unity
This page will walk you through installing and running the steamVR driver. It is currently supported on windows, with linux support in progress. A Launcher/Installer is also in progress.
This steamVR driver is still a work in progress, if you run into any issues, please reach out on Discord.
- Head Tracking - Hand Tracking - View Projection - Skeletal tracking - Basic input - T265 Sensor integration
- Gesture recognizer
Versions of vendor libraries not included, here is where to get them:
You will need to install the leap motion multi-device drivers in order for this driver to work. - LeapDeveloperKit 4.0.0+52238
If using the structure core you will need the CrossPlatform SDK 0.7.1 and the Perception Engine 0.7.1 https://structure.io/
If using the intel realsense t265, you should install the Intel® RealSense™ SDK 2.0
We have a prebuilt version of the driver available here. You can place it in the following directory, or use vrpathreg as shown below.
In order to build from source you will need to install visual studio 2019 with c++, .net and C++ modules v142. In addition you'll also need to install git and cmake.
All commands below are run in windows command prompt
In the folder in which you want the repo to exist, run the following commands:
- Open the generated solution and set northstar to the startup project (right click the project and choose the set as startup where the gear icon is) and build.
Make sure to target x64 and a Release build to remove any object creation slowness.
- The release will be in `build/Release/`
and will be comprised of dll files.
- Copy all the dll's to wherever you want to install from, they should be combined into the `resources/northstar/bin/win64`
directory, make this if it does not exist and put all generated dll's inside.
- Next register the driver with steamVR (using vrpathreg tool found in SteamVR bin folder). This is located at
C:\Program Files (x86)\Steam\steamapps\common\SteamVR\bin\win64\vrpathreg.exe
vrpathreg is a command line tool, you have to run it via the command prompt. To do this, follow these steps.
1) open command prompt
2) run cd C:\Program Files (x86)\Steam\steamapps\common\SteamVR\bin\win64\
3) run vrpathreg adddriver <full_path_to>/resources/northstar
4) you can verify the driver has been added by typing vrpathreg
in command prompt, it will show you a list of drivers installed.
- at this point vrpathreg should show it registered under "external drivers", to disable it you can either disable in StamVR developer options under startup and manage addons, or by using vrpathreg removedriver <full_path_to>/resources/northstar
- Running steamvr with the northstar connected (and no other hmd's connected) should work at this point but probably not have the right configuration for your hmd. Take the .json file from calibrating your nothstar and convert it to the format found inresources/northstar/resources/settings/default.vrsettings
- restart steamvr. If driver is taking a long time to start, chances are it got built in debug mode, Release mode optimizes a way a bunch of intermediate object creation during the lens distortion solve that happens at startup. If things are still going south please issue a bug report here:
- if you wish to remove controller emulation, disable the leap driver in SteamVR developer settings.
This page describes both existing calibration methods, as well as how to align the LeapMotion Controller properly.
Believe it or not, most hardware doesn't work perfectly the exact moment it's assembled. Assembling headsets by hand leads to small variances in each headset that need to be compensated for in software. In order to get the best experience with your ProjectNorthstar headset, you'll need to follow a few steps in order to compensate for these variances.
The first type of calibration you'll need to do is classified as Optical Calibration. This type of calibration uses stereo cameras to calculate the image distortion caused by the parabolic reflectors in the ProjectNorthstar headset. There are two versions of Optical Calibrations you can do based on what hardware you have access to. Both versions of optical calibration currently require a Calibration Stand to be 3D printed. The calibration methods distort the normal image displayed on the screens to make it appear correctly on the headset. Unity, unreal and other engines/compositors need to be able to take the information generated by the calibration method, and plug that into a rendering method in order to properly compute the image. Please see the table below for which methods are currently supported on specific platforms.
Note that Calibration files generated by 3D Optical Calibration and 2D Optical Calibration are not currently interchangeable
3D Optical Calibration calculates the exact position of your headset's screens and Optical Combiners, and uses raytracing (not the RTX kind, don't worry) to calculate the distortion for the headset. This allows it to be adjusted for multiple Interpupillary distances and eye reliefs. This method requires using two Stereo cameras and an external monitor (in addition to your main screen) to work. This makes it more expensive/complicated to do compared to 2D Optical Calibration. We currently recommend this method for maker spaces or groups that have to calibrate multiple headsets. This is the method used for preassembled and calibrated headsets from smart-prototyping and CombineReality.
Unlike the 3D Optical Calibration, 2D Optical Calibration only calculates the 2D distortion generated by the headset's Optical configuration. This method is much simpler to setup. It only requires one stereo camera, which you likely already have for your headset! You can use the intel t261, t265 currently, but if you have another stereo camera reach out on the discord and we will see what we can do to support it.
Hand position is dependent on the position of the leap motion sensor, make sure your Leap motion sensor has the bottom metal bezel hidden behind the 3d printed housing. You can use this application to set up your hand position: https://github.com/fmaurer/NorthStarToolbox
Both the 2D and 3D versions of the calibration setup share the same 3D printable stand. You can find the assembly instructions below. (Note that they show the instructions for the original, 3D dual camera, stand.)
Early Access Documentation for the Next Generation Northstar headsets available on https://shop.combinereality.com
You can get next generation Northstar kits from Combine Reality: https://shop.combinereality.com.
The display assembly for Northstar Next kits include:
The new display driver board supports two BOE 1440x1600 displays at ~90hz. The board has a USB-C input, and a 12pin serial output to connect to the USB hub board included in the next gen kits.
The Driver board and break out boards use special FPC connectors. These connectors have the pins wrap around the connector, meaning that either side can make contact with the ribbon cable pins.
The ribbon connections on the Driver board and Adapter boards use an FPC connection called a Backflip Latch. The gates have pins on both the top and bottom of the connector. This means you must connect the display ribbon cable as shown in the instructions, otherwise you will short your displays.
In order to connect the ribbon cables to the adapter boards correctly you must do so as pictured below:
The gates on the Driver board and Adapter boards should be opened. In the photo below the backflip latch on the right of the driver board is open, and the backflip latch on the left is closed.
Move both latches to the open position.
Ensure that the ribbon cables are facing down so the pins are facing the PCB on the circuit board. Ensure that the ribbon cable is fully inserted in the backflip latch, it should be fully parallel like shown below. Once inserted, close the latch.
Flip the driver board over so that the gold diamond is facing upwards and the USB-C port is facing down. Then line up the Adapter board as shown, the pins of the ribbon cable should now be facing up. Connect the ribbon cable in the same manner you did on the driver board.
Align the displays so that they point in the same direction the arrows are pointing on the adapter board.
Then line up the pins and push firmly to connect the displays to the FPC connector.
Double Check your work.
Take a moment to examine each connection.
Are the cables fully inserted in each connector?
Are the backflip latches closed and fully secure?
Are the cables in the correct orientation? It’s always better to be patient and double check your work then to rush and power on a system haphazardly.
Power on your Display Driver board by connecting the USB-C cable. Be careful when handling the displays as the USB-C cable will likely try to pull the driver board and displays along with it. The driver board has no power button, so connecting the cable will power it on automatically. The Display Driver board will automatically negotiate a handshake with your computer, and the displays will automatically turn on and show up as display. For now we just care about the displays turning on and showing an image, so don't worry too much about settings. You should see your desktop background on the displays. If you do not see an image on the displays or they don't turn on, ensure your computer is capable of outputting video over USB-C, this is commonly referred to as DP-Alt-Mode.
The future of spatial computing deserves to be open.
We envision a future where the physical and virtual worlds blends together into a single magical experience. At the heart of this experience is hand tracking, which unlocks interactions uniquely suited to virtual and augmented reality. To explore the boundaries of interactive design in AR, we created and open sourced Project North Star, which drove us to push beyond the limitations of existing systems." -Leap Motion
Project North Star is an open-source Augmented Reality headset originally designed by LeapMotion (now UltraLeap) in June 2018. The project has had many variations since its inception, by both UltraLeap and the open-source community. Some of the variations are documented and linked to here but visit the discord server for more to-the-moment information. The headset is almost entirely 3D printable, with a handful of components like reflectors, circuit boards, cables, sensors, and screws that need to be sourced separately.
Project North Star at MIT Reality Hack 2020, photo credit: Matthew Daiter
There's also a large community of Northstar developers and builders on Discord, you can join the server and share your build, ask questions, or get help with your projects by joining the server
For a more detailed look into the project, checkout our General FAQ, or Mechanical pages.
Project North Star has seen its fair share of revisions and updates since the original open-source files were released. To clear up any ambiguity from the outset, Release 1 was an internal release. Release 2, the first public open-source release (sometimes referred to as the initial release), was in 2018. Release 3 came in 2019 and improved on the mechanical design in many ways. The Deck X innovated on this and provided an integrated circuit board to reduce cables from the headset to just two, (USB 3 / mini-DP), by combining USB devices into a custom-built USB hub + Arduino module. As shown from the table below, the newly released Northstar Next is probably what new users will want to start with, it has an emphasis on modularity and affordability.
The following instructions can be used for version 1 of the calibration rig
The above video shows the general process of performing a 3D calibration on your project northstar headset.
This requires:
3D Printing the mechanical assembly, search for it here: https://leapmotion.github.io/ProjectNorthStar/
Affixing TWO of these Stereo Cameras to it: https://www.amazon.com/ELP- Industrial- Application- Synchronized- ELP- 960P2CAM- V90- VC/dp/B078TDLHCP/
Acquiring a large secondary monitor to use as the calibration target
Find the exact model and active area of the screen for this monitor; we'll need it in 3)
Print out an OpenCV calibration chessboard, and affixing it to a flat backing board.
Flatness is absolutely crucial for the calibration.
Editing the config variables at the top of dualStereoChessboardCalibration.py
with the correct:
Number of interior corners on each axis
Dimensions of each square on the checkerboard (in meters!)
Install Python 3 on your machine and run pip install numpy
and pip install opencv-contrib-python
Run it from the python scripts folder, usually something like C:\Users\*USERNAME*\AppData\Local\Programs\Python\Python36\Scripts
Running dualStereoChessboardCalibration.py
First, ensure that your upper stereo camera appears on top in the camera visualizer
If not, exit the program, and unplug/replug your cameras' USB ports in various orders/ports until they do.
Hold your checkerboard in front of your camera array, ensuring to move it around to gain good coverage.
Every time the calibrator takes a snapshot, it will print a notice in the terminal.
After 30 snapshots in both camera views, it will run the calibration routines and display rectified views.
If the calibration went well, you will see your live camera stream rectified such that all straight lines in the real world will appear straight in the camera image, and the images will look straightened and vertically aligned between the screens.
If this happened, congratulations! Exit out of the program and run it one more time.
Running it again verifies the calibration can be loaded AND GENERATES THE CALIBRATION JSON
If the calibration did not go well (you see horrible warping and badness), you can attempt the calibration again by:
Deleting the created dualCameraCalibration.npz
AND cameraCalibration.json
files from the main folder
Trying again: ensuring the checkerboard is flat, the config parameters are correct, and that you have good coverage (including along depth)
You should have a good cameraCalibration.json
file in the main folder (from the last step)
Ensure that your main monitor is 1920x1080 is and the Calibration Monitor appears to the left of the main monitor, and the north star display appears to the right of it.
This ensures that the automatic layouting algorithm detects the various monitors appropriately.
Edit config.json
to have the active area for your calibration monitor you found earlier.
Download this version of the Leap Service: https://github.com/leapmotion/UnityModules/tree/feat-multi-device/Multidevice%20Service
The calibrator was built with this version; it will complain if you don't have it :/
Now Run the NorthStarCalibrator.exe
You should see the top camera's images in the top right, and the bottom camera's images on the bottom.
If this is not so, please reconnect your cameras until it is (same process as for the checkerboard script)
You should also see a set of sliders and buttons running along the top.
These control the calibration process.
First, point the bare calibration rig toward the calibration monitor
Ensure it is roughly centered on the monitor, so it can see all of the vertical area.
Then Press "1) Align Monitor Transform"
This will attempt to localize the monitor in space relative to the calibration rig. This is important for the later steps.
Next, place the headset onto the calibration rig and press "2) Create Reflector Mask"
This should mask out all of the camera's FoV except the region where the screen and reflectors overlap the calibration monitor.
If it does not appear to do this, double check that all of the prior steps have been followed correctly...
Now, before we press "3) Toggle Optimization", we'll want to adjust the bottom two sliders until the both represent roughly equal brightnesses.
This is important since the optimizer is trying to create a configuration that yields a perfectly gray viewing area.
Now press "3) Toggle Optimization" and observe it.
It's switching between being valid for the upper and lower camera views, so only one image is going to appear to improve at a time.
You should it see it gradually discovering where the aligned camera location is.
This is the finickiest step in the process, it's possible that the headset is outside the standard build tolerances.
If you suspect this is the case, increase the simplexSize in the config.json
to increase the area it will search.
If it does converge on an aligned image, then congratulations! Toggle the optimization off again.
Press button 4) to hide the pattern, put the headset on, and use the arrow keys to adjust the view dependent/ergonomic distortion and the numpad 2,4,6,8 keys to adjust the rotation of the leap peripheral.
When satisfied, press 5) to Save the Calibration
This will save your calibration as a "Temp" calibration in the Calibrations
Folder (a shortcut is available in the main folder).
You can differentiate between calibrations by the time in which they were created.
http://blog.leapmotion.com/bending-reality-north-stars-calibration-system/
Bringing new worlds to life doesn’t end with bleeding-edge software – it’s also a battle with the laws of physics. Project North Star is a compelling glimpse into the future of AR interaction and an exciting engineering challenge, with wide-FOV displays and optics that demanded a whole new calibration and distortion system.
Just as a quick primer: the North Star headset has two screens on either side. These screens face towards the reflectors in front of the wearer. As their name suggests, the reflectors reflect the light coming from the screens, and into the wearer’s eyes.
As you can imagine, this requires a high degree of calibration and alignment, especially in AR. In VR, our brains often gloss over mismatches in time and space, because we have nothing to visually compare them to. In AR, we can see the virtual and real worlds simultaneously – an unforgiving standard that requires a high degree of accuracy.
North Star sets an even higher bar for accuracy and performance, since it must be maintained across a much wider field of view than any previous AR headset. To top it all off, North Star’s optics create a stereo-divergent off-axis distortion that can’t be modelled accurately with conventional radial polynomials.
North Star sets a high bar for accuracy and performance, since it must be maintained across a much wider field of view than any previous augmented reality headset. How can we achieve this high standard? Only with a distortion model that faithfully represents the physical geometry of the optical system. The best way to model any optical system is by raytracing – the process of tracing the path rays of light travel from the light source, through the optical system, to the eye. Raytracing makes it possible to simulate where a given ray of light entering the eye came from on the display, so we can precisely map the distortion between the eye and the screen.
But wait! This only works properly if we know the geometry of the optical system. This is hard with modern small-scale prototyping techniques, which achieve price effectiveness at the cost of poor mechanical tolerancing (relative to the requirements of near-eye optical systems). In developing North Star, we needed a way to measure these mechanical deviations to create a valid distortion mapping.
One of the best ways to understand an optical system is… looking through it!. By comparing what we see against some real-world reference, we can measure the aggregate deviation of the components in the system. A special class of algorithms called “numerical optimizers” lets us solve for the configuration of optical components that minimizes the distortion mismatch between the real-world reference and the virtual image.
Leap Motion North Star calibration combines a foundational principle of Newtonian optics with virtual jiggling. For convenience, we found it was possible to construct our calibration system entirely in the same base 3D environment that handles optical raytracing and 3D rendering. We begin by setting up one of our newer 64mm camera modules inside the headset and pointing it towards a large flat-screen LCD monitor. A pattern on the monitor lets us to triangulate its position and orientation relative to the headset rig.
With this, we can render an inverted virtual monitor on the headset in the same position as the real monitor in the world. If the two versions of the monitor matched up perfectly, they would additively cancel out to uniform white. (Thanks Newton!) The module can now measure this “deviation from perfect white” as the distortion error caused by the mechanical discrepancy between the physical optical system and the CAD model the raytracer is based on.
This “one-shot” photometric cost metric allows for a speedy enough evaluation to run a gradient-less simplex Nelder-Mead optimizer in-the-loop. (Basically, it jiggles the optical elements around until the deviation is below an acceptable level.) While this might sound inefficient, in practice it lets us converge on the correct configuration with a very high degree of precision.
This might be where the story ends – but there are two subtle ways that the optimizer can reach a wrong conclusion. The first kind of local minima rarely arises in practice. The more devious kind comes from the fact that there are multiple optical configurations that can yield the same geometric distortion when viewed from a single perspective. The equally devious solution is to film each eye’s optics from two cameras simultaneously. This lets us solve for a truly accurate optical system for each headset that can be raytraced from any perspective.
In static optical systems, it usually isn’t worth going through the trouble of determining per-headset optical models for distortion correction. However, near-eye displays are anything but static. Eye positions change for lots of reasons – different people’s interpupillary distances (IPDs), headset ergonomics, even the gradual shift of the headset on the head over a session. Any one of these factors alone can hamper the illusion of augmented reality.
Fortunately, by combining the raytracing model with eye tracking, we can compensate for these inconsistencies in real-time for free![6] We’ll cover the North Star eye tracking experiments in a future blog post.
This page answers some more general northstar related questions. The other sections below it answer questions related to their respective topics.
have a question not mentioned here? Feel free to ask on the community Discord
Good Question! There are multiple variations of the reference design from leap motion. The reference design can be found on this page
The minimum system requirements for Northstar can vary based on the type of application you want to run. Your system will need a display port connection that can output 4k@60hz.
Project Northstar uses the same displays commonly used in VR headsets to provide a high resolution and high field of view experience.
While most of the design is 3d printable, there are components, like the screens, driver board, combiners, and leap motion controller that you will have to order.
For Northstar Next you can get them from the CombineReality shop.
CombineReality is a company started by Noah Zerkin and Alex Chu to be able to build, fund, and distribute Northstar Headsets.
Most components fit within a print volume of 130mm*130mm*130mm, however the two largest prints will need a print volume of 220mm*200mm*120mm. It is possible for the parts to be split, using MeshMaker to allow them to fit on smaller print volumes. The ender series (220 x 220 x 250mm) by Creality seems to be a fan favorite among the discord if you're just getting started with 3d printing. If you want something that has a larger print area, check out the creality pro (300 x 300 x 400mm). If you want other recommendations, feel free to ask on the discord.
The Intel RealSense T265 is the most commonly used device currently. It supports 6dof (degrees of freedom) but does not support world meshing.
The occipital structure core is great since it's cross platform and non GPU dependent and has more features, but it's more expensive than the Realsense. (Note that if you order this you need the black and white camera version and NOT the color version). There are members of occipital here in the discord to answer more questions, check out the #occipital-structure-core channel. Occipital has discontinued support of the perception engine and is no longer recommended.
If you have a windows PC with a 1070 or above you can use the Zed Mini, but it only works with Nvidia CUDA which limits its use.
If you have a Vive already, you can use a vive tracker for 6dof tracking, however the vive tracker requires external "lighthouse" base stations in order to function, making it more difficult to transport the headset or use it in different environments without extra setup.
Due to the nature of 3D printing and assembly each headset is going to be slightly unique and will require going through a calibration process to display the image correctly. There are currently two ways to calibrate a northstar headset. The first method uses two stereo cameras to calculate the 3D position of the displays and reflectors. The second method uses a single stereo camera, and is currently setup to be able to use the intel t265 camera, which we currently recommend for 6DOF. This allows northstar developers to reuse the t265 rather than purchase two seperate stereo cameras.
These numbers refer to the focal distance that the images appear from the user. The only difference between the two is the location of the screens relative to the combiner. The focal distances can be switched by replacing the display tray. Typically, we recommend starting with a 25cm build since they are easier to get started with. 25cm is sharper for items attached to your hand or right in front of your face.
- 25cm provides a much better experience when using all the virtual wearable interfaces from Leap Motion
- 25cm allows for a slightly wider FOV
- 25cm is also much brighter because of the angle of incidence and collimation layer in the display panel
- for wandering about and batting stuff around, throwing things, sticking stuff to your wall, or making Characters run around the room 75cm is way more convincing. With 25cm, the Vergence accommodation effect is noticeable, even if you get your IPD just right
- 75 cm is harder to calibrate than 25cm
In general, 75 cm is better if a lot of things you’re dealing with are further away, while 25 cm is best if you’re prototyping hand interactions. You can still tell how far away things are with either of the focal distances via stereo overall.
So to conclude, whichever decision you opt for will work depending on what you plan on doing. However, something to keep in mind is that if you wish to switch between the two display holders, you have to put the screens into the new trays and recalibrate the headset using the stand.
There are a total of four cables connecting to the board (see picture below): - two are the two ribbon cables that send data and power to the two displays. The connections are on either side of the board one for power one for the mini display port (for transferring data)
Note that the board will only work with display port connections that support 4k@60hz, hdmi -> displayport adapters will not work. usb-c with displayport functionality or displayport to mini-displayport adapters work fine, as long as they support 4k@60hz.
The expected behavior for the board having adequate power and working properly is having yellow led turned on. The LED's location has also been labeled in the image attached above. If there is also a red led lighting up upon plugging it in, this is reflective of insufficient power being fed into the board. If you have a voltmeter handy you might be interested in checking if after plugging the USB into a computer or a wall socket to usb converter that the voltmeter shows 5V of potential. This is the expected output for a USB connection. If the output voltage is correct or if you see no led light lighting whatsoever then there must be an issue with the device driver board. If this is the case reach out in the #noa-labs-display-adapter channel on the discord server for help in debugging. Secondly, if you do see the board led lighting up but don’t see anything on the screen you can do a couple of things to get things going:
- plugging the power USB in the same computer where you are plugging your display adapter makes things sync up (as this common source shares the same ground)
- disconnect everything and connect things in the following order: displays first (make sure you align the pins correctly and don’t force push the cables into the board as that might damage the pins), connect the power cable into your laptop and finally the display adapter. DO NOT under any circumstance unplug the display cables directly from the board while power is being connected. Also, always opt for operating with the cables that connect to the computer
There is a reset button. Using it is actually much safer than plugging cables in and out during testing.
This Unity Package contains the Unity Assets, Scenes, and Pre-warping systems necessary to build Unity applications with the Project North Star headset.
Make sure to check the Getting Started with Software page before following the instructions on this page.
These assets require the Multi-Device Beta Service to display hands.
These assets are dependent on Release 4.4.0 of the Leap Motion Unity Modules (included in the package).
Make sure your North Star AR Headset is plugged in
In windows display settings make sure the headset is showing at the correct resolution (2880x1600) and is to the right of the main monitor.
Create a new Project in Unity 2018.4 LTS
Import "LeapAR.unitypackage" from the Github Repo
Navigate to LeapMotion/North Star/Scenes/NorthStar.unity
Click on the ARCameraRig
game object and look for the WindowOffsetManager
component
Here, you can adjust the X and Y Shift that should be applied to the Unity Game View for it to appear on the North Star's display
When you're satisfied with the placement; press "Move Game View To Headset"
With the Game View on the Headset, you should be able to preview your experience in play mode!
Key Code Shortcuts in NorthStar.unity
(in the Editor with the Game View in focus and playing)
C
to Toggle Visibility of Calibration Bars
We have included a pre-built version of the internal calibration tool. We can make no guarantees about the accuracy of the process in DIY environments; this pipeline is built from multiple stages, each with multiple points of failure. Included in the .zip file are a python script for calibrating the calibration cameras, a checkerboard .pdf to be used with that, and Windows-based Calibrator exe, and a readme describing how to execute the entire process.
While this page is under construction, check out the working drawings here:
These are also available via PDF
Below is a step by step guide to assembling your Deck X HMD, each image is annotated and captioned.
Your Deck X Kit already comes with the integrator/t261 mount. This is because this part is printed in carbon fiber to better handle the temperatures of the realsense t261, you do not need to print it yourself.
3D Printed Optics Bracket
Heat Threaded Inserts (x8)
Soldering Iron
You may also find it helpful to acquire these special specifically for inserts.
Locate the Optics Bracket
Locate six heat-set threaded inserts. Note that the two ends of the inserts are different; one has a flat surface around the threaded hole, while the other is tapered. You will be installing the inserts with the tapered end down, inserting your soldering iron into the flat end. For information about using heat-set inserts into printed plastic parts, read thi le.
Locate the six 4.6mm diameter holes in the top surface of the optics bracket.
Heat your soldering iron, with blunt soldering tip, to 220°C. TS100 digital soldering iron's tip, pictured here, works great. As always, exercise care when handling your soldering iron. Burns suck.
Place each insert into one of the six 4.6mm holes and then insert the tip into the soldering iron into the insert. Do not apply significant pressure, as you don't want the threads catching on the surface of the iron's tip. As the plastic around the insert begins to soften, gently push the insert down into the hole until the edges are flush with the surface all the way around. Do your best to keep it centered. If it isn't perfect on the first try, you can use the soldering iron to reheat the insert and gently make adjustments.
Keep in mind that the heat-set inserts are brass and hold heat for a significant time after you've removed your soldering iron. Do not try to correct a skewed insert using your finger; if the insert and plastic are still hot enough to be easily adjusted, it is still hot enough to burn your skin.
The resulting assembly should look like this.
Turn the optics bracket over so that it is resting on its top surface and locate the two 4.6mm holes illustrated here.
Install heat-set inserts into these holes
The resulting assembly should look like this.
You will need the following
Optics Bracket Assembly
Power Filter Board
M2.5x0.45 5mm Button Head Phillips Screw (1)
Locate the hole illustrated here
Inserting your screwdriver through the Leap Motion Controller mounting area, secure the power board to the optics bracket assembly using the M2.5 button-head screw as illustrated here.
Note: Do not overtighten screws that fasten directly into printed parts without threaded inserts.
As you might expect, the resulting assembly should look like this.
You will need the following components for this step:
Leap Motion Controller
Micro-USB to 6-conductor ribbon adapter board
6-conductor ribbon cable
Gather your optics bracket assembly, 6-conductor ribbon cable, and Ribbon to Micro-USB Ribbon Adapter Board. Locate and remove you Leap Motion Controller from its package
Making sure that the latch of the ribbon connector on the adapter board is open, insert either end of the ribbon into the connector as shown.
Applying a gentle rotational force to the top of the ribbon adapter latch tab, push the tab down and back from ribbon to latch it into place. Since "gentle" is a relative term, you might need to use a little more force if the latch tab doesn't move easily, but it shouldn't take much, and the tab is easy to break, so treat it nicely.
Insert the adapter board's USB connector into the wider side of the Micro USB SuperSpeed female connector as shown here.
Note: While the Leap Motion Controller has a SuperSpeed connector, it is a USB 2 device and does not require, or benefit from, the use of a SuperSpeed cable, so long as there is enough current available on the USB 2 pins.
The male MicroUSB plug on the adapter is longer than the socket on the Leap Motion Controller and will protrude as illustrated here. This is normal, but make sure that it is inserted as far as shown in this picture.
Remove the protective film from the front of the Leap Motion Controller and lay it with the outward side down on your assembly surface. Once installed in the optics bracket we'll be reapplying the firm to protect the sensor during the rest of the assembly process.
Double back (but don't crease) the ribbon, and hold as shown here.
Tuck the doubled over ribbon and end of the adapter board under the small shelf on the top left side of the optics bracket cavity
Gently press your Leap Motion controller down and forward into its mounting area. It shouldn't take much pressure to do this, but because of the layering of the 3D-printed optics bracket, you may need to wiggle it a bit to get it into position.
The result should look like this.
Making sure that the Leap Motion Controller is positioned flat against the optics bracket lip along its top edge, you can reapply the protective film as shown. Or you can put it right side up. Your call.
Optics Bracket Assembly
Leap Motion Controller mounting spacer (3D printed Part)
Display Driver Board
Self Threading Screws (x3)
Locate your Leap Motion Controller Mounting Spacer
Position the Leap Motion Controller Mounting Spacer behind the Leap Motion Controller as shown here, oriented so that the smaller "tooth" is positioned at the top.
Locate your Display Driver Board
Carefully position the Display Driver Board so that the + and - power header pins are positioned over header pin socket on the Power Filter Board as shown.
Tucking the ribbon out of the way as necessary, hold the Power Filter Board in place with one hand while pressing the Display Driver Board's header pins into the header pin socket. Be careful not the bend the legs of the capacitor on the Power Filter Board too much.
The Display Driver Board should be resting on the shelves at the back and on both sides of the cavity.
Locate the three Display Driver mounting holes shown here
Secure the Display Driver Board to the Optics Bracket using three M2 8mm thread-forming screws.
Important Note: Do not flex the Display Drive Board significantly, as it can damage the solder welds between the electronic components and the board. If tightening the screws will flex the board, reposition things so that it won't, or leave the screws loose as necessary.
The result should look like this... except that our camera lens kind of makes it look like the display driver board is flexed here. It isn't, and yours shouldn't be either. Seriously. Don't flex the board.
You will need:
Optics Bracket Assembly
JST Power Jumper
Connect either end of the JST Power Jumper cable to the JST socket on the Power Filter Board as shown. Hold the Power Filter Board with one hand (not shown) while inserting the jumper cable connector with the other.
Display Trays (25, or 75cm depending on which focal length you want. If you're unsure choose the 25cm to start with.)
Heat-set inserts (8)
Install the heat-set inserts into the back of the display trays.
Note: In the case of the 75cm trays, make sure that the inserts don't poke through too much into the slot for the display. If necessary, push inserts back into place from the other side.
The end result should look like this:
It will be very helpful to read through this step in full before proceeding. Pay Careful Attention and exercise patience. This is the hardest step in assembly of the headset. (Note that step 7-4 has multiple photos, you have to scroll down to see them)
Patience
BOE 3.5" 1440px1600p displays
Locate your displays and place them face down on your assembly surface. Make sure that the surface is clean and clear of any debris that might scratch the display surface. Do not remove the protective film from the display.
Okay... this is the scary part, so take a break and take some deep breaths. If you've got a hair dryer handy, go get it.
If you have a hair dryer or heat gun, use it at a LOW temperature to slightly soften the adhesive holding the ribbon cables to the back of the displays. Move your hot-air-blowing-device back and forth across the ribbon. If you don't feel confident doing this, don't do it. We're not sending you free replacement displays if you screw this up.
Starting from the loose end of the display connector ribbon cable, gently peel back the ribbon using even force and being careful not to tear it. This part is pretty easy. If you're lucky, the adhesive will come off the back of the display with the ribbon. You just won the lottery and at this point can just keep peeling until you get to the part of the ribbon with the electronics on it. If not, don't fret too much. You'll do fine.
Now that we've gotten to the part where the ribbon widens out and is populated with components, we're going to carefully keep separating the ribbon from the board. If the adhesive came off with the ribbon, this should be straightforward. If, in the more likely scenario that the adhesive is too solidly attached to the back of the display to peel it off by itself, we're just going to have to carefully work the edge free. Don't use a blade to do this.
The greatest risk here is that the ribbon comes free and you accidentally tear the much thinner and delicate ribbon film actually connecting the FPC film to the display backplane. Don't do that. Be careful. And in case you catch it the first time, be careful!
The other side of the ribbon isn't as delicate, and won't tear as easily because the backlight power FPC connects at this end. Regardless, be careful!
Yay! We got... oh wait. No we didn't. The adhesive is still holding the ribbon to the back of the display.
How about we just use our screwdriver to carefully peel the adhesive from the back of the ribbon.
Seems that worked. That was quite an adventure. Let's do it again! (Go do it again with the other display. That was too intense, so we'll just wait over here for you until you're done. Good luck, friend.)
Woohoo! You made it! Have some Gummy Bears! (No, there aren't any in the kit, but if you've got some handy, have some.)
Now that our display ribbons are no longer stuck to the backs of the displays, we can slide them into the trays. Start by partially peeling the protective film back from the display using the attached tab as illustrated. Don't completely remove the film.
Slide the display, down into the tray as shown. The edges of the tray should slip under the edges of the film.
You can now lower the film back into place on the display surface. Ideally, leave them there until assembly is finished.
Assuming that you're assembling a 25cm headset, per our recommendation above, fold the ribbon, never creasing if avoidable, as shown in the sequence of pictures that follow.
Place the left display tray into position as shown, the small tab at the top of the display tray should slide into the optics bracket as shown.
If installed properly the threaded holes should line up with those in the optics bracket as shown.
Hold the Power Jumper Cable out of the way of the display ribbon connector, then bend (but do not crease) the display ribbon as shown.
Firmly Press the ribbon cable into the receptor on the driver board as shown. This does not require force, and will snap into place,
Locate the four screw locations on the rear of the optics bracket and use M2.5 Screws to fasten the display tray into the headset.
Awesome! It should look like the following image! Now let's move on to the right side.
Just like the previous step, go ahead and bend the cable like shown, careful not to crease the cable.
Bend the cable under itself as shown
Slide the display tray into place, lining the notch up with the opening in the optics bracket.
Ensure that the display tray also lines up with the mounting location on the bottom of the optics bracket
Verify that the notch is secured in place, then attach the ribbon cable to the display tray as shown in the next tab.
Just like in the previous step, go ahead and screw in the display tray.
Great Job! The headset should look like this currently! Next, we'll be installing the Integrator board.
We recently introduced a new product to improve the installation and safety of the display process. The ribbon cable extensions remove the need to peel the display cable and provide a cleaner overall installation experience. Please see the following video for instructions:
Newer headset kits come with a piece that does not use the heat set inserts, you can skip this step if your part t261 mount looks different from the one shown below.
Locate the t261/Integrator Mounting bracket, this part is printed using PETG, which has a higher thermal threshold than the previous PLA prints, as such you will need to make sure your soldering iron is at a higher temperature.
Installing the heat-set inserts will generally follow the same process as the previous sets.
That was pretty straightforward, onto the next step!
oh boy fun stuff! Grab the included thermal paste, three heatsinks, and t261 sensor/mount.
Line up the t261 sensor so that it lines up with the notch on the left side of the bracket.
Ensuring that the t261 is aligned properly, attach the heatsink to the middle of the t261 sensor.
Now that you have the middle heatsink aligned properly, remove the t261 from the bracket, next we'll be applying the thermal paste for the remaining two heatsinks.
Apply thermal paste to the two indented rectangles as shown.
When installing the heatsink you'll way to make sure that it does not block the vent at the top or bottom of the t261, line it up as illustrated below.
Following a similar process as the previous step, attach the heatsink to the other side of the t261, aligning it as illustrated.
Awesome! Now that you've got all the heatsinks installed your t261 should look like this!
Take note of the notch on the t261 bracket, we'll be using this to align and fasten the fan.
Holding the power cable out of the way, orient the fan so that the screw hole will line up with the t261 bracket as shown.
once you have the fan aligned go ahead and screw it into the t261 bracket.
Flip the bracket around and fasten a second screw into the bottom right corner of the fan as illustrated below. Once finished this step we will move onto installing the t261 sensor.
On this step we will be installing the t261 sensor, note the two holes on the t261 and on the mounting bracket.
Line up the t261 with the two mounting holes on the t261 bracket and fasten screws. As with previous steps make sure not to over fasten the screws.
Installing the integrator board is as simple as 1,2,3,4. for screws and it's in! As with previous screws be sure not to over tighten them.
The Mounting bracket also has two small pins to allow you to line it up easier. It should look as shown below.
In this step we'll be installing the ribbon cable that connects the integrator board to the t261.
Open the gate and slide in the ribbon cable in a similar manner to how we setup the leap motion connection in step 3.The gate does not require force to open or close.
Connecting the ribbon cable to the integrator board functions in the same way we just connected it to the adapter board, make sure you plug it into the right header as shown here and you'll be set.
Line up the connection between the adapter board and the t261 sensor. This connector works in a similar way to the display driver board.
Once the cable is attached properly your t261 bracket should look like the photo below. Next we'll be connecting the fan to the integrator.
The fan connection is a relatively standard connector for electronics. Make sure to match up the red (positive) and black (ground) to the correct polarity.
It's the home stretch now! Let's attach the Integrator to the Optics Bracket!
Before we screw in the Integrator Mount we will need to plug in the display cable. You don't need to force it in, but it should be firmly connected.
Fastening the integrator board is simple using the M2.5 screws
After fastening the Integrator bracket we'll attach the ribbon cable from the leap motion to the Integrator as shown below.
Next we'll attach the power jumper to the integrator board, this allows the integrator board to handle powering the display board as well as the LMC and the t261 off a single cable!
Starting to look impressive right? Only a few more steps now, Your headset should look like the photo shown below.
Locate your usb A to usb C cable, you'll be using the usb C connector to plug into the headset.
Plug the cable in as shown below, note that the cable will have to loop around in order to be mounted properly, this is due to space constraints inside the lid of the headset.
Locate your Lid, D-Pad, and two round buttons, for this step we'll be installing the buttons and button control board.
Take note of the alignment pin, this will help you position the D-Pad in the correct orientation, as well as the circuit board for the buttons.
The button with the small round circle goes in the top slot, and the larger round circle goes in the bottom slot, each button is unique, and switching them around will result in buttons that are difficult to push, so make sure to put them in properly.
After you've got your buttons installed, go ahead and locate your button board, four m.2 button head screws, and the two cables shown below.
Before installing the board, you'll need to plug the cables in. This is similar to how we plugged the fan cable in earlier.
The metal pin on the bottom of the board is for power delivery, make sure the jumper is connected properly, it should slide right onto the connector.
After you've got your cables connected screw the button board into the headset as shown below, take not to align it with properly, the two pins are there to help with that.
Attaching the lid is pretty simple, though there are a lot of screws to keep track of. You'll need four m.2 screws to attach to the heat threaded inserts, and two self threading screws.
before screwing the lid on, you'll need to connect the button board cables, first up is the power jumper, the connection type is the same as it was on the button board. simply slide the jumper onto the metal rod on the integrator board.
next up attach the input cable to the integrator board as shown.
Once both cables are connected your headset should look like this.
slide the lid over the t261 and then push inwards lightly so that the circular cutouts go around the camera lenses. It's a good idea to keep your lens caps on until after the lid is secure.
Once you've got the lid in position, maneuver the display and power cables so that they slide into the cutout on the lid. These can take a bit of pressure, but try to make sure not to wiggle the display cable too much, a lose connection to the driver board could mean your displays won't function properly. This can be resolved by re-seating the display cable, but we want to avoid having to do that.
Yay more screws! Locate the screw holes shown below and apply the proper screws to each, if the hole has a heat threaded insert, use a flat m.2 screw, if it does not, use the self threading screws that we identified earlier.
Congrats! We're almost there! Just two parts left!!
Don't skip this step! Padding is important, and 3D prints are not comfortable when applied directly to your forehead.
This step is pretty self explanatory, just like a sticker, remove the film from the adhesive side of the velcro and apply as shown.
Remove the backing from the 3M foam strip and apply to the forehead rest.
Once the padding is applied, it should look like this, use two thread forming screws to attach the forehead rest to the headset.
Installing the combiners is one of the easiest steps, no screws, no glue, just form fitting polycarbonate!
Line the combiner up so that the tabs on the top of the combiner are in sync with the tab cutouts in the 3D print.
once you've done that you can apply a small amount of force to the optics bracket and the combiners to fit the side tabs in. As always, don't go crazy with pressure here, they'll snap in with little resistance.
Once you've got the combiner in, it should look like this, note the lack of gaps shown on the edges.
Now that the Combiners are in place it's time to lock in the optics bracket. Get two (2) M2.5 screws and fasten the display trays to the optics bracket at the mounting points on the bottom of the bracket.
Once you've got the screws in your headset should like the photos below, note that the display trays aren't completely flush with the headset.
This page details the Bill of Materials from Smart Prototyping, it also includes the ID numbers for the 3D printed parts to make them easier to find in the GitHub repository.
This is the homepage for the Combine Reality Deck-X variant of Northstar.
The Combine Reality Deck X is a variant of Release 3 designed by Noah Zerkin's team at smart-prototyping. It includes a new hub called "The Integrator" which includes microSD card storage, an Arduino and USB hub, an embedded Intel Realsense t261 sensor, and a control board for adjusting ergonomics like IPD and eye relief.
The Integrator is our custom-built USB hub system originally created for the CombineReality Project North Star Deck X. The Integrator cuts down the use of cables and adds customizable buttons to the headset with the following components & features:
USB-C hub, two USB 3.1 ribbon connectors, and one USB 2.0 ribbon connector 3GB on-board flash drive (only works when connected to a USB 3.0 host) Arduino-compatible microcontroller, featuring a Qwiic connector that can be used to connect additional sensors like an IMU, as well as HID buttons that can emulate keyboard keys.
A button breakout board is included, and the microcontroller is preflashed with firmware that maps the buttons to the default ergonomics adjustment keys. (Eye relief, eye position, and IPD) Also allows for manual power reset of sensor USB ports via a GPIO pin.
A fan, the speed of which is controlled by the Arduino-compatible microcontroller. A thermistor for a more intelligent fan speed control.
A ribbon connector that lets the Arduino on the hub relay commands and debug output to and from the serial UART of the display driver board Ribbon adapter board for Intel® RealSense™ T261 embedded 6-DOF module (ribbon cable included) Ribbon adapter board for Leap Motion Controller (ribbon cable included)
Work In Progress documentation on how to setup
Part Name | Quantity |
---|---|
If you have the new ribbon cable extensions skip to
All Done! Congrats on making it through the build process! To get started with software, check out the page, and the #First-Steps channel on the discord!
Please note that the Github repo for the CombineReality Deck X headset has three versions. The Prints in the Deck X folder are intended for users who will be assembling, hacking and taking apart their headset multiple times. The inserts are intended to help increase the lifespan of the 3D printed parts by reducing the stress and wear on the parts themselves. This version is not updated as frequently as the version below without heat set inserts. There's also a version of the Deck X for users that don't want to use heat-set inserts. Please note that if you only intend to adjust or rebuild the headset once or twice maximum. Taking the headset apart and putting it back together without heat-set inserts will cause the mounting points to deteriorate over time. The Prints in the 3.1.1 folder are intended for users who want to upgrade their existing 3.1 headset without reprinting the optics bracket.
The Integrator uses a modified version of the lilyPadUSB-caterina Arduino bootloader Bootloader can be found here:
Features
3D Optical Calibration
2D Optical Calibration
IPD adjustment
Yes
In Progress
T265 Support
No
Yes
Cameras
2 (stereo)
1 (stereo)
Camera Calibration
Checkerboard Process
Built In or Checkerboard
Unity
Yes
Yes
SteamVR
Yes
In Progress
Monado/Linux
Yes
Yes (Beta)
Northstar Next Driver Board
1
Northstar Next Adapter Boards
2
BOE 3.5" 1440x1600 Displays
2
Display Adapter Ribbon Cables
2
Minimum Requirements
Recommended Requirements
AMD Ryzen Quad Core or Equivalent
AMD Ryzen 6 or 8 Core Processor
8GB Ram
16GB Ram
Vega 8 GPU or MX150
Nvidia 1060 6GB or higher
Specifications
Northstar
Hololens 2
MagicLeap
ARKit
ARCore
Operating System
Windows, Linux
Windows Holographic
Lumin
iOS
Android
Stand Alone
Compute Pack*
Headset
Compute Pack
Phone/Mobile
Phone/Mobile
FOV
~110x70
~52 Diagonal
~40 Diagonal
N/A
N/A
Functionality
Northstar
Hololens 2
MagicLeap
ARKit
ARCore
Positional Tracking
✅
✅
✅
✅
✅
Hand Tracking
✅
✅
✅
✅
❌
SteamVR
👷♂️
❌
❌
❌
❌
Plane Tracking
❌
✅
✅
✅
✅
Anchors
✅
✅
✅
✅
✅
Light Estimation
❌
❌
❌
✅
✅
Environment Probes
❌
❌
✅
✅
Face tracking
❌
❌
❌
✅
✅
Meshing
❌
✅
✅
lidar*
DepthKit
2D image tracking
Aruco
✅
✅
✅
✅
Speech Input
❌
✅
✅
Part Name | Quantity | Kit A | Kit B | Kit C | Upgrade Kit | Rebuild Kit |
Spring Left (0.304-in OD Torsion Spring 180 Deg. L-Hand Wound) | 1 | √ | √ | √ |
Spring Right (0.304-in OD Torsion Spring 180 Deg. R-Hand Wound) | 1 | √ | √ | √ |
M2.5x0.45 6mm Long Steel Flat Head Screw 90 Deg CS | 14 | √ | √ | √ | √ | √ |
M2.5x0.45 5mm Long Button Head Socket Screw | 16 | √ | √ | √ | √ | √ |
M4x0.7 20mm Steel Button Head Socket Screw | 2 | √ | √ | √ | √ | √ |
M4x0.7 Zinc Plated Nylon Insert Hex Lock Nut | 2 | √ | √ | √ | √ | √ |
M2 8mm Long Thread-Forming Screws for Plastic | 25 | √ | √ | √ | √ | √ |
M1.4 6mm Long Thread-Forming Screws for Plastic | 4 | √ | √ | √ | √ | √ |
8x16x2mm Rubber Washer | 2 | √ | √ | √ |
M2.5x0.45 3.4mm Long Heat-Set Insert for Plastics | 24 | √ | √ | √ | √ |
3x10x165mm 6061 Aluminum Bar Slide (Pre-Drilled & Threaded) | 2 | √ | √ | √ |
Gen 2 Welding Headgear | 1 | √ | √ | √ |
Foam Forehead Padding (Pre-Cut) | 1 | √ | √ | √ |
Thin Foam for Forehead Padding (Pre-Cut) | 1 | √ | √ | √ |
Thin Foam for Facial Interface (Pre-Cut) | 1 | √ | √ | √ |
2mm Thick Self-Stick Anti-Skid Rubber 3M Tape (Pre-Cut) | 4 | √ | √ | √ |
Zip Ties | 10 | √ | √ | √ | √ | √ |
Part Name | Quantity | Kit A | Kit B | Kit C | Upgrade Kit | Rebuild Kit | ID |
Deck X Optics Bracket (Project North Star 3.2 Compatible) | 1 | √ | √ |
Left Display Tray - 75cm Focal Depth | 1 | √ | √ |
Right Display Tray - 75cm Focal Depth | 1 | √ | √ |
Left Display Tray - 25cm Focal Depth | 1 | √ | √ |
Right Display Tray - 25cm Focal Depth | 1 | √ | √ |
Left Rotation Adjustment Slide Mount | 1 | √ | √ |
Right Rotation Adjustment Slide Mount | 1 | √ | √ |
Forehead Rest Base Shape | 1 | √ | √ |
Part Name | Quantity | Kit A | Kit B | Kit C | Upgrade Kit | Rebuild Kit | ID |
Forehead Main Structure (Low-Profile) | 1 | √ | √ |
Forehead Headgear Span | 1 | √ | √ |
Forehead Hinge Cap | 2 | √ | √ |
Forehead Hinge Base | 2 | √ | √ |
Rear Hinge Base | 2 | √ | √ |
Rear Hinge Cap | 2 | √ | √ |
Left Side Main Span With Slide Brake | 1 | √ | √ |
Left Brake Housing | 1 | √ | √ |
Left Brake Button | 1 | √ | √ |
Right Side Main Span With Slide Brake | 1 | √ | √ |
Right Brake Housing | 1 | √ | √ |
Right Brake Button | 1 | √ | √ |
Slide Endcap | 2 | √ | √ |
Part Name | Quantity | Kit A | Kit B | Kit C | ID |
Deck X D-Pad | 1 | √ | √ |
Deck X Button 1 | 1 | √ | √ |
Deck X Button 2 | 1 | √ | √ |
Deck X Lid (T261 Version) | 1 | √ | √ |
Deck X Leap Motion Controller Mounting Spacer | 1 | √ | √ |
Mounting Bracket for North Star Integrator (T261 Version) (PETG w/ Carbon Fiber) | 1 | √ | √ |
Part Name | Quantity | Kit A | Kit B | Kit C | ID |
North Star Optical Combiner Set | 1 |
North Star Display (3.5 inch, 1440x1600 pixels, 120fps) | 2 |
North Star Display Matte Overlay (1 pair) | 1 |
North Star Display Driver Board | 1 |
North Star Display Driver Debug Adapter | 1 |
North Star Power Stabilizer | 1 |
USB Power Adapter Board (for supplemental power if necessary) | 1 |
Part Name |
North Star Integrator (USB Hub System w/ Integrated Arduino and Flash Drive) |
Leap Motion Controller to Integrator Ribbon Adapter (Micro USB to Ribbon) |
Integrator Ribbon Adapter (NOVASTACK to Ribbon) to Intel RealSense T261 (incl. Ribbon Cable) |
Micro USB 3 to Ribbon Adapter for Intel RealSense T265 (incl. Ribbon Cable) |
Zio Qwiic 6 Button Board (incl. Jumper Wire + Qwiic Cable) |
Mini Fan (with custom 5cm cable) |
Part Name |
Leap Motion Controller |
Intel RealSense t261 |
Part Name |
Heat Sink Set (3 pcs, 14 x 14 x 6mm) |
Thermal Grease (0.5g Syringe) |
USB C to Intel RealSense T261 Adapter Bundle |
Micro USB 3 to Intel RealSense T261 Adapter Bundle |
All notable Mechanical changes will be documented in this file.
BOM for Update 3.1 Assm
Cable guides (#113-002, #114-002)
Simplified folder structure
Updated BOM with links
Exported updated CAD STEP files of assemblies
Associated part numbers with simplified assm (#130-003, #130-004, #130-005, #130-006)
Flattened bottom for easier printing (#112-002, #111-002)
Thickened and fillet corners for strength (#230-001,#240-001)
Keyed mates for faster assembly (#220-001,#220-002,#210-003,#210-004)
Thickened for strength, offset edges, removed old cable guide holes (#130-005, #130-006)
Added L+R labels to headgear for easier assembly (#230-000, #240-000)
Added corresponding screwholes for cable guides (#113-001, #114-001)
New endcap with M2.5 screw mount (#110-005, #110-006)
Cutout size and shape for al. bar slides (#110-003, #230-002, #240-002)
Modular lid and Vive tracker mount (#130-002, 3, 4, 5, and 6)
Blank mounting plate STEP file
New slide end cap with cable mount (#110-003)
Chamfer on optics bracket display screws, removed lip around rubber washer (#130-001 and #110-001)
Added simplified optics assembly (#130-000)
The following pages contain the original documentation for the first version of Project Northstar, the latest version is here: Northstar 3.1
This page is intended to be specifically about the purchasable kits, there will be some overlap with general FAQ. Check the sidebar on the right to jump to the correct question
Ahead.IO offers multiple kits.
Kit Zero contains all the electronics and mechanical pieces to get you started with your build, but does NOT come with 3D printed parts. You'll have to print them, and assemble the headset.
Kit One contains all the electronics and mechanical parts, along with the 3D printed parts. You'll still need to assemble and calibrate the headset.
Kit Two contains the headset fully assembled and pre-calibrated.
NOTE: All kits have 3m cables. All Kits contains one Ultraleap Stereo IR170. Although the RealSense T261 isn’t part of the kits, the kits still provide all parts to assemble the sensor. There are still some stores that are selling this particular sensor.
The kits come with a full screwdriver kit, and all the screws you'll need to build the headset, aside from a soldering iron. A soldering iron is used for heating up the heat-threaded inserts and can be found at any local electronics store for around $10 USD.
Active
06 JUN 2023
CombineReality
XR50 / T261
SIR 170
BoboVR
Custom
Active
21 AUG 2020
CombineReality
T261
LMC / SIR 170
3.1
3.2
Release 3-2
In Dev.
TBD
Leap Motion
T265
LMC
3.2
3.2
Outdated
03 APR 2019
Leap Motion
T265
LMC
3.1
3.1
Outdated
23 JAN 2019
Leap Motion
T265
LMC
3
3
Release 2
Outdated
6 JUN 2018
Leap Motion
N/A
LMC
1
1
Part number
Quantity
Description
Source
#310-001
1
BASE MOUNTING PLATE
3DPRINT
#310-002
1
LEFT LEFT STAND
3DPRINT
#310-003
1
RIGHT LEFT STAND
3DPRINT
#320-001
1
DUAL STEREO CAMERA MOUNT BRACKET
3DPRINT
#320-002
2
STEREO CAMERA STIFFENER
3DPRINT
#320-003
1
CAMERA CROSSBEAM
3DPRINT
#640-001
8
M2.5x0.45 6mm LONG STEEL FLAT HEAT SCREW 90 DEG CS
#650-001
22
M2x8mm LONG THREAD-FORMING SCREWS FOR PLASTIC
#690-001
8
M2.5x0.45 3.4mm LONG HEAT-SET INSERT FOR PLASTICS
#700-003
2
ELP USB CAMERA MODULE DUAL LENS STEREO
This page was contributed by Guillermo Guillesanbri.
The Deck X headset uses the same headgear from release 3. The build process is documented below with photos.
This page was contributed by Guillermo Guillesanbri
If we take a look at the first page with instructions starting from the end of the document, we will find the 230-000 instructions.
These parts were updated not long ago, so if you have downloaded them recently, you should be missing the two cable-guides on the top of the picture above and have a reinforced version of 230-001/240-001.
When assembling the headgear, take into account that even if the assembly drawing doesn’t show it, you have to put the welding headgear between the hinge parts.
To put the spring in place, you will have to turn it 180 degrees loading it, this can be easily done by putting the spring in place in both parts and turning around one of them.
First, we need to bend the forehead headgear span, to do this, it’s recommended to turn the heated bed of your 3D printer to 70 °C, which is enough to shape the part if you don’t have access to a heated print bed, don’t worry, other options are:
Using a Heat Gun.
Using hot water.
Anything that gets the piece to ~70 °C.
You can now shape the piece with a mannequin head or your own head if you cover your skin to avoid first degree burns (not really, just be careful). Demo video by Florian Maurer.
Once we have the piece bent, it’s time to wrap up the foam, to do so, I did this, I’m not 100% sure this is the best/correct way to do it, but it worked out fine for me. Take into account that the felt-like side is the smoothest one and should be the one in contact with your forehead.
Now, let’s put the hinges in place, again, remember to put pieces 230-000 and 240-000 between the two pieces of each hinge. I found it easier to lock in place the hinge base (shown in the picture below) and screw the cap on top.
You should have now the headgear assembled, just like the pictures below, with two differences, the reinforced parts and the absence of the cable guides.
This version of the calibration system allows you to calibrate your headset with a single stereo camera
These Modules require python 3.7 in order to work
The realsense T265 & T261 can only be used with on application at a time, you'll have to close realsense viewer.
Above: A video walking you through the 2D calibration process for northstar.
Download the realsense-integration branch of the following repo: https://github.com/BryanChrisBrown/ProjectNorthStar/tree/realsense-integration
Setup python with the following dependencies:
1) Pyrealsense2
2) opencv-contrib-python
3) numpy
Print the calibration stand and intel realsense t265 mount m3x12mm screws to mount the t265 itself.
If using a Deck X and t261 mount, use the intel realsense t261 mount instead, the screws are included in your Deck X Kit.
Run through the following steps to calibrate your headset.
captureGraycodes.py - run python captureGraycodes.py
to run this script
Ensure that your headset is placed on the calibration stand, with the stand's camera looking through it where the users' eyes will be.
On line 44 there is a line of code that changes the window offset for the graycode generator.
This example here moves the window 1920 pixels to the right, and 0 up and down. You'll want the X value (1920) to be your main monitor's width in pixels. cv2.moveWindow ("Graycode Viewport", 1920, 0)
It helps to place a piece of cloth over the rig to shield the cameras + headset from ambient light.
The sequence of binary codes will culminate in a 0-1 UV mapping, saved to "./WidthCalibration.png" "./HeightCalibration.png" in your main folder.
calibrateGraycodes.py
Running this script will fit a 3rd-Degree 2D Polynomial to the left and right "eye"'s X and Y distortions.
This polynomial will map from each display's 0-1 UV Coordinates to rectilinear coordinates (where a 3D ray direction is just (x, y, 1.0)).
When you are finished, you may paste the output of the calibrateGraycodes.py into this diagnostic shadertoy to check for alignment. Additionally, ensure that your headset is plugged in and displaying imagery from your desktop. Running this script will display a sequence of gray codes on your North Star, capturing them at the same time.
Additionally, there should be a NorthStarCalibration.json
in this directory which you may use in the Unity implementation.
This folder contains the mechanical assets necessary to build a Project North Star AR Headset with calibration stand.
Mechanical Release 3 bundles together all the lessons we learned into a new set of 3D files and drawings. Its main objective is to be more inviting, less hacked together, and more reliable. The design includes more adjustments and mechanisms for a larger variety of heads and facial geometries. Overall, the assembly is lighter, more balanced, and stiffer.The parts were designed for an FDM style 3D printer
This is a work in progress. This is not a finished guide, nor end-user friendly. Major sections are missing. Assembly requires care and patience. Nothing worth having is ever easy.
Open the Headset Mechanical Assembly and construct each sub-assembly as illustrated. A full list of the parts needed can be found in the Headset Bill of Materials (BOM). Not all sub-assemblies are required as there are multiple designs to choose from. Additionally, the CAD files are included in STEP format to help design new parts.
The headset consists of two basic sections: the optics #100-000
and headgear #200-000
. The optics subassembly currently has two variants: Release 3 optics and Update 3-1 (i.e. the simplified optics assembly). The headgear assembly utilizes the rear adjustment mechanism from a Miller branded welder’s helmet, but several models can be made to work.
Functionally, the two optics assemblies are the same. Release 3 is closer to the original design aesthetic and it’s marginally (8g) more lightweight. On the other hand, Update 3-1 halves the print time by removing the need for supports on the sides. Overall, the simplified 3-1 bracket is easier to print with lower print failures and stiffer too. It’s recommended to start with the simplified Release 3-1 optics assembly for the first build. See drawing #100-000
for more information.
All development was done using eSun's PLA Pro / PLA+ filament. We've found it to be strong, easy to print, and have great surface finish.
A build plate of approximately 250x200mm is recommended for the largest parts
Parts may need to be rotated to align with print bed.
Some parts have optimized versions (labeled “FDM OPTIMIZED”) with extra plastic tabs that need removing. These versions aid in minimizing warping and gripping the print surface.
.25mm layer height, 2 perimeter shells, 15% infill Filament: eSun PLA Pro (PLA+) 3D Printer: MakerGear M2 Slicer: Simplify 3D
Several parts in the optics assembly use brass inserts for increased clamping load and the ability to swap out the components multiple times without wear. These inserts need to be heated above the plastic's melting point and pressed into the plastic. It's recommended to use an installation tip designed for brass inserts.
Demonstration of using a soldering iron to install brass inserts. The wire cutters are used to prevent the insert from pulling back out when not using an installation tip.
Parts that require inserts:
Installing brass inserts into the optics bracket is optional. They're intended as mounting points for future testing.
The headgear assembly includes parts that print flat but bend to fasten to each other. Although not necessary, it’s suggested to drape these parts around a form while the 3D print is still soft from the heated print bed. This minimizes strain inside the plastic and prolongs the life of the part.
Preheating the print bed to 70C softens a print enough to shape the part
This guide is also available as a PDF:
A video guide from Tasuku Takahashi is available below, it's best to supplement the written documentation with this video. If you run into problems or have questions, feel free to reach out on the discord.
Item No 6 (#230-004) of the Left Brake Assembly was removed and thus no longer packaged due to undesired cable catching. The new method of managing cables is by attaching a zip tie to the end cap.
Item No 6 (#230-004) of the Left Brake Assembly was removed and thus no longer packaged due to undesired cable catching. The new method of managing cables is by attaching a zip tie to the end cap.
Welcome to the next generation of open source spatial computing.
You can get next generation Northstar kits from Combine Reality: https://shop.combinereality.com.
Northstar Next is a new Single-cable (USB Type-C) Project North Star variant kit, cost-optimized with an example of lightweight and modular headset design. The use of a single cable and its modularity are the biggest differences with the previous version. The individual parts are as follows:
Thinner, lighter, more easily mountable! Still has the massive FoV of Leap Motion’s original optics design.
USB-C with DisplayPort Alt-Mode (Not all USB-C host ports support DisplayPort; please make sure in advance that your intended host device has at least one that does.)
Using same 3.5” 1440 x 1600 VS035ZSM-NW0-69P0 displays from BOE (Grade B*)
LT7911D display controller (good relationship with manufacturer, good support, and a stable supply)
The D variant of the LT7911 series has two MIPI outputs, while together the displays have four. We’re pushing the limit of what a single 4-channel MIPI port can handle, with the benefit of using 25-pin ribbon cables with only half the pins of their predecessors.
Low-cost alternative to original board, but with 85 – 90Hz maximum stable refresh rate
Support for audio output
USB 2 (High-Speed) passthrough to downstream hub or direct peripheral connection
Backlight brightness control (can be switched off entirely) with non-volatile settings. When you plug in headset, it has the same brightness as when you unplugged it. You can use a microcontroller to adjust the brightness, but you don’t need one connected for the display and backlight circuitry to work.
8-pin ribbon connector for future I2S stereo audio output breakout board
Connects to display driver board and doesn’t require any additional external cables
4-port USB 2 High Speed Hub with FE1.1 controller IC.
1A integrated fuse with 1.5A cutoff
3x 6-pin ribbon connectors for USB 2
1x 12-pin ribbon connector for USB 2 with passthrough connection to display driver board’s backlight brightness control pins and one GPIO pin connection for switching power to one of the 6-pin ports (for resetting any finicky sensors with enumeration issues like the T261.)
Driver board: Breakout Audio ports and Brightness controlling pins
Adapters, the kit includes several USB adapters so that you can connect it to a variety of different sensors that are in use in the North Star community: - USB-C male connector supporting USB 2 High Speed, with both 6-pin and 12-pin ribbon connectors for direct connection to the Display Driver board or to one of the 6-pin connectors on the PNS Hub board. - MicroUSB male connector with upstream 6-pin connector; similar to and compatible with the MicroUSB adapter included with our older kits. Works with PNS Hub board. - Ultraleap SIR170 / Rigel adapter with upstream 6-pin connector for use with our PNS Hub board. - T261 USB2 High Speed adapter with 6-pin connector. (Note, with USB2 bandwidth, you won’t be able to stream video from the T261 when used concurrently with an Ultraleap or Leap Motion hand-tracking module. You can, however, get a low-latency stream of the 6DOF tracking data generated onboard the device.) The Realsense T261 has been discontinued by Intel, but can still be found. We include this adapter mainly for the convenience of those North Star builders who already have this module.
Planned microcontroller board (to be sold separately): - Brightness control - Eye-tracking (Experimental and in development) - Additional sensors (like 3DOF with IMU or ambient light sensor) - User interface (like configuration buttons) - Compatibility with planned magnetically-coupled module connectors - Mechanical Modularity - Easily removable and swappable core electronics bay and sensor pods - Snap-together / snap-apart design
Easily removable and swappable core electronics bay and sensor pods
Snap-together / snap-apart design
Input Interface: USB-C DisplayPort Alt-Mode (requires host port compatibility; a full-featured USB-C 3.2, Thunderbolt 3 or better, or spec-compliant USB 4 are all safe bets)
Displaying Resolution: 2880x1600 pixels (2x 1600x1440 Displays)
Frame Rate: 85-90Hz
Support Brightness control by adjusting digital potentiometer (Microchip MCP4021T-503E/MC)
Driver board Board dimensions:
USB hub speed: Max 20-30MB/s
Max useable USB2.0 high speed downstream ports: 4
4 Downstream ports, 3 with 6P Connector, 1 port with power switch controlled via the 4th port(12 Pin Connector)
USB Hub board dimension:
Release 1 is the first version of the headset which leap motion open sourced in 2018. Since then there have been multiple revisions to make the headset stronger, easier to print, and more comfortable. For a detailed writeup of the changes made in release 3, check out the Release 3 Documentation.
The leap motion was designed to be able to support usb 3.0, however it currently only utilizes usb 2.0. This means you can use a standard usb micro-b connection with the sensor, which is useful for integrating it into smaller form factors, like an hmd.
Yes! The optics bracket will fit, you may have to rotate it a bit to fit properly on the build plate, here's a screenshot for reference. You can see as noted in the image that the dimensions of the optics bracket with a roughly 22 degree offset are (X:224.14mm , Y: 140.51mm, Z: 101.03mm).
http://blog.leapmotion.com/northstar/
Leap Motion is a company that has always been focused on human-computer interfaces.
We believe that the fundamental limit in technology is not its size or its cost or its speed, but how we interact with it. These interactions define what we create, how we learn, how we communicate with each other. It would be no stretch of the imagination to say that the way we interact with the world around us is perhaps the very fabric of the human experience.
We believe that this human experience is on the precipice of a great change.
The coming of virtual reality has signaled a great moment in the history of our civilization. We have found in ourselves the ability to break down the very substrate of reality and create ones anew, entirely of our own design and of our own imaginations.
As we explore this newfound ability, it becomes increasingly clear that this power will not be limited to some ‘virtual world’ separate from our own. It will spill out like a great flood, uniting what has been held apart for so long: our digital and physical realities.
In preparation for the coming flood, we at Leap Motion have built a ship, and we call it Project North Star.
North Star is a full augmented reality platform that allows us to chart and sail the waters of a new world, where the digital and physical substrates exist as a single fluid experience.
The first step of this endeavor was to create a system with the technical specifications of a pair of augmented glasses from the future. This meant our prototype had to far exceed the state of the art in resolution, field-of-view, and framerate.
Borrowing components from the next generation of VR systems, we created an AR headset with two low-persistence 1600×1440 displays pushing 120 frames per second with an expansive visual field over 100 degrees. Coupled with our world-class 180° hand tracking sensor, we realized that we had a system unlike anything anyone had seen before.
All of this was possible while keeping the design of the North Star headset fundamentally simple – under one hundred dollars to produce at scale. So although this is an experimental platform right now, we expect that the design itself will spawn further endeavors that will become available to the rest of the world.
To this end, next week we will make the hardware and related software open source. The discoveries from these early endeavors should be available and accessible to everyone.
We’ve got a long way to go still, so let’s go together.
We hope that these designs will inspire a new generation of experimental AR systems that will shift the conversation from what an AR system should look like, to what an AR experience should feel like.
Over the past month we’ve hinted at some of the characteristics of this platform, with videos on Twitter that have hit the front page of Reddit and collected millions of views from people around the world.
Over the next few weeks we will be releasing blog posts and videos charting our discoveries and reflections in the hope that this will create an evolving and escalating conversation around the nature of this new world we’re heading towards.
We’re going to take a bit of time to talk about the hardware itself, but it’s important to understand that, at the end of the day, it’s the experience that matters most. This platform lets us forget the limitations of today’s systems; it lets us focus on the experience, the software and the interface, which is the core of what Leap Motion is about.
The journey towards the hardware of a perfect AR headset is not complete and will not be for some time, but Project North Star gives us perhaps the first glimpse that we’ve ever had. It helps us ask the right questions, find the right answers and start to chart the course to a future we all want to live in, where technology empowers humanity to solve the problems of today and those to come.
http://blog.leapmotion.com/our-journey-to-the-north-star/
When we embarked on this journey, there were many things we didn’t know.
What does hand tracking need to be like for an augmented reality headset? How fast does it need to be; do we need a hundred frames per second tracking or a thousand frames per second?
How does the field of view impact the interaction paradigm? How do we interact with things when we only have the central field, or a wider field? At what point does physical interaction become commonplace? How does the comfort of the interactions themselves relate to the headset’s field of view?
What are the artistic aspects that need to be considered in augmented interfaces? Can we simply throw things on as-is and make our hands occlude things and call it a day? Or are there fundamentally different styles of everything that suddenly come out when we have a display that can only ‘add light’ but not subtract it?
These are all huge things to know. They impact the roadmaps for our technology, our interaction design, the kinds of products people make, what consumers want or expect. So it was incredibly important for us to figure out a path that let us address as many of these things as possible.
To this end, we wanted to create something with the highest possible technical specifications, and then work our way down until we had something that struck a balance between performance and form-factor.
All of these systems function using ‘ellipsoidal reflectors’, or sections of curved mirror which are cut from a larger ellipsoid. Due to the unique geometry of ellipses, if a display is put on one side of the curve and the user’s eye on the other, then the resulting image will be big, clear, and in focus.
We started by constructing a computer model of the system to get a sense of the design space. We decided to build it around 5.5-inch smartphone displays with the largest reflector area possible.
Next, we 3D-printed a few prototype reflectors (using the VeroClear resin with a Stratasys Objet 3D printer), which were hazy but let us prove the concept: We knew we were on the right path.
The next step was to carve a pair of prototype reflectors from a solid block of optical-grade acrylic. The reflectors needed to possess a smooth, precise surface (accurate to a fraction of a wavelength of light) in order to reflect a clear image while also being optically transparent. Manufacturing optics with this level of precision requires expensive tooling, so we “turned” to diamond turning (the process of rotating an optic on a vibration-controlled lathe with a diamond-tipped tool-piece).
Soon we had our first reflectors, which we coated with a thin layer of silver (like a mirror) to make them reflect 50% of light and transmit 50% of light. Due to the logarithmic sensitivity of the eye, this feels very clear while still reflecting significant light from the displays.
We mounted these reflectors inside of a mechanical rig that let us experiment with different angles. Behind each reflector is a 5.5″ LCD panel, with ribbon cables connecting to display drivers on the top.
While it might seem a bit funny, it was perhaps the widest field of view, and the highest-resolution AR system ever made. Each eye saw digital content approximately 105° high by 75° wide with a 60% stereo overlap, for a combined field of view of 105° by 105° with 1440×2560 resolution per eye.
The vertical field of view struck us most of all; we could now look down with our eyes, put our hands at our chests and still see augmented information overlaid on top of our hands. This was not the minimal functionality required for a compelling experience, this was luxury.
This system allowed us to experiment with a variety of different fields of view, where we could artificially crop things down until we found a reasonable tradeoff between form factor and experience.
We found this sweet spot around 95° x 70° with a 20 degree vertical (downwards) tilt and a 65% stereo overlap. Once we had this selected, we could cut the reflectors to a smaller size. We found the optimal minimization amount empirically by wearing the headset and marking the reflected displays’ edges on the reflectors with tape. From here, it was a simple matter of grinding them down to their optimal size.
The second thing that struck us during this testing process was just how important the framerate of the system is. The original headset boasted an unfortunate 50 fps, creating a constant and impossible to ignore slosh in the experience. With the smaller reflectors, we could move to smaller display panels with higher refresh rates.
At this point, we needed to make our own LCD display system (nothing off the shelf goes fast enough). We settled on a system architecture that combines an Analogix display driver with two fast-switching 3.5″ LCDs from BOE Displays.
Put together, we now had a system that felt remarkably smaller:
The reduced weight and size feel exponential. Every time we cut away one centimeter, it felt like we cut off three.
We ended up with something roughly the size of a virtual reality headset. In whole it has fewer parts and preserves most of our natural field of view. The combination of the open air design and the transparency generally made it feel immediately more comfortable than virtual reality systems (which was actually a bit surprising to everyone who used it).
We mounted everything on the bottom of a pivoting ‘halo’ that let you flip it up like a visor and move it in and out from your face (depending on whether you had glasses).
Sliding the reflectors slightly out from your face gave room for a wearable camera, which we threw together created from a disassembled Logitech (wide FoV) webcam.
All of the videos you’ve seen were recorded with a combination of these glasses and the headset above.
Last we want to do one more revision on the design to have room for enclosed sensors and electronics, better cable management, cleaner ergonomics and better curves (why not?) and support for off the shelf head-gear mounting systems. This is the design we are planning to open source next week.
There remain many details we feel that would be important to further progressions of this headset. Some of which are:
Inward-facing embedded cameras for automatic and precise alignment of the augmented image with the user’s eyes as well as eye and face tracking.
Head mounted ambient light sensors for 360 degree lighting estimation.
Directional speakers near the ears for discrete, localized audio feedback
Electrochromatic coatings on the reflectors for electrically controllable variable transparency
Micro-actuators that move the displays by fractions of a millimeter to allow for variable and dynamic depth of field based on eye convergence
The field of view could be even further increased by moving to slightly non-ellipsoidal ‘freeform’ shapes for the reflector, or by slightly curving the displays themselves (like on many modern smartphones).
Mechanical tolerance is of the utmost importance, and without precise calibration, it’s hard to get everything to align. Expect a post about our efforts here as well as the optical specifications themselves next week.
However, on the whole, what you see here is an augmented reality system with two 120 fps, 1600×1440 displays with a field of view covering over a hundred degrees combined, coupled with hand tracking running at 150 fps over a 180°x 180° field of view. Putting this headset on, the resolution, latency, and field of view limitations of today’s systems melt away and you’re suddenly confronted with the question that lies at the heart of this endeavor:
What shall we build?
http://blog.leapmotion.com/project-north-star-mechanical-update-3/
Today we’re excited to share the latest major design update for the. North Star Release 3 consolidates several months of research and insight into a new set of 3D files and drawings. Our goal with this release is to make Project North Star more inviting, less hacked together, and more reliable. The design includes more adjustments and mechanisms for a greater variety of head and facial geometries – lighter, more balanced, stiffer, and more inclusive.
With each design improvement and new prototype, we’ve been guided by the experiences of our test participants. One of our biggest challenges was the facial interface, providing stability without getting in the way of emoting.
Now, the headset only touches the user’s forehead, and optics simply “float” in in front of you. The breakthrough was allowing the headgear and optics to self-align between face and forehead independently. As a bonus, for the first time, it’s usable with glasses!
Release 3 has a lot packed into it. Here are a few more problems we tackled:
New forehead piece. While we enjoyed the flexibility of the welder’s headgear, it interfered with the optics bracket, preventing the optics from getting close enough. Because the forehead band sat so low, the welder’s headgear also required a top strap.
Our new headgear design sits higher and wider, taking on the role of the top strap while dispersing more weight. Choosing against a top strap was important to make it self-evident how the device is worn, making it more inviting and a more seamless experience. New users shouldn’t need help to put on the device.
Now, in addition to the new forehead, brakes are mounted to each side of the headgear. The one-way brake mechanism allows the user to slide the headset towards their face, but not outwards without holding the brake release. The spring is strong enough to resist slipping – even when looking straight down – but can be easily defeated by simply pulling with medium force in case of emergency.
Weight, balance, and stiffness comes as a whole. Most of the North Star headset’s weight comes from the cables. Counterbalancing the weight of the optics by guiding the cables to the back is crucial for comfort, even if no weight is removed. Routing the cables evenly between left and right sides ensures the headset isn’t imbalanced.
By thickening certain areas and interlocking all the components, we stiffened the design so the whole structure acts cohesively. Now there is much less flexure throughout. Earlier prototypes included aluminum rods to stiffen the structure, but clever geometry and better print settings offered similar performance (with a few grams of weight saved)! Finally, instead of thread-forming screws, brass inserts were added for a more reliable and repeatable connection.
Interchangeable focal distances. Fixed focal distances are one of the leading limiting factors in current VR technology. Our eyes naturally change focus to accommodate the real world, while current VR tech renders everything to the same fixed focus. We spent considerable time determining where North Star’s focal distance should be set, and found that it depends on the application. Today we’re releasing two pairs of display mounts – one at 25cm (the same as previous releases) and the other at an arm length’s 75cm. Naturally 75cm is much more comfortable for content further away.
Finally, a little trick we developed for this headgear design: bending 3D prints. An ideal VR/AR headset is light yet strong, but 3D prints are anisotropic – strong in one direction, brittle in another. This means that printing large thin curves will likely result in breaks.
Instead, we printed most of the parts flat. While the plastic is still warm from the print bed, we drape the plastic over a mannequin head. A few seconds later, the plastic cools enough to retain the curved shape. The end result is very strong while using very little plastic.
http://blog.leapmotion.com/north-star-open-source/
At Leap Motion, we envision a future where the physical and virtual worlds blend together into a single magical experience. At the heart of this experience is hand tracking, which unlocks interactions uniquely suited to virtual and augmented reality. To explore the boundaries of interactive design in AR, we created , which drove us to push beyond the limitations of existing systems.
Today, we’re excited to share the open source schematics of the North Star headset, along with a short guide on how to build one. By open sourcing the design and putting it into the hands of the hacker community, we hope to accelerate experimentation and discussion around what augmented reality can be. You can or , where it’s been published under an Apache license.
Our goal is for the reference design to be accessible and inexpensive to build, using off-the-shelf components and 3D-printed parts. At the same time, these are still early days and we’re looking forward to your feedback on this initial release. The mechanical parts and most of the software are ready for primetime, while other areas are less developed. The reflectors and display driver board are custom-made and expensive to produce in single units, but become cost-effective at scale. We’re also exploring how the custom components might be made more accessible to everyone.
The headset features two 120 fps, 1600×1440 displays with a field of view covering over a hundred degrees combined. While the classic Leap Motion Controller’s FOV is significantly beyond existing AR headsets such as Microsoft Hololens and Magic Leap One, it felt limiting on the North Star headset. As a result, we used our next-generation ultra-wide tracking module. These new modules are already being embedded directly into upcoming VR headsets, with AR on the horizon.
It’s time to look beyond platforms and form factors, to the core user experience that makes augmented reality the next great computing medium. Let’s build it together.
http://blog.leapmotion.com/project-north-star-mechanical-update-1/
This morning, we released an update to the North Star headset assembly. The project CAD files now fit the Leap Motion Controller and add support for alternate headgear and torsion spring hinges.
With these incremental additions, we want to broaden the ability to put together a North Star headset of your own. These are still works in progress as we grow more confident with what works and what doesn’t in augmented reality – both in terms of industrial design and core user experience.
This alternate 3D printed bracket is a drop-in replacement for Project North Star. Since parts had to move to fit the Leap Motion Controller at the same origin point, we took the opportunity to cover the display driver board and thicken certain areas. Overall these updates make the assembly stiffer and more rugged to handle.
When we first started developing North Star prototypes, we used 3M’s Speedglas Utility headgear. At the time, the optics would bounce around, causing the reflected image to shake wildly as we moved our heads. We minimized this by switching to the stiffer Miller headgear and continued other improvements for several months.
However, the 3M headgear was sorely missed, as it was simple to put on and less intimidating for demos. Since then we added cheek alignment features, which solved the image bounce. As a result we’ve brought back the earlier design as a supplement to the release headgear. The headgear and optics are interchangeable – only the hinges need to match the headgear. Hopefully this enables more choices in building North Star prototypes.
One of the best features of the old headgear was torsion hinges, which we’ve introduced with the latest release. Torsion hinges lighten the clamping load needed to keep the optics from pressing on users’ faces. (Think of a heavy VR HMD – the weight resting on the nose becomes uncomfortable quickly.)
We can’t wait to share more of our progress in the upcoming weeks – gathering feedback, making improvements, and seeing what you’ve been doing with Leap Motion and AR. The community’s progress so far has been inspirational. Given the barriers to producing reflectors, we’re currently exploring automating the calibration process as well as a few DIY low-cost reflector options.
http://blog.leapmotion.com/project-north-star-mechanical-and-calibration-update-3-1/
The future of open source augmented reality just got easier to build. Since our last major release, we’ve streamlined Project North Star even further, including improvements to the calibration system and a simplified optics assembly that 3D prints in half the time. Thanks to feedback from the developer community, we’ve focused on lower part counts, minimizing support material, and reducing the barriers to entry as much as possible. Here’s what’s new with version 3.1.
As we discussed in our post on the , small variations in the headset’s optical components affect the alignment of the left- and right-eye images. We have to compensate for this in software to produce a convergent image that minimizes eye strain.
Before we designed the calibration stand, each headset would need to have its screen positions and orientations manually compensated for in software. With the North Star calibrator, we’ve automated this step using two visible-light stereo cameras. The optimization algorithm finds the best distortion parameters automatically by comparing images inside the headset with a known reference. This means that auto-calibration can find best possible image quality within a few minutes. Check out our for instructions on the calibration process.
Building on feedback from the developer community, we’ve made the assembly easier and faster to put together. Mechanical Update 3.1 introduces a simplified optics assembly, designated #130-000, that cuts print time in half (as well as being much sturdier).
The biggest cut in print time comes from the fact that we no longer need support material on the lateral overhangs. In addition, two parts were combined into one. This compounding effect saves an entire workday’s worth of print time!
Left: 1 part, 95g, 7 hours, no supports. Right: 2 parts, 87g, 15 hour print, supports needed.
The new assembly, #130-000, is backwards compatible with Release 3. Its components substitute #110-000 and #120-000, the optics assembly, and electronics module respectively. Check out the assembly drawings in the for the four parts you need!
Last but not least, we’ve made a small cutout for the power pins on the driver board mount. When we received our , we quickly noticed the interference and made the change to all the assemblies.
This change makes it easy if you’re using pins or soldered wires, either on the top or bottom.
Another problem with the previous designs was slide-away optics. The optics bracket would slide away from the face occasionally, especially if the user tried to look downward.
While the bleeding edge of Project North Star development is in our San Francisco tech hub, the work of the open source community is a constant source of inspiration. With so many people independently 3D printing, adapting, and sharing our AR headset design, we can’t wait to see what you do next with Project North Star. You can download the latest designs from the.
Project North Star is very much a work in progress. Over the coming weeks, we’ll continue to post updates to the core release package. Let us know what you think in the comments and . If your company is interested in bringing North Star to the world, email us at .
If you’re reading this, odds are good that you already own a Leap Motion Controller. (If you don’t, it’s available today on our .) Featuring the speed and responsiveness of our , its 135° field of view extends beyond the North Star headset’s reflectors. The device can be easily taken in and out of the headset for rapid prototyping across a range of projects.
Two torsion springs constantly apply twisting force on the aluminum legs, fighting gravity acting on the optics. The end result is the user can acutely suspend the optics above the nose, and even completely flip up the optics with little effort. After dusting off the original hinge prototypes, rotation limits and other simple modifications were made (e.g. using the same screws as the rest of the assembly). Check out the for details.
You can catch up on the updated parts on the Project North Star , and print the latest files from our . Come back soon for the latest updates!
Want to stay in the loop on the latest North Star updates? Join the !
Part number
Quantity
Description
Source
#110-001
1
OPTICS BRACKET MAIN BODY
3DPRINT
#110-002
2
SLIDE ENDCAP
3DPRINT
#111-001
0
LEFT DISPLAY TRAY - 75cm FOCAL
3DPRINT
#112-001
0
RIGHT DISPLAY TRAY - 75cm FOCAL
3DPRINT
#111-002
1
LEFT DISPLAY TRAY - 25cm FOCAL
3DPRINT
#112-002
1
RIGHT DISPLAY TRAY - 25cm FOCAL
3DPRINT
#113-001
1
LEFT ROTATION ADJUSTMENT SLIDE MOUNT
3DPRINT
#114-001
1
RIGHT ROTATION ADJUSTMENT SLIDE MOUNT
3DPRINT
#115-001
1
FOREHEAD REST BASE SHAPE
3DPRINT
#120-001
1
HALO MAIN BODY
3DPRINT
#120-002
1
HALO LID
3DPRINT
#121-001
1
HALO ELECTRONICS HOLDER
3DPRINT
#121-002
1
HALO ELECTRONICS DP PORT HOLDER
3DPRINT
#130-001
0
SIMPLIFIED OPTICS MAIN BODY
3DPRINT
#131-001
0
SIMPLIFIED DRIVER BOARD MOUNT
3DPRINT
#131-002
0
SIMPLIFIED DP PORT HOLDER
3DPRINT
#210-001
1
FOREHEAD MAIN STRUCTURE
3DPRINT
#210-002
1
FOREHEAD HEADGEAR SPAN
3DPRINT
#210-003
2
FOREHEAD HINGE CAP
3DPRINT
#210-004
2
FOREHEAD HINGE BASE
3DPRINT
#220-001
2
REAR HINGE BASE
3DPRINT
#220-002
2
REAR HINGE CAP
3DPRINT
#230-001
1
LEFT SIDE MAIN SPAN WITH SLIDE BRAKE
3DPRINT
#230-002
1
LEFT BRAKE HOUSING
3DPRINT
#230-003
1
LEFT BRAKE BUTTON
3DPRINT
#230-004
1
LEFT CABLE GUIDE
3DPRINT
#240-001
1
RIGHT SIDE MAIN SPAN WITH SLIDE BRAKE
3DPRINT
#240-002
1
RIGHT BRAKE HOUSING
3DPRINT
#240-003
1
RIGHT BRAKE BUTTON
3DPRINT
#240-004
1
RIGHT CABLE GUIDE
3DPRINT
#500-001
1
REFLECTORS
#630-001
1
0.304-in OD TORSION SPRING 180 DEG. L-HAND WOUND
#630-002
1
0.304-in OD TORSION SPRING 180 DEG. R-HAND WOUND
#640-001
28
M2.5x0.45 6mm LONG STEEL FLAT HEAT SCREW 90 DEG CS
#640-002
2
M4x0.7 20mm STEEL BUTTON HEAD SOCKET SCREW
#650-001
36
M2x8mm LONG THREAD-FORMING SCREWS FOR PLASTIC
#660-001
2
M4x0.7mm ZINC PLATED NYLON INSERT HEX LOCK NUT
#670-001
2
5/16 ID x 3/4 OD x 1/16in NEOPRENE WASHER
#690-001
21
M2.5x0.45 3.4mm LONG HEAT-SET INSERT FOR PLASTICS
#700-002
2
BOE 3.5-in VS035ZSM-NW0 120HZ LTPS 1440*1600 Module
#800-001
1
DISPLAY DRIVER BOARD
#900-001
1
MILLER WELDING GEN. III HEADGEAR
#900-002
1
2mm THICK SELF-STICK ANTI-SKID RUBBER
#900-003
1
FOAM FOREHEAD PADDING
#900-004
1
FOAM BACKED FABRIC
#900-005
1
3x10x165mm 6061 AL. BAR SLIDE
Pulled from https://leapmotion.github.io/ProjectNorthStar/mechanical.html
This Assembly Guide is for version 1 of the headset, for the most recent version click here
The purpose of this guide is to instruct, with access to a few common tools, how to make a Project North Star reference AR headset. It's aimed to be accessible and inexpensive to build, using as many off-the-shelf components and 3D-printed parts as possible. For now, several key components are custom-made (e.g. the reflectors, display driver board, and our custom ultra-wide hand tracking module), but we know that together we can find make some work-arounds and alternatives.
Although not a complete AR solution, Project North Star points to a future where the physical and virtual worlds blend together into a single magical experience. With it we hope to gather like-minded people and build that future together.
The headset is made up of three modular assemblies: the optics bracket, the headgear with hinges, and the halo (which contains all the electronics). This configuration has been very useful as we iterated on different parts. The parts are 3D printed using consumer grade 3D printers and materials, with a couple parts from McMaster-Carr. The goal is to have an easily reproduced AR headset that anyone can put together.
Ignoring the reflectors, you'll need to 3D print roughly 20+ hours of parts and cut, drill and tap aluminum bar stock. Nothing expensive or major requiring a machine shop, but a few tools are expected. Read below for the list.
We are exploring different possibilities for individuals to obtain reflectors.
These parts and this guide are still a work in progress
3D printer and software: MakerGear M2, Simplify3D, print profiles: Rsilver's M2 Simplify 3D optimized profiles, Thingiverse
Filament: eSun PLA+, Amazon
T6 Torx driver for thin-plastic screws Torx T6 driver, McMaster
Phillips driver for M2.5 screws Screwdriver Phillips #1 size, Amazon
M2.5 Screw thread tap *: M2.5 x 0.45 mm Thread Tap, McMaster
Hot glue gun (on low setting < 40W), for reflectors Hot glue gun, Amazon
Latex or Nitrile gloves (minimize fingerprints on optical components) Sterile gloves, Amazon
Optional: These are our favorite 3D print finishing tools (wire cutters, chisel, file, sand paper, hand drill, etc.)
Depending on which Leap Motion hand tracking module you use, it will determine which sets of 3D prints you'll need. Check the 3D printing section for more details.
Retail Leap Motion Controller - Amazon -or- Custom ultra-wide tracking module
See the Electronics folder in the main repo for display driver schematics and firmware.
BOE produces 3.5-inch 1440*1600 modules and refers to them as VS035ZSM-NW0; however they are not listed publicly because they are only sold in large quantities. The panel listed as VS035ZSM-NH0 is no longer in production nor have we tested it with our driver board. On a broad level, it is possible to make the hardware more accessible through bulk orders of the core components, which is currently under exploration.
What may seem intimidating at first is conceptually very simple. One property of elipsoids is that all the light emitted from one focus reflects off its surface and focuses it to the second focus! Amazing. Therefore, to figure out the section of elipsoid needed in our AR application, we traced a ray backwards from the second focus, through a cellphone-sized.
The resultant elipsoid is defined by the following dimensions, chosen to allow a range of 3-to-5-inch LCD panels to be placed near the eye. The perimeter of the elipsoid section was decided on empirically by minimizing distortion with a larger prototype, then cropping to the reflected image. This large reflector was roughly 120x120mm, and the final ended up being around 75x75mm.
The reflectors are diamond-turned and milled from PMMA (a.k.a. acrylic). The outside surface needs an anti-reflective coating to prevent a secondary reflection from degrading the image. Indoors and subjectively, 50-50% (transmissive-reflective) anti-reflective coating provided the best results for us.
From the elipsoid definition, the reflectors need to be rotated and translated into position. We define the origin point as the midpoint between the two reflector foci, which should line up with the user's pupils:
Light and adjustable with rigid mount points. Several welder's mask headgear vendors were tested, but most failed to be rigid or comfortable for our applications. Their general design is made of injection-molded plastic features that tighten onto either side of a welding faceshield. Although it may be desireable for those features to be flexible and compliant for welding, our heavier AR assemblies caused the headgear to flex and/or dig into the users' head. Ouch. The following Miller branded model's had very little flex and generally comfortable. It really helps they're lightweight as well.
These two specific generations of headgears yielded the best results quickly, mounting evolving electronics on the most varied heads. The amount of adjustment points makes it work for the large majority of users.
Welding Replacement Headgear - Miller Generation IV for T94 Headgear, Amazon Part #260486 or Miller Replacement Headgear, Amazon Part #256174
Alternative:
The 3M Speedglas Utility headgear was the first headgear we used for North Star prototypes. At the time, the optics would bounce around, causing the reflected image to shake wildly as we moved around. We minimized this by switching to the stiffer Miller headgear and continued other improvements for several months. However, the 3M headgear was sorely missed, as it was simple to put-on and less intimidating for demos. Since then we added cheek alignment features, which solved the image bounce, and therefore makes sense to bring back this design as a supplement to the release headgear. The headgear and optics are interchangeable, only the hinges need to match the headgear.
Welding Replacement Headgear - 3M Speedglas Welding Helmet Headband, Amazon
Machine screws, 10-32 Thread Size, 7/8" Long "Steel Button Head Hex Drive Screw" - McMaster #92949a270 Matching nuts, 10-32 Thread Size, Stainless "Steel Nylon-Insert Locknut" - McMaster #91831a411
Cut, drill, and tap the slides to roughly 150mm (6 inches) in length. We used a Dremel rotary tool to cut the bar stock to length, and transferred the hole pattern from the Halo 3D print using a transfer punch. The two holes are tapped using M2.5 x 0.45 mm Thread Tap, McMaster
Cost and speed being the most important design parameters, this DIY method performs as well as a machined part at 100x lower price. Without access to a CNC metal workshop it's surprisingly expensive to have 1-2 parts machined.
Each component has suggested orientation and print settings for the best performance and least print time. Proceed to the 3D print section for information on each print: 3D Printing Part Reference
The NorthStar headset has three main sub-assemblies: the optics bracket, the electronics tray, and the adjustable headgear.
(see below for alternate headgear with torsion hinge)
1) Verify the aluminum bar stock slides without resistance inside the 3D printed hinges. Tip: Repeatedly sliding the bar stock quickly through a new hinge clears out any obtrusive burs, and the heat generated shapes the plastic.
2) Disassemble old hinges from headgear, set aside the o-rings:
3) Assemble new hinge as shown below, using screw/nut H
. Tighten until the o-ring is still visible and the hinge has considerable resistance turning.
4) Slide hinge onto headgear, aligning with front of track. Lock in place using screw F
1) Remove the mounting assemblies on both sides of the headgear, enlarge hole to 18mm. We used a stepped drill to enlarge the hole, a little larger than the diameter of the spring we’re using.
2) Place matching spring into rear housing, fasten close, and trim spring legs.
3) Place the outer hinge housing onto the spring, in the orientation shown. Using the perimeter fasteners, clamp down onto the spring leg, aligning it with the cutout.
4) Fasten the large center locknut and screw, adjusting it for the desired resistance. Trim the remaining spring leg.
5) Fasten display driver B
to 3D printed bracket using F
screws x 3. Fasten hand tracking module A
to 3D print using screws from its enclosure
6) Place USB cable into USB strain relief, and fasten using F
screws x 2 to electronics tray
7) Adjust USB cable until the flex connector is aligned with connector and doesn't strain
8) Fasten electronics tray to halo using 2 screws F
9) Fasten optics bracket to the halo using 6 screws F
10) Fasten displays and facial interfaces into optics bracket using 4 screws per side G
(8 total).
11) Carefully connect the display flex connector to the driver board
12) Fasten lid to halo using 4 screws F
from the bottom as shown
13) Secure wires to lid using cable-ties through the anchors
14) Verify reflectors align with tabs in optics bracket, scrape away offending plastic if not. Using hot-glue, "tack" the reflectors into the halo bracket. Be careful not to use too much and be quick enough that the glue doesn't solidify before the reflectors are in place.
15) Fasten slides to assembled optics bracket using 4 x G
screws
16) Make adjustments for user's head.
Not optimal for users with glasses
Electronics bracket hard to access/service once installed
No wire management for display flex connectors
Imbalanced weight due to cables: USB, display port, and power
Pulled from https://leapmotion.github.io/ProjectNorthStar/3dprints.html
Note, this refence is for the initial Version 1 Release, however it has useful information for 3D printing in general and is still worth reading.
Use your slicer's part orientation tools to ensure flat contact with print surface. In Simplify 3D, it can be found in the menu "Edit > Place Surface On Bed"
The optics bracket is the most difficult part to FDM print. The toed-in optics and long thin beams require a tuned printer and large print bed (200x250mm). To reduce the chance for print failure, support material is manually removed where it is thin and tall (see image below). Furthermore, to alleviate the finishing work, the print is separated into 2 print profiles: fast and high-res. The high-res profile's goal is to reduce stepping on the near-horizontal crossbeam. Afterwards, a file and 220-grit sandpaper very quickly make the surface passable without the need for filler. Although printing the entire bracket with one profile also produces good brackets, this hybrid solution saved time and produced better results without much effort. Makes sense if a file and sand paper is already available.
All the optical components are mounted to this bracket. Take your time, be meticulous. It's crucial that care is taken in producing a good print. This will save time during calibration.
Choose the correct halo for the Leap Motion module you're using. There are currently two variants: one for the released Leap Motion Controller and one for our custom ultra-wide FOV module not yet available.
Choose the matching lid for the halo you're using. There are currently two variants: one for the released Leap Motion Controller and one for our custom ultra-wide FOV module not yet available.
Only needed for the custom hand tracking module. If you're using the Leap Motion Controller inside a metal housing, you don't need this part.
Only needed for the custom hand tracking module. If you're using the Leap Motion Controller inside a metal housing, you don't need this part.
The print is done using multiple print profiles to minimize the need for support material. By printing the feature at a lower temperature the plastic can span longer distances without support. Also, align the print roughly as shown (~45 degrees) to prevent the support material from printing orthogonal to lip, which makes it harder to remove in post.
After printing, remove and clean up any support material so that the display slides in easily. We used a sharp chisel to scrap away any remaining plastic burs.
Check the lip of the part is flat and consistent throughout to prevent any extra light leaking from the edges of the LCD panels.
Optional step is tapping the screw holes to make sure any extra plastic is removed and the screws will fasten smoothly.
There are two styles of hinges to choose from depending on the headgear you're using.
Print from: 0 - 5mm:
layer height
0.25mm
infill
20% rectilinear
top/bottom layers
3
perimeter walls
2
temperature
200 C
supports
yes, 60% infill
Print from: 5 - 32mm:
layer height
0.15mm
infill
20% rectilinear
top/bottom layers
6
perimeter walls
3
temperature
200 C
supports
yes, 40% infill
Print from: 32mm - end:
layer height
0.25mm
infill
20% rectilinear
top/bottom layers
3
perimeter walls
2
temperature
200 C
supports
yes, 60%
layer height
0.25mm
infill
20% rectilinear
top/bottom layers
3
perimeter walls
3
temperature
200 C
supports
yes
layer height
0.25mm
infill
20% rectilinear
top/bottom layers
3
perimeter walls
2
temperature
200 C
supports
yes
layer height
0.25mm
infill
20% rectilinear
top/bottom layers
3
perimeter walls
2
temperature
200 C
supports
no
layer height
0.25mm
infill
20% rectilinear
top/bottom layers
3
perimeter walls
2
temperature
200 C
supports
no
Print from: 0 - 3mm:
layer height
0.25mm
infill
20% rectilinear
top/bottom layers
3
perimeter walls
2
temperature
200 C
supports
yes
Print from: 3mm - end:
layer height
0.25mm
infill
20% rectilinear
top/bottom layers
3
perimeter walls
2
temperature
190 C
supports
yes
layer height
0.25mm
infill
20% rectilinear
top/bottom layers
3
perimeter walls
2
temperature
200 C
supports
yes
layer height
0.25mm
infill
20% rectilinear
top/bottom layers
3
perimeter walls
2
temperature
200 C
supports
yes
BOM Ref.
Part Name
Quantity
Procurement
A
Leap Motion Controller
1
B
Display Driver
1
TBD
C
3.5-inch 120Hz LCDs
2
TBD
D
Acrylic reflectors
2
E
Headgear
1
F
Self-tapping screws for plastic
1 pk
G
Machine screws M2.5 x 6mm
1 pk
H
Hinge fasteners, #10-32
1 pk ea.
I
Al. bar stock 10x3mm
12-in [300 mm]
J
3D printed parts
-