Sensors and Autonomous Systems

The Company provides a range of advanced space-based sensors and autonomous systems for mission-critical support to human space flight and robotic missions.

XSS11-MicroSatellite

The Company is a leading developer of ruggedized smart sensors for real-time environmental sensing in space. These optical and LiDAR solutions incorporate real-time image processing and standardized communication interfaces to support manned and unmanned applications. ​The Company's autonomous control systems, smart cameras, and scientific instruments for planetary exploration and Earth observation instruments support mission-critical, long-life space exploration applications, as well as short-duration space missions

Space and defence markets are driving demand for low power, low mass, smart sensor systems with on-board intelligence to reduce transmitted data, pointing to a robust decentralized computational engine for real-time 2D and 3D imagery. Included in these solutions are vision processing, pose estimation and low-power/low-mass cameras. ​The Company has multiple products for these markets, including its 6th generation MicroCam, a “GoPro™” size camera system with built-in real-time High Dynamic Range (HDR) and on-board data compression intended for space applications.

 
SPACE CAMERAS: flight certified camera and light systems
OE
EBCS
HBCS
SM4
OTVC
RCAM
DARPA

Since 1999, the Company has actively designed, developed, refurbished, and flight certified multiple camera and light systems for use in ​low ​Earth ​orbit, including cameras and illumination systems specifically designed for long life (>10 years) in operation with the Mobile Servicing System (MSS) aboard the International Space Station (ISS)

RCAM

Mobile Servicing System Replacement Camera (RCAM) is a space qualified colour camera presently in development for use on the ISS. The camera has a high-resolution, high dynamic range image sensor and a 9:1 digital zoom capability. It supports a variety of outputs including Camera Link and analogue National Television System Committee (NTSC). The telemetry of all camera settings, including a unique camera identifier and timestamp, are embedded in the header of each image. The RCAM also includes a LED illuminator. The RCAM will be used during spacecraft berthing operations, ISS infrastructure inspection, and for support during Canadarm2 and "Dextre" robotic operations as well as astronaut EVA support.

MicroCam

The ​Company MicroCam is designed as a flexible camera platform with a minimal size, weight, and power envelope that is qualified for Geostationary Orbit (GEO). MicroCam is a compact 6 Megapixel monochromatic camera with high dynamic range imaging, auto-exposure, on sensor sub-sampling and windowing, as well as adjustable gain, gamma and black levels. The telemetry of all camera settings, including a unique camera identifier and timestamp, are embedded in the header of each image. The camera has a variety of pre-defined interfaces, a large focal plane array, and a ruggedized C-mount. Optional LED lights are also available. A selection of different lenses is available to provide the needed flexibility to address different applications such as: robotic tool tasks, vehicle inspections, satellite deployment verification, rendezvous and proximity operations, and long-range vehicle detection.

 
RENDEZVOUS / MAPPING LIDAR: systems for rendezvous and proximity operations

OSIRISSince 2000, ​the Company has been actively doing work in “time of flight” LIDARs. Teaming with our terrestrial partner Optech, ​the Company has produced the XSS-11 LIDAR for rendezvous and proximity operations, and the Phoenix meteorological LIDAR for detecting snow in the Martian atmosphere. More recently, the Company has developed a rendezvous LIDAR for NASA's OSIRIS-REx Mission for rendezvous with a dark asteroid named Bennu.

​The Company/Optech LIDARs are uniquely designed for long life in the harsh environment of space.  They are excellent products for high-resolution mapping of objects from significant distances (2-8 km). At the lower range the sensor can be used for relative navigation and proximity operations. 


XSS
OLA
 
DATA FUSION AND PROCESSING:​ increasing the reliability of navigation and mapping data products
Data Processing
OLA
​The Company has been actively working in the area of data fusion over the last 15 years. The primary objective for data fusion is to increase the reliability of navigation and mapping data products. In the area of relative navigation, data from visible cameras are combined with IR cameras and 3D points from a LiDAR to provide a multidimensional map of an object, for example, 3D data products with superimposed colour and thermal images showing “hot spots”. For relative navigation, sensor fusion increases the accuracy and the robustness of relative pose computations. Visible camera images and 3D points from the LiDAR are used via an EKF algorithm to generate a real-time pose of a target satellite relative to the sensor frame. 
 
AUTONOMOUS NAVIGATION: enabling long-range, fully autonomous navigation
DARPA
XSS
For the last decade, ​the Company has been developing Guidance, Navigation and Control (GN&C) capabilities to enable long-range, fully autonomous navigation. The maturity of these technologies has been greatly accelerated through field testing in planetary analogue environments, using different classes of rovers, ranging from off-the-shelf laboratory rovers to underground mining load-haul-dump vehicles, to prototype planetary exploration rovers for the Canadian Space Agency. ​The Company's extensive field trials continue to prove the robustness, accuracy, and reliability of ​Company-developed GN&C systems for terrestrial and space applications, and extend ​​its heritage in providing robust control systems for flight programs. Results from recent field trials in the Mojave Desert have further demonstrated the capability for autonomous precision driving beyond line of sight.
 
SOFTWARE AND AVIONICS: design, development, and support of electrical/electro-optic systems
  • Digital Electronics for Signal Processing
  • Custom Algorithms and Signal Processing
  • Computer Architecture
  • Analogue Electronics
  • Electro-mechanical Drive and Control
  • Signal, Command and Control Interfaces
  • Power Interfacing and Conditioning
  • Optical Designs
  • Design for Environment
  • Commercial EEE parts used in space

​The Company has a rich heritage in the design, development, and support of electrical/electro-optic systems for space, and terrestrial applications. The Company system designs range from full-custom design, to off-the-shelf assemblies. The main discriminators are:

  • Technical strengths of our Electrical/Electro-Optics group
  • Leveraging of COTS/MOTS components
  • Company investment and hardware in-hand, to demonstrate cost-effective solutions to the prospective customers

Data Processing
 
INTELLIGENT SYSTEMS: supervisory control layer that sits on top of a regular robotic control architecture
​The Company has been conducting R&D in intelligent systems since 2005. The idea of an intelligent system is to have a supervisory control layer that sits on top of a regular robotic control architecture. This layer works in two stages: training and operation. During the training stage, its function is to learn (via Artificial Neural Networks, or another self-adapting algorithm) the system behaviour when the robotic system is being controlled by human operators. During the operational stage, it monitors the system behaviour for off-nominal conditions, and generates high-level commands to recover. Intelligent supervisory control has been shown to be less mathematically complex. Introduction of heuristics in many cases allows for simpler control strategies, especially when traditional control methods produce extremely complex mathematical solutions.

OSIRIS-REx Investigates Asteroid Bennu
2:39
Credit: NASA Goddard