Introduction
This guide gives developers an overview of the Microsoft Windows 8.1 sensors application programming interfaces (APIs) for Windows 8.1 Desktop and Windows Store applications with a specific focus on the various sensor capabilities available in Windows 8.1 Desktop mode. This Development Guide summarizes the APIs that enable creating interactive applications by including some of the common sensors such as accelerometers, magnetometers, and gyroscopes with Windows 8.1.
Programming Choices for Windows 8.1
Developers have multiple API choices to program sensors on Windows 8.1. The touch-friendly app environment is called “Windows Store apps.” Windows Store apps can run software developed with the Windows Run-Time (WinRT) interface. The WinRT sensor API represents a portion of the overall WinRT library. For more details, please refer to the MSDN Sensor API library.
Traditional Win Forms, or MFC-style apps are called “Desktop Apps” because they run in the Desktop Windows Manager environment. Desktop apps can either use the native Win32*/COM API, a .NET-style API or a subset of select WinRT APIs.
The following is a list of WinRT APIs that can be accessed by Desktop apps:
- Windows.Sensors (Accelerometer, Gyrometer, Ambient Light Sensor, Orientation Sensor...)
- Windows.Networking.Proximity.ProximityDevice (NFC)
- Windows.Device.Geolocation (GPS)
- Windows.UI.Notifications.ToastNotification
- Windows.Globalization
- Windows.Security.Authentication.OnlineId (including LiveID integration)
- Windows.Security.CryptographicBuffer (useful binary encoding/decoding functions)
- Windows.ApplicationModel.DataTransfer.Clipboard (access and monitor Windows 8 Clipboard)
In both cases, the APIs go through a Windows middleware component called the Windows Sensor Framework. The Windows Sensor Framework defines the Sensor Object Model. The different APIs “bind” to that object model in slightly different ways.
Differences in the Desktop and Windows Store app development will be discussed later in this document. For brevity, we will consider only Desktop app development. For Windows Store app development, please refer to the API Reference for Windows Store apps.
Sensors
There are many kinds of sensors, but we are interested in the ones required for Windows 8.1, namely accelerometers, gyroscopes, ambient light sensors, compass, and GPS. Windows 8.1 represents the physical sensors with object-oriented abstractions. To manipulate the sensors, programmers use APIs to interact with the objects. The following table provides information on how the sensors can be accessed from both the Windows 8 Desktop apps as well as from Windows Store apps.
Windows 8.1 Desktop Mode Apps | Windows Store Apps | ||||
---|---|---|---|---|---|
Feature/Toolset | C++ | C#/VB | JavaScript*/ HTML5 | C++, C#, VB & XAML | Unity* 4.2 |
Orientation Sensors | Yes
| Yes
| Yes
|
| Yes |
Yes | Yes | Yes | |||
Yes | Yes | Yes | |||
Yes | Yes | Yes | Yes |
Table 1.Features Matrix for Windows* 8.1 Developer Environments
Below, Figure 1 identifies that there are more objects than actual hardware. Windows defines some “logical sensor” objects by combining information from multiple physical sensors. This is called “Sensor Fusion.”
Figure 1. Different sensors supported, starting on Windows* 8
Sensor Fusion
Physical sensor chips have some inherent natural limitations. For example:
- Accelerometers measure linear acceleration, which is a measurement of the combined relative motion and the force of Earth’s gravity. If you want to know the computer’s tilt, you’ll have to do some mathematical calculations.
- Magnetometers measure the strength of magnetic fields, which indicate the location of the Earth’s Magnetic North Pole.
These measurements are subject to an inherent drift problem, which can be corrected by using raw data from the Gyro. Both measurements are (scaled) dependent upon the tilt of the computer from level with respect to the Earth’s surface. For example, to obtain the computer’s heading with respect to the Earth’s True North Pole (Magnetic North Pole is in a different position and moves over time), corrections must be applied.
Sensor Fusion (Figure 2) is defined by obtaining raw data from multiple physical sensors, especially the Accelerometer, Gyro, and Magnetometer, performing mathematical calculations to correct for natural sensor limitations, computing more human-usable data, and representing those as logical sensor abstractions. The application developer must implement the necessary transformations required to translate physical sensor data to the abstract sensor data. If the system design has a SensorHub, the fusion operations will take place inside the microcontroller firmware. If the system design does not have a SensorHub, the fusion operations must be done inside one or more device drivers that the IHVs and/or OEMs provide.
Figure 2. Sensor fusion via combining output from multiple sensors
Identifying Sensors
To manipulate a sensor, a system is needed to identify and refer to. The Windows Sensor Framework defines a number of categories that sensors are grouped into. It also defines a large number of specific sensor types. Table 2 lists some of the sensors available for Desktop applications.
“All” | ||||||||
Biometric | Electrical | Environmental | Light | Location | Mechanical | Motion | Orientation | Scanner |
---|---|---|---|---|---|---|---|---|
Human Presence | Capacitance | Atmospheric Pressure | Ambient Light | Broadcast | Boolean Switch | Accelerometer 1D | Compass 1D | Barcode |
Human Proximity* | Current | Humidity | Gps | Boolean Switch Array | Accelerometer 2D | Compass 2D | Rfid | |
Touch | Electrical Power | Temperature | Static | Force | Accelerometer 3D | Compass 3D | ||
Inductance | Wind Direction | Multivalue Switch | Gyrometer 1D | Device Orientation | ||||
Potentio-meter | Wind Speed | Pressure | Gyrometer 2D | Distance 1D | ||||
Resistance | Strain | Gyrometer 3D | Distance 2D | |||||
Voltage | Weight | Motion Detector | Distance 3D | |||||
Speedometer | Inclinometer 1D | |||||||
Inclinometer 2D | ||||||||
Inclinometer 3D |
Table 2. Sensor types and categories
The sensor types required by Windows 8 are shown in bold font:
- Accelerometer, Gyro, Compass, and Ambient Light are the required “real/physical” sensors
- Device Orientation and Inclinometer are the required “virtual/fusion” sensors (note that the Compass also includes fusion-enhanced/tilt-compensated data)
- GPS is a required sensor if a WWAN radio exists, otherwise GPS is optional
- Human Proximity is an oft-mentioned possible addition to the required list, but, for now, it’s not required.
All of these constants correspond to Globally Unique IDs GUIDs. Below, in Table 3, is a sample of some of the sensor categories and types, the names of the constants for Win32/COM and .NET, and their underlying GUID values.
Identifier | Constant (Win32*/COM) | Constant (.NET) | GUID |
---|---|---|---|
Category “All” | SENSOR_CATEGORY_ALL | SensorCategories.SensorCategoryAll | {C317C286-C468-4288-9975-D4C4587C442C} |
Category Biometric | SENSOR_CATEGORY_BIOMETRIC | SensorCategories.SensorCategoryBiometric | {CA19690F-A2C7-477D-A99E-99EC6E2B5648} |
Category Electrical | SENSOR_CATEGORY_ELECTRICAL | SensorCategories.SensorCategoryElectrical | {FB73FCD8-FC4A-483C-AC58-27B691C6BEFF} |
Category Environmental | SENSOR_CATEGORY_ENVIRONMENTAL | SensorCategories.SensorCategoryEnvironmental | {323439AA-7F66-492B-BA0C-73E9AA0A65D5} |
Category Light | SENSOR_CATEGORY_LIGHT | SensorCategories.SensorCategoryLight | {17A665C0-9063-4216-B202-5C7A255E18CE} |
Category Location | SENSOR_CATEGORY_LOCATION | SensorCategories.SensorCategoryLocation | {BFA794E4-F964-4FDB-90F6-51056BFE4B44} |
Category Mechanical | SENSOR_CATEGORY_MECHANICAL | SensorCategories.SensorCategoryMechanical | {8D131D68-8EF7-4656-80B5-CCCBD93791C5} |
Category Motion | SENSOR_CATEGORY_MOTION | SensorCategories.SensorCategoryMotion | {CD09DAF1-3B2E-4C3D-B598-B5E5FF93FD46} |
Category Orientation | SENSOR_CATEGORY_ORIENTATION | SensorCategories.SensorCategoryOrientation | {9E6C04B6-96FE-4954-B726-68682A473F69} |
Category Scanner | SENSOR_CATEGORY_SCANNER | SensorCategories.SensorCategoryScanner | {B000E77E-F5B5-420F-815D-0270ª726F270} |
Type HumanProximity | SENSOR_TYPE_HUMAN_PROXIMITY | SensorTypes.SensorTypeHumanProximity | {5220DAE9-3179-4430-9F90-06266D2A34DE} |
Type AmbientLight | SENSOR_TYPE_AMBIENT_LIGHT | SensorTypes.SensorTypeAmbientLight | {97F115C8-599A-4153-8894-D2D12899918A} |
Type Gps | SENSOR_TYPE_LOCATION_GPS | SensorTypes.SensorTypeLocationGps | {ED4CA589-327A-4FF9-A560-91DA4B48275E} |
Type Accelerometer3D | SENSOR_TYPE_ACCELEROMETER_3D | SensorTypes.SensorTypeAccelerometer3D | {C2FB0F5F-E2D2-4C78-BCD0-352A9582819D} |
Type Gyrometer3D | SENSOR_TYPE_GYROMETER_3D | SensorTypes.SensorTypeGyrometer3D | {09485F5A-759E-42C2-BD4B-A349B75C8643} |
Type Compass3D | SENSOR_TYPE_COMPASS_3D | SensorTypes.SensorTypeCompass3D | {76B5CE0D-17DD-414D-93A1-E127F40BDF6E} |
Type DeviceOrientation | SENSOR_TYPE_DEVICE_ORIENTATION | SensorTypes.SensorTypeDeviceOrientation | {CDB5D8F7-3CFD-41C8-8542-CCE622CF5D6E} |
Type Inclinometer3D | SENSOR_TYPE_INCLINOMETER_3D | SensorTypes.SensorTypeInclinometer3D | {B84919FB-EA85-4976-8444-6F6F5C6D31DB} |
Table 3. Example of Constants and Globally Unique IDs (GUIDs)
Above are the most commonly used GUIDs; there are many available. At first you might think that the GUIDs are silly and tedious, but there is one good reason for using them: extensibility. Since the APIs don’t care about the actual sensor names (they just pass GUIDs around), it is possible for vendors to invent new GUIDs for “value add” sensors.
Generating New GUIDs
Microsoft provides a tool in Visual Studio* for generating new GUIDs. Figure 3 shows a screenshot from Visual Studio for doing this. All the vendor has to do is publish them, and new functionality can be exposed without the need to change the Microsoft APIs or any operating system code at all.
Figure 3. Defining new GUIDs for value add sensors
Using Sensor Manager Object
In order for an app to use a sensor, Microsoft Sensor Framework needs a way to “bind” the object to actual hardware. It does this via Plug and Play, using a special object called the Sensor Manager Object.
Ask by Type
An app can ask for a specific type of sensor, such as Gyrometer3D. The Sensor Manager consults the list of sensor hardware present on the computer and returns a collection of matching objects bound to that hardware. Although the Sensor Collection may have 0, 1, or more objects, it usually has only one. Below is a C++ code sample illustrating the use of the Sensor Manager object’s GetSensorsByType method to search for 3-axis Gyros and return them in a Sensor Collection. Note that a ::CoCreateInstance() must be made for the Sensor Manager Object first.
// Additional includes for sensors #include <InitGuid.h> #include <SensorsApi.h> #include <Sensors.h> // Create a COM interface to the SensorManager object. ISensorManager* pSensorManager = NULL; HRESULT hr = ::CoCreateInstance(CLSID_SensorManager, NULL, CLSCTX_INPROC_SERVER, IID_PPV_ARGS(&pSensorManager)); if (FAILED(hr)) { ::MessageBox(NULL, _T("Unable to CoCreateInstance() the SensorManager."), _T("Sensor C++ Sample"), MB_OK | MB_ICONERROR); return -1; } // Get a collection of all 3-axis Gyros on the computer. ISensorCollection* pSensorCollection = NULL; hr = pSensorManager->GetSensorsByType(SENSOR_TYPE_GYROMETER_3D, &pSensorCollection); if (FAILED(hr)) { ::MessageBox(NULL, _T("Unable to find any Gyros on the computer."), _T("Sensor C++ Sample"), MB_OK | MB_ICONERROR); return -1; }
Ask by Category
An app can request sensors by category, such as all motion sensors. The Sensor Manager consults the list of sensor hardware on the computer and returns a collection of motion objects bound to that hardware. The SensorCollection may have 0, 1, or more objects in it. On most computers, the collection will have two motion objects: Accelerometer3D and Gyrometer3D.
The C++ code sample below illustrates the use of the Sensor Manager object’s GetSensorsByCategory method to search for motion sensors and return them in a sensor collection. In practice, it is most efficient for an app to request all of the sensors on the computer at once. The Sensor Manager consults the list of sensor hardware on the computer and returns a collection of all the objects bound to that hardware. The Sensor Collection may have 0, 1, or more objects in it. On most computers, the collection will have seven or more objects. C++ does not have a GetAllSensors call, so you must use GetSensorsByCategory(SENSOR_CATEGORY_ALL, …) instead as shown in the sample code below. On Windows, as with most hardware devices, sensors are treated as Plug and Play devices. There are a few different scenarios where sensors can be connected/disconnected: In the context of sensors, a Plug and Play connect is called an Enter event, and disconnect is called a Leave event. Resilient apps need to be able to handle both. If the app is already running at the time a sensor is plugged in, the Sensor Manager reports the sensor Enter event; however, if the sensors are already plugged in when the app starts running, this action will not result in Enter events for those sensors. In C++/COM, you must use the SetEventSinkmethod to hook the callback. The callback must be an entire class that inherits from ISensorManagerEvents and must implement IUnknown. Additionally, the ISensorManagerEvents interface must have callback function implementations for: Code:Hook Callback for Enter event Code:Callback for Enter event The individual sensor (not the Sensor Manager) reports when the Leave event happens. This code is the same as the previous hook callback for an Enter event. Code: Hook Callback for Leave event The OnLeave event handler receives the ID of the leaving sensor as an argument. Code: Callback for Leave event Different types of sensors report different information. Microsoft calls these pieces of information Data Fields, and they are grouped together in a SensorDataReport. A computer may (potentially) have more than one type of sensor that an app can use. The app won’t care which sensor the information came from, so long as it is available. Table 4 shows the constant names for the most commonly used Data Fields for Win32/COM and.NET. Just like sensor identifiers, these constants are just human-readable names for their associated GUIDs. This method of association provides for extensibility of Data Fields beyond those “well known” fields that Microsoft has pre-defined. Constant (Win32*/COM) Constant (.NET) PROPERTYKEY (GUID,PID) SENSOR_DATA_TYPE_TIMESTAMP SensorDataTypeTimestamp {DB5E0CF2-CF1F-4C18-B46C-D86011D62150},2 SENSOR_DATA_TYPE_LIGHT_LEVEL_LUX SensorDataTypeLightLevelLux {E4C77CE2-DCB7-46E9-8439-4FEC548833A6},2 SENSOR_DATA_TYPE_ACCELERATION_X_G SensorDataTypeAccelerationXG {3F8A69A2-07C5-4E48-A965-CD797AAB56D5},2 SENSOR_DATA_TYPE_ACCELERATION_Y_G SensorDataTypeAccelerationYG {3F8A69A2-07C5-4E48-A965-CD797AAB56D5},3 SENSOR_DATA_TYPE_ACCELERATION_Z_G SensorDataTypeAccelerationZG {3F8A69A2-07C5-4E48-A965-CD797AAB56D5},4 SENSOR_DATA_TYPE_ANGULAR_VELOCITY_X_DEGRE SensorDataTypeAngularVelocityXDegreesPerSecond {3F8A69A2-07C5-4E48-A965-CD797AAB56D5},10 SENSOR_DATA_TYPE_ANGULAR_VELOCITY_Y_DEGRE SensorDataTypeAngularVelocityYDegreesPerSecond {3F8A69A2-07C5-4E48-A965-CD797AAB56D5},11 SENSOR_DATA_TYPE_ANGULAR_VELOCITY_Z_DEGRE SensorDataTypeAngularVelocityZDegreesPerSecond {3F8A69A2-07C5-4E48-A965-CD797AAB56D5},12 SENSOR_DATA_TYPE_TILT_X_DEGREES SensorDataTypeTiltXDegrees {1637D8A2-4248-4275-865D-558DE84AEDFD},2 SENSOR_DATA_TYPE_TILT_Y_DEGREES SensorDataTypeTiltYDegrees {1637D8A2-4248-4275-865D-558DE84AEDFD},3 SENSOR_DATA_TYPE_TILT_Z_DEGREES SensorDataTypeTiltZDegrees {1637D8A2-4248-4275-865D-558DE84AEDFD},4 SENSOR_DATA_TYPE_MAGNETIC_HEADING_COMPEN SensorDataTypeMagneticHeadingCompen {1637D8A2-4248-4275-865D-558DE84AEDFD},11 SENSOR_DATA_TYPE_MAGNETIC_FIELD_STRENGTH_ SensorDataTypeMagneticFieldStrengthXMilligauss {1637D8A2-4248-4275-865D-558DE84AEDFD},19 SENSOR_DATA_TYPE_MAGNETIC_FIELD_STRENGTH_ SensorDataTypeMagneticFieldStrengthYMilligauss {1637D8A2-4248-4275-865D-558DE84AEDFD},20 SENSOR_DATA_TYPE_MAGNETIC_FIELD_STRENGTH_ SensorDataTypeMagneticFieldStrengthZMilligauss {1637D8A2-4248-4275-865D-558DE84AEDFD},21 SENSOR_DATA_TYPE_QUATERNION SensorDataTypeQuaternion {1637D8A2-4248-4275-865D-558DE84AEDFD},17 SENSOR_DATA_TYPE_ROTATION_MATRIX SensorDataTypeRotationMatrix {1637D8A2-4248-4275-865D-558DE84AEDFD},16 SENSOR_DATA_TYPE_LATITUDE_DEGREES SensorDataTypeLatitudeDegrees {055C74D8-CA6F-47D6-95C6-1ED3637A0FF4},2 SENSOR_DATA_TYPE_LONGITUDE_DEGREES SensorDataTypeLongitudeDegrees {055C74D8-CA6F-47D6-95C6-1ED3637A0FF4},3 SENSOR_DATA_TYPE_ALTITUDE_ELLIPSOID_METERS SensorDataTypeAltitudeEllipsoidMeters {055C74D8-CA6F-47D6-95C6-1ED3637A0FF4},5 Table 4. Data Field identifier constants One thing that makes Data Field identifiers different from sensor IDs is the use of a data type called PROPERTYKEY. A PROPERTYKEY consists of a GUID (similar to what sensors have), plus an extra number called a “PID” (property ID). You might notice that the GUID part of a PROPERTYKEY is common for sensors that are in the same category. Data Fields have a native data type for all of their values, such as Boolean, unsigned char, int, float, double, etc. In Win32/COM, the value of a Data Field is stored in a polymorphic data type called PROPVARIANT. In .NET, there is a CLR (Common Language Runtime) data type called “object” that does the same thing. The polymorphic data type will need to be queried and/or typecast to the “expected”/”documented” data type. The SupportsDataField()method of the sensor should be used to check the sensors for the Data Fields of interest. This is the most common programming idiom that is used to select sensors. Depending on the usage model of the app, only a subset of the Data Field may be required. Sensors that support the desired Data Fields should be selected. Type casting will be required to assign the sub-classed member variables from the base class sensor. Code: Use of the SupportsDataField() method of the sensor to check for supported data field In addition to Data Fields, sensors have Properties that can be used for identification and configuration. Table 5 shows the most commonly-used Properties. Just like Data Fields, Properties have constant names used by Win32/COM and .NET, and those constants are really PROPERTYKEY numbers underneath. Properties are extensible by vendors and also have PROPVARIANT polymorphic data types. Unlike Data Fields that are read-only, Properties have the ability to Read/Write. It is up to the individual sensor’s discretion as to whether or not it rejects Write attempts. Because no exception is thrown when a write attempt fails, a write-read-verification will need to be performed. Identification Identification PROPERTYKEY (GUID,PID) SENSOR_PROPERTY_PERSISTENT_UNIQUE_ID SensorID {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},5 WPD_FUNCTIONAL_OBJECT_CATEGORY CategoryID {8F052D93-ABCA-4FC5-A5AC-B01DF4DBE598},2 SENSOR_PROPERTY_TYPE TypeID {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},2 SENSOR_PROPERTY_STATE State {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},3 SENSOR_PROPERTY_MANUFACTURER SensorManufacturer {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},6 SENSOR_PROPERTY_MODEL SensorModel {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},7 SENSOR_PROPERTY_SERIAL_NUMBER SensorSerialNumber {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},8 SENSOR_PROPERTY_FRIENDLY_NAME FriendlyName {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},9 SENSOR_PROPERTY_DESCRIPTION SensorDescription {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},10 SENSOR_PROPERTY_MIN_REPORT_INTERVAL MinReportInterval {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},12 SENSOR_PROPERTY_CONNECTION_TYPE SensorConnectionType {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},11 SENSOR_PROPERTY_DEVICE_ID SensorDevicePath {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},15 SENSOR_PROPERTY_RANGE_MAXIMUM SensorRangeMaximum {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},21 SENSOR_PROPERTY_RANGE_MINIMUM SensorRangeMinimum {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},20 SENSOR_PROPERTY_ACCURACY SensorAccuracy {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},17 SENSOR_PROPERTY_RESOLUTION SensorResolution {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},18 Configuration Configuration PROPERTYKEY (GUID,PID) SENSOR_PROPERTY_CURRENT_REPORT_INTERVAL ReportInterval {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},13 SENSOR_PROPERTY_CHANGE_SENSITIVITY ChangeSensitivity {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},14 SENSOR_PROPERTY_REPORTING_STATE ReportingState {7F8383EC-D3EC-495C-A8CF-B8BBE85C2920},27 The sensitivity setting is a very useful Property of a sensor. It can be used to assign a threshold that controls or filters the number of SensorDataReports sent to the host computer. In this way, traffic can be reduced: only send up those DataUpdated events that are truly worthy of bothering the host CPU. The way Microsoft has defined the data type of this Sensitivity property as a container type called IPortableDeviceValues in Win32/COM and SensorPortableDeviceValues in .NET. This container holds a collection of tuples, each of which is a Data Field PROPERTYKEY followed by the sensitivity value for that Data Field. The sensitivity always uses the same units of measure and data type as the matching Data Field. Some information provided by sensors may be considered sensitive, i.e., Personally Identifiable Information(PII). Data Fields such as the computer’s location (e.g., latitude and longitude), could be used to track the user. For this reason, Windows forces apps to get end-user permission to access the sensor. The State property of the sensor and the RequestPermissions() method of the SensorManager can be used as needed. The RequestPermissions() method takes an array of sensors as an argument, so an app can request permission for more than one sensor at a time. The C++/COM code is shown below. Note that (ISensorCollection *) must be provided as an argument to RequestPermissions(). Sensors report data by throwing an event called a DataUpdated event. The actual Data Fields are packaged inside a SensorDataReport, which is passed to any attached DataUpdated event handlers. An app can obtain the SensorDataReport by hooking a callback handler to the sensor’s DataUpdated event. The event occurs in a Windows Sensor Framework thread, which is a different thread than the message-pump thread used to update the app’s GUI. Therefore, a “hand-off” of the SensorDataReport from the event handler (Als_DataUpdate) to a separate handler (Als_UpdateGUI) that can execute on the context of the GUI thread is required. In .NET, such a handler is called a delegate function. The example below shows preparation of the delegate function. In C++/COM, the SetEventSink method must be used to hook the callback. The callback cannot simply be a function; it must be an entire class that inherits from ISensorEvents and also implements IUnknown. The ISensorEvents interface must have callback function implementations for: Code: Set a COM Event Sink for the sensor The DataUpdated event handler receives the SensorDataReport (and the sensor that initiated the event) as arguments. It calls the Invoke() method of the form to post those items to the delegate function. The GUI thread runs the delegate function posted to its Invoke queue and passes the arguments to it. The delegate function casts the data type of the SensorDataReport to the expected subclass, gaining access to its Data Fields. The Data Fields are extracted using the GetDataField()method of the SensorDataReport object. Each of the Data Fields has to be typecast to their “expected”/”documented” data types (from the generic/polymorphic data type returned by the GetDataField() method). The app can then format and display the data in the GUI. The OnDataUpdated event handler receives the SensorDataReport (and the sensor that initiated the event) as arguments. The Data Fields are extracted using the GetSensorValue()method of the SensorDataReport object. Each of the Data Fields needs to have their PROPVARIANT checked for their “expected”/”documented” data types. The app can then format and display the data in the GUI. It is not necessary to use the equivalent of a C# delegate. This is because all C++ GUI functions (such as ::SetWindowText() shown here) use Windows message-passing to post the GUI update to the GUI thread / message-loop (the WndProc of your main window or dialog box). Properties of the SensorDataReport object can be referenced to extract Data Fields from the SensorDataReport. This only works for the .NET API and for “well known” or “expected” Data Fields of that particular SensorDataReport subclass. For the Win32/COM API, the GetDataField method must be used. It is possible to use “Dynamic Data Fields” for the underlying driver/firmware to “piggyback” any “extended/unexpected” Data Fields inside SensorDataReports. The GetDataField method is used to extract those. Unlike the Desktop mode, the WinRT Sensor API follows a common template for each of the sensors: Windows Store apps are often written either in JavaScript* or in C#. There are different language-bindings to the API, which result in a slightly different capitalization appearance in the API names and a slightly different way that events are handled. The simplified API is easier to use, and the pros and cons are listed in Table 6. Feature Pros Cons SensorManager There is no SensorManager to deal with. Apps use the GetDefault() method to get an instance of the sensor class. Events Apps only worry about the DataUpdated event. Sensor properties Apps only worry about the ReportInterval property. Data Report properties Apps only worry about a few, pre-defined Data Fields unique to each sensor. Table 6. Sensor APIs for Metro Style Apps, pros and cons Windows 8 APIs provide developers an opportunity to take advantage of sensors available on different platforms under both the traditional Desktop mode and the new Windows Store app interface. In this document, an overview was presented of the sensor APIs available to developers creating Windows 8 applications, focusing on the APIs and code samples for Desktop apps. Many of the new Windows 8 APIs were improved with the Windows 8.1 Operating System and this article provides links to many of the relevant samples provided on MSDN. Appendix Coordinate System for Different Form Factors To figure out the direction of rotation, use the “Right Hand Rule”: * Point the thumb of your right hand in the direction of one of the axes. These are the X, Y, and Z axes for a tablet form-factor PC, or phone (left) and for a clamshell PC (right). For more esoteric form factors (for example, a clamshell that is convertible into a tablet), the “standard” orientation is when it is in the TABLET state. To develop a navigation application (e.g., 3D space game), a conversion from “ENU” systems in your program is required. This can be done by using matrix multiplication. Graphics libraries such as Direct3D* and OpenGL* have APIs for handling this. Gael Hofemeier Deepak Vembar, PhD Intel and the Intel logo are trademarks of Intel Corporation in the US and/or other countries.
// Additional includes for sensors
#include <InitGuid.h>
#include <SensorsApi.h>
#include <Sensors.h>
// Create a COM interface to the SensorManager object.
ISensorManager* pSensorManager = NULL;
HRESULT hr = ::CoCreateInstance(CLSID_SensorManager, NULL, CLSCTX_INPROC_SERVER,
IID_PPV_ARGS(&pSensorManager));
if (FAILED(hr))
{
::MessageBox(NULL, _T("Unable to CoCreateInstance() the SensorManager."),
_T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
// Get a collection of all 3-axis Gyros on the computer.
ISensorCollection* pSensorCollection = NULL;
hr = pSensorManager->GetSensorsByCategory(SENSOR_CATEGORY_MOTION, &pSensorCollection);
if (FAILED(hr))
{
::MessageBox(NULL, _T("Unable to find any Motion sensors on the computer."),
_T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
Ask by Category “All”
C++ does not have a GetAllSensors call, so you must use GetSensorsByCategory(SENSOR_CATEGORY_ALL, …) instead as shown in the sample code below.
// Additional includes for sensors
#include <InitGuid.h>
#include <SensorsApi.h>
#include <Sensors.h>
// Create a COM interface to the SensorManager object.
ISensorManager* pSensorManager = NULL;
HRESULT hr = ::CoCreateInstance(CLSID_SensorManager, NULL, CLSCTX_INPROC_SERVER,
IID_PPV_ARGS(&pSensorManager));
if (FAILED(hr))
{
::MessageBox(NULL, _T("Unable to CoCreateInstance() the SensorManager."),
_T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
// Get a collection of all 3sensors on the computer.
ISensorCollection* pSensorCollection = NULL;
hr = pSensorManager->GetSensorsByCategory(SENSOR_CATEGORY_ALL, &pSensorCollection);
if (FAILED(hr))
{
::MessageBox(NULL, _T("Unable to find any sensors on the computer."),
_T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
Sensor Life Cycle – Enter and Leave Events
Enter Event Callback
STDMETHODIMP OnSensorEnter(ISensor *pSensor, SensorState state);
// Hook the SensorManager for any SensorEnter events.
pSensorManagerEventClass = new SensorManagerEventSink(); // create C++ class instance
// get the ISensorManagerEvents COM interface pointer
HRESULT hr = pSensorManagerEventClass->QueryInterface(IID_PPV_ARGS(&pSensorManagerEvents));
if (FAILED(hr))
{
::MessageBox(NULL, _T("Cannot query ISensorManagerEvents interface for our callback class."),
_T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
// hook COM interface of our class to SensorManager eventer
hr = pSensorManager->SetEventSink(pSensorManagerEvents);
if (FAILED(hr))
{
::MessageBox(NULL, _T("Cannot SetEventSink on SensorManager to our callback class."),
_T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
Below is the C++/COM equivalent of the Enter callback. All the initialization steps from the main loop would be performed in this function. In fact, it is more efficient to refactor the code so that the main loop merely calls OnSensorEnter to simulate an Enter event.
STDMETHODIMP SensorManagerEventSink::OnSensorEnter(ISensor *pSensor, SensorState state)
{
// Examine the SupportsDataField for SENSOR_DATA_TYPE_LIGHT_LEVEL_LUX.
VARIANT_BOOL bSupported = VARIANT_FALSE;
HRESULT hr = pSensor->SupportsDataField(SENSOR_DATA_TYPE_LIGHT_LEVEL_LUX, &bSupported);
if (FAILED(hr))
{
::MessageBox(NULL, _T("Cannot check SupportsDataField for SENSOR_DATA_TYPE_LIGHT_LEVEL_LUX."),
_T("Sensor C++ Sample"), MB_OK | MB_ICONINFORMATION);
return hr;
}
if (bSupported == VARIANT_FALSE)
{
// This is not the sensor we want.
return -1;
}
ISensor *pAls = pSensor; // It looks like an ALS, memorize it.
::MessageBox(NULL, _T("Ambient Light Sensor has entered."),
_T("Sensor C++ Sample"), MB_OK | MB_ICONINFORMATION);
.
.
.
return hr;
}
Leave Event
// Hook the Sensor for any DataUpdated, Leave, or StateChanged events.
SensorEventSink* pSensorEventClass = new SensorEventSink(); // create C++ class instance
ISensorEvents* pSensorEvents = NULL;
// get the ISensorEvents COM interface pointer
HRESULT hr = pSensorEventClass->QueryInterface(IID_PPV_ARGS(&pSensorEvents));
if (FAILED(hr))
{
::MessageBox(NULL, _T("Cannot query ISensorEvents interface for our callback class."),
_T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
hr = pSensor->SetEventSink(pSensorEvents); // hook COM interface of our class to Sensor eventer
if (FAILED(hr))
{
::MessageBox(NULL, _T("Cannot SetEventSink on the Sensor to our callback class."),
_T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
STDMETHODIMP SensorEventSink::OnLeave(REFSENSOR_ID sensorID)
{
HRESULT hr = S_OK;
::MessageBox(NULL, _T("Ambient Light Sensor has left."),
_T("Sensor C++ Sample"), MB_OK | MB_ICONINFORMATION);
// Perform any housekeeping tasks for the sensor that is leaving.
// For example, if you have maintained a reference to the sensor,
// release it now and set the pointer to NULL.
return hr;
}
Picking Sensors for an App
ES_PER_SECOND
ES_PER_SECOND
ES_PER_SECOND
SATED_MAGNETIC_NORTH_DEGREES
satedTrueNorthDegrees
X_MILLIGAUSS
Y_MILLIGAUSS
Z_MILLIGAUSS
ISensor* m_pAls;
ISensor* m_pAccel;
ISensor* m_pTilt;
// Cycle through the collection looking for sensors we care about.
ULONG ulCount = 0;
HRESULT hr = pSensorCollection->GetCount(&ulCount);
if (FAILED(hr))
{
::MessageBox(NULL, _T("Unable to get count of sensors on the computer."), _T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
for (int i = 0; i < (int)ulCount; i++)
{
hr = pSensorCollection->GetAt(i, &pSensor);
if (SUCCEEDED(hr))
{
VARIANT_BOOL bSupported = VARIANT_FALSE;
hr = pSensor->SupportsDataField(SENSOR_DATA_TYPE_LIGHT_LEVEL_LUX, &bSupported);
if (SUCCEEDED(hr) && (bSupported == VARIANT_TRUE)) m_pAls = pSensor;
hr = pSensor->SupportsDataField(SENSOR_DATA_TYPE_ACCELERATION_Z_G, &bSupported);
if (SUCCEEDED(hr) && (bSupported == VARIANT_TRUE)) m_pAccel = pSensor;
hr = pSensor->SupportsDataField(SENSOR_DATA_TYPE_TILT_Z_DEGREES, &bSupported);
if (SUCCEEDED(hr) && (bSupported == VARIANT_TRUE)) m_pTilt = pSensor;
.
.
.
}
}
Sensor Properties
(Win32*/COM)
(.NET)
(Win32/COM)
(.NET)Table 5. Commonly used sensor Properties and PIDs
Setting Sensor Sensitivity
// Configure sensitivity
// create an IPortableDeviceValues container for holding the <Data Field, Sensitivity> tuples.
IPortableDeviceValues* pInSensitivityValues;
hr = ::CoCreateInstance(CLSID_PortableDeviceValues, NULL, CLSCTX_INPROC_SERVER, IID_PPV_ARGS(&pInSensitivityValues));
if (FAILED(hr))
{
::MessageBox(NULL, _T("Unable to CoCreateInstance() a PortableDeviceValues collection."), _T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
// fill in IPortableDeviceValues container contents here: 0.1 G sensitivity in each of X, Y, and Z axes.
PROPVARIANT pv;
PropVariantInit(&pv);
pv.vt = VT_R8; // COM type for (double)
pv.dblVal = (double)0.1;
pInSensitivityValues->SetValue(SENSOR_DATA_TYPE_ACCELERATION_X_G, &pv);
pInSensitivityValues->SetValue(SENSOR_DATA_TYPE_ACCELERATION_Y_G, &pv);
pInSensitivityValues->SetValue(SENSOR_DATA_TYPE_ACCELERATION_Z_G, &pv);
// create an IPortableDeviceValues container for holding the <SENSOR_PROPERTY_CHANGE_SENSITIVITY, pInSensitivityValues> tuple.
IPortableDeviceValues* pInValues;
hr = ::CoCreateInstance(CLSID_PortableDeviceValues, NULL, CLSCTX_INPROC_SERVER, IID_PPV_ARGS(&pInValues));
if (FAILED(hr))
{
::MessageBox(NULL, _T("Unable to CoCreateInstance() a PortableDeviceValues collection."), _T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
// fill it in
pInValues->SetIPortableDeviceValuesValue(SENSOR_PROPERTY_CHANGE_SENSITIVITY, pInSensitivityValues);
// now actually set the sensitivity
IPortableDeviceValues* pOutValues;
hr = pAls->SetProperties(pInValues, &pOutValues);
if (FAILED(hr))
{
::MessageBox(NULL, _T("Unable to SetProperties() for Sensitivity."), _T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
// check to see if any of the setting requests failed
DWORD dwCount = 0;
hr = pOutValues->GetCount(&dwCount);
if (FAILED(hr) || (dwCount > 0))
{
::MessageBox(NULL, _T("Failed to set one-or-more Sensitivity values."), _T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
PropVariantClear(&pv);
Requesting permissions for Sensors
// Get the sensor's state
SensorState state = SENSOR_STATE_ERROR;
HRESULT hr = pSensor->GetState(&state);
if (FAILED(hr))
{
::MessageBox(NULL, _T("Unable to get sensor state."), _T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
// Check for access permissions, request permission if necessary.
if (state == SENSOR_STATE_ACCESS_DENIED)
{
// Make a SensorCollection with only the sensors we want to get permission to access.
ISensorCollection *pSensorCollection = NULL;
hr = ::CoCreateInstance(CLSID_SensorCollection, NULL, CLSCTX_INPROC_SERVER, IID_PPV_ARGS(&pSensorCollection));
if (FAILED(hr))
{
::MessageBox(NULL, _T("Unable to CoCreateInstance() a SensorCollection."),
_T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
pSensorCollection->Clear();
pSensorCollection->Add(pAls); // add 1 or more sensors to request permission for...
// Have the SensorManager prompt the end-user for permission.
hr = m_pSensorManager->RequestPermissions(NULL, pSensorCollection, TRUE);
if (FAILED(hr))
{
::MessageBox(NULL, _T("No permission to access sensors that we care about."),
_T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
}
Sensor Data Update
STDMETHODIMP OnEvent(ISensor *pSensor, REFGUID eventID, IPortableDeviceValues *pEventData);
STDMETHODIMP OnDataUpdated(ISensor *pSensor, ISensorDataReport *pNewData);
STDMETHODIMP OnLeave(REFSENSOR_ID sensorID);
STDMETHODIMP OnStateChanged(ISensor* pSensor, SensorState state);
// Hook the Sensor for any DataUpdated, Leave, or StateChanged events.
SensorEventSink* pSensorEventClass = new SensorEventSink(); // create C++ class instance
ISensorEvents* pSensorEvents = NULL;
// get the ISensorEvents COM interface pointer
HRESULT hr = pSensorEventClass->QueryInterface(IID_PPV_ARGS(&pSensorEvents));
if (FAILED(hr))
{
::MessageBox(NULL, _T("Cannot query ISensorEvents interface for our callback class."),
_T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
hr = pSensor->SetEventSink(pSensorEvents); // hook COM interface of our class to Sensor eventer
if (FAILED(hr))
{
::MessageBox(NULL, _T("Cannot SetEventSink on the Sensor to our callback class."),
_T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
return -1;
}
STDMETHODIMP SensorEventSink::OnDataUpdated(ISensor *pSensor, ISensorDataReport *pNewData)
{
HRESULT hr = S_OK;
if ((NULL == pNewData) || (NULL == pSensor)) return E_INVALIDARG;
float fLux = 0.0f;
PROPVARIANT pv = {};
hr = pNewData->GetSensorValue(SENSOR_DATA_TYPE_LIGHT_LEVEL_LUX, &pv);
if (SUCCEEDED(hr))
{
if (pv.vt == VT_R4) // make sure the PROPVARIANT holds a float as we expect
{
// Get the lux value.
fLux = pv.fltVal;
// Update the GUI
wchar_t *pwszLabelText = (wchar_t *)malloc(64 * sizeof(wchar_t));
swprintf_s(pwszLabelText, 64, L"Illuminance Lux: %.1f", fLux);
BOOL bSuccess = ::SetWindowText(m_hwndLabel, (LPCWSTR)pwszLabelText);
if (bSuccess == FALSE)
{
::MessageBox(NULL, _T("Cannot SetWindowText on label control."),
_T("Sensor C++ Sample"), MB_OK | MB_ICONERROR);
}
free(pwszLabelText);
}
}
PropVariantClear(&pv);
return hr;
}
Using Sensors in Windows Store apps
Summary
The Windows API reports X, Y, and Z axes in a manner that is compatible with the HTML5 standard (and Android*). It is also called the “ENU” system because X faces virtual “East”, Y faces virtual “North”, and Z faces “Up.”
* Positive angle rotation around that axis will follow the curve of your fingers.MSDN Resources
About the Authors
Gael is a Software Engineer in the Developer Relations Division at Intel working with Business Client Technologies. Gael holds a BS in Math and an MBA, both from the University of New Mexico. Gael enjoys hiking, biking, and photography.
Deepak Vembar is a Research Scientist in the Interaction and Experience Research (IXR) group at Intel Labs. His research interests are at the intersection of computer graphics and human computer interaction including areas of real-time graphics, virtual reality, haptics, eye-tracking, and user interaction. Prior to joining Intel Labs, Deepak was a Software Engineer in Software and Services Group (SSG) at Intel, where he worked with PC game developers to optimize their games for Intel platforms, delivered courses and tutorials on heterogeneous platform optimization, and created undergraduate coursework using game demos as an instructional medium for use in school curriculum.
Copyright © 2012 Intel Corporation. All rights reserved.
*Other names and brands may be claimed as the property of others.