The M300XL and M300 feature interaction methods which differ significantly from traditional touchscreen Android devices, and it is particularly important to keep these considerations in mind when designing the User Interface of an application intended to run on this device.

Existing applications which heavily leverage touchscreen interactions do not translate well to this device. This is due to touchscreen UI’s leveraging taps for input based on specific screen coordinates, which is not possible with the available interaction methods.


Voice commands are the ideal method of interacting with the device under many circumstances, as they will allow users to quickly control the device and provide input without requiring them to physically interact with the device, thus interrupting their workflow.

The device includes a Speech Recognition engine. Refer to the Speech SDK section for additional details on the engine and the speech vocabulary it supports.

Applications can leverage alternate recognition engines by including them within the application itself.

Navigation Buttons

The three navigation buttons on the device include both short and long-press functionality.

The buttons generate KeyEvents which can be intercepted and handled explicitly in your application, or can be left to the system to handle. Reference Android KeyEvent documentation for details.

Short presses on the buttons will perform the following functions:

  • Foremost Button – Move focus to the right within a UI or move down if no focusable objects are available to the right. Returns the KEYCODE_DPAD_RIGHT KeyEvent.

  • Middle Button – Move focus to the left within a UI or move up if no focusable objects are available to the left. Returns the KEYCODE_DPAD_LEFT KeyEvent.

  • Rearmost Button – Will select the current UI element which has focus. Returns the KEYCODE_DPAD_CENTER KeyEvent.

Long presses on the buttons will perform the following functions:

  • Foremost Button – Brings up a context menu for the current area of the UI, allowing users to access additional functions without crowding the UI. (KEYCODE_MENU)

  • Middle Button – Returns to the Home screen. Returns KEYCODE_HOME.

  • Rearmost Button – Moves back one step in the UI. Returns KEYCODE_BACK.


The device features a two-axis touchpad with double-tap gestures for select actions. The swipe gestures can be leveraged for left/right navigation with horizontal swipes, as well as up/down navigation with vertical swipes. Double-tap gestures are leveraged for select actions in order to avoid unintentional input when interacting with the device.

The touchpad is implemented as a trackball device, and methods such as dispatchTrackballEvent() and onTrackballEvent() can be used to capture and process the raw touchpad events.

As a fallback, if you do not handle the trackball events in your application, the touchpad will generate KEYCODE_DPAD events which can be captured with standard Android methods. Reference Android KeyEvent documentation for details.

The following key events are returned by the touchpad:

  • Swipe back to front: KEYCODE_DPAD_RIGHT

  • Swipe front to back: KEYCODE_DPAD_LEFT

  • Swipe bottom to top: KEYCODE_DPAD_UP

  • Swipe top to bottom: KEYCODE_DPAD_DOWN


Proximity Sensors

The device implements two non-contact proximity sensors, accessed through the Android Sensor API as Sensor.TYPE_PROXIMITY sensors:

  • One inward-facing TMD-26723 head sensor, which wakes the device on state change

  • One outward-facing APDS-9960 hand proximity sensor, which does not wake the device

Each of these sensors is a Sensor.REPORTING_MODE_ON_CHANGE device; it generates an event only upon state change and not periodically.

The following code excerpt demonstrates use of the hand proximity sensor:


// One of: Sensor.TYPE_ACCELEROMETER, Sensor.TYPE_AMBIENT_TEMPERATURE, // Sensor.TYPE_GAME_ROTATION_VECTOR, Sensor.TYPE_GEOMAGNETIC_ROTATION_VECTOR, // Sensor.TYPE_GRAVITY, Sensor.TYPE_GYROSCOPE, Sensor.TYPE_GYROSCOPE_UNCALIBRATED, // Sensor.TYPE_HEART_RATE, Sensor.TYPE_LIGHT, Sensor.TYPE_LINEAR_ACCELERATION, // Sensor.TYPE_MAGNETIC_FIELD, Sensor.TYPE_MAGNETIC_FIELD_UNCALIBRATED, // Sensor.TYPE_PRESSURE, Sensor.TYPE_PROXIMITY, Sensor.TYPE_RELATIVE_HUMIDITY, // Sensor.TYPE_ROTATION_VECTOR, Sensor.TYPE_SIGNIFICANT_MOTION, // Sensor.TYPE_STEP_COUNTER, Sensor.TYPE_STEP_DETECTOR private final int sensor_type = Sensor.TYPE_PROXIMITY; // Note: For Vuzix M300XL/M300, wakeUp=true accesses the inward-facing tmd26723 head // sensor and wakeUp=false accesses the outward-facing apds9960 hand proximity sensor. private final boolean wakeUp = false; ... SensorManager sm = ((SensorManager)getSystemService(SENSOR_SERVICE)); SensorEventListener listener = new mySensorListener(); Sensor sens = sm.getDefaultSensor(sensor_type, wakeUp); sm.registerListener(listener, sens, 0); ... private class mySensorListener implements SensorEventListener { public void onAccuracyChanged(Sensor sens, int accuracy) { ... } public void onSensorChanged(SensorEvent ev) { ... } }

Both head and hand proximity sensors return through SensorEventListener.onSensorChanged() a float value in SensorEvent.values[0]: 0.0 for proximity detected, > 1.0 for proximity not detected.

Did this answer your question?