Working With Sensor
Working With Sensor
Introduction
What you should already KNOW
What you will LEARN
What you will DO
App overview
Task 1. List the available sensors
Task 2. Get sensor data
Solution code
Coding challenge
Summary
Related concept
Learn more
Many Android-powered devices include built-in sensors that measure motion, orientation,
and environmental conditions such as ambient light or temperature. These sensors can
provide data to your app with high precision and accuracy. Sensors can be used to monitor
three-dimensional device movement or positioning, or to monitor changes in the
environment near a device, such as changes to temperature or humidity. For example, a
game might track readings from a device's accelerometer sensor to infer complex user
gestures and motions, such as tilt, shake, or rotation.
In this practical you learn about the Android sensor framework, which is used to find the
available sensors on a device and retrieve data from those sensors.
The device camera, fingerprint sensor, microphone, and GPS (location) sensor all have their
own APIs and are not considered part of the Android sensor framework.
72
Introduction
Query the sensor manager for available sensors, and retrieve information about specific
sensors.
Register listeners for sensor data.
React to incoming sensor data.
App overview
You will build two apps in this practical. The first app lists the available sensors on the device
or emulator. The list of sensors is scrollable, if it is too big to fit the screen.
73
Introduction
74
Introduction
The second app, modified from the first, gets data from the ambient light and proximity
sensors, and displays that data. Light and proximity sensors are some of the most common
Android device sensors.
75
Introduction
76
Introduction
android:layout_margin="16dp"
Attribute Value
android:layout_width "match_parent"
android:layout_height "match_parent"
app:layout_constraintBottom_toBottomOf "parent"
app:layout_constraintTop_toTopOf "parent"
app:layout_constraintLeft_toLeftOf "parent"
app:layout_constraintRight_toRightOf "parent"
The ScrollView is here to allow the list of sensors to scroll if it is longer than the
screen.
6. Add a TextView element inside the ScrollView and give it these attributes:
Attribute Value
android:id "@+id/sensor_list"
android:layout_width "wrap_content"
android:layout_height "wrap_content"
android:text "(placeholder)"
This TextView holds the list of sensors. The placeholder text is replaced at runtime by
the actual sensor list. The layout for your app should look like this screenshot:
77
Introduction
78
Introduction
7. Open MainActivity and add a variable at the top of the class to hold an instance of
SensorManager :
The sensor manager is a system service that lets you access the device sensors.
8. In the onCreate() method, below the setContentView() method, get an instance of the
sensor manager from system services, and assign it to the mSensorManager variable:
mSensorManager =
(SensorManager) getSystemService(Context.SENSOR_SERVICE);
9. Get the list of all sensors from the sensor manager. Store the list in a List object
whose values are of type Sensor :
List<Sensor> sensorList =
mSensorManager.getSensorList(Sensor.TYPE_ALL);
The Sensor class represents an individual sensor and defines constants for the
available sensor types. The Sensor.TYPE_ALL constant indicates all the available
sensors.
10. Iterate over the list of sensors. For each sensor, get that sensor's official name with the
getName() method, and append that name to the sensorText string. Each line of the
sensor list is separated by the value of the line.separator property, typically a newline
character:
11. Get a reference to the TextView for the sensor list, and update the text of that view with
the string containing the list of sensors:
79
Introduction
Different Android devices have different sensors available, which means the SensorSurvey
app shows different results for each device. In addition, the Android emulator includes a
small set of simulated sensors.
1. Run the app on a physical device. The output of the app looks something like this
screenshot:
80
Introduction
81
Introduction
In this list, lines that begin with a letter/number code represent physical hardware in the
device. The letters and numbers indicate sensor manufacturers and model numbers. In
most devices the accelerometer, gyroscope, and magnetometer are physical sensors.
Lines without letter/number codes are virtual or composite sensors, that is, sensors that
are simulated in software. These sensors use the data from one or more physical
sensors. So, for example, the gravity sensor may use data from the accelerometer,
gyroscope, and magnetometer to provide the direction and magnitude of gravity in the
device's coordinate system.
2. Run the app in an emulator. The output of the app looks something like this screenshot:
Because the Android emulator is a simulated device, all the available sensors are virtual
sensors. "Goldfish" is the name of the emulator's Linux kernel.
3. Click the More button (three horizontal dots) on the emulator's control panel. The
Extended Controls window appears.
82
Introduction
This window shows the settings and current values for the emulator's virtual sensors.
Drag the image of the device to simulate motion and acceleration with the
accelerometer. Dragging the device image may also rotate the main emulator window.
83
Introduction
This tab shows the other available virtual sensors for the emulator, including the light,
temperature, and proximity sensors. You use more of these sensors in the next task.
The light sensor measures ambient light in lux, a standard unit of illumination. The light
sensor typically is used to automatically adjust screen brightness.
The proximity sensor measures when the device is close to another object. The
proximity sensor is often used to turn off touch events on a phone's screen when you
answer a phone call, so that touching your phone to your face does not accidentally
launch apps or otherwise interfere with the device's operation.
Attribute Value
android:id "@+id/label_light"
android:layout_width "wrap_content"
android:layout_height "wrap_content"
app:layout_constraintLeft_toLeftOf "parent"
app:layout_constraintTop_toBottomOf "parent"
The "%1$.2f" part of the text string is a placeholder code. This code will be replaced in
the Java code for your app with the placeholder filled in with an actual numeric value. In
this case the placeholder code has three parts:
%1 : The first placeholder. You could include multiple placeholders in the same
84
Introduction
4. Copy and paste the TextView element. Change the attributes in the following table.
Extract the string into a resource called "label_proximity" . This text view will print
values from the proximity sensor.
Attribute Value
android:id "@+id/label_proximity"
app:layout_constraintTop_toBottomOf "@+id/label_light"
The layout for your app should look like this screenshot:
85
Introduction
86
Introduction
You'll use this message in the next task when you test if a sensor is available.
1. Open MainActivity and add private member variables at the top of the class to hold
Sensor objects for the light and proximity sensors. Also add private member variables
2. In the onCreate( ) method, delete all the existing code after the line to get the sensor
manager.
3. Add code to onCreate() to get the two TextView views and assign them to their
respective variables:
4. Get instances of the default light and proximity sensors. These will be instances of the
Sensor class. Assign them to their respective variables:
mSensorProximity =
mSensorManager.getDefaultSensor(Sensor.TYPE_PROXIMITY);
mSensorLight = mSensorManager.getDefaultSensor(Sensor.TYPE_LIGHT);
The getDefaultSensor( ) method is used to query the sensor manager for sensors of a
given type. The sensor types are defined by the Sensor class. If there is no sensor
available for the given type, the getDefaultSensor() method returns null .
5. Get the error string you defined earlier from the strings.xml resource:
87
Introduction
6. Test that there is an available light sensor. If the sensor is not available (that is, if
getDefaultSensor() returned null ), set the display text for the light sensor's
if (mSensorLight == null) {
mTextSensorLight.setText(sensor_error);
}
Different devices have different sensors, so it is important that your app check that a
sensor exists before using the sensor. If a sensor is not available, your app should turn
off features that use that sensor and provide helpful information to the user. If your app's
functionality relies on a sensor that is not available, your app should provide a message
and gracefully quit. Do not assume that any device will have any given sensor.
if (mSensorProximity == null) {
mTextSensorProximity.setText(sensor_error);
}
handle the new sensor data in an onSensorChanged() callback. All of these tasks are part of
the SensorEventListener interface.
In this task, you register listeners for changes to the light and proximity sensors. You
process new data from those sensors and display that data in the app layout.
1. At the top of the class, modify the class signature to implement the
SensorEventListener interface.
2. Click the red light bulb icon, select "implement methods," and select all methods.
The SensorEventListener interface includes two callback methods that enable your app
to handle sensor events:
88
Introduction
onSensorChanged() : Called when new sensor data is available. You will use this
react to that change. Most sensors, including the light and proximity sensors, do not
report accuracy changes. In this app, you leave onAccuracyChanged() empty.
3. Override the onStart() activity lifecycle method to register your sensor listeners.
Listening to incoming sensor data uses device power and consumes battery life. Don't
register your listeners in onCreate() , as that would cause the sensors to be on and
sending data (using device power) even when your app was not in the foreground. Use
the onStart() and onStop() methods to register and unregister your sensor listeners.
@Override
protected void onStart() {
super.onStart();
if (mSensorProximity != null) {
mSensorManager.registerListener(this, mSensorProximity,
SensorManager.SENSOR_DELAY_NORMAL);
}
if (mSensorLight != null) {
mSensorManager.registerListener(this, mSensorLight,
SensorManager.SENSOR_DELAY_NORMAL);
}
}
Note: The onStart() and onStop() methods are preferred over onResume() and
onPause() to register and unregister listeners. As of Android 7.0 (API 24), apps can run
An app or activity Context . You can use the current activity ( this ) as the context.
The Sensor object to listen to.
A delay constant from the SensorManager class. The delay constant indicates how
quickly new data is reported from the sensor. Sensors can report a lot of data very
quickly, but more reported data means that the device consumes more power.
Make sure that your listener is registered with the minimum amount of new data it
needs. In this example you use the slowest value
( SensorManager.SENSOR_DELAY_NORMAL ). For more data-intensive apps such as
89
Introduction
4. Implement the onStop() lifecycle method to unregister your sensor listeners when the
app pauses:
@Override
protected void onStop() {
super.onStop();
mSensorManager.unregisterListener(this);
}
The sensor event stores the new data from the sensor in the values array. Depending
on the sensor type, this array may contain a single piece of data or a multidimensional
array full of data. For example, the accelerometer reports data for the x -axis, y -axis,
and z -axis for every change in the values[0] , values[1] , and values[2] positions.
Both the light and proximity sensors only report one value, in values[0] .
7. Add a switch statement for the sensorType variable. Add a case for
Sensor.TYPE_LIGHT to indicate that the event was triggered by the light sensor.
90
Introduction
switch (sensorType) {
// Event came from the light sensor.
case Sensor.TYPE_LIGHT:
// Handle light sensor
break;
default:
// do nothing
}
8. Inside the light sensor case , get the template string from the resources, and update
the value in the light sensor's TextView .
mTextSensorLight.setText(getResources().getString(
R.string.label_light, currentValue));
When you defined this TextView in the layout, the original string resource included a
placeholder code, like this:
When you call getString() to get the string from the resources, you include values to
substitute into the string where the placeholder codes are. The part of the string that is
not made up of placeholders ( "Light Sensor: " ) is passed through to the new string.
case Sensor.TYPE_PROXIMITY:
mTextSensorProximity.setText(getResources().getString(
R.string.label_proximity, currentValue));
break;
91
Introduction
92
Introduction
2. Move the device towards a light source, or shine a flashlight on it. Move the device
away from the light or cover the device with your hand. Note how the light sensor
reports changes in the light level.
TIP: The light sensor is often placed on the top right of the device's screen.
The light sensor's value is generally measured in lux, a standard unit of illumination.
However, the lux value that a sensor reports may differ across different devices, and the
maximum may vary as well. If your app requires a specific range of values for the light
sensor, you must translate the raw sensor data into something your app can use.
3. Move your hand toward the device, and then move it away again. Note how the
proximity sensor reports values indicating "near" and "far." Depending on how the
proximity sensor is implemented, you may get a range of values, or you may get just
two values (for example, 0 and 5) to represent near and far.
TIP: The proximity sensor is often a virtual sensor that gets its data from the light
sensor. For that reason, covering the light sensor may produce changes to the proximity
value.
As with the light sensor, the sensor data for the proximity sensor can vary from device to
device. Proximity values may be a range between a minimum and a maximum. More
often there are only two proximity values, one to indicate "near," and one to indicate
"far." All these values may vary across devices.
4. Run the app in an emulator, and click the More button (three horizontal dots) on the
emulator's control panel to bring up the Extended controls window.
93
Introduction
5. Click Virtual sensors, and then click the Additional sensors tab.
The sliders in this window enable you to simulate changes to sensor data that would
normally come from the hardware sensors. Changes in this window generate sensor
events in the emulator that your app can respond to.
6. Move the sliders for the light and proximity sensors and observe that the values in the
app change as well.
Solution code
Android Studio projects:
SensorSurvey
SensorListeners
Coding challenge
Note: All coding challenges are optional.
Challenge: Modify the SensorListeners app such that:
The background color of the app changes in response to the light level.
94
Introduction
Place an ImageView or Drawable in the layout. Make the image larger or smaller based
on the value that the app receives from the proximity sensor.
Summary
The Android sensor framework provides access to data coming from a set of device
sensors. These sensors include accelerometers, gyroscopes, magnetometers,
barometers, humidity sensors, light sensors, proximity sensors, and so on.
The SensorManager service lets your app access and list sensors and listen for sensor
events ( SensorEvent ). The sensor manager is a system service you can request with
getSystemService() .
The Sensor class represents a specific sensor and contains methods to indicate the
properties and capabilities of a given sensor. It also provides constants for sensor types,
which define how the sensors behave and what data they provide.
Use getSensorList(Sensor.TYPE_ALL) to get a list of all the available sensors.
Use getDefaultSensor() with a sensor type to gain access to a particular sensor as a
Sensor object.
Sensors provide data through a series of sensor events. A SensorEvent object includes
information about the sensor that generated it, the time, and new data. The data a
sensor provides depends on the sensor type. Simple sensors such as light and
proximity sensors report only one data value, whereas motion sensors such as the
accelerometer provide multidimensional arrays of data for each event.
Your app uses sensor listeners to receive sensor data. Implement the
SensorEventListener interface to listen for sensor events.
Use the onSensorChanged() method to handle individual sensor events. From the
SensorEvent object passed into that method, you can get the sensor that generated the
Related concept
The related concept documentation is in Sensor Basics.
95
Introduction
Learn more
Android developer documentation:
Sensors Overview
Sensor
SensorEvent
SensorManager
SensorEventListener
96
Introduction
Introduction
What you should already KNOW
What you will LEARN
What you will DO
App overview
Task 1. Build the TiltSpot app
Task 2. Add the spots
Task 3. Handle activity rotation
Solution code
Coding challenge
Summary
Related concept
Learn more
The Android platform provides several sensors that enable your app to monitor the motion or
position of a device, in addition to other sensors such as the light sensor.
Motion sensors such as the accelerometer or gyroscope are useful for monitoring device
movement such as tilt, shake, rotation, or swing. Position sensors are useful for determining
a device's physical position in relation to the Earth. For example, you can use a device's
geomagnetic field sensor to determine its position relative to the magnetic north pole.
A common use of motion and position sensors, especially for games, is to determine the
orientation of the device, that is, the device's bearing (north/south/east/west) and tilt. For
example, a driving game could allow the user to control acceleration with a forward tilt or
backward tilt, and control steering with a left tilt or right tilt.
97
Introduction
App overview
The TiltSpot app displays the device orientation angles as numbers and as colored spots
along the four edges of the device screen. There are three components to device
orientation:
98
Introduction
When you tilt the device, the spots along the edges that are tilted up become darker.
The initial layout for the TiltSpot app includes several text views to display the device
orientation angles (azimuth, pitch, and roll)—you learn more about how these angles
work later in the practical. All those textviews are nested inside their own constraint
99
Introduction
layout to center them both horizontally and vertically within in the activity. You need the
nested constraint layout because later in the practical you add the spots around the
edges of the screen and around this inner text view.
3. Open MainActivity .
MainActivity in this starter app contains much of the skeleton code for managing sensors
This method gets an instance of the SensorManager service, and then uses the
getDefaultSensor() method to retrieve specific sensors. In this app those sensors are
The accelerometer measures acceleration forces on the device; that is, it measures how
fast the device is accelerating, and in which direction. Acceleration force includes the
force of gravity. The accelerometer is sensitive, so even when you think you're holding
the device still or leaving it motionless on a table, the accelerometer is recording minute
forces, either from gravity or from the environment. This makes the data generated by
the accelerometer very "noisy."
The magnetometer , also known as the geomagnetic field sensor , measures the
strength of magnetic fields around the device, including Earth's magnetic field. You can
use the magnetometer to find the device's position with respect to the external world.
However, magnetic fields can also be generated by other devices in the vicinity, by
external factors such as your location on Earth (because the magnetic field is weaker
toward the equator), or even by solar winds.
Neither the accelerometer nor the magnetometer alone can determine device tilt or
orientation. However, the Android sensor framework can combine data from both
sensors to get a fairly accurate device orientation—accurate enough for the purposes of
this app, at least.
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
This line locks the activity in portrait mode, to prevent the app from automatically
rotating the activity as you tilt the device. Activity rotation and sensor data can interact in
unexpected ways. Later in the practical, you explicitly handle sensor data changes in
your app in response to activity rotation, and remove this line.
100
Introduction
3. Examine the onStart() and onStop() methods. The onStart() method registers the
listeners for the accelerometer and magnetometer, and the onStop() method
unregisters them.
1. Open MainActivity .
2. Add member variables to hold copies of the accelerometer and magnetometer data.
When a sensor event occurs, both the accelerometer and the magnetometer produce
arrays of floating-point values representing points on the x -axis, y -axis, and z -axis of
the device's coordinate system. You will combine the data from both these sensors, and
over several calls to onSensorChanged() , so you need to retain a copy of this data each
time it changes.
3. Scroll down to the onSensorChanged() method. Add a line to get the sensor type from
the sensor event object:
4. Add tests for the accelerometer and magnetometer sensor types, and clone the event
data into the appropriate member variables:
101
Introduction
switch (sensorType) {
case Sensor.TYPE_ACCELEROMETER:
mAccelerometerData = sensorEvent.values.clone();
break;
case Sensor.TYPE_MAGNETIC_FIELD:
mMagnetometerData = sensorEvent.values.clone();
break;
default:
return;
}
You use the clone() method to explicitly make a copy of the data in the values array.
The SensorEvent object (and the array of values it contains) is reused across calls to
onSensorChanged() . Cloning those values prevents the data you're currently interested
in from being changed by more recent data before you're done with it.
A rotation matrix is a linear algebra term that translates the sensor data from one
coordinate system to another—in this case, from the device's coordinate system to the
Earth's coordinate system. That matrix is an array of nine float values, because each
point (on all three axes) is expressed as a 3D vector.
The x -axis is horizontal and points to the right edge of the device.
The y -axis is vertical and points to the top edge of the device.
The z -axis extends up from the surface of the screen. Negative z values are
102
Introduction
The y -axis points to magnetic north along the surface of the Earth.
The x -axis is 90 degrees from y , pointing approximately east.
The z -axis extends up into space. Negative z extends down into the ground.
A reference to the array for the rotation matrix is passed into the getRotationMatrix()
method and modified in place. The second argument to getRotationMatrix() is an
inclination matrix, which you don't need for this app. You can use null for this argument.
6. Call the SensorManager.getOrientation() method to get the orientation angles from the
rotation matrix. As with getRotationMatrix() , the array of float values containing
those angles is supplied to the getOrientation() method and modified in place.
103
Introduction
The angles returned by the getOrientation() method describe how far the device is
oriented or tilted with respect to the Earth's coordinate system. There are three
components to orientation:
7. Create variables for azimuth, pitch, and roll, to contain each component of the
orientationValues array. You adjust this data later in the practical, which is why it is
8. Get the placeholder strings, from the resources, fill the placeholder strings with the
orientation angles and update all the text views.
mTextSensorAzimuth.setText(getResources().getString(
R.string.value_format, azimuth));
mTextSensorPitch.setText(getResources().getString(
R.string.value_format, pitch));
mTextSensorRoll.setText(getResources().getString(
R.string.value_format, roll));
<string name="value_format">%1$.2f</string>
104
Introduction
1. Run the app. Place your device flat on the table. The output of the app looks something
like this:
Even a motionless device shows fluctuating values for the azimuth, pitch, and roll. Note
also that even though the device is flat, the values for pitch and roll may not be 0. This
is because the device sensors are extremely sensitive and pick up even tiny changes to
the environment, both changes in motion and changes in ambient magnetic fields.
2. Turn the device on the table from left to right, leaving it flat on the table.
Note how the value of the azimuth changes. An azimuth value of 0 indicates that the
device is pointing (roughly) north.
105
Introduction
Note that even if the value of the azimuth is 0, the device may not be pointing exactly
north. The device magnetometer measures the strength of any magnetic fields, not just
that of the Earth. If you are in the presence of other magnetic fields (most electronics
emit magnetic fields, including the device itself), the accuracy of the magnetometer may
not be exact.
Note: If the azimuth on your device seems very far off from actual north, you can
calibrate the magnetometer by waving the device a few times in the air in a figure-eight
motion.
3. Lift the bottom edge of the device so the screen is tilted away from you. Note the
change to the pitch value. Pitch indicates the top-to-bottom angle of tilt around the
106
Introduction
4. Lift the left side of the device so that it is tilted to the right. Note the change to the roll
value. Roll indicates the left-to-right tilt along the device's vertical axis.
5. Pick up the device and tilt it in various directions. Note the changes to the pitch and roll
values as the device's tilt changes. What is the maximum value you can find for any tilt
direction, and in what device position does that maximum occur?
The color changes in the spots rely on dynamically changing the alpha value of a shape
drawable in response to new sensor data. The alpha determines the opacity of that
drawable, so that smaller alpha values produce a lighter shape, and larger values produce a
darker shape.
<shape
xmlns:android="http://schemas.android.com/apk/res/android"
android:shape="oval">
<solid android:color="@android:color/black"/>
</shape>
<dimen name="spot_size">84dp</dimen>
107
Introduction
Attribute Value
android:id "@+id/spot_top"
android:layout_width "@dimen/spot_size"
android:layout_height "@dimen/spot_size"
android:layout_margin "@dimen/base_margin"
app:layout_constraintLeft_toLeftOf "parent"
app:layout_constraintRight_toRightOf "parent"
app:layout_constraintTop_toTopOf "parent"
app:srcCompat "@drawable/spot"
tools:ignore "ContentDescription"
This view places a spot drawable the size of the spot_size dimension at the top edge
of the screen. Use the app:srcCompat attribute for a vector drawable in an ImageView
(versus android:src for an actual image.) The app:srcCompat attribute is available in
the Android Support Library and provides the greatest compatibility for vector
drawables.
5. Add the following code below that first ImageView . This code adds the other three spots
along the remaining edges of the screen.
108
Introduction
<ImageView
android:id="@+id/spot_bottom"
android:layout_width="@dimen/spot_size"
android:layout_height="@dimen/spot_size"
android:layout_marginBottom="@dimen/base_margin"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintLeft_toLeftOf="parent"
app:layout_constraintRight_toRightOf="parent"
app:srcCompat="@drawable/spot"
tools:ignore="ContentDescription" />
<ImageView
android:id="@+id/spot_right"
android:layout_width="@dimen/spot_size"
android:layout_height="@dimen/spot_size"
android:layout_marginEnd="@dimen/base_margin"
android:layout_marginRight="@dimen/base_margin"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintRight_toRightOf="parent"
app:layout_constraintTop_toTopOf="parent"
app:srcCompat="@drawable/spot"
tools:ignore="ContentDescription"/>
<ImageView
android:id="@+id/spot_left"
android:layout_width="@dimen/spot_size"
android:layout_height="@dimen/spot_size"
android:layout_marginLeft="@dimen/base_margin"
android:layout_marginStart="@dimen/base_margin"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintLeft_toLeftOf="parent"
app:layout_constraintTop_toTopOf="parent"
app:srcCompat="@drawable/spot"
tools:ignore="ContentDescription" />
109
Introduction
6. Add the android:alpha attribute to all four ImageView elements, and set the value to
"0.05" . The alpha is the opacity of the shape. Smaller values are less opaque (less
visible). Setting the value to 0.05 makes the shape very nearly invisible, but you can still
see them in the layout view.
1. In MainActivity , add member variables at the top of the class for each of the spot
ImageView objects:
2. In onCreate() , just after initializing the text views for the sensor data, initialize the spot
views:
110
Introduction
3. In onSensorChanged() , right after the lines that initialize the azimuth, pitch, and roll
variables, reset the pitch or roll values that are close to 0 (less than the value of the
VALUE_DRIFT constant) to be 0:
When you initially ran the TiltSpot app, the sensors reported very small non-zero values
for the pitch and roll even when the device was flat and stationary. Those small values
can cause the app to flash very light-colored spots on all the edges of the screen. In this
code if the values are close to 0 (in either the positive or negative direction), you reset
them to 0.
4. Scroll down to the end of onSensorChanged() , and add these lines to set the alpha of all
the spots to 0. This resets all the spots to be invisible each time onSensorChanged() is
called. This is necessary because sometimes if you tilt the device too quickly, the old
values for the spots stick around and retain their darker color. Resetting them each time
prevents these artifacts.
mSpotTop.setAlpha(0f);
mSpotBottom.setAlpha(0f);
mSpotLeft.setAlpha(0f);
mSpotRight.setAlpha(0f);
5. Update the alpha value for the appropriate spot with the values for pitch and roll.
if (pitch > 0) {
mSpotBottom.setAlpha(pitch);
} else {
mSpotTop.setAlpha(Math.abs(pitch));
}
if (roll > 0) {
mSpotLeft.setAlpha(roll);
} else {
mSpotRight.setAlpha(Math.abs(roll));
}
Note that the pitch and roll values you calculated in the previous task are in radians, and
their values range from -π to +π. Alpha values, on the other hand, range only from 0.0
to 1.0. You could do the math to convert radian units to alpha values, but you may have
111
Introduction
noted earlier that the higher pitch and roll values only occur when the device is tilted
vertical or even upside down. For the TiltSpot app you're only interested in displaying
dots in response to some device tilt, not the full range. This means that you can
conveniently use the radian units directly as input to the alpha.
You should now be able to tilt the device and have the edge facing "up" display a dot
which becomes darker the further up you tilt the device.
You may assume that with TiltSpot, if you rotate the device from landscape to portrait, the
sensors will report the correct data for the new device orientation, and the spots will continue
to appear on the correct edges. That's not the case. When the activity rotates, the activity
drawing coordinate system rotates with it, but the sensor coordinate system remains the
same. The sensor coordinate system never changes position, regardless of the orientation
of the device.
The second tricky point for handling activity rotation is that the default or natural orientation
for your device may not be portrait. The default orientation for many tablet devices is
landscape. The sensor's coordinate system is always based on the natural orientation of a
device.
The TiltSpot starter app included a line in onCreate( ) to lock the orientation to portrait
mode:
setRequestedOrientation (ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
Locking the screen to portrait mode in this way solves one problem—it prevents the
coordinate systems from getting out of sync on portrait-default devices. But on landscape-
default devices, the technique forces an activity rotation, which causes the device and
sensor-coordinate systems to get out of sync.
112
Introduction
Here's the right way to handle device and activity rotation in sensor-based drawing: First,
use the Display.getRotation() method to query the current device orientation. Then use the
SensorManager.remapCoordinateSystem() method to remap the rotation matrix from the sensor
data onto the correct axes. This is the technique you use in the TiltSpot app in this task.
The getRotation() method returns one of four integer constants, defined by the Surface
class:
ROTATION_90 : The "sideways" orientation of the device (landscape for phones). Different
Note that many devices do not have ROTATION_180 at all or return ROTATION_90 or
ROTATION_270 regardless of which direction the device was rotated (clockwise or
counterclockwise). It is best to handle all possible rotations rather than to make assumptions
for any particular device.
//setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
3. At the end of onCreate() , get a reference to the window manager, and then get the
default display. You use the display to get the rotation in onSensorChanged() .
5. Get the current device rotation from the display and add a switch statement for that
113
Introduction
value. Use the rotation constants from the Surface class for each case in the switch.
For ROTATION_0 , the default orientation, you don't need to remap the coordinates. You
can just clone the data in the existing rotation matrix:
switch (mDisplay.getRotation()) {
case Surface.ROTATION_0:
rotationMatrixAdjusted = rotationMatrix.clone();
break;
}
6. Add additional cases for the other rotations, and call the
SensorManager.remapCoordinateSystem() method for each of these cases.
This method takes as arguments the original rotation matrix, the two new axes on which
you want to remap the existing x -axis and y -axis, and an array to populate with the
new data. Use the axis constants from the SensorManager class to represent the
coordinate system axes.
case Surface.ROTATION_90:
SensorManager.remapCoordinateSystem(rotationMatrix,
SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X,
rotationMatrixAdjusted);
break;
case Surface.ROTATION_180:
SensorManager.remapCoordinateSystem(rotationMatrix,
SensorManager.AXIS_MINUS_X, SensorManager.AXIS_MINUS_Y,
rotationMatrixAdjusted);
break;
case Surface.ROTATION_270:
SensorManager.remapCoordinateSystem(rotationMatrix,
SensorManager.AXIS_MINUS_Y, SensorManager.AXIS_X,
rotationMatrixAdjusted);
break;
7. Modify the call to getOrientation() to use the new adjusted rotation matrix instead of
the original matrix.
SensorManager.getOrientation(rotationMatrixAdjusted,
orientationValues);
8. Build and run the app again. The colors of the spots should now change on the correct
edges of the device, regardless of how the device is rotated.
Solution code
114
Introduction
Coding challenge
Note: All coding challenges are optional.
Challenge: A general rule is to avoid doing a lot of work in the onSensorChanged() method,
because the method runs on the main thread and may be called many times per second. In
particular, the changes to the colors of the spot can look jerky if you're trying to do too much
work in onSensorChanged() . Rewrite onSensorChanged() to use an AsyncTask object for all
the calculations and updates to views.
Summary
Motion sensors such as the accelerometer measure device movement such as tilt,
shake, rotation, or swing.
Position sensors such as the geomagnetic field sensor (magnetometer) can determine
the device's position relative to the Earth.
The accelerometer measures device acceleration, that is, how much the device is
accelerating and in which direction. Acceleration forces on the device include the force
of gravity.
The magnetometer measures the strength of magnetic fields around the device. This
includes Earth's magnetic field, although other fields nearby may affect sensor readings.
You can use combined data from motion and position sensors to determine the device's
orientation (its position in space) more accurately than with individual sensors.
The 3-axis device-coordinate system that most sensors use is relative to the device
itself in its default orientation. The y -axis is vertical and points toward the top edge of
the device, the x -axis is horizontal and points to the right edge of the device, and the z -
axis extends up from the surface of the screen.
The Earth's coordinate system is relative to the surface of the Earth, with the y -axis
pointing to magnetic north, the x -axis 90 degrees from y and pointing east, and the z -
axis extending up into space.
Orientation angles describe how far the device is oriented or tilted with respect to the
Earth's coordinate system. There are three components to orientation:
Azimuth : The direction (north/south/east/west) the device is pointing. 0 is magnetic
north.
Pitch : The top-to-bottom tilt of the device. 0 is flat.
Roll : The left-to-right tilt of the device. 0 is flat.
To determine the orientation of the device:
Use the SensorManager.getRotationMatrix() method. The method combines data from
115
Introduction
the accelerometer and magnetometer and translates the data into the Earth's
coordinate system.
Use the SensorManager.getOrientation() method with a rotation matrix to get the
orientation angles of the device.
The alpha value determines the opacity of a drawable or view. Lower alpha values
indicate more transparency. Use the setAlpha() method to programmatically change
the alpha value for a view.
When Android automatically rotates the activity in response to device orientation, the
activity coordinate system also rotates. However, the device-coordinate system that the
sensors use remains fixed.
The default device-coordinate system sensors use is also based on the natural
orientation of the device, which may not be "portrait" or "landscape."
Query the current device orientation with the Display.getRotation() method.
Use the current device orientation to remap the coordinate system in the right
orientation with the SensorManager.remapCoordinateSystem() method.
Related concept
The related concept documentation is in Motion and position sensors.
Learn more
Android developer documentation:
Sensors Overview
Motion Sensors
Position Sensors
Sensor
SensorEvent
SensorManager
SensorEventListener
Surface
Display
Other:
Accelerometer Basics
Sensor fusion and motion prediction (written for VR, but many of the basic concepts
116
Introduction
117