What really sets the iPhone apart from laptops and PCs is its use of onboard sensors, including those that are location-enabled. This concise book takes experienced iPhone and Mac developers on a detailed tour of iPhone and iPad hardware by explaining how these sensors work, and what they're capable of doing.
With this book, you'll build sample applications for each sensor, and learn hands-on how to take advantage of the data each sensor produces. You'll gain valuable experience that you can immediately put to work inside your own iOS applications for the iPhone, iPod touch, and iPad. This book helps you focus on:
Camera: learn how to take pictures and video, create video thumbnails, customize video, and save media to the photo album
Audio: use the media picker controller and access the iPod music library in your own application, and enable your app to record and play sampled audio
Accelerometer: write an application that uses this sensor to determine device orientation
Magnetometer: learn how this sensor verifies compass headings
Core Motion: use this framework to receive motion data from both the accelerometer and the vibrational gyroscope
This short book is part of a collection that will, along with new material, be compiled into a larger book, iOS Sensor Programming. The other books in this collection are Augmented Reality in iOS, Geolocation in iOS, and iOS Sensor Apps with Arduino.
Alasdair Allan is a senior research fellow in Astronomy at the University of Exeter, where he is building an autonomous, distributed peer-to-peer network of telescopes that reactively schedule observations of time-critical events. He also runs a small technology consulting business writing bespoke software and building open hardware, and is currently developing a series of iPhone applications to monitor and manage cloud-based services and distributed sensor networks.