Forgetful geeks need never lose keys, phones or even cutlery at home again.
Two computer science researchers have developed a
depth-camera based system that keeps track of household objects as they
are moved around a building.
The project - dubbed Kinsight - relies on several of
Microsoft's Kinect sensors attached to a computer running the team's
software.
Although the project is still at an experimental stage, it has been shown to work in a "real-world scenario".
Details of the system were recently outlined at a conference in China and were subsequently reported by New Scientist.
"Imagine if we had a system that could keep account of all
the objects that we interact with in our daily lives," the researchers
said.
"By keeping track of the locations of the objects, we could
build a smart search engine for our home that could answer queries like -
where are my eye glasses, or my TV-remote, or my wallet?"
Although alternative solutions, such as the use of
radio-frequency identification chips already exists, the men said their
system was many times cheaper due to the high cost of RFID readers.
What goes where
The researchers noted that running a computer program that
simultaneously tracked all the owner's objects in real-time would be too
processor-intensive.
So they based their design around the principle that objects only change locations when humans move them.
As a result the system focuses on tracking human figures and
then looking for objects that have changed position in their vicinity.
Although the Kinect sensor's capabilities are limited - it
only sees objects up to 11 feet (3.4m) away and only provides "skeleton
data" at 15 frames/second - the Kinsight program has commonsense notions
built into it to improve accuracy: so it knows that a coffee cup is
most likely to be found at a study desk, or kitchen sink, but not inside a bath.
"This means that, when in doubt, an object recognition algorithm can
use this knowledge to identify an object by analysing the likelihood of
it being at some location, or looking for the candidate objects in their
other locations," the researchers said.
On the move
Algorithms were also created to help the computer learn the
appearance of objects and the context they were likely to be used in by
analysing the data gathered.
To prove the system worked the two scientists labelled 48
objects - including knives, forks, keys and a Rubik's cube - and
identified 80 possible locations around a house.
They then asked volunteers to move the items around according to randomly generated patterns.
The results suggested room for improvement - errors were more
likely if the objects were very small, far away, transparent or placed
too closely together - but the team said these problems should be
addressed by using more sensors per room and adopting more sensitive
depth-cameras.
In the meantime, they say that even when the program does
lose track of possessions, it can still say were they were last seen
which may still prove helpful. (BBC)
0 comments:
Post a Comment
Grace A Comment!