ANN/THE STRAITS TIMES – Dr Chieko Asakawa was travelling and negotiating her surroundings, luggage in hand, when a thought struck her: What if her luggage could scan the area around her for her, and be her guide?
That was back in 2017, and even then, the visually impaired scientist was no stranger to turning big ideas into reality.
A fellow and leader of accessibility research at technology company IBM, Dr Asakawa is the creator of a self-voicing web browser for visually impaired users, which earned her fame and induction into American organisation National Inventors Hall of Fame in 2019.
With a team of around 10 other scientists and about a year later in 2018, the first iteration of Dr Asakawa’s ideal suitcase was created.
The team has since made several improvements to it, partially due to feedback it has received from around 900 people in Japan and California, the latter during an accessibility conference in 2023.
Some upgrades that have been made include a smaller battery that powers the suitcase for longer, a threefold increase in the number of RGB (red green blue) cameras to have 360-degree coverage of its surroundings, and a more powerful and energy-efficient motor.
“The idea is for it to take visually impaired people wherever they want to go, to unfamiliar places they have never been,” said Mr Masashi Oikawa, program manager of accessibility research at IBM Research Tokyo. He was addressing reporters during a media visit to IBM’s Think Lab in Tokyo in June.
Weighing 15kg and standing at about a metre tall, the suitcase sports a speaker which allows an artificial intelligence (AI) voice assistant to give information to its user, and a microphone that allows the user to give instructions on where he or she wants to go.
It also features a light detection and ranging technology (lidar) system to obtain information on non-moving objects in the surroundings to understand its position relative to them, and a colour and depth camera that can detect humans in its vicinity.
The suitcase also contains a central and graphics processing unit, motion sensor and a tactile device.
To take advantage of the suitcase’s capabilities, users must first pair it with an app on their iPhone. By selecting a location on the app via a menu or voice interaction, they can communicate to the suitcase where they want to go.
The app is proprietary at the moment, so it does not have a specific name, said Mr Oikawa. However, he added that it should be ready for downloading on the Apple App Store when the suitcase is made available commercially.
AI software embedded in the suitcase also allows the navigation tool to make the necessary calculations and take its user on the safest route – such as guiding them towards a lift instead of an escalator – in a crowded mall.
The suitcase can also pick up on and avoid static and dynamic obstacles in its path, and match the user’s walking pace to the people around.
“The very first version we created, we called it the vacuum cleaner, since it was so noisy because of the multiple fans installed to cool the heat generated inside,” Mr Oikawa told ST.
The latest iteration is much quieter and emits a low-humming vibration when in use, allowing any information it conveys verbally to be heard clearly.
The suitcase has so far been positioned as an AI navigation tool for the visually impaired in indoor environments such as museums, shopping malls, stadiums, airports and hospitals.
Since April 2024, visitors to the National Museum of Emerging Science and Innovation in Tokyo have been able to rent the suitcase for free.
Each suitcase costs more than USD30,000 to develop currently. This cost includes resources to create the digital twin of a building for navigation.
“We had previously created a digital map of one big shopping mall in Tokyo. The shopping area consists of five floors, and is approximately 22,000 square metres. This took about one day,” said Oikawa.
The challenge now is for the team to take the suitcase outdoors, said Oikawa. There are more unpredictable variables in outdoor settings, and these include road conditions, weather, cars, bicycles and traffic lights that the suitcase has not been trained to recognise and respond to.
Dr Asakawa has big dreams for the navigation technology to be applied to other assistive tools such as wheelchairs and shopping carts.
“That’s why we are making our software open-sourced, and we will encourage other companies and developers to utilise it to scale,” she added.