In 2018, two Computer Engineering students, DesmondWong Chi-ping and Felix Wong Kwong-yat, were sifting through numerous ideas to submit for contention in the China Collegiate Computing Contest, when Felix read a news article about the severe shortage of guide dogs in Hong Kong. “It got us thinking about alternate forms of navigational aid for the visually impaired,”said Desmond.“Most of the existing aids, such as white canes, are technologically obsolete and unable to make use of the latest advancements in technology.” They decided to develop a machine learningpowered technological aid, and called it Lumino. The idea earned a second-class award in the contest, which was the validation they needed to continue developing the concept. They were joined at this point by a third team member Alvin Yu Shing-chit, an Arts and Law student at HKU, who took charge of the administrative and business development side of the business. The Lumino app has two functions. The navigation function works by giving the user directions via haptic feedback through vibrations: the intensity of the vibrations increases when the algorithm detects that the user is straying off the correct path. Surround cognition The second function, surround cognition, is the app’s selling point.“Lumino uses machine learning to identify the user’s surrounding environment, and provides context-specific, latency-free obstacle and object recognition,” said Felix.“When obstacles and objects are detected, we also offer user-friendly feedback that visually-impaired users find more intuitive – that is, auditory and haptic feedback.” Desmond elaborated on the details:“We have adopted a hybrid model: data unique to each user, such as their daily routines and idiosyncrasies, is stored locally on their device, while data beneficial to all users, such as those regarding typical structures and objects (walls, cars, keys) are stored in the cloud. Our self-developed algorithm processes the user’s surrounding environment obtained from the live feed of their smartphone camera, Most of the existing street navigation applications, like the Apple Maps or Google Maps, do not provide detailed guidance information for visually-impaired people... The Lumino sensors are integrated into a very stylish prototype to guide the user via vibrations. Dr Vincent Tam cross-references objects of interest from similar data stored in local and cloud storages, and produces suggestions for the user’s reference, or completes tasks for them in the background. “Visually-impaired people no longer have to scour their house for misplaced keys, nor worry about impending hazards just out of reach of their white canes: Lumino is able to anticipate their needs and provide context-contingent assistance, backed up by an extensive local and cloud database.” The students were given guidance throughout by Dr Vincent Tam, Principal Lecturer of the Department of Electrical and Electronic Engineering, and developed the project while doing his Embedded Systems course. Dr Tam said:“Most of the existing street navigation applications, like the Apple Maps or Google Maps, do not provide detailed guidance information for visually-impaired people – for example, ‘in 300 metres, turn left’– especially important given Hong Kong’s complex street layout and city design. The Lumino sensors are integrated into a very stylish prototype to guide the user via vibrations.” “Our vision for the hardware design was inspired by high-tech soundwear,”said Desmond,“a wearable piece that sits round the shoulders like a small collar, with all the necessary sensors, actuators and machine learning-enabled computing module inside.” The project has received start-up funding. “We are in the 2020 cohort of the Science and Technology Entrepreneur Programme (STEP) initiated by the Hong Kong Science and Technology Parks Corporations (HKSTP),”said Felix.“So far, we’ve made use of the training sessions that STEP offers to develop the nonR&D aspects of the start-up. The funding is currently being used to develop a prototype compact wearable device that accelerates the capabilities of the app.“ User feedback “Backed by HKU’s Knowledge Exchange Office, we’ve had the opportunity to conduct user tests of our prototypes via our partnership with the Ebenezer School and Home for the Visually Impaired,”he added.“The data we gathered from the tests was instrumental in calibrating our products, such as the sensitivity of the navigation algorithm. We’ve also listened to feedback from our test users and added features including facial recognition.” Throughout the development of the Lumino project, Dr Tam has also helped and guided the team in several aspects:“I have offered technical advice and shared my past experience in project development and management through our regular meetings,”he said.“I’ve also constantly reminded the team to seek input and feedback from the project’s stakeholders: that is, visually-impaired people, especially those at the Ebenezer School who have given valuable advice.” He has further encouraged them, with the support of Computer Engineering Programme Director Professor Edmund Lam, to participate in local and international competitions so as to“broaden their views and horizons”through exchanges with others. In addition to the China Collegiate Computing Contest, Lumino has won awards at the SG:Digital Wonderland 2019 (organised by the InfocommMedia Development Authority in Singapore), and at HKU’s first Engineering InnoShow in 2019 where it received the Best Project Award from the course ELEC3442 Embedded Systems. “Our current priority is tweaking our prototype and carrying out more user tests within the next few months,”said Desmond.“Our next milestone is to graduate from STEP and be officially incubated in HKSTP – we’re optimistic about Lumino’s potential to reach new heights!” Lumino was showcased at the SG:Digital Wonderland 2019, the largest tech exposition in Singapore. From left: Dr Wilton Fok, Jasmine Poon, Desmond Wong, Alvin Yu, Felix Wong and their supervisor Dr Vincent Tam. The Lumino team received the Best Project Award for the ELEC3442 Embedded Systems course from the Dean of Engineering Professor Christopher Chao (left). Partnered with the Ebenezer School and Home for the Visually Impaired, the team had the opportunity to conduct user tests of the prototypes and thereby modify the features based on the feedback from the test users. A shortage of guide dogs in Hong Kong led two engineering students to develop an app for the visually impaired, which prioritises object identification, hazard detection and navigation. Shedding Some Light 45 Knowledge Exchange 44 The University of Hong Kong Bulletin | November 2020
RkJQdWJsaXNoZXIy ODI4MTQ=