A tech expert has gone to Silicon Valley having been invited by Google after using AI to help a disabled girl to communicate with her eyes.
Earlier this year Scott Phillips, of Congleton, who runs his own computer graphics design and development agency, won a “hackathon” after he managed to programme, within three days, using a basic selfie camera to accurately simulate head, face and eye muscles.
The challenge was to help a disabled girl with Rett Syndrome, a rare genetic disorder that affects brain development, resulting in severe mental and physical disability.
The system that people with paralysis use to communicate via their eyes involves infrared cameras costing thousands of pounds.
But Mr Phillips said his AI solution, using a basic camera at the top of a laptop, worked much faster and could be developed at a fraction of the cost.
He said his ambition was to take the design to market and he appealed for sponsors to back him.
During his five-day stay in Silicon Valley the 45-year-old, who first programmed a computer when he was eight and was inspired to work in what is known as “computer vision” by “Star Wars”, “Terminator 2” and, later, Tom Cruise’s sci-fi movie “Minority Report”, will advise Google about how to solve complex data acquisition problems.
It is a far cry from his Btec studies in graphic design at Mid Cheshire College in Northwich, having previously attended Fallibroome School in Macclesfield.
He later graduated in 3D graphics at the University of Wolverhampton.
In the early 2000s Mr Phillips worked for a Congleton 3D graphics company, Advanced Illustration.
As his interest in computer graphics and simulation grew, coupled with technology advances, he wanted to focus on computer vision, in which AI can extract information from images.
He was fascinated by Industrial Light and Magic, the special effects company George Lucas founded to make “Star Wars”.
His work with AI moved up a notch when he asked it to rewrite data from the 2D Atari “Star Wars” arcade video games of the 1980s into 3D.
He said: “I opened these files from the 80s inside of a modern 3D package. That was one of the most salient moments that I’ve had; talking to AI about how this data could be interpreted into 3D. It’s absolutely fascinating.
“Everybody is going crazy about AI, from Grok to ChatGPT, and it is amazing technology, but very expensive to run.
“The AI I’m interested in is the exact opposite – the one you can run on your smartphone or laptop.”
Spotted
During a conference for tech developers in Athens he was spotted by someone who hosts hackathons in Greece who liked his “computer vision” work.
Over a short period during hackathons, tech experts work rapidly on projects such as software engineering.
People from around Greece with various disabilities were at the hackathon, where Mr Phillips was asked to design a communication system for the girl with Rett Syndrome.
He explained: “She has tremors, which means her head moves very quickly and her body shakes but her eyes are stable.”
The solution normally would be to use infrared technology, the sort that is used to change channels with the television remote control.
“When a disabled person with very limited motor control uses an infrared device it sends the data back and forth using their eyeball movements. But this process is very slow and the computer is not able to make a reasonably accurate simulation of the eye muscles.”
Mr Phillips added: “But you can make that simulation with modern technology. The most successful way to do that is to just use a camera to capture data, which is then turned into the simulation.”
For the hackathon, Mr Phillips was asked to repurpose a computer game engine, a process he said was very straightforward.
“All the tools were there; I just needed to understand the conditions to be able to interpret the data so it worked for her.
“Her head movements needed to be slowed down to amplify the tiny movements around the eyes. When you combined those two you got accurate eye movements.
“Normally these things are very expensive so I used the game engine to make it real-time. It’s faster than, say, using a Nintendo Wii controller. The controller runs at around 100 milliseconds whereas my solution can run at 60 milliseconds.”
He explained: “It is like when you are on a Zoom call on your laptop. You put the camera on and then the computer, in real time, builds a model in 3D using every single frame. The data is then fed back into a user interface.
“As a result of that the girl could look at her computer screen and move very accurately across the screen and use functions displayed on the screen in real time. It’s still prototypical and very basic, but she can use it.”
He would like to bring his design to market with the help of business people, and would be happy to give away equities.
One of the topics Mr Phillips will talk to Google about in California is the scarcity of data in relation to people’s everyday 3D environment.
Said Mr Phillips: “No matter how much data is fed into computers there are blind spots all the time.”
He continued: “If people like Google have got a data scarcity issue – and I don’t want to sound arrogant – but I can definitely help.”
Of his future project plans he said: “I’ve already been approached to work on futuristic business models where you’ll be able to rent tele-operated robots and use computer vision to interact with them.”
Mr Phillips, who runs his company Inspired Labs from home, spends a lot of his time in Congleton’s cafes and pubs, which he described as being “so friendly”.
He said: “I don’t have an office but I do like to be around other people when I’m working.
“They let me sit there and put my feet up with my earphones on while I think about all of my algorithms, I really appreciate them.” He thanked the Lion and Swan, Orso Lounge, and The Loft café at Victoria Mill.




