Patrick Sisson - Writer, Journalist, Cultural Documentarian, Music Lover

Beyond the Finish Line: How technology helped a blind athlete run free at the New York Marathon

Verge

November 2017

Simon Wheatcroft, blind marathon runner, trains on his local football field, using the feel of the land to guide him through the goalposts near his home in Doncaster, UK.

 

It’s a literal road to nowhere. Stretching out from a roundabout outside the Robin Hood Airport in Doncaster, a small village in Northern England, it’s a wholly unremarkable stretch of slowly cracking pavement, bushes, and weeds, an idle strip of asphalt near long-term parking and a bland business park.

For 35-year-old runner Simon Wheatcroft, however, this stretch of unused roadway may as well be his gym, training center, and proving grounds, his own private version of the 72 stone steps that make up a Rocky montage. Wheatcroft knows every inch of this one-third-mile strip of asphalt — from the contours of the roadway to the feeling of its double yellow lines of paint under his sneakers. Despite the mind-numbing bore of jogging such a short length in endless loops, Wheatcroft had to memorize it. He’s blind.

Imagine getting up from your desk or couch, closing your eyes, and walking to the other end of the room, or perhaps crossing the street in midday traffic. Most people wouldn’t have the audacity to do that without guidance or aid. Meanwhile, Wheatcroft has run the New York and Boston marathons, covered 100 miles in the Sahara Desert, and — perhaps most impressive — sprinted solo alongside the curving roads and streets of his small corner of rural England, sometimes alongside oncoming traffic, all without the benefit of actually seeing where he was going. Instead, he used the twin yellow lines on the side of the road, feeling them through his sneakers, to avoid stepping into the road. (Cars usually make it a point to avoid hitting people, he says, and honestly, they hate cyclists more.)

For the last few months, Wheatcroft has been training along these roads with renewed intensity. Though he’s finished countless races and even ultramarathons, he’s now focused on the New York Marathon, the premier event of its kind. He’s completed the race twice before, but this year carries another challenge. Thanks to the technology of a Brooklyn-based startup called WearWorks, and their prototype wearable navigation device, Wheatcroft aims to be the first blind runner to cover the course unaided and unassisted.

When it comes to technology developed for the visually impaired, “the biggest thing is accessibility and affordability,” says Wheatcroft. “How do we make visually impaired people more mobile? If these technologies exist, eventually they trickle down to people, and everybody uses them.”

The New York Marathon represents an edge case, a stress test, an extreme. Wheatcroft believes that by finding a way to navigate the route amid thousands of runners, he can help test technology that could assist the quarter of a billion people around the worldtoday who are blind or suffer from vision impairment. Many of the visually impaired don’t have a job — 70–80 percent in the US are unemployed — and suffer varying degrees of mobility and navigational challenges.

“It upsets me that so many blind people don’t work, and a lot of that is due to mobility,” he says. “We should be at a point where we should be able to solve these things. I want to make better technology for the community as a whole.”

Photo by Abbie Trayler-Smith for The Verge

Walking through Doncaster with Wheatcroft, on the route where he takes his sons, Grayson and Franklin, to school, it’s difficult to tell he’s visually impaired. Even when he’s out walking with his guide dog, Ascot, Wheatcroft’s mental map of the surrounding roadways is so acute that he often gives precise directions to people too dependent on their smartphones to find their way without one.

“People would see me running and ask what I was doing, and eventually, I’d end up telling them where to go,” he says. “‘To your left, there’s a building, about 0.9 miles down the road, then you can turn right.’”

Wheatcroft often looks people in the eye when he talks, a force of habit from when he could see. He started losing his vision at 17 due to a degenerative eye disease called retinitis pigmentosa, a genetic condition that also blinded his uncle. (Only 1 percent of Americans who are blind are blind from birth.) At this point, Wheatcroft can only vaguely make out changes in light, or what he calls the “fog of dull color.” (He could tell when I stood in front of him and blocked the afternoon sun.)

“When I was young, I thought, ‘Oh, this’ll never really happen,’” he says of going fully blind. “I was always a little bit concerned about the things that I’d miss out on, like I wouldn’t be able to see my kids. That used to plague me. But at the same time, I thought medical science might solve the problem.”

Photo by Abbie Trayler-Smith for The Verge

Wheatcroft grew up near Doncaster and dreamed of being a fighter jet pilot, but his diagnosis ended that dream. During high school, he rarely talked about his situation. After graduating, he went to college in Sheffield, where he received an undergraduate degree in psychology. He milled around a bit, and eventually worked at a friend’s video game store for a few years before finding a job in IT. At 26, his vision rapidly deteriorated. The shift initially devastated him; he says his situation became “depressing as hell.” Without work, he felt like he had lost purpose. Over time, Wheatcroft found ways to acclimate to his condition — he recalls memorizing the route between local pubs — all part of what he says was a constant adjustment.

But during a three-week vacation traveling across the United States in 2009, Wheatcroft was reminded of his limits. He had planned to propose to his girlfriend Sian at the summit of Half Dome in Yosemite, California, a romantic vista accessible via an arduous hike. But Wheatcroft had trouble navigating the ascent, and as they crossed the tree line, Wheatcroft, with a ring in his pocket, became exhausted. The loose ground and steep incline were proving difficult. Light rain started falling, making the route even more treacherous. Sian asked him to stop and rest, and when he sat down at the halfway point, he realized he had to turn back. In the end, Wheatcroft proposed to Sian at the base of the mountain during a snack break. A few weeks later, still crisscrossing the US, they wed in Las Vegas.

Wheatcroft came back to the UK struggling with what had happened in Yosemite. He decided to take a “voluntary redundancy” and quit working. His failure to propose at the summit ate at him for weeks, then months. What if he kept giving up on his aspirations because he was blind?

It was then time that Wheatcroft picked up a book given to him by an old university teacher: Ultramarathon Man by Dean Karnazes, a famed ultramarathon runner. Wheatcroft, who wasn’t very involved in sports as a teenager, thought that if Karnazes could endure long distances, and find significance and self-confidence in running, maybe he could, too. The idea marinated in his head for a few months. Maybe running could be his way to overcome obstacles, like the one that had forced him down the mountain.

Photo by Abbie Trayler-Smith for The Verge

In 2010, Wheatcroft started practicing in what he thought was a safe space: a soccer field in the back of an elementary school in Doncaster. He had done some weight lifting and CrossFit in high school and into his 20s, but this was different. Wheatcroft barely had the money to afford serious training: he ate candy bars from the corner store as a cheap source of calories, and wrote to Brooks running shoes, explaining his cash-strapped situation, and got a free pair of shoes in the mail. Sprinting between posts in endless loops, he’d feel out the paint on the grass to help himself navigate, but it was far from foolproof. Occasionally he’d run into a dog walker, a post, or someone who just assumed he could see and swerve around them. Eventually he moved to the empty airport road and, after gaining confidence, ventured out onto surrounding streets and roadways.

“Had I not lived here, I don’t think I’d have even been able to start training,” he says. “Right location right time, more than anything.”

In 2011, inspired by Karnazes and feeling confident after six months of training, Wheatcroft attempted his first ultramarathon: a 100-mile race in the Cotswolds, a rural area of rolling hills in South-Central England. At mile 83, he was dragged off the race when he could no longer stand. But he didn’t stop running.

Photo by Abbie Trayler-Smith for The Verge
Photo by Abbie Trayler-Smith for The Verge

Over the next six years, he would go on to finish numerous marathons and ultramarathons: he ran the Boston Marathon in 2016 (which he finished in four hours and 45 minutes), the New York Marathon twice (in 2014, he finished in five hours and 14 minutes), and even ran the 220-mile route from New York to Boston over the course of nine days in 2014.

For most of these races, Wheatcroft ran with a guide, his friend Neil Bacon, who’s been running with him for four years. But increasingly, he’s been turning to technology to wean himself off of human guides. He attempted the Four Deserts Marathon in Namibia last May — a 155-mile-long, multi-day race through scorching, shade-free desert where temperatures climbed to 104 degrees Fahrenheit — using corrective navigation technology he helped develop with IBM engineers. The device used a series of audio cues to keep him on track; beeps would steer him and keep him within a virtual corridor mapped out by the program. They named the device “eAscot” after Wheatcroft’s dog.

Wheatcroft says the device functioned well as a proof of concept for corrective navigation, but it was a rush job and had too many functional constraints. The navigational corridor wasn’t tight enough, and the device assumed that the desert would be free of obstacles. On day two, Wheatcroft ran without Bacon trailing him; he hit an unmapped flagpole 10 miles in.

Competitive running is a notoriously injury-prone pastime, even for those with full sight. Long-distance runners face twisted ankles, runner’s knee, and shin splints. Wheatcroft says the most significant issues he and other blind runners face is drifting from their paths. He’s clipped countless lampposts and traffic lights during training, and tripped over ditches, piles of dirt, and even garbage left on the road. A few years ago, Wheatcroft was running down a roadway near his home when he unknowingly came upon a battered car, abandoned on the shoulder the day before. Wheatcroft hit the damaged vehicle running at full speed, cutting his shins. Disoriented, he tried to right himself and in the process cut his arms. He got up, dazed, covered in what he thought was sweat. When he realized it was blood, he panicked, unable to see himself, identify his injuries, or find landmarks that could help someone locate him. He located his phone amid the wreckage and called his wife, frantically telling her to come find him. Luckily, she was able to locate him by driving up and down his normal route.

“If I’d have smashed my phone,” Wheatcroft says, “I would have been fucked.”

Photo by Abbie Trayler-Smith for The Verge

Wheatcroft’s running career coincided with an advance that made his life as a blind person better: the 2009 release of Apple’s iPhone 3GS, the first smartphone with a built-in screen reader, VoiceOver.

“It was night and day,” he says. “It wasn’t just about training. Now I could read newspapers. I could cue up a song on Spotify. I can do it now, thanks to that phone.”

More important for Wheatcroft is the issue of mobility. Despite a massive market, one that’s forecast to grow as baby boomers age, there has been no truly affordable or readily attainable breakthrough navigation technology for the visually impaired. Meanwhile, the established everyday aids are imperfect: canes require environmental cues to work, and can’t provide directions to the store; guide dogs can master an area or a series of tasks, but can’t immediately learn a new neighborhood, or help navigate through an unfamiliar city.

“The basic skills we need to navigate aren’t the challenge,” says Karl Bélanger, a technology expert at the National Federation for the Blind. Canes and guide dogs work, he says, for general, day-to-day navigation. But it’s important to have supplements to basic mobility, especially in specialized circumstances.

Some new technologies have offered steps forward: Google Glass, in conjunction with a subscription service called Aira, can “see” for the blind. Aira give the visually impaired immediate access to a remote, sighted assistant who can tell them what’s in their field of vision. (Erich Manser used Aira to run the Boston Marathon earlier this year.) It’s incredible technology, but it’s also expensive — the unlimited plan for Aira costs $329 a month — which may explain why Aira has less than a thousand subscribers. Other programs and devices, such as Microsoft’s Seeing Eye, tap phone cameras and visual recognition software to help navigate certain scenarios, but they don’t offer wider navigation cues. Not to mention, with constant need for power and a Wi-Fi connection, they’re limiting.

“That’s why the dog and cane still reign supreme,” says Wheatcroft. “The only input a dog needs is food.”

The first technology Wheatcroft experimented with was a relatively basic app called Runkeeper, which simply told him how far he had gone with regular audio reminders. Those reminders helped jog his memory and maintain focus, as well as create detailed mental maps of his surroundings.

Photo by Abbie Trayler-Smith for The Verge

“It was just a data point, but that data point was like a comfort blanket,” he says. “That voice helped tell me what to do, and that almost becomes your internal voice. If I didn’t have that technology, I wouldn’t have had the extra confidence to go out.”

Now, Wheatcroft trains with Runkeeper and uses a treadmill at home; it’s a Nordic model that’s hooked up to a program called iFit to run preprogramed routes, practice pacing, and get used to inclines and markers on his upcoming routes.

During races and long runs, Wheatcroft, like many other blind runners, relies on a much more low-tech way of getting around: human guides. Professional blind runners rely on volunteers and practice partners who are literally tethered to them by ropes in order to help them avoid hitting anything or anyone on the course. It’s both a liberating, and limiting, factor.

“When you ask people why they run, it’s normally about freedom and independence, to go out and push yourself,” Wheatcroft says. “But you can only push yourself as much as the person you’re connected to.”

New Yorker Charles-Edouard Catherine, also a blind runner, is a member of Achilles International, an organization that helps pair volunteers and athletes with a variety of disabilities, including vision impairment, autism, and amputations. With chapters in more than 60 countries, the group fields a large team at marathons and other running events; at the New York Marathon, the group can field over 300 athletes with nearly 700 accompanying guides. (Many racers have multiple guide runners.) Catherine, who also has retinitis pigmentosa, says his first time running with Achilles in 2012 was life-changing.

“When you become blind, you get in a phase of denial where you do not want to accept the new condition you’re in, the new requirements that it implies. You don’t like to ask for help. I didn’t know what to do,” he says. “It was awkward. But I paired up with people depending on speed and level, and right away, it felt like a new community.”

Catherine started running regularly with Achilles, and he quickly realized the advantages and limits of running with a guide. He felt camaraderie with fellow runners, who would share the experience of a long race with him, and having someone with him to warn other runners and pedestrians to get out of the way felt like having a presidential escort. But the more Catherine trained, the more dependent he felt.

“I always need someone,” he says. “And that’s limiting. In New York in February, if it’s snowing and frozen, and you want to do hill repeats, you’re not going to find lots of volunteers.”

Photo by Andrew White for The Verge

Most of the technology Wheatcroft has used to date relies on audio cues. But audio is a constricting form of communication. Imagine a Siri or Alexa-like interface describing every single object in your field of vision. Consider the cognitive overload that it would create on an already loud street crowded with obstacles.

“When I’m walking down the street to my house, hearing that there’s a bush or a lamppost doesn’t really help me,” Wheatcroft says. “Just help me avoid it.”

That’s why Wheatcroft has become increasingly focused on the sense of touch. Haptic technology, Wheatcroft believes, can steer a visually impaired person without overloading their senses. A haptic device could be called up by a voice command to access existing GPS data for directions, then “steer” someone via gentle taps on their skin. (The system could be combined with additional sensor systems, or even a service animal or cane, to help avoid obstacles, grade changes, and immediate impediments.)

Earlier this year, Wheatcroft went searching for a company working on a haptic solution. That’s how he came across WearWorks.

Co-founded by a trio of graduate students at New York’s Pratt Institute, WearWorks traces its origins, at least in part, to visions of a kung fu suit and an “iTunes for movement.”

Keith Kirkland, a dreadlocked designer and engineer born in Camden, New Jersey, knew his way around clothing. A graduate of the Fashion Institute and Technology, a freelancer for Calvin Klein, and a one-time handbag engineer for Coach (“every bag has to be stress-tested to hold 150 pounds,” he says), Kirkland was inspired to explore haptic design while working on 3D modeling. An ex-girlfriend saw him hunched over a computer from across the room, noticed his poor posture, then walked over and shifted his shoulders.

“What if you could read my body posture and compare it to what’s right, all without being there?” he remembers thinking at the time. “What does it look like to have movement fully digitized?”

He spent months trying to fashion a crude prototype, which was the foundation for his thesis at Pratt. Imagine Neo uploading his martial arts mastery into The Matrix as a file. The end result was a crude punching meter, a sleeve that would measure the strength of a strike. The project fell apart due to the difficulty of connecting wires and motors to the elastic sleeve, but it got Kirkland thinking about haptics and feedback: how can we communicate movement instruction via touch?

Kirkland partnered with two classmates, Yangyang Wang, and Kevin Yoo, a sculptor and painter turned industrial designer who had worked with Wang for a 2015 competition called America’s Greatest Makers for Intel. The million-dollar contest, focused on wearable technology, was a perfect place to pool their design skills to work on designing a better haptic interface.

The team’s original idea was to create a general market notification device, but then Yoo remembered the story of Marcus Engel, a famous blind author and consultant, who Yoo once heard speak. (Engel would later become a friend and adviser for the group.) The team began discussing how they could create a device that could help the visually impaired navigate, “offloading” the communication of directions from verbal to tactile.

WearWorks’ early Wayband prototype didn’t win at the Intel competition, but a few weeks later, it did help them become fellows at the Next Top Makers incubator, an event sponsored by the New York Business Development Corporation. The recognition helped the team take the device to SXSW last year, and landed them a spot in the Urban-X incubator in Brooklyn’s Greenpoint neighborhood, where they recently finished a year-long residency. That’s where Wheatcroft came upon the group, and began working with them to develop and refine the technology.

“What they’ve understood is that it’s not about the maps. It’s about how you communicate with a person,” says Wheatcroft. “With verbal systems, you need to lose one of your senses for directions; hearing becomes dedicated to navigation. By using touch, which isn’t often used, you still leave audio free.”

Photo by Andrew White for The Verge
Photo by Andrew White for The Verge
Photo by Andrew White for The Verge
Photo by Andrew White for The Verge

The system developed by WearkWorks that utilizes GPS to create a map and route

The core technology behind the Wayband is relatively simple: users pair the Wayband with their phone, and it utilizes GPS to create and map a route. The path is surrounded by virtual “fencing,” and any time a user steps in the wrong direction, or approaches a mapped object or obstacle, the band buzzes in a sort of Morse code. (Four quick taps on the bracelet signal a turn left, for example, while two long taps signal a right turn.) It’s corrective navigation. Testing out an early version of the device at the Urban-X accelerator earlier this summer, I found myself slowly spinning in circles, eventually righting myself after getting the hang of the haptic cues. Kirkland compares it to creating an alphabet and vocabulary from scratch.

“Keep it functional and simple,” says Yoo. “We actually went to the National Federation for the Blind, and they told us high-tech canes and proximity sensors are great, but what really would help us is wayfinding.”

Instead of reinventing navigation, or relying on new computer models, the device simply creates a more easy-to-understand, universal system of directions, which connects to a GPS mapping system. The team is quick to note this doesn’t entirely solve the problem of navigation; though the Wayband can steer a blind person to the Post Office, it can’t help them avoid a pothole or cross a street. For that, Wheatcroft will be partnering the Wayband with an ultrasonic device the team devised to help with micro-scale navigation. Called the Tortoise, the green plastic device, roughly measuring two inches square and strapped to Wheatcroft’s chest, broadcasts and receives ultrasonic vibrations. (The antennae looks like the small bump of a camera on a smartphone.) The Tortoise’s constant, low-level vibration will speed up when the reflected waves indicate another runner or object is close.

Catherine, who became one of a number of blind consultants for the WearWorks team after they reached out to him, loves the concept behind the technology.

“You have this bittersweet feeling. Why haven’t we figured this out five years ago?” he says. “I think this technology has been there for a long time.”

Throughout the last year, WearWorks and Wheatcroft have refined the technology. He tested an early prototype in April, and it was impressive enough that he was almost ready to use it for the actual race. During a visit to New York City in September, Wheatcroft briefly ran around Central Park with the updated device.

Wheatcroft loves the Wayband system because it’s what he calls a “safe sandbox.” Instead of running within a wide digital corridor between 10–50 meters wide (the system he developed with IBM), WearWorks’ Wayband works within a 2.5-meter corridor, which offers more accuracy and safety, especially in a race environment.

For the marathon, he’ll wear a larger armband-sized version of the device in addition to the Tortoise. Neil Bacon, Wheatcroft’s longtime guide runner, will be at the race as a precaution, but won’t be helping Wheatcroft along on this record-breaking attempt.

“My main concern is running into somebody,” Wheatcroft says. “If this is their first marathon, and they’ve been training for years, I don’t want to be the bloody idiot who runs into them and takes them out.”

After the race, WearWorks plans to begin selling early versions of the Wayband, including an armband-sized version for athletes, similar to what Wheatcroft will be wearing, starting at $300.

Catherine says the potential independence this device promises would be like going from a child to an adult, a graduation. It would be a different race. But he knows exactly what he’d like to do first.

“I would really love to guide someone else,” he says. “I would like to be on the other side.”

WearWorks co-founder Kevin Yoo adjusting the equipment prior to the race
 Photo by Amelia Holowaty Krales / The Verge

Wheatcroft’s bet on a haptic, rather than audio, navigation system was a smart one: the New York Marathon engulfs runners in noise.

Started in 1970 as a race that took place entirely within Central Park and had roughly 100 spectators, the New York City Marathon has become the largest and most important race of its kind. Last year, a record-setting 51,394 runners, representing every state in the US and 124 countries completed a course that winds through each of New York City’s five boroughs. More than a million cheering and screaming fans, along with bands, DJs, and announcers, line the 26.2-mile course.

This year’s race took place on Sunday, November 5th. At 7AM, runners started to gather in corrals on Staten Island. They were itchy with nervous energy, ready to shed blankets and jackets, and — after long mornings of commuting on boats, buses, and trains to the edge of Staten Island — eager to just run.

Wheatcroft’s day started at 5AM with coffee, oatmeal, and so many press calls to UK media that he didn’t even have time to talk to his family. By 9:15, he was at the starting line, part of group of athletes with disabilities that include other blind runners (and guides from Achilles International) as well as those using handcycles.

The 24 hours before the marathon were full of last-minute preparations. Wheatcroft and the WearWorks team ran final trials in Central Park on the eve of the race, and discovered that the ultrasonic sensor wasn’t sensing objects in Wheatcroft’s vicinity. That night, the WearWorks team huddled at a Thai restaurant in Manhattan to hack together a solution, and Yoo fabricated a new module overnight. Yoo, who was going to run with Wheatcroft to observe the Wayband and Tortoise in action, made last-minute adjustments to the devices.

Press swarmed over Wheatcroft with questions and photographers snapped photos. New York Times reporter Jeré Longman was there, and would shadow Wheatcroft for the first few miles. Runners in front of Wheatcroft started asking members of the entourage if they should know who he was.

Minutes before the start, a stoic Wheatcroft, more serious and slightly more rigid than he was back in England, slipped out of his black tracksuit. At 9:57, as a slight drizzle fell on the crowd, the start gun was fired and the pack of hundreds began to move. Wheatcroft hung at the rear, and was one of the last of his group to begin crossing the Verrazano-Narrows Bridge.

Wheatcroft was running as the forward point in an invisible triangle. Though he was navigating independently, his guide runners, who previously guided him at the Boston Marathon last year, ran 10 feet behind. As Wheatcroft cleared the bridge with a smooth, steady gait, Bacon and Croak hung back, giving him a wide lead. A water station appeared in Wheatcroft’s path and both guides bit their tongues to avoid tipping him off. This was the Tortoise’s first test. Wheatcroft felt the device vibrate faster, so he slowed down and weaved around the obstacle.

“Then it became a totally different race,” says Bacon. “I’d never seen him dodge things like that on his own. The hard thing was standing back and letting him go. ”

From there, Wheatcroft continued through Brooklyn and Queens, picking up the pace, enjoying the freedom provided by the twin devices. Bacon and Croak, accustomed to chatting with Wheatcroft, hung back. They watched him avoid large groups of runners, the Tortoise functioning like it was meant to.

“At the beginning, it was like, ‘Oh my god, we’re doing it,’” says Wheatcroft. “It was exactly how I imagined we’d avoid people in the crowd. I was running faster because I was enjoying it working.”

But the team didn’t count on rain. Around mile 15, the functionality of the Tortoise, which had been steadily deteriorating as rainfall picked up, stopped working. At the same time, the Wayband was having difficulty picking up signals. The sheer volume of data and cellular traffic along the route didn’t help, says Yoo.

Photographs by Amelia Holowaty Krales / The Verge

“We had every single problem possible,” Yoo would later say, during a post-race stretch near the finish. “There was lots of high-rises causing signal issues, issues with navigation while crossing bridges. We did the hardest thing we could do: testing the Wayband during the marathon.”

As the navigation aids faltered, Wheatcroft found himself working more, forced to concentrate harder to move ahead. Combined with his early surge, he began to feel drained. By the time they crossed the East River and headed through Manhattan, Bacon, Croak, and Yoo assumed typical guide duties. As the group passed through Manhattan’s Upper East Side, Wheatcroft and his guides ran side by side.

Wheatcroft crossed the finish line at 3:15PM, five hours and 17 minutes after the start, with Yoo and Bacon flanking him. Over the last leg of the marathon, he demonstrated the same steady gait he had at the start, but it was clear he was spent. Huddled under two blankets and clutching a cup of sugary, milky tea in the finish area, he said the sheer amount of mental energy required to navigate with the system added to the physical exhaustion of the race. He expelled too much energy at the beginning, and didn’t anticipate the energy needed to navigate.

After Wheatcroft crossed the finish line, he put his arm around Bacon and flashed a grin. He appeared excited and relieved to have met the physical challenges of the race. But the unproven technology, which showed promise under the harshest of conditions, ultimately didn’t last the entire marathon, and Wheatcroft was unable to finish unaided.

I asked Bacon what he thought of the entire thing: he felt it was a great success. Exhausted, Wheatcroft couldn’t muster up a response:“Right now, I really don’t know. I’m too tired to think,” Wheatcroft said.

Photo by Abbie Traylor-Smith for The Verge
 Photo by Abbie Trayler-Smith for The Verge

In the hours after the race, Yoo cataloged improvements for next time: the software algorithm needs to sort out data discrepancies better, the hardware needs to stand up to more duress, and they need a better GPS system. WearWorks clearly doesn’t have the budget to launch a fleet of satellites, but Yoo believes a mass-market GPS chip coming to the smartphone market next year will allow accuracy to within roughly a foot, and significantly improve the performance of their system.

Despite being exhausted, Wheatcroft lit up a little when asked about the future of the Wayband after the marathon.

“We took something we always knew was going to be an intense test,” he says. “We tested so many worst-case scenarios. Let’s take the lessons learned, and see how we can improve it.”

Wheatcroft is already looking toward the future, and even more strenuous challenges. Already an advocate and an occasional speaker, next, he’d like pursue triathlons. In addition, he’s consulted with tech companies about inclusivity and designing for the visually impaired, and he’s continuing his studies, including computer coding. (He’s currently working at home with a braille reader, and pursuing a master’s in computer science.) Wheatcroft wants to be more than a runner; eventually, he doesn’t just want to test the technology, he wants to help develop and build it.

“As a blind person, you always strive for independence,” says Wheatcroft. “But it’s a bit of a contradiction, because oftentimes, you’re using somebody with sight to become independent. What we’re trying to do is use this technology to really achieve true independence. This race isn’t about time, it’s proving that something is possible.”

 

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *